sa国际传媒

Skip to content
Join our Newsletter

Editorial: Artificial intelligence: A sword with two edges

Machines will have the ability to pool their knowledge. And that might not end well for us.
web1_20230331100356-6426f55280cad90da2475745jpeg
The OpenAI logo on a mobile phone in front of a computer screen displaying output from ChatGPT. MICHAEL DWYER, THE ASSOCIATED PRESS

This is an editorial aimed at expounding the jeopardies of artificial astuteness.

Think that’s a rather wordy opening sentence? Yes, but it was written by an artificial intelligence program called “Complex Sentence Generator.”

We asked it for an opening line about the dangers of AI, and that’s what it came up with. Not bad for a machine.

But in another sense altogether, not good. It’s all too close to the real thing.

And that has some of the world’s most respected AI experts cautioning that this technology is close to becoming a real and present danger.

Some 350 top-rated executives and researchers have signed a letter calling for governments to act before it’s too late.

They warn that “mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”

To date, the main concern about artificial intelligence has been that jobs will increasingly be put at risk as smart machines take over. That concern is real. Numerous tasks, from analyzing stock market trends to navigating ships, can already be performed by clever software.

And there is no end in sight to how far this replacement of human labour by machines may stretch.

But an entirely different concern is taking shape.

It is in the nature of AI applications that they can store vast quantities of information. By some estimates, the chatbot ChatGPT holds 10,000 times more data than any human can.

This is an impressive ability that clearly offers benefits. An AI doctor could know everything that all of the physicians on Earth know.

But it is also ominous. Through the internet, anything that one AI application knows, the others can learn.

Potentially, that gives machines the ability to pool their knowledge. And that might not end well for us.

History is replete with examples of more advanced civilizations subduing then oppressing less technologically sopisticated societies. The aboriginal peoples of sa国际传媒 can attest to that unfortunate truth.

And here a marketplace reality comes to bear. It is to the advantage of AI manufacturers to push the power of their product ahead of the competition.

The smarter, the quicker, the better it sells. The emphasis is not on responsible progress. It’s on beating the other guy to the punch.

Yet what can be done? Some experts have called for a body of regulations that would constrain the over-extension of AI.

And the federal government has introduced legislation that creates “sa国际传媒-wide obligations and prohibitions pertaining to the design, development and use of artificial intelligence systems … This applies to any technological system that [processes data] through the use of a genetic algorithm, a neural network, or machine learning … in order to generate content or make decisions, recommendations or predictions.”

Yet how are these “obligations and prohibitions” to be imposed? The technologies in question are expanding so fast, they may very well outgrow any attempts at imposing limitations. The horse leaves the barn before the door is closed.

Moreover the number of actors, worldwide, who have embarked on the development of artificial intelligence in itself nearly prohibits any effective oversight and control.

It comes down to this. Smart machines are already close to the point where they must be considered intelligent.

But we know how intelligence has proved a double-edged sword for the human species. The past 100 years have shown the risks as well as rewards that accompany the power of invention.

Two world wars and numerous localized conflicts testify to that inarguable fact.

Might not the same prove true for intelligent machines? This does appear to be, as the experts have said, an existential threat.

>>> To comment on this article, write a letter to the editor: [email protected]