Ross: The fundamental problem with new artificial intelligence
Apr 18, 2022, 7:06 AM | Updated: 10:06 am

Misty II, the first professional-grade platform robot by Misty Robotics, being exhibited at the Dell stand at Mobile World Congress. (Photo by Joan Cros/NurPhoto via Getty Images)
(Photo by Joan Cros/NurPhoto via Getty Images)
I read Steven Johnson鈥檚 about a new system of artificial intelligence called GPT-3.
GPT-3 stands for Generative Pre-Trained Transformer 3, which is an artificial intelligence program that mimics brain synapses, and is housed in a supercomputer in Iowa. This machine is reading the Internet 24-7 and digesting its contents, mapping speech patterns, and teaching itself to write original prose in answer to any question.
It learns by teaching itself to complete partial sentences, much in the same way Microsoft Outlook offers to complete your e-mail responses.
But GPT-3 goes way beyond this. It can write original short stories. It鈥檚 even been trained to write movie scripts.
Imagine you could choose a subject, a style, and a tone 鈥 anything from Northwest Nice to Five Jalapenos 鈥 and in less than a second, a grammatically-perfect paragraph pops out.
The excerpts in the Times article are stunning compared to what I was seeing a few years ago.
But there is a fundamental problem with this technology and that鈥檚 the temptation to use it. If you鈥檙e just churning out fiction for entertainment, great. However, you聽know trouble is ahead when you read the stipulations in the software license: the designers specifically forbid using the technology to determine who gets a credit card, a payday loan, a job, or an apartment. It is also forbidden to use it to generate spam, promote 鈥樷榩seudo-pharmaceuticals,鈥 or influence the political process.
Except there鈥檚 no mention of how to enforce that. Since computers cannot experience fear, shame, pain, poverty, loss or death 鈥 they have no motivation to control themselves, and so any control will have to be imposed by humans 鈥 humans who I guarantee have every intention of using these machines to do all those forbidden things.
GPT-3 will also be used to generate high school essays, eventually college essays, and ultimately, human experts will simply read an AI-generated script in their teleprompter glasses.
Of course, ultimately, this will be used to make decisions 鈥 like, do we stick with the conventional weapons, or is it time for the nukes?
I want to say here and now: that would be bad. I want to get that on the record before this segment is turned over to GPT-3.
Time may be running short. After reading the article, I also read some of the comments.
One of them from a reader in New Jersey named 鈥淎rcher鈥 read as follows:
鈥淚 have no objection to any of this. I am tired of reading the scribblings of carbon-based writers.鈥
Like I said, that was from a reader in New Jersey named Archer … unless it wasn鈥檛.
Listen to Seattle鈥檚 Morning News with Dave Ross and Colleen O鈥橞rien weekday mornings from 5 鈥 9 a.m. on 成人X站 Newsradio, 97.3 FM. Subscribe to the聽podcast here.