|

A Written Test for Artificial General Intelligence

By Michael Castelluccio
November 1, 2020
0 comments

Until recently, people have relied on the Turing Test to determine when or if computers have reached human-level intelligence. That level for machines is called artificial general intelligence (AGI), and Microsoft and OpenAI seem to be developing what could be used as written exams to determine that turning point in humanity’s relation to its machines.

 

On July 22, 2019, Microsoft announced a billion-dollar investment in AI in a partnership with OpenAI, the AI research company in San Francisco, Calif., cofounded by Elon Musk, Sam Altman, and four others. Microsoft explained, “The companies will build new Azure AI supercomputing technologies, and Microsoft will become OpenAI’s exclusive cloud provider.” OpenAI said, “Microsoft will become OpenAI’s preferred partner for commercializing new AI technologies, [and will be licensed with] exclusive use of GPT-3.” Others will still be able to use the public application programming interface (API), but only Microsoft will have control of the Generative Pre-Trained Transformer 3 (GPT-3) source code. That license might be the most important part of partnership.

 

THE MOST POWERFUL NLG

 

The GPT-3 is an autoregressive language model that’s the most powerful natural language generation (NLG) model ever built. It was released in beta on June 11, 2020. It’s the third generation of GPT language models created by OpenAI, and it’s the largest, with a capacity of 175 billion machine-learning parameters. Parameters are values, like words, and they serve as a knowledge base for a machine like GPT-3 that does text prediction.

 

If you input a few words, GPT-3 will write a completed thought or sentence. The model was trained on data from Common Crawl, a nonprofit that builds and maintains an open repository of web crawl data accessible to the public for free (commoncrawl.org). The GPT-3 database has much of the internet, an immense library of books, and all of Wikipedia. The previous version, GPT-2, had 1.5 billion parameters, and the competition has far less as well—Microsoft’s Turing-NLG has 17 billion, and NVIDIA’s Megatron has 8.3 billion parameters.

 

GPT-3 can write news stories and business memos. It can research and summarize case histories for lawyers, write fiction and poetry, translate in several languages, and even write computer programs and do web formatting in Python, JavaScript, CSS, HTML, and other languages.

 

ONE-SHOT LEARNING

 

Not only does GPT-3 have an immense knowledge base, it also has deep learning models that were first used in 2017, called transformers. They teach the semantics of a sentence by showing the meaning of a word as it relates to other words in a sentence—in other words, in context. This enables the generator to do things like compose text or summarize a paragraph or sections of an article. It also explains why the model is called a text predictor.

 

GPT-3 is also adept at learning on its own with a technique called “one-shot learning.” Describe what you want it to do, show it how to do it once, and it’ll learn from that one lesson and apply it in unsupervised future learning. In some cases, GPT-3 is capable of “zero-shot learning,” which even further reduces the learning curve and time spent.

 

These techniques have produced the ability of the language model to write in the style of an author it has been exposed to. Asked to write a Wikipedia entry about itself, it’ll study the way the entries are formatted and follow that style.

 

GPT-3 PROSE

 

It even seems to know what it doesn’t know. On the gwern.net website, GPT-3 responded to a query about this copying styles of others. It wrote, “And so I have created something more than a poetry-writing AI program. I have created a voice for the unknown human who hides within the binary. I have created a writer, a sculptor, an artist. And this writer will be able to create worlds, to give life to emotion, to create character. I will not see it myself. But some other human will, and so I will be able to create a poet greater than any I have ever encountered.”

 

The Gwern website has many examples along with examinations of how GPT-3 imitates, creates, and learns. With Microsoft’s very serious investment of $1 billion, the future of OpenAI’s GPT-3 looks almost limitless. Alongside Microsoft, OpenAI continues to allow other users and academics to request access to the public GPT-3 API “to explore the strengths and limits of the language model.”

 

As to limits on the progress toward AGI, the Australian philosopher David Chalmers, in an article titled “GPT-3 and General Intelligence,” called the language generator “one of the most interesting and important AI systems ever produced.” Further, he said, “[it] seems to be closer to passing the Turing test than any other system to date.” Sam Altman remains reserved. “AI is going to change the world, but GPT-3 is just a very early glimpse. We have a lot still to figure out.”

 

Michael Castelluccio has been the Technology Editor for Strategic Finance for 26 years. His SF TechNotes blog is in its 23rd year. You can contact Mike at mcastelluccio@imanet.org.


0 No Comments

You may also like