Techrecipe

Did humans use it? AI sentence generation language model GPT-3

OpenAI, an artificial intelligence research organization, has unveiled GPT-3, the successor to GPT-2, a language model that can generate sentences that are indistinguishable from human writing.

GPT-2, released in 2019, also postponed the publication of a paper containing technical details, saying that it is too dangerous even for the development team due to its excellent performance. In this GPT-2, the parameters were 1.5 billion, but the new version, GPT-3, is said to have a whopping 175 billion parameters.

However, it is not the optimal language model in any situation. You can get excellent results in natural language processing, such as language translation, creation of news articles, and answers to college math skills tests, but it may be insufficient in the part of common sense reasoning. It is said that there are some inferior parts in word context analysis and middle and high school test answers.

In addition, there are reports that articles that are identical to those written by humans were created using GPT-3. Related information can be found here .

lswcap

lswcap

Through the monthly AHC PC and HowPC magazine era, he has watched 'technology age' in online IT media such as ZDNet, electronic newspaper Internet manager, editor of Consumer Journal Ivers, TechHolic publisher, and editor of Venture Square. I am curious about this market that is still full of vitality.

Add comment

Follow us

Don't be shy, get in touch. We love meeting interesting people and making new friends.

Most discussed