
OpenAI, an artificial intelligence research organization, has unveiled GPT-3, the successor to GPT-2, a language model that can generate sentences that are indistinguishable from human writing.
GPT-2, released in 2019, also postponed the publication of a paper containing technical details, saying that it is too dangerous even for the development team due to its excellent performance. In this GPT-2, the parameters were 1.5 billion, but the new version, GPT-3, is said to have a whopping 175 billion parameters.
However, it is not the optimal language model in any situation. You can get excellent results in natural language processing, such as language translation, creation of news articles, and answers to college math skills tests, but it may be insufficient in the part of common sense reasoning. It is said that there are some inferior parts in word context analysis and middle and high school test answers.
In addition, there are reports that articles that are identical to those written by humans were created using GPT-3. Related information can be found here .
Add comment