Hey Everyone,
The cascade of Generative AI technologies that was made possible by GPT-3 is getting a lot of hype and Venture Capital interest. GPT-3, which was introduced in May 2020, and was in beta testing as of July 2020, is part of a trend in natural language processing (NLP) systems of pre-trained language representations.
We are thus due for a GPT-4 announcement soon heading into 2023. There are rumors that it will be a significant leap. It’s been nearly 2.5 years since GPT-3 was announced and it has changed the landscape of artificial intelligence and the variety of projects, startups and momentum.
Back in the Fall of 2021, Sam Altman, the CEO of OpenAI, in a question-answer session in AC10 online meetup, spoke about the impending GPT-4 release. While GPT-1 was released in 2018, GPT-2 in 2019, and GPT-3 in 2020. Most industry watchers believe GPT-4 will be launched sometime in early 2023. Many believe it will be worth the wait.
According to Altman, GPT-4 won’t be much bigger than GPT-3. So, we can assume that it will have around 175B-280B parameters, similar to Deepmind’s language model Gopher. Though I am not one to make predictions on what GPT-4 will do or how big it will be or its impact on the industry.
According to forecasts by Gartner, the worldwide AI software market is growing at a rate of 21.3% per year and is expected to generate $62.5 billion in revenue in 2022.
It’s undeniable in terms of the pace of Generative A.I. how significant GPT-3 was in the history of artificial intelligence. In simple words, a large size does not mean higher performance. The large model Megatron NLG is three times larger than GPT-3 with 530B parameters and did not exceed it in performance.
2022 was an exciting year in A.I. when In July 2022, OpenAI launched DALLE2, a state-of-the-art text-to-image model. And, after a few weeks, Stability.AI launched an open-source version of DALLE-2 called Stable Diffusion. Midjourney and many others and applications in technology are already beginning to spin out including in TikTok with its green screen.
How transformative will GPT-4 be? Altman said that they are focusing on making smaller models perform better. The large language models required a large dataset, massive computing resources, and complex implementation.
Twitter is not full of rumors, speculation, conjecture and hype around what GPT-4 will be able to do. Whether you call it Generative A.I., Software 3.0 or something else, it’s yet another wave of A.I. hype.
Keep reading with a 7-day free trial
Subscribe to AI Supremacy to keep reading this post and get 7 days of free access to the full post archives.