According to two OpenAI investors speaking at a Reuters Next conference on Thursday, the company’s decision about artificial intelligence “apps” does not necessarily mean the end for emerging companies developing AI products.
According to them, investors are still searching for fresh AI solutions to improve user interfaces and tackle complex tech problems like brain-computer interfaces.
Earlier this week, the creator of the hugely popular ChatGPT chatbot revealed a marketplace where users can get customized AI “apps” for various activities, including creating stickers or teaching arithmetic.
Founders of AI startups are terrified of OpenAI, which is attempting to establish an AI empire by selling goods to both businesses and consumers. They fear they won’t be able to compete with OpenAI.
“AI has a ton of potential for future innovation. Partners at Sequoia Capital, Konstantine Buhler, stated during the conference that “we’re in an intermediary step in a decades-long revolution.” “You can play a huge role in shaping this.”
The company that created ChatGPT, OpenAI, which Microsoft (MSFT.O) also owns a more significant portion of, received funding from Sequoia in 2021.
Along with seeing the potential for the growth of consumer applications other than ChatGPT, Avery Klemmer, partner at Thrive Capital, which recently expanded its investment in OpenAI, said she did as well.
As for the current format of AI chatbots made famous by ChatGPT, she anticipates future advancements.
“I think there will be really novel formats and forms of engagement that get invented,” Klemmer declared.
Analysts and investors assert that the creation of AI products is still in its early phases despite the current frenzy of businesses and venture capital firms investing heavily in the technology.
As per Jill Chase, a partner at CapitalG, the cost of AI inference—that is, using an AI model to make predictions and inspire new products—may drop quickly due to the accelerated pace of research in the field, even though developing applications using large language models is still relatively expensive.
“The cost of inference coming down so dramatically may seem small, but it’s hugely impactful for what types of businesses can be created, and what use cases incumbents can empower,” she continued.
