Last week at ProMat, I interviewed Covariant co-founder and CEO Peter Chen. Who, you ask, was the timing great or bad? I’m sure the startup’s communications team is happy that I’m writing a follow-up a week later on a fresh investment round.
The Series C extension increases the AI firm’s fundraising to $222 million, but $75 million is hard to ignore. Radical Ventures and Index Ventures led the round, which included new (Pension Plan Investment Board and Amplify Partners) and returning (Gates Frontier Holdings, AIX Ventures, and Northgate Capital) investors. The round follows a July 2021 $80 million Series C.
“This financing allows us to further develop the Covariant Brain as a more powerful foundation model and to apply the Covariant Brain to even more use cases across a broad variety of sectors,” Chen tells TechCrunch. “With eCommerce demand growing and supply chain resilience becoming more critical, we’ve made fantastic progress with global retailers and logistics providers, and we’re looking forward to providing more specifics on these partnerships.”
Covariant showed me their innovation at the exhibition last month. Knowing what drives its selection and putting is extraordinary. The logistics play relies on the Covariant Brain’s enormous library of package sizes, shapes, and materials based on real-world decisions.
Chen compared ChatGPT to generative AI. Three of the team’s four co-founders are directly connected to OpenAI, so it’s more than simply a tenuous relationship to the newest hype cycle.
Chen says:
Before the current ChatGPT, several natural languages were processing AIs. Search, translate, emotion identification, spam detection—natural language AIs were everywhere. Before GPT, you trained an AI for each use case using a smaller dataset. Now, GPT eliminates translation, even if it’s not programmed to do so. Instead of utilizing limited quantities of data or training a model specialized to one case, train a huge foundation-universal model on a lot more data to make the AI more generalized.