SambaNova Systems, an artificial intelligence chip startup, revealed a new semiconductor on Tuesday that will allow its clients to employ higher-quality AI models at a cheaper total cost.
According to the Palo Alto, California-based business, the SN40L processor is designed to run AI models that are more than double the size of the advanced version of OpenAI’s ChatGPT.
“SN40L is specifically built for large language models running enterprise applications,” said SambaNova CEO Rodrigo Liang. “We’ve built a full stack that has allowed us to really understand the enterprise use case extremely well.”
According to Liang, large enterprises trying to employ AI creatively confront a different and more sophisticated set of considerations than consumer products like ChatGPT.
To be helpful for commercial customers, AI technology must be designed differently regarding security, accuracy, and privacy.
Nvidia (NVDA.O) controls the AI chip industry, but a rise in demand caused by interest in generative AI software made the desired chips impossible to procure for some organizations. Intel (INTC.O), AMD (AMD.O), and companies like SambaNova have filled the vacancy.
The new SambaNov chip can run a 5 trillion parameter model and has two advanced forms of memory. Memory can be a hindrance to digesting AI data. According to the business, this hardware combination allows users to run larger AI models without sacrificing accuracy for size.
The chip for SambaNova is made by Taiwan Semiconductor Manufacturing Company (2330. TW).