OpenAI is exploring alternatives to some of NVIDIA Corp‘s (NASDAQ:NVDA) latest AI chips, potentially altering the dynamics between two key players in the AI sector. This strategic move by OpenAI highlights the company’s focus on improving AI inference performance, essential for applications like ChatGPT.
OpenAI’s decision arises from dissatisfaction with the speed of Nvidia’s hardware in handling specific tasks. OpenAI is considering partnerships with companies such as Cerebras and Groq to enhance its inference capabilities, Reuters reports.
OpenAI’s Bold Move Against Nvidia’s Dominance
The shift in OpenAI’s strategy comes amid prolonged negotiations with Nvidia over a potential $100 billion investment. While Nvidia remains a leader in training AI models, OpenAI’s pursuit of alternatives in the inference chip market could test Nvidia’s dominance.
OpenAI Chief Executive Sam Altman expressed a desire to remain a significant customer of Nvidia, despite seeking alternatives. “Nvidia makes ‘the best AI chips in the world,’” Altman stated, and as quoted by the outlet, emphasizing the company’s reliance on Nvidia for most of its inference needs.
Are Alternative AI Chips The Future?
OpenAI’s pursuit of alternative chips focuses on SRAM-heavy designs, which could offer speed advantages for AI applications. Nvidia’s reliance on external memory in its GPUs adds processing time, a concern for OpenAI’s coding product, Codex.
According to Reuters, OpenAI’s collaboration with Cerebras aims to meet the demand for faster performance in coding models. “Customers using OpenAI’s coding models will ‘put a big premium on speed for coding work,’” Altman noted during a recent call.
Nvidia has also shown interest in acquiring companies like Cerebras and Groq to bolster its technology portfolio. However, Cerebras opted for a commercial deal with OpenAI, while Nvidia secured a licensing agreement with Groq.
Photo: Prathmesh T on Shutterstock.com
Recent Comments