OpenAI is developing an AI inference chip in collaboration with Broadcom
OpenAI has reportedly partnered with American chip design firm Broadcom to work on a specialized AI inference chip. Sources from within OpenAI told Reuters that the company is looking to diversify its supply chains to reduce costs.
At one time, OpenAI was planning to build a network of foundries to manufacture in-house chips, but those plans are now on hold due to costs and the time it would take to complete them.
While it works on an in-house chip, OpenAI has begun adding AMD and Nvidia's AI chips to their workflows. The sources told the publication that OpenAI will leverage industry partnerships to manage chip supply with a mix of internal and external sources.
Two sources told Reuters that OpenAI is considering the elements of its chip design and may add more external partners. The company has built an in-house team of around 20 people, including Thomas Norrie and Richard Ho, who previously worked on TPUs (Tensor Processing Units) at Google.
OpenAI has managed to secure manufacturing capacity at TSMC (Taiwan Semiconductor Manufacturing Company) with Broadcom's help and plans to begin production of in-house chips in 2026. Although, this timeline could change according to the sources.
Are you a techie who knows how to write? Then join our Team! Wanted:
- News translator (DE-EN)
- Review translation proofreader (DE-EN)
Details here