|

OpenAI May Build Custom AI Chips for ChatGPT

OpenAI is discussing creating their own custom AI chips

OpenAI, the developer for the highly popular and discussed ChatGPT artificial intelligence service, is exploring the development of its own custom AI chips due to the high costs and scarcity of Nvidia compute GPUs and other processors for AI training. Open AI is even considering acquiring other AI chip companies to obtain the intellectual property needed and speed up the process of their own chip development.

With everyone scrambling to get on the AI bandwagon, compute chips for AI inference and training are becoming scarce with supply not keeping up with demand. Not to mention the incredible cost of even a single Nvidia compute GPU. AI training can require thousands of GPUs working together to speed up the training process.

Designing a custom chip would be a major shift for OpenAI in terms of market strategy and would also entail a significant investment on the company’s part. Even if OpenAI does decide to pursue this, it may take years to create their own AI processors. This would mean continued reliance on current manufacturers such as Nvidia, Intel and AMD. Acquiring an already established chip company would be easier and accelerate the development process.

Companies such as Amazon Web Services (AWS), Google and Telsa have undertaken chip development for critical services. Should OpenAI join the party, it would align the company with the likes of some of the world’s tech giants. It is worth noting that even some of these tech giants have taken up and later abandoned their own chip development due to the challenges faced, such as Meta.

The driving force being OpenAI’s exploration into custom chip development is the challenge posed by the limited supply of AI GPUs, which it used to train and run its large language model AI. OpenAI’s dependence on GPUs is evident from its use of a massive supercomputer provided by Microsoft, which uses 10000 Nvidia A100 GPUs. Nvidia reported hold 80% of the global AI processor market share.

OpenAI’s CEO, Sam Altman, has voiced concerns about the shortage of GPUs and the expense involved in operating AI software on such platforms and at such scale. The operational cost of ChatGPT alone is immense. If it were to scale up to the size of Google search, the annual operational cost, putting aside buying the required hardware, would be approximately 19 trillion Rand annually.

Similar Posts