Has the game changed in artificial intelligence? CPU instead of GPU

The artificial intelligence revolution occurred in mid-2023. The suddenness of this revolution made artificial intelligence accelerator GPUs such as Nvidia H100 become popular. However, judging by the data announced by Google today, things are starting to turn around. Especially in artificial intelligence outputs, CPU has begun to be preferred over GPU.

According to Google, CPU usage in artificial intelligence has surpassed GPU usage

Artificial intelligence technology suddenly appeared with a huge leap forward with ChatGPT in mid-2023. The successive artificial intelligence tools and the allocation of billion-dollar budgets by large companies have turned all balances upside down. In this first era of artificial intelligence, Nvidia and HX100 GPU units came to the fore. In particular, OpenAI and Microsoft included Nvidia H100 GPU units in their artificial intelligence servers.

GPU units are still much more effective in the learning process of artificial intelligence today. However, now the real burden in the artificial intelligence sector has begun to be taken by artificial intelligence outputs. In order to make these outputs healthier and faster, the CPU started to come to the fore. Google Cloud product manager Brandon Roya explained the role of CPUs in artificial intelligence technology at the TechFieldDay event he attended.

GPUs, or graphics processors, generally have a much higher number of cores than processors. This allows graphics cards to do much more work. Brandon Roya emphasized the importance of this ability to perform multitasking simultaneously in the artificial intelligence training process.

Artificial intelligence drew the cities of Türkiye!  Malatya as we know…

Artificial intelligence drew the cities of Türkiye! Malatya as we know…

ChatGPT, an artificial intelligence tool developed by OpenAI, drew the cities of Turkey. Here are the images created…

However, when it comes to AI output, graphics cards are weaker in focusing than processors. Although processors are not as successful as graphics cards in multi-tasking, they have much higher processing power when it comes to focusing on a single process.

Brandon Ray stated that processing speed and efficiency are much more important, especially for artificial intelligence outputs. Although graphics cards enable the process to proceed more optimized, they lag behind in processing speed. In this sense, the industry has turned to artificial intelligence-supported processors for artificial intelligence outputs.

Another reason why processors for AI outputs are on the rise is cost. Server processors are cheaper and more accessible than artificial intelligence-supported server GPU units such as the H100. Especially after the artificial intelligence craze, it has become more difficult to reach Nvidia H100 and higher level GPU units.

In this sense, artificial intelligence servers started to prefer artificial intelligence processors to get faster results. It is also a matter of curiosity whether this choice will unseat Nvidia. Currently, Intel AVX-512 processors are used in artificial intelligence servers. In addition, Meta has already rolled up its sleeves for its artificial intelligence processor, and AMD and Microsoft are preparing to make a major investment in this issue. In addition, OpenAI CEO Sam Altman is trying to raise a huge investment of 5 to 7 billion dollars to develop an artificial intelligence processor.

source site-29