Why von der Leyen’s supercomputers hardly help

Dusseldorf The AI ​​scene is discussing a proposal from Brussels: Artificial intelligence (AI) start-ups should have access to the three European supercomputers in the future. EU Commission President Ursula von der Leyen announced this on Wednesday in her State of the Union speech.

Access to high computing power is currently a scarce resource for AI developers. Even well-funded companies have difficulty obtaining specialized chips that can train large AI models. That’s why more and more scientists and founders are calling for more computing capacity to be built up in Europe.

It’s also about independence: Start-ups have to buy computing power at increasingly higher prices from large tech companies in the USA. And because many can’t do that, US corporations themselves are gaining the upper hand in developing AI models.

With the AI ​​initiative, the EU Commission wants to unlock new computing capacities. But the reactions are not euphoric: Leading German researchers and companies say that the existing supercomputers are hardly helping developers.

“Computing power is just as limited a commodity as electricity, gas, water,” warns Germany’s leading AI researcher Björn Ommer. The professor at the Ludwig Maximilians University in Munich says: “We cannot ensure that we become independent in Europe quickly enough.” Otherwise, the EU would find itself in a situation like that at the height of the energy crisis, in which it could no longer make decisions , but must act, analyzes the developer of the image generator Stable Diffusion.

Von der Leyen speaks of these supercomputers

In her speech, Ursula von der Leyen said: “Thanks to our investments in recent years, Europe is now a leader in supercomputers – with three of the five most powerful supercomputers in the world.”

The EU is building the exascale computer Jupiter in Jülich and has two pre-exascale computers in Italy and Finland – the EU now wants to use these in the development of new AI models and solutions.

An exascale computer reaches the processing power of the human brain at the neural level, making it even more powerful than the pre-exascale models. The sticking point, however, is the computing power. In the case of the EU supercomputers, this is not sufficient for the enormous needs of the large AI models, say experts. This could jeopardize the progress of research in Europe.

Research institutions and companies are already working successfully with supercomputers today. The only exascale computer in Europe was even developed specifically for use in the areas of simulation, AI and analysis. However, Andreas Liebl, managing director of the Applied AI initiative, points out: “Not every supercomputer is optimal for AI applications.”

The differences between supercomputers

The reason for this, according to Liebl: Most supercomputers are CPU-based, meaning they have a main processor, although AI applications primarily require powerful graphics processor models (GPU). Because these could record large amounts of data better. However, this technology is currently in particularly high demand and delivery times are long.

>>> Read here: Nvidia chip for 30,000 euros – how semiconductors drive up costs for companies

The computers that are best designed for AI models are built with hardware from the US company Nvidia. But European computers are not that. Therefore, according to Liebl, “it takes a lot of effort to train European supercomputers for appropriate applications.” And even then, it cannot be guaranteed whether opening up supercomputers would automatically benefit the start-ups.

In principle, AI researcher Jürgen Gall from the University of Bonn considers the initiative to be “very welcome”. The initiative could be an opportunity, especially if the bureaucratic hurdles for access were reduced, explains the Bonn professor.

This is how the EU wants to proceed

Access to the supercomputer is already organized today by the European High Performance Computing Joint Undertaking (EuroHPC). According to a spokeswoman for the Commission, the access rules should be adjusted so that start-ups can also use the supercomputer in the future – “with the aim of having quick access specifically for medium-sized companies and AI start-ups,” the spokeswoman continued.

However, access is only granted to “responsible” start-ups. It is unclear when a company is considered responsible. The question of when the first start-ups will have access to the supercomputers also remains unanswered.

This is what the German AI scene is demanding

Andreas Liebl from Applied AI calls for a suitable infrastructure to be built in Germany. This can be done via software layers, a method to make software more powerful and the existing computing capacity more usable and distributeable.

Liebl also considers start-up funding of around five billion euros to be conceivable to build up the infrastructure: “This could give a boost to the ecosystem of start-ups, research and industry.” Then the EU could also attract other chip manufacturers to Europe to create one here Liebl is convinced that production will be set up.

According to Gall and Liebl, there is also a need for continuous investment in infrastructure and low-threshold offers for companies and start-ups. According to AI researcher Gall, “a coordinated offering of supra-regional Tier 2 high-performance computers optimized for AI and national Tier 1 GPU clusters and AI expertise in data centers” would significantly improve the situation in Germany.

The AI ​​service center WestAI, funded by the Federal Ministry of Education and Research, could serve as an example of what such an offer could look like. This combines the computing capacities of the supercomputer in Jülich and the RWTH Aachen with the AI ​​​​competencies of the Lamarr Institute for Machine Learning and Artificial Intelligence and the University of Paderborn.

Federal Research Minister Bettina Stark-Watzinger is also focusing on the expansion of supercomputing centers and “opening up access to the AI ​​community” in her AI action plan.

Nevertheless, Björn Ommer is convinced: “The computing capacity released by the initiatives would not be sufficient, because demand is increasing faster than supply.” Just last week, the researcher devoted a lecture in particular to the “Limitation of Growth”, i.e. the obstacles to development . In addition to the scarcity of resources, the lack of computing capacity was one of the main factors here.

More: Meta develops new high-performance AI model

source site-13