Humanoid robots

“AI factories”: Chip giant Nvidia has big plans

NVIDIA
Image source: JHVEPhoto / Shutterstock.com

The chip company Nvidia wants to transfer its dominance in artificial intelligence technology in data centers to the real world with robots. At its in-house developer conference GTC, Nvidia CEO Jensen Huang presented a platform designed to accelerate the development of humanoid robots.

Entertainment giant Disney and Google’s AI company DeepMind are also involved in the initiative called Isaac Groot.

Ad

Robots “will be a very, very big industry”, said Huang. If only because there will be a shortage of at least 15 million workers for some jobs by the end of this decade. But even beyond that: “Everything that moves will be autonomous.” Robots will need huge amounts of data for training and will have to understand their environment, emphasized Huang. Nvidia wants to cover all the building blocks: training and testing the AI software as well as using it in the real world.

Celebrated like a rock star

Chips from Nvidia have become a key technology for artificial intelligence. Initially, the systems were primarily used around the world for training applications with artificial intelligence. Tech giants such as Google and the Facebook group Meta fill entire data centers with them – but AI start-ups such as the ChatGPT inventor company OpenAI also rely on them. This position has led to explosive growth in Nvidia’s business in recent years.

The GTC is Nvidia’s annual event, at which the company traditionally gives an outlook on the future. Once again, Huang was celebrated like a rock star. Wearing his usual leather jacket, he shot T-shirts into the audience before the start of his more than two-hour presentation. By his own admission, he did without a teleprompter.

Ad

“AI factories”

The Nvidia boss talks about data centers as “AI factories” that house the computing power for artificial intelligence. “Every industry that manufactures something will have two factories in the future,” said Huang. One will manufacture the physical products as before – and the second will supply the software for them.

In addition to robots, there were also many other announcements:

  • New chip systems: Nvidia wants to meet the rapidly growing demand for computing power for artificial intelligence with the next generation of its AI computers. The new system called “Vera Rubin” is set to be launched in fall 2026, said Huang. Rubin and the further development of the current Blackwell platform announced for this year are intended to drastically reduce the costs of operating AI software compared to previous technology. The next generation is named after Vera Rubin – an American astronomer who made important discoveries about dark matter.
  • AI computers for the desktop: With DGX Spark and DGX Station, Nvidia wants to offer artificial intelligence developers and researchers more computing power locally. They also contain the current AI chips called “Blackwell”.
  • Robot cars: In the development of technology for autonomous driving, Nvidia has won US car giant General Motors as a customer for its computer systems and software. GM had only recently given up years of developing robotaxis under the Cruise umbrella.
  • Digital twins: Nvidia relies on training robots, self-driving cars and other AI systems with the help of simulations. This allows developers to play through countless situations in a short space of time. Under the name Cosmos, Nvidia offers software that simulates real environments in photorealistic videos.

100 times more computing power than expected

At the same time, Huang tried to allay investors’ concerns that the world could make do with less AI computing power in the future – and that expectations for Nvidia’s future business could therefore be too high.

The world as a whole is moving towards using artificial intelligence to generate fresh answers instead of retrieving stored answers. The new AI models in particular, which can build up a chain of reasoning step by step to solve problems, are performance-hungry.

As an example, Huang demonstrated how the Chinese Deepseek R1 model needed 150 times more computing power than traditional AI software to calculate the seating arrangements at a wedding based on traditions and the relationship between individual family members. The conventional model also failed at the task.

In total, 100 times more computing capacity is easily needed than was assumed a year ago, said Huang. The fact that he chose Deepseek for the demonstration was probably no coincidence. R1 is said to have been trained with significantly less computing power than previous AI models. This caused Nvidia’s share price to plummet a few weeks ago. However, Huang argues that the actual need for computing power will not arise during training, but when generating the answers.

Meanwhile, investors were not really convinced: Nvidia shares closed down 3.43 percent and lost a further 0.55 percent in US after-hours trading.

dpa

Ad

Weitere Artikel