Nvidia’s new chips accelerate large AI system training, data shows

Published 04/06/2025, 16:56
© Reuters

Investing.com -- Nvidia (NASDAQ:NVDA)’s latest chips have shown progress in training sizable artificial intelligence (AI) systems, according to data released on Wednesday. The data indicates a significant decrease in the number of chips required to train large language models.

The data was released by MLCommons, a nonprofit organization that shares benchmark performance results for AI systems. The organization provided new information about chips from Nvidia and Advanced Micro Devices (NASDAQ:AMD), among others, used for AI training. This process involves feeding AI systems large volumes of data to learn from. While much of the stock market’s focus has shifted towards AI inference, where AI systems address user queries, the number of chips needed for system training remains a key competitive factor. For instance, China’s DeepSeek asserts it can create a competitive chatbot with far fewer chips than its U.S. counterparts.

The recent results are the first ones that MLCommons has shared concerning chip performance in AI system training. The chips were tested on training AI systems such as Llama 3.1 405B, an open-source AI model developed by Meta Platforms (NASDAQ:META). This model has a substantial amount of what are termed "parameters," giving an idea of how the chips might perform in some of the world’s most complex training tasks, which could involve trillions of parameters.

Only Nvidia and its partners submitted data about training this large model. The data revealed that Nvidia’s new Blackwell chips are over twice as fast as the previous Hopper chips generation on a per-chip basis.

According to the fastest results for Nvidia’s new chips, 2,496 Blackwell chips completed the training test in 27 minutes. This performance was faster than that of more than three times as many chips from Nvidia’s previous generation, as per the data.

In a press conference, Chetan Kapoor, the chief product officer for CoreWeave, which worked with Nvidia on some of the results, stated that there has been a shift in the AI industry. The trend is towards assembling smaller groups of chips into subsystems for separate AI training tasks, instead of creating homogeneous groups of 100,000 chips or more. Kapoor noted that this approach allows for continued acceleration or reduction in the time required to train multi-trillion parameter model sizes.

This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.

Latest comments

Risk Disclosure: Trading in financial instruments and/or cryptocurrencies involves high risks including the risk of losing some, or all, of your investment amount, and may not be suitable for all investors. Prices of cryptocurrencies are extremely volatile and may be affected by external factors such as financial, regulatory or political events. Trading on margin increases the financial risks.
Before deciding to trade in financial instrument or cryptocurrencies you should be fully informed of the risks and costs associated with trading the financial markets, carefully consider your investment objectives, level of experience, and risk appetite, and seek professional advice where needed.
Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. The data and prices on the website are not necessarily provided by any market or exchange, but may be provided by market makers, and so prices may not be accurate and may differ from the actual price at any given market, meaning prices are indicative and not appropriate for trading purposes. Fusion Media and any provider of the data contained in this website will not accept liability for any loss or damage as a result of your trading, or your reliance on the information contained within this website.
It is prohibited to use, store, reproduce, display, modify, transmit or distribute the data contained in this website without the explicit prior written permission of Fusion Media and/or the data provider. All intellectual property rights are reserved by the providers and/or the exchange providing the data contained in this website.
Fusion Media may be compensated by the advertisers that appear on the website, based on your interaction with the advertisements or advertisers
© 2007-2025 - Fusion Media Limited. All Rights Reserved.