12/11/2024 | News release | Distributed by Public on 12/11/2024 09:43
Superfast chips are in high demand - not just by the artificial intelligence industry, but also by companies that work in computer graphics, robotics, autonomous vehicles, or drug discovery. "It's fun to see all these amazing applications being created," says Jensen Huang, the CEO of Nvidia.
Speaking to David Solomon, the CEO of Goldman Sachs, at the Communacopia + Technology conference in San Francisco, Huang explained how computer graphics, for example, rely heavily on AI infrastructure. "We compute one pixel, and we infer the other 32," he says in an edition of Goldman Sachs Talks. "Computing one pixel takes a lot of energy. Inferring the other 32 takes very little energy, and you can do it very fast. And the image quality is incredible."
Given this speed and flexibility, this infrastructure more than pays for itself, Huang says, responding to a question from Solomon about returns on investment for customers. By spending on such equipment, "the computing cost goes up a little bit - maybe it doubles," Huang says. "But you reduce the computing time by a factor of about 20. You get 10x savings."
How Huang sees the data center market
Chips that accelerate computing are everywhere, but there is no such thing as a universal accelerator, Huang says. Instead, every time a chip company enters a new market, it must learn new algorithms. They differ according to purpose; the algorithm for image processing would be different from the algorithm to model fluid dynamics.
"Usually, some 5-10% of the code represents 99.999% of the run time," Huang says. "So if you take that 5% of the code and offloaded it onto an accelerator, then technically you should be able to speed up the application a hundred times."
The promise of this kind of accelerated computing has led to keen investor interest in the data center market, Huang says. He thinks this infrastructure can yet be improved. For one thing, the average data center is "super-inefficient, because it's filled with air, and air is a lousy conductor of electricity." Making data centers denser - eliminating the air, in other words - will make them cheaper and more energy efficient.
Another revolution lies in how data centers now understand not just how to process data but the meaning of the data itself, and how to translate one form of data to another, Huang says: "English to images, images to English, English to proteins, proteins to chemicals."
The chip supply chain needs to be resilient
The ecosystem of manufacturers and suppliers to the chip industry is sprawling and complex, and particularly concentrated in Asia. As a result, Nvidia tries to design diversity and redundancy into every aspect of its supply chain.
Companies need to have "enough intellectual property" to be able to shift their manufacturing from one "fab" - or chip-making facility - to another if they have to, Huang says. "Maybe the process technology won't be as great, or you won't get the same level of performance or cost, but you will still be able to provide the supply."
This article is being provided for educational purposes only. The information contained in this article does not constitute a recommendation from any Goldman Sachs entity to the recipient, and Goldman Sachs is not providing any financial, economic, legal, investment, accounting, or tax advice through this article or to its recipient. Neither Goldman Sachs nor any of its affiliates makes any representation or warranty, express or implied, as to the accuracy or completeness of the statements or any information contained in this article and any liability therefore (including in respect of direct, indirect, or consequential loss or damage) is expressly disclaimed.