Nvidia’s powerful new chip aims to help the AI ​​understand you better

Nvidia's powerful new chip aims to help the AI ​​understand you better

What is happening

Nvidia is launching a new chip, the H100 “Hopper”, which has the potential to accelerate the artificial intelligence that is sweeping the tech industry.

why is it important

The chip helps cement Nvidia’s lead in technology that is revolutionizing everything in computing, from self-driving cars to translating language as people speak.

Nvidia will begin selling a new AI acceleration chip later this year, as part of the company’s efforts to secure leadership in the artificial intelligence computing revolution.

The H100 “Hopper” processor, which Nvidia chief executive Jensen Huang unveiled in March, is expected to allow AI developers to accelerate their research and create more advanced AI models, especially for challenges such as understanding human language and driving self-driving cars. The chip is expected to start shipping next quarter.

The H100 processor has 80 billion transistors and measures 814 square millimeters, which is almost as large as physically possible with today’s chip manufacturing equipment. (CNET got a glimpse of the H100 Hopper chips and Nvidia’s new Voyager building that will house hardware and software development work.)

the H100 competes with huge power-hungry AI processors like AMD’s MI250X, Google TPU v4 and Intel’s upcoming Ponte Vecchio. These chips are goliaths most commonly found in the preferred environment of AI training systems, data centers filled with racks of computer hardware and thick copper power cables.

The new chip embodies Nvidia’s evolution from a designer of graphics processing units used for video games to an AI powerhouse. The company did this by tailoring GPUs to the particular mathematics of AI like multiply arrays of numbers.

Circuitry for accelerating AI is becoming increasingly important as the technology makes its way into everything from iPhones to Dawn, supposedly the fastest supercomputer in the world. Chips like the H100 are essential for speeding up tasks like training an AI model to live-translate live speech from one language to another or to automatically generate video captions. Faster performance means AI developers can tackle tougher tasks like autonomous vehicles and speed up their experimentation, but one of the main areas for improvement is in the processing language.

Linley Gwennap, analyst at TechInsights, says the H100, along with Nvidia’s software tools, solidifies its position in the AI ​​processor market.

“Nvidia dominates its competitors,” Gwennap wrote in a report in April.

pingout, a longtime Nvidia customer that uses AI-powered voice analytics to help customer service representatives authenticate legitimate customers and spot scammers, says the chipmaker’s steady progress has enabled it to extend to identifying audio deepfakes. Deepfakes are sophisticated computer simulations that can be used to perpetrate fraud or spread misinformation.

“We couldn’t get there if we didn’t have the latest generation of Nvidia GPUs,” said Ellie Khoury, the company’s director of research.

Training their AI system involves processing an enormous amount of information, including the audio data of 100,000 voices, each manipulated in multiple ways to simulate things like background chatter and bad connections. telephone. That’s why H100 advancements like expanded memory and faster processing are important for AI customers.

Nvidia estimates that its H100 is overall six times faster than the A100 predecessor the company launched two years ago. One important area that definitely benefits from this is natural language processing. Also known as NLP, the field of AI helps computers understand your speech, summarize documents, and translate languages, among other tasks.

Nvidia is a big player in NLP, a field at the forefront of AI. Google’s Palm AI systemcan disentangle cause and effect in a sentence, write programming code, explain jokes and play the emoji movie game. But Nvidia’s flexible GPUs are popular with researchers. For example, Meta, the parent company of Facebook, this week released sophisticated NLP technology for free to speed up AI research, and it runs on 16 Nvidia GPUs.

With the H100, NLP researchers and product developers can work faster, said Ian Buck, vice president of Nvidia’s hyperscale and high performance computing group. “What took months should take less than a week.”

The H100 offers a breakthrough in Transformers, an artificial intelligence technology created by Google that can assess the importance of context around words and detect subtle relationships between information across domains. Data such as photos, speech, and text used to train AI often need to be carefully labeled before use, but transformer-based AI models can use raw data like large swaths of text on the web , said Aidan Gomez, co-founder of Cohere, AI language startup.

“It reads the internet. It consumes the internet,” Gomez said, and then the AI ​​model turns that raw data into useful information that captures what humans know about the world. Effect Transformers “moved my timeline forward decades” when it comes to the pace of AI progress.

We can all benefit from the H100’s ability to accelerate AI research and development, said Hang Liu, Assistant Professor at the Stevens Institute of Technology. Amazon can spot more fake reviews, chipmakers can better lay out chip circuits, and a computer can turn your words into Chinese as you speak them, he said. “Right now, AI is completely reshaping almost every area of ​​business life.”