News Tech World

Nvidia AI Chip is 4.5x faster than the previous chip

AI Future looks good

Nvidia has said that its upcoming H100″Hopper” Tensor Core GPU has set some amazing records during the debut . In industry standard MLPerf benchmarks. The chip showed results which were mind-blowing . It showed that it is 4.5 times faster than the A100 which was currently  Nvidia’s fastest production AI chip.

The MPerf benchmarks(MLPerfTM Inference 2.1) measures “inference” workloads, which can demonstrate how a chip can apply previous trained machine learning model to new data. MLCommons developed the MLPerf benchmarks in 2018, which could deliver a standardized metric for conveying machine learning performance to potential customers.

Nvidia
Nvidia’s H100 benchmark results verses the A100, in fancy bar graph form

H100 also did well in BERT-Large benchmark. Which  measures natural language-processing performance by Bert model which  developed by Google. Nvidia has credited this result to Hopper architecture’s Transformer Engine, which  accelerated training transformer models. By this it means that the H100 chip  could accelerate  future natural language models, which is similar toOpenAi’s GPT -3. GPT -3 can hold conversational chats and compose written works in many different styles .

The H100 is a high-end GPU chip . It is designed for AI and supercomputer applications like large language models ,image recognition, image synthesis and more. It will be replacing A100 as a flagship data centre GPU, but it is being ready.

But last week the US government has imposed restrictions on the export of chips to China. This  results in slow production of the chip . As Nvidia not being able to deliver the chip to us by the end of 2022 because part of its development is going on in China.

But Nvidia has clarified in the  second Securities and Exchange Commission filing last week, that the US will be allowing the development in China and we could get our hands on them by the end of 2022 .

All the groundbreaking AI applications in the future will be powered by H100.