Nvidia has announced a new configuration for its advanced artificial intelligence chips that is designed to speed generative AI applications.
The new version of the Grace Hopper Superchip boosts the amount of high-bandwidth memory, which will give the design the capacity to power larger AI models. The configuration is optimized to perform AI inference functions that effectively power generative AI applications such as ChatGPT.
Nvidia’s Grace Hopper Superchip design stitches together one of the company’s H100 graphics processing units (GPU) with an Nvidia-designed central processor. The additional memory in the new configuration will allow AI models to remain resident on a single GPU, which will improve performance and reduce the need for multiple systems or GPUs to run the models.
The new configuration, called GH200, will be available in the second quarter of 2023. Nvidia plans to sell two flavours of the chip: a version that includes two chips that customers can integrate into systems, and a complete server system that combines two Grace Hopper designs.
The sources for this piece include an article in Reuters.