Home Innovation Networking Nvidia publishes AI factory re...
Networking
Business Fortune
07 November, 2024
AI-oriented data center deployment is intended to be made easier and faster by the reference designs.
After discussing AI factories for a while, Nvidia is finally releasing some reference designs to aid in their construction. The chip manufacturer has published a number of blueprints known as Enterprise Reference Architectures (Enterprise RA), which are designed to enable the construction of AI-focused data centers easier.
Even by data center construction standards, building an AI-oriented data center is a difficult undertaking. Additionally, this would be the first time such a facility was built for the majority of organizations. Who has previously constructed AI factories, after all? Nvidia's Enterprise RAs are designed to make the process of constructing these infrastructures as easy as possible while also assisting businesses in making sure their Nvidia AI factories can grow and expand in the future.
All of the recommended hardware and software for the setup are included in the reference architecture. Each Enterprise RA offers Nvidia-certified server configurations, AI-optimized networking via the Spectrum-X AI Ethernet and BlueField-3 DPUs, and Nvidia's AI Enterprise software for executing AI applications, according to a blog post by Bob Pette, VP for enterprise platforms at Nvidia.
Storage is the one area that the reference architecture does not address because Nvidia does not provide storage. Rather, Nvidia's approved server partners, including Dell Technologies, Pure Storage, and NetApp, are in charge of storage hardware and software.
With 23 certified data center partners and 577 systems in its portfolio, Nvidia's partners, including Cisco, Dell, HPE, Lenovo, and Supermicro, provide Nvidia AI solutions built on top of its Enterprise RAs.
On the software side, Nvidia's AI Enterprise platform consists of Nvidia Base Command Manager Essentials, which offers tools for workload management, resource monitoring, and infrastructure provisioning, as well as microservices like Nvidia NeMo and Nvidia NIM for creating and implementing AI applications.
The primary advantage of utilizing Nvidia's AI reference architectures is undoubtedly the ability to get up and running more quickly because users are provided with instructions rather than having to figure things out on their own.