Supermicro and NVIDIA partner in AI at the Edge
NVIDIA is playing a crucial role in enabling Supermicro to expand its AI solutions portfolio, allowing customers to drive advanced AI capabilities to edge computing environments.
The company is utilising NVIDIA GPUs to provide greater processing power for AI workloads at the edge, in addition to the NVIDIA AI Platform offering software tools and libraries optimised for running AI workloads.
Using Supermicro application-optimised servers with NVIDIA GPUs will make it easier to fine-tune pre-trained models and for AI inference solutions to be deployed at the edge where the data is generated, improving response times and decision-making.
Utilising AI to drive real data centre business value
Last year (2023) saw huge progress in innovation that is entwined with AI, the Internet of Things (IoT) and other new technologies that businesses implemented into their digital strategies. As a result, this year (2024) enterprises will be starting to actualise these ideas and thinking about how they can further optimise their operations.
As a result, companies like Supermicro are seeking to expand their remote edge computing services, as they require high-performance AI and training solutions to advance workloads and increase productivity. It is expanding its AI portfolio enabling customers to leverage the power and capability of AI in edge locations like public spaces, retail stores or industrial infrastructure.
The company’s edge AI solutions are capable of supporting pre-trained models for the edge environment of its customers.
“The Supermicro Hyper-E server, based on the dual 5th Gen Intel Xeon processors, can support up to three NVIDIA H100 Tensor Core GPUs, delivering unparalleled performance for Edge AI,” says Charles Liang, President and CEO of Supermicro. “With up to 8TB of memory in these servers, we are bringing data centre AI processing power to edge locations. Supermicro continues to provide the industry with optimised solutions as enterprises build a competitive advantage by processing AI data at their edge locations.”
Enhancing customer experience
Within the context of edge data centres, data can be processed and analysed more quickly and effectively. This can be crucial for applications that require low latency or high bandwidth.
By processing data closer to the user, edge data centres can reduce the time it takes for data to travel to and from the central data centre, significantly improving the responsiveness of the application.
With server advancements like this, users no longer need to send data back to the cloud for processing, only to retrieve the information back to the edge. Customers can now use pre-trained large language models (LLMs) optimised for performance and available with NVIDIA AI Enterprise at their edge locations where the data is needed for real-time decision-making close to the data origination.
Supermicro has been keen to advance its AI solutions, having recently launched a full stack storage solution for AI and machine learning data pipelines. The end goal is to enable businesses across a wide range of industries to train AI models faster.
“Businesses across industries, including healthcare, retail, manufacturing and auto, are increasingly looking to leverage AI at the edge,” says Kevin Connors, Vice President of Partner Alliances at NVIDIA. “The new Supermicro NVIDIA-Certified Systems, powered by the NVIDIA AI platform, are built to deliver the highest-performing accelerated computing infrastructure, as well as NVIDIA AI Enterprise software to help run edge AI workloads.”
In addition to NVIDIA hardware and software, Supermicro also holds an extensive portfolio of Intel and AMD based storage servers to advance customer AI.
******
Make sure you check out the latest edition of Data Centre Magazine and also sign up to our global conference series - Tech & AI LIVE 2024
******
Data Centre Magazine is a BizClik brand