Page 59 - Red Hat PR REPORT - MAY-JUNE 2025
P. 59
5/26/25, 11:41 AM Red Hat Unlocks Generative AI for Any Model and Any Accelerator Across the Hybrid Cloud with Red Hat AI Inference Server | We…
Bill Pearson, vice president, Data Center & AI Software Solutions and Ecosystem, Intel
“Intel is excited to collaborate with Red Hat to enable Red Hat AI Inference Server on
®
®
Intel Gaudi accelerators. This integration will provide our customers with an optimized solution
to streamline and scale AI inference, delivering advanced performance and efficiency for a wide
range of enterprise AI applications.”
John Fanelli, vice president, Enterprise Software, NVIDIA
“High-performance inference enables models and AI agents not just to answer, but to reason and
adapt in real time. With open, full-stack NVIDIA accelerated computing and Red Hat AI Inference
Server, developers can run efficient reasoning at scale across hybrid clouds, and deploy with
confidence using Red Hat Inference Server with the new NVIDIA Enterprise AI validated design.”
Additional Resources
Read a technical deep dive on Red Hat AI Inference Server
Hear more about Red Hat AI Inference Server from Red Hat executives
Find out more about Red Hat AI
Learn more about Red Hat OpenShift AI
Learn more about Red Hat Enterprise Linux AI
Read more about the llm-d project
Learn about the latest updates to Red Hat AI
Learn more about Red Hat Summit
See all of Red Hat’s announcements this week in the Red Hat Summit newsroom
Follow @RedHatSummit or #RHSummit on X for event-specific updates
Connect with Red Hat
Learn more about Red Hat
Get more news in the Red Hat newsroom
Read the Red Hat blog
https://web-release.com/red-hat-unlocks-generative-ai-for-any-model-and-any-accelerator-across-the-hybrid-cloud-with-red-hat-ai-inference-server/ 5/6

