Page 19 - Red Hat PR REPORT - MAY-JUNE 2025
P. 19
Press Release
Red Hat Summit
Join the Red Hat Summit keynotes to hear the latest from Red Hat executives, customers and
partners:
• Modernized infrastructure meets enterprise-ready AI — Tuesday, May 20, 8-10 a.m. EDT
(YouTube)
• Hybrid cloud evolves to deliver enterprise innovation — Wednesday, May 21, 8-9:30 a.m.
EDT (YouTube)
Supporting Quotes
Joe Fernandes, vice president and general manager, AI Business Unit, Red Hat
“Faster, more efficient inference is emerging as the newest decision point for gen AI innovation.
Red Hat AI, with enhanced inference capabilities through Red Hat AI Inference Server and a new
collection of validated third-party models, helps equip organizations to deploy intelligent
applications where they need to, how they need to and with the components that best meet their
unique needs.”
Michele Rosen, research manager, IDC
“Organizations are moving beyond initial AI explorations and are focused on practical
deployments. The key to their continued success lies in the ability to be adaptable with their AI
strategies to fit various environments and needs. The future of AI not only demands powerful
models, but models that can be deployed with ability and cost-effectiveness. Enterprises seeking
to scale their AI initiatives and deliver business value will find this flexibility absolutely essential.”
1 Source: Navigate The Open-Source AI Ecosystem In The Cloud, Forrester Research, Inc.,
February 2025
Additional Resources
• Learn more about Red Hat AI
• Learn more about Red Hat AI Inference Server
• Learn more about Red Hat OpenShift AI
• Learn more about Red Hat Enterprise Linux AI
• Learn more about Red Hat AI validated models
• Hear more about Red Hat AI Inference Server from Red Hat executives
• Read about the llm-d community project
• Learn about Red Hat’s work with Meta
• Learn about Red Hat’s work with NVIDIA
• Learn about Red Hat’s work with AMD
• Learn about Red Hat’s success with Hitachi
• Learn about Red Hat’s success with DenizBank
• Read more about Llama Stack and MCP
• Read more about model validation
• Read more about LLM model compression
• Read more about feature store