Page 27 - Red Hat PR REPORT - OCTOBER 2025
P. 27

Press Release





               Building the foundation for next-generation AI agents


               AI agents are poised to transform how applications are built, and their complex, autonomous work-
               flows will place heavy demands on inference capabilities. The Red Hat OpenShift AI 3.0 release con-
               tinues to lay the groundwork for scalable agentic AI systems not only through its inference capabili-
               ties but also with new features and enhancements focused on agent management.

               To accelerate agent creation and deployment, Red Hat has introduced a Unified API layer based on
               Llama Stack, which helps align development with industry standards like OpenAI-compatible LLM
               interface protocols. Additionally, to champion a more open and interoperable ecosystem, Red Hat is
               an early adopter of the Model Context Protocol (MCP), a powerful, emerging standard that stream-
               lines how AI models interact with external tools—a fundamental feature for modern AI agents.

               Red Hat AI 3 introduces a new modular and extensible toolkit for model customization, built on ex-
               isting InstructLab functionality. It provides specialized Python libraries that give developers greater
               flexibility and control. The toolkit is powered by open source projects like Docling for data pro-
               cessing, which streamlines the ingestion of unstructured documents into an AI-readable format. It
               also includes a flexible framework for synthetic data generation and a training hub for LLM fine tun-
               ing. The integrated evaluation hub helps AI engineers monitor and validate results, empowering
               them to confidently leverage their proprietary data for more accurate and relevant AI outcomes.

               Supporting Quotes

               Joe Fernandes, vice president and general manager, AI Business Unit, Red Hat
               "As enterprises scale AI from experimentation to production, they face a new wave of complexity, cost
               and control challenges. With Red Hat AI 3, we are providing an enterprise-grade, open source platform
               that minimizes these hurdles. By bringing new capabilities like distributed inference with llm-d and a
               foundation for agentic AI, we are enabling IT teams to more confidently operationalize next-generation
               AI, on their own terms, across any infrastructure."

               Dan McNamara, senior vice president and general manager, Server and Enterprise AI, AMD
               “As Red Hat brings distributed AI inference into  production, AMD is proud to provide the high-
               performance foundation behind it. Together, we’ve integrated the efficiency of AMD EPYC™ processors,
               the scalability of AMD Instinct™ GPUs, and the openness of the AMD ROCm™ software stack to help
               enterprises move beyond experimentation and operationalize next-generation AI — turning performance
               and scalability into real business impact across on-prem, cloud, and edge environments.”

               Mariano Greco, chief executive officer, ARSAT
               “As a provider of connectivity infrastructure for Argentina, ARSAT handles massive volumes of customer
               interactions and sensitive data. We needed a solution that would move us beyond simple automation to
               'Augmented Intelligence' while delivering absolute data sovereignty for our customers. By building our
               agentic AI platform on Red Hat OpenShift AI, we went from identifying the need to live production in just
               45 days. Red Hat OpenShift AI has not only helped us improve our service and reduce the time engineers
   22   23   24   25   26   27   28   29   30   31   32