Page 50 - Red Hat PR REPORT - MAY 2024
P. 50

5/14/24, 2:32 PM                                             Latest News


        Red Hat Delivers Accessible, Open Source


        Generative AI Innovation with Red Hat Enterprise

        Linux AI



        14/05/2024





        ? The offering is the first to deliver supported, indemnified and open source-licensed IBM Granite LLMs under
        Red Hat’s flexible and proven enterprise subscription model

        ? Adds open source InstructLab model alignment tools to the world’s leading enterprise Linux platform to
        simplify generative AI model experimentation and alignment tuning
        ? Provides a supported, enterprise-ready model runtime environment across AMD, Intel and NVIDIA

        platforms for fueling AI innovation built on open source



        MAY 13, 2024 – Red Hat, Inc., the world's leading provider of open source solutions, today announced the
        launch of Red Hat Enterprise Linux AI (RHEL AI), a foundation model platform that enables users to more

        seamlessly develop, test and deploy generative AI (GenAI) models. RHEL AI brings together the open
        source-licensed Granite large language model (LLM) family from IBM Research, InstructLab model alignment

        tools based on the LAB (Large-scale Alignment for chatBots) methodology and a community-driven approach
        to model development through the InstructLab project. The entire solution is packaged as an optimized,
        bootable RHEL image for individual server deployments across the hybrid cloud and is also included as part

        of OpenShift AI, Red Hat’s hybrid machine learning operations (MLOps) platform, for running models and
        InstructLab at scale across distributed cluster environments.



        The launch of ChatGPT generated tremendous interest in GenAI, with the pace of innovation only

        accelerating since then. Enterprises have begun moving from early evaluations of GenAI services to building
        out AI-enabled applications. A rapidly growing ecosystem of open model options has spurred further AI

        innovation and illustrated that there won’t be “one model to rule them all.” Customers will benefit from an
        array of choices to address specific requirements, all of which stands to be further accelerated by an open
        approach to innovation.



        Implementing an AI strategy requires more than simply selecting a model; technology organizations need the

        expertise to tune a given model for their specific use case, as well as deal with the significant costs of AI
        implementation. The scarcity of data science skills are compounded by substantial financial requirements

        including:
        ? Procuring AI infrastructure or consuming AI services

      https://www.arabbnews.com/english/Latest-News.asp?id=17415                                                    1/6
   45   46   47   48   49   50   51   52   53   54   55