Page 15 - Red Hat PR REPORT - MAY 2024
P. 15

model alignment that uses taxonomy-guided synthetic data generation and a novel multi-phase
               tuning framework. This approach makes AI model development more open and accessible to all
               users by reducing reliance on expensive human annotations and proprietary models. Using the LAB
               method, models can be improved by specifying skills and knowledge attached to a taxonomy,
               generating synthetic data from that information at scale to influence the model and using the
               generated data for model training.

               After seeing that the LAB method could help significantly improve model performance, IBM and
               Red Hat decided to launch InstructLab, an open source community built around the LAB method
               and the open source Granite models from IBM. The InstructLab project aims to put LLM
               development into the hands of developers by making, building and contributing to an LLM as
               simple as contributing to any other open source project.

               As part of the InstructLab launch, IBM has also released a family of select Granite English language
               and code models in the open. These models are released under an Apache license with
               transparency on the datasets used to train these models. The Granite 7B English language model
               has been integrated into the InstructLab community, where end users can contribute the skills and
               knowledge to collectively enhance this model, just as they would when contributing to any other
               open source project. Similar support for Granite code models within InstructLab will be available
               soon.

               Open source AI innovation on a trusted Linux backbone

               RHEL AI builds on this open approach to AI innovation, incorporating an enterprise-ready version
               of the InstructLab project and the Granite language and code models along with the world’s leading
               enterprise Linux platform to simplify deployment across a hybrid infrastructure environment. This
               creates a foundation model platform for bringing open source-licensed GenAI models into the
               enterprise. RHEL AI includes:

                     Open source-licensed Granite language and code models that are supported and
                       indemnified by Red Hat.
                     A supported, lifecycled distribution of InstructLab that provides a scalable, cost-
                       effective solution for enhancing LLM capabilities and making knowledge and skills
                       contributions accessible to a much wider range of users.
                     Optimized bootable model runtime instances with Granite models and InstructLab
                       tooling packages as bootable RHEL images via RHEL image mode, including optimized
                       Pytorch runtime libraries and accelerators for AMD Instinct™ MI300X, Intel and NVIDIA
                       GPUs and NeMo frameworks.
                     Red Hat’s complete enterprise support and lifecycle promise that starts with a trusted
                       enterprise product distribution, 24x7 production support and extended  lifecycle support.


               As organizations experiment and tune new AI models on RHEL AI, they have a ready on-ramp for
               scaling these workflows with Red Hat OpenShift AI, which will include RHEL AI, and where they can
               leverage OpenShift’s Kubernetes engine to train and serve AI models at scale and OpenShift AI’s
               integrated MLOps capabilities to manage the model lifecycle.  IBM’s watsonx.ai enterprise studio,
               which is built on Red Hat OpenShift AI today, will benefit from the inclusion of RHEL AI in OpenShift
               AI upon availability, bringing additional capabilities for enterprise AI development, data
               management, model governance and improved price performance.
   10   11   12   13   14   15   16   17   18   19   20