Page 9 - LARGE LANUGAGE MODEL AND SMALL LANUGAGE MODEL
P. 9

What are SLM









                     Small language models (SLMs) are AI models that use different

                         methods to be smaller and require less computing power than

                         larger models.


                     SLMs use techniques like model compression, knowledge distillation,

                         and efficient architectures to make themselves smaller. Model

                         compression uses methods like low-rank factorization, quantization,

                         and pruning to reduce the number of parameters without sacrificing

                         much performance. Knowledge distillation involves teaching a

                         smaller model to act like a larger, already trained model. Efficient

                         architectures, like Transformer-XL and Linformer, are specially

                         designed structures that focus on efficiency.
   4   5   6   7   8   9   10