In a significant advancement in artificial intelligence, AMD announced the release of AMD-135M, its first small language model (SLM). According to AMD.com, this new model aims to address some of the limitations faced by large language models (LLMs) such as GPT-4 and Llama while providing specialized functionality.
AMD-135M: The first AMD small language model
Part of the Llama family, the AMD-135M is AMD’s pioneering effort in SLM. The model was trained from scratch using the AMD Instinct™ MI250 accelerator and 670 billion tokens. The training process resulted in two different models: AMD-Llama-135M and AMD-Llama-135M-code. The former was pre-trained with regular data, while the latter was fine-tuned with an additional 20 billion tokens specifically for code data.
prior training: AMD-Llama-135M was trained for 6 days using 4 MI250 nodes. The AMD-Llama-135M code, a code-centric variant, required an additional four days for fine-tuning.
All associated training code, datasets, and model weights are open source, allowing developers to reproduce the model and contribute to the training of other SLMs and LLMs.
Optimization through speculative decoding
One notable advancement in AMD-135M is the use of speculative decoding. Existing autoregressive approaches for large-scale language models often have low memory access efficiency because each forward pass produces only a single token. Speculative decoding solves this problem by using a small draft model to generate candidate tokens and then verifying them with a larger target model. This method allows generating multiple tokens per forward pass, significantly improving memory access efficiency and inference speed.
Accelerate inference performance
AMD tested the performance of the AMD-Llama-135M code with a draft model of CodeLlama-7b on a variety of hardware configurations, including MI250 accelerators and Ryzen™ AI processors. The results show that inference performance is significantly improved when using speculative decoding. This enhancement establishes an end-to-end workflow for training and inference on selected AMD platforms.
next steps
AMD aims to foster innovation within the AI community by providing open source reference implementations. The company encourages developers to explore and contribute to new areas of AI technology.
For more information about the AMD-135M, visit the full technology blog on AMD.com.
Image source: Shutterstock