Crypto Flexs
  • DIRECTORY
  • CRYPTO
    • ETHEREUM
    • BITCOIN
    • ALTCOIN
  • BLOCKCHAIN
  • EXCHANGE
  • TRADING
  • SUBMIT
Crypto Flexs
  • DIRECTORY
  • CRYPTO
    • ETHEREUM
    • BITCOIN
    • ALTCOIN
  • BLOCKCHAIN
  • EXCHANGE
  • TRADING
  • SUBMIT
Crypto Flexs
Home»ADOPTION NEWS»TEAL, Introducing Training-Free Activation Sparsity to Improve LLM Efficiency
ADOPTION NEWS

TEAL, Introducing Training-Free Activation Sparsity to Improve LLM Efficiency

By Crypto FlexsSeptember 1, 20243 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
TEAL, Introducing Training-Free Activation Sparsity to Improve LLM Efficiency
Share
Facebook Twitter LinkedIn Pinterest Email

Jack Anderson
September 1, 2024 08:34

TEAL provides a learning-free approach to activation sparsity that significantly improves the efficiency of large-scale language models (LLMs) with minimal degradation.





TEAL (Training-Free Activation Sparsity in LLMs) has emerged as a groundbreaking approach to improve the efficiency of large-scale language models (LLMs) without additional training. According to together.ai, the method achieves 40-50% activation sparsity with minimal degradation by applying size pruning to the hidden state throughout the model. This innovation allows transferring fewer weights to on-chip memory, solving the memory-bound nature of LLM inference and translating into a 1.53-1.8x wall-clock speedup in single-batch decoding.

background

LLM is known for its enormous size, which makes it difficult during inference, mainly due to the speed limitation of transferring parameters from device memory to registers. Various techniques such as quantization, weight sparsity, and speculative decoding have been developed to address this ‘memory wall’. Activation sparsity, which utilizes zero values ​​in the hidden state, is a less explored method that avoids transferring unnecessary weight channels during decoding.

Older models like OPT-175B exhibit high activation sparsity, allowing significant speedups with methods like DejaVu. However, newer models like LLaMA have moved to SwiGLU variants, making these methods difficult to apply. Recent studies have attempted to ‘recover’ models that exhibit activation sparsity, but these models require extensive retraining on large datasets.

Motivational Research: Activation Distribution Characteristics of LLM

Studies have shown that the hidden states of LLM are outliers, zero-centered, and have similar distribution shapes across layers. Specifically, the states before MLP and Attention Blocks are Gaussian in shape, and the intermediate states are Laplacian in shape. This suggests that many low-amplitude activations can be eliminated with negligible model degradation, a notion also observed in other studies such as CATS.

teal

TEAL introduces optimizations by sparsifying all tensors in the model, achieving near-zero degradation at 25% sparsity and minimal degradation at 40% sparsity. At 50% sparsity, the Llama-3 variant shows slightly more degradation than its predecessors Llama-2 and Mistral. TEAL outperforms CATS by sparsifying all tensors and producing lower error by sparsifying the input.

Improved hardware recognition speed

To benchmark real-world speedups, TEAL is integrated with GPT-Fast, achieving significant speedups of up to 1.53x and 1.8x at 40% and 50% sparsity, respectively. The kernel is faster than cuBLAS at 0% sparsity, but there is still room for further optimization.

Compatibility with quantization

TEAL also demonstrates compatibility with quantization, another technique for efficient LLM inference. Combining activation sparsity and quantization opens up a new regime for transferring memory to GPU registers, leading to faster inference speeds.

Application

The most immediate application of TEAL is to accelerate inference in resource-constrained edge settings, especially in single-batch scenarios. It also enables inference providers like Together AI, which hosts over 100 open-source models on large fleets of GPUs, to serve their models more efficiently.

Image source: Shutterstock


Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

Michael Burry’s Short-Term Investment in the AI ​​Market: A Cautionary Tale Amid the Tech Hype

November 19, 2025

BTC Rebound Targets $110K, but CME Gap Cloud Forecasts

November 11, 2025

TRX Price Prediction: TRON targets $0.35-$0.62 despite the current oversold situation.

October 26, 2025
Add A Comment

Comments are closed.

Recent Posts

CreatorFi Launches On Aptos With $2M Strategic Backing To Scale Stablecoin Credit For Creators

November 25, 2025

Bybit Lowers Barrier To Elite Wealth Management Solutions With Year-End Exclusive For VIP Clients

November 25, 2025

TrustLinq Launches Swiss-Regulated Crypto-to-Fiat Payment Platform To Boost Cryptocurrency Adoption

November 25, 2025

Bitcoin Is Dropping—but Your Income Doesn’t Have To. Earn Up To $5,927 Per Day Safely With 8 Hours Cloud Mining.

November 25, 2025

BitMine has released 3.6 million ETH, but investors question the math.

November 25, 2025

The Shai Hulud malware has hit NPM as cryptocurrency libraries face a growing security crisis.

November 24, 2025

Wallet In Telegram Lists Monad, Enabling Telegram TGE Trading & Expanding MON Distribution

November 24, 2025

Wallet In Telegram Lists Monad, Enabling Telegram TGE Trading & Expanding MON Distribution

November 24, 2025

MEXC’s ENA Extravaganza Concludes With 51,000+ Participants And $79.7 Billion In Trading Volume

November 24, 2025

Solicoin (Soli) is now available for presale! 🎉

November 24, 2025

Chainlink is the ‘critical connective tissue’ for tokenization

November 24, 2025

Crypto Flexs is a Professional Cryptocurrency News Platform. Here we will provide you only interesting content, which you will like very much. We’re dedicated to providing you the best of Cryptocurrency. We hope you enjoy our Cryptocurrency News as much as we enjoy offering them to you.

Contact Us : Partner(@)Cryptoflexs.com

Top Insights

CreatorFi Launches On Aptos With $2M Strategic Backing To Scale Stablecoin Credit For Creators

November 25, 2025

Bybit Lowers Barrier To Elite Wealth Management Solutions With Year-End Exclusive For VIP Clients

November 25, 2025

TrustLinq Launches Swiss-Regulated Crypto-to-Fiat Payment Platform To Boost Cryptocurrency Adoption

November 25, 2025
Most Popular

Paying Credit Card Bills with Bitcoin (Complete Guide)

December 2, 2023

Binance Pool Announces Rewards for Mining Services for Altcoin Project Created by Dogecoin (DOGE) Founder Billy Markus

October 1, 2024

EVM compliant, verifiable 10,000 TPS and sub-second finality

December 18, 2024
  • Home
  • About Us
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms and Conditions
© 2025 Crypto Flexs

Type above and press Enter to search. Press Esc to cancel.