Crypto Flexs
  • DIRECTORY
  • CRYPTO
    • ETHEREUM
    • BITCOIN
    • ALTCOIN
  • BLOCKCHAIN
  • EXCHANGE
  • TRADING
  • SUBMIT
Crypto Flexs
  • DIRECTORY
  • CRYPTO
    • ETHEREUM
    • BITCOIN
    • ALTCOIN
  • BLOCKCHAIN
  • EXCHANGE
  • TRADING
  • SUBMIT
Crypto Flexs
Home»BLOCKCHAIN NEWS»StripedHyena-7B: Next-generation AI architecture for improved performance and efficiency
BLOCKCHAIN NEWS

StripedHyena-7B: Next-generation AI architecture for improved performance and efficiency

By Crypto FlexsJanuary 4, 20242 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
StripedHyena-7B: Next-generation AI architecture for improved performance and efficiency
Share
Facebook Twitter LinkedIn Pinterest Email

Recent advances in AI have been greatly influenced by the Transformer architecture, a key component of large models across fields as diverse as language, vision, audio, and biology. However, the complexity of Transformer’s attention mechanism limits its application in processing long sequences. Even sophisticated models such as GPT-4 suffer from this limitation.

Breakthrough Advances with StripedHyena

To address these issues, Together Research recently open sourced StripedHyena, a language model that boasts a new architecture optimized for long contexts. StripedHyena can handle up to 128,000 tokens and has demonstrated improved performance over the Transformer architecture in both training and inference performance.​​ It is the first model to have the performance of the best open source Transformer model for both short and long contexts. .

StripedHyena’s Hybrid Architecture

StripedHyena incorporates a hybrid architecture that combines multi-head, grouped query attention with gate convolution within hyena blocks. This design differs from traditional decoder-only Transformer models. Represent the convolution with a state-space model or truncated filter to decode it into a persistent memory of Hyena blocks. This architecture has lower latency, faster decoding, and higher throughput compared to Transformers.

Improve training and efficiency

StripedHyena improves performance by more than 30%, 50%, and 100% over existing Transformer in end-to-end training on 32k, 64k, and 128k token sequences, respectively. In terms of memory efficiency, it reduces memory usage during autoregressive generation by over 50% compared to Transformers.

Comparative performance using attention mechanisms

StripedHyena significantly reduces the quality gap through large-scale attention, reducing computational cost and providing similar disruption and downstream performance without the need for mixed attention.​​

Applications beyond language processing

StripedHyena’s versatility extends to image recognition. The researchers tested the applicability of Visual Transformers (ViT) to attention substitution and showed similar accuracy in an image classification task on the ImageNet-1k dataset.

StripedHyena represents an important advancement in AI architecture, providing a more efficient alternative to Transformer models, especially when processing long sequences. Its hybrid structure of training and inference and improved performance make it a promising tool for a wide range of applications in language and vision processing.

Image source: Shutterstock

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

BitMEX Launches Equity Perps for 24/7 Stock Trading

January 8, 2026

MATIC Price Prediction: Technical Differences Point to $0.45 Recovery Despite Bearish Momentum

December 29, 2025

Arizona Lawmaker Proposes Tax Ban on Cryptocurrency and Blockchain

December 24, 2025
Add A Comment

Comments are closed.

Recent Posts

Bitmine Immersion Technologies (BMNR) Announces ETH Holdings Reach 4.168 Million Tokens, And Total Crypto And Total Cash Holdings Of $14.0 Billion

January 12, 2026

How will stablecoins and cryptocurrency crime change regulation in 2025?

January 12, 2026

Helio Corporation Announces $20 Million Non-Dilutive Utility Token Offering To Advance Space-Based Solar Power (SBSP) Initiative

January 12, 2026

How global sanctions are reshaping illicit cryptocurrency activity

January 11, 2026

How do cryptocurrency payments for virtual numbers work?

January 11, 2026

Onchain Perps Hit $12 Trillion, Hyperliquid and Rivals Redefine 2025

January 10, 2026

Best Cryptocurrency Betting Platforms in 2026: Sports, Esports and Live Markets

January 10, 2026

Asset manager VanEck explains how one Bitcoin could be worth $2.9 million by 2050.

January 10, 2026

BNB Chain Launches New Stablecoin for Large-Scale Applications

January 9, 2026

Rain Raises $250M Series C To Scale Stablecoin-Powered Payments Infrastructure For Global Enterprises

January 9, 2026

Truebit protocol hack exposes DeFi security risks as TRU token collapses

January 9, 2026

Crypto Flexs is a Professional Cryptocurrency News Platform. Here we will provide you only interesting content, which you will like very much. We’re dedicated to providing you the best of Cryptocurrency. We hope you enjoy our Cryptocurrency News as much as we enjoy offering them to you.

Contact Us : Partner(@)Cryptoflexs.com

Top Insights

Bitmine Immersion Technologies (BMNR) Announces ETH Holdings Reach 4.168 Million Tokens, And Total Crypto And Total Cash Holdings Of $14.0 Billion

January 12, 2026

How will stablecoins and cryptocurrency crime change regulation in 2025?

January 12, 2026

Helio Corporation Announces $20 Million Non-Dilutive Utility Token Offering To Advance Space-Based Solar Power (SBSP) Initiative

January 12, 2026
Most Popular

NexBridge Raises $30 Million in Tokenized U.S. Treasury Bond Offering

December 9, 2024

Dragonfly Led $12M Seed Round from Stablecoin Issuer Agora: Report

April 2, 2024

Gala Games lost more than $200 million due to the exploit, and GALA plummeted 14%.

May 21, 2024
  • Home
  • About Us
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms and Conditions
© 2026 Crypto Flexs

Type above and press Enter to search. Press Esc to cancel.