Crypto Flexs
  • DIRECTORY
  • CRYPTO
    • ETHEREUM
    • BITCOIN
    • ALTCOIN
  • BLOCKCHAIN
  • EXCHANGE
  • ADOPTION
  • TRADING
  • HACKING
  • SLOT
  • CASINO
Crypto Flexs
  • DIRECTORY
  • CRYPTO
    • ETHEREUM
    • BITCOIN
    • ALTCOIN
  • BLOCKCHAIN
  • EXCHANGE
  • ADOPTION
  • TRADING
  • HACKING
  • SLOT
  • CASINO
Crypto Flexs
Home»ETHEREUM NEWS»Alibaba’s new Qwen2 AI model, Meta challenges OpenAI
ETHEREUM NEWS

Alibaba’s new Qwen2 AI model, Meta challenges OpenAI

By Crypto FlexsJune 8, 20244 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Alibaba’s new Qwen2 AI model, Meta challenges OpenAI
Share
Facebook Twitter LinkedIn Pinterest Email

Chinese e-commerce giant Alibaba is a major player in China’s AI sector. Today, we announced the release of our latest AI model, Qwen2, which in some respects is currently the best open source option.

Qwen2, developed by Alibaba Cloud, is the next generation of the company’s Tongyi Qianwen (Qwen) model series, which includes Tongyi Qianwen LLM (also known as Qwen), vision AI models Qwen-VL and Qwen-Audio.

The Qwen model family is pre-trained on multilingual data covering a variety of industries and domains, and Qwen-72B is the most powerful model in the series. It was trained on data from an impressive 3 trillion tokens. By comparison, Meta’s most powerful variant, Llama-2, is based on 2 trillion tokens. However, Llama-3 is in the process of digesting 15 trillion tokens.

According to a recent blog post from the Qwen team, Qwen2 can handle 128K context tokens, comparable to OpenAI’s GPT-4o. Meanwhile, Qwen2 outperforms Meta’s LLama3 on essentially all of the most important synthetic benchmarks, making it the best open source model currently available, the team claims.

However, it is worth noting that the independent Elo Arena ranks Qwen2-72B-Instruct slightly higher than GPT-4-0314, but lower than Llama3 70B and GPT-4-0125-preview, making it the second most preferred open source LLM among humans. It’s worth it. Tester so far.

Qwen2 outperforms Llama3, Mixtral, and Qwen1.5 on synthetic benchmarks. Image: Alibaba Cloud

Qwen2 is available in five sizes ranging from 500 to 72 billion parameters, and this release offers significant improvements in a variety of specialized areas. Additionally, the model was trained on data in 27 more languages ​​than in the previous release, including German, French, Spanish, Italian, and Russian in addition to English and Chinese.

“Compared with state-of-the-art open source language models, including the previously released Qwen1.5, Qwen2 generally outperformed most open source models and was competitive against proprietary models on a set of benchmarks targeting language understanding and language generation. , multilingual capabilities, coding, mathematics, and reasoning,” the Qwen team claimed on HuggingFace’s model’s official page.

The Qwen2 model also shows an impressive understanding of long context. Qwen2-72B-Instruct can handle information extraction tasks anywhere within a huge context without errors and almost completely passes the “needle in a haystack” test. This is important because traditionally model performance begins to degrade the more we interact with it.

Qwen2 delivers outstanding performance in the following areas: "needle in haystack" test.  Image: Alibaba Cloud
Qwen2 performed well in the “Needle in a Haystack” test. Image: Alibaba Cloud

With this release, the Qwen team has also changed the licensing of the model. Qwen2-72B and its command coordination model continue to use the original Qianwen license, but all other models have adopted Apache 2.0, the standard in the open source software world.

Alibaba Cloud said on its official blog, “In the near future, we will continue to introduce new open source models to accelerate open source AI.”

decryption Testing the model showed that it has significant ability to understand tasks in multiple languages. These models are censored, especially on topics considered sensitive in China. This seems consistent with Alibaba’s claim that Qwen2 is the model least likely to deliver unsafe outcomes, including illegal activity, fraud, pornography, and privacy abuse, no matter what language it appears in.

Qwen2's answer: Is Taiwan a country?
Qwen2’s answer to “Is Taiwan a country?”
ChatGPT's answer: Is Taiwan a country?
ChatGPT’s answer: “Is Taiwan a country?”

You also have a better understanding of system prompts, so the conditions that apply will have a greater impact on your answers. For example, when asked to act as an assistant with legal knowledge and as a knowledgeable lawyer who always responds based on the law, there was a big difference in the answers. It provided similar advice to that provided by GPT-4o, but was more concise.

Qwen2's response: My neighbor insulted me.
Qwen2’s answer to “My neighbor insulted me”
ChatGPT's response: "my neighbor insulted me"
ChatGPT’s response: “My neighbor insulted me.”

The next model upgrade will give the Qwen2 LLM multiple modes, unifying the entire family into one powerful model, the team said. “Furthermore, we can extend the Qwen2 language model to be multimodal to understand both visual and audio information,” he added.

Qwen is available for online testing through HuggingFace Spaces. Those with enough computing power to run them locally can also download the weights for free through HuggingFace.

The Qwen2 model can be a great alternative for those willing to invest in open source AI. It has a larger token context window than most other models, outperforming Meta’s LLama 3. Additionally, licensing allows fine-tuned versions shared by others to improve upon it, further boosting scores and overcoming bias.

Edited by Ryan Ozawa.

generally intelligent newsletter

A weekly AI journey explained by Gen, a generative AI model.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

Asia Morning Briefing: SEC’s in -kind BTC, ETH ETF reduction shift occurred in Hong Kong a few years ago.

July 30, 2025

Ether Leeum validation exit exit queue will explode with 521,000 ETH ATH.

July 26, 2025

Altcoin benefits of capital rotation

July 22, 2025
Add A Comment

Comments are closed.

Recent Posts

VFAT Farm Strategy Audit Summary

July 31, 2025

ETH Meme Coin Pepeto Ends Stage 6 With $5.770.000 Raised In Presale

July 31, 2025

PowerBank And Intellistake Announce Strategic Alliance To Pioneer Digital Currencies, Including Bitcoin Treasury Integration And RWA Tokenization

July 31, 2025

Strategic Ettterim Protection Zone surpasses $ 10 billion as institutional interests increase.

July 31, 2025

Tethers we target the Stablecoin market and quote the path of genius behavior.

July 31, 2025

Pepescape Crypto Presale Raises $1M As Ethereum Eyes $6K, Community-Owned Exchange Gigacex Unveiled

July 30, 2025

Midl Secures $2.4M Seed Investment From Draper Associates And Draper Dragon To Pioneer Native DApp Infrastructure On Bitcoin

July 30, 2025

LayerBTC starts $ LBTC ICO to power the new Bitcoin Layer 2 for Apps and Defi.

July 30, 2025

Asia Morning Briefing: SEC’s in -kind BTC, ETH ETF reduction shift occurred in Hong Kong a few years ago.

July 30, 2025

XRP Open Interests decrease by $ 2.4B after recent sale

July 30, 2025

Is it really possible to sell Memecoins?

July 29, 2025

Crypto Flexs is a Professional Cryptocurrency News Platform. Here we will provide you only interesting content, which you will like very much. We’re dedicated to providing you the best of Cryptocurrency. We hope you enjoy our Cryptocurrency News as much as we enjoy offering them to you.

Contact Us : Partner(@)Cryptoflexs.com

Top Insights

VFAT Farm Strategy Audit Summary

July 31, 2025

ETH Meme Coin Pepeto Ends Stage 6 With $5.770.000 Raised In Presale

July 31, 2025

PowerBank And Intellistake Announce Strategic Alliance To Pioneer Digital Currencies, Including Bitcoin Treasury Integration And RWA Tokenization

July 31, 2025
Most Popular

Bitcoin exchange-traded fund issuer warns spot Ethereum ETF is unlikely to receive SEC approval

April 10, 2024

Warning about deepfake celebrity scam investment videos

December 5, 2023

Dow slides 200 points before the major Fed Conference

May 5, 2025
  • Home
  • About Us
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms and Conditions
© 2025 Crypto Flexs

Type above and press Enter to search. Press Esc to cancel.