Skip to content
  • Categories
  • Newsletter
  • Recent
  • AI Insights
  • Tags
  • Popular
  • World
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
  1. Home
  2. AI Insights
  3. Qwen-72B Model Tops Hugging Face Open-Source Large Model Pre-training Leaderboard
uSpeedo.ai - AI marketing assistant
Try uSpeedo.ai — Boost your marketing

Qwen-72B Model Tops Hugging Face Open-Source Large Model Pre-training Leaderboard

Scheduled Pinned Locked Moved AI Insights
techinteligencia-ar
1 Posts 1 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • baoshi.raoB Offline
    baoshi.raoB Offline
    baoshi.rao
    wrote on last edited by
    #1

    The open-source large model community Hugging Face has released its latest open-source large model rankings, with Qwen (Tongyi Qianwen) standing out in the pre-trained model category and taking the top spot.

    The Hugging Face open-source large model leaderboard covers hundreds of top open-source large models worldwide, evaluating them comprehensively across six dimensions: reading comprehension, logical reasoning, mathematical computation, factual question answering, and more.

    WeChat Screenshot_20231211085737.png

    Among these models, Qwen's 72B model performed exceptionally well, with its 72 billion parameters and a comprehensive score of 73.6, making it the top performer among all pre-trained models.

    Tongyi Qianwen is a large-scale language model launched by Alibaba Cloud, featuring capabilities such as multi-turn dialogue, copywriting, logical reasoning, multimodal understanding, and multilingual support. On December 1st, Alibaba Cloud announced the open-sourcing of its 72-billion-parameter Tongyi Qianwen model.

    Experience address: https://qianwen.aliyun.com/

    1 Reply Last reply
    0
    Reply
    • Reply as topic
    Log in to reply
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes


    • Login

    • Don't have an account? Register

    • Login or register to search.
    • First post
      Last post
    0
    • Categories
    • Newsletter
    • Recent
    • AI Insights
    • Tags
    • Popular
    • World
    • Groups