Skip to content
  • Categories
  • Newsletter
  • Recent
  • AI Insights
  • Tags
  • Popular
  • World
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
  1. Home
  2. AI Insights
  3. R&D Cost $10 Billion! NVIDIA's Latest AI Chip GB200 Priced Over $30,000
uSpeedo.ai - AI marketing assistant
Try uSpeedo.ai — Boost your marketing

R&D Cost $10 Billion! NVIDIA's Latest AI Chip GB200 Priced Over $30,000

Scheduled Pinned Locked Moved AI Insights
ai-articles
1 Posts 1 Posters 3 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • baoshi.raoB Offline
    baoshi.raoB Offline
    baoshi.rao
    wrote last edited by
    #1

    NVIDIA introduced the new-generation GPU Blackwell platform at the GTC2024 conference, with the first chip named GB200, which will be released this year. The GB200 incorporates two B200 Blackwell GPUs and an Arm-based Grace CPU.

    According to NVIDIA CEO Jensen Huang, the chip is priced between $30,000 (approximately ¥220,000) and $40,000.

    He estimates that NVIDIA spent approximately $10 billion (around ¥72 billion) on research and development costs. This pricing range is similar to the previous generation H100 (Hopper), but it brings significant performance improvements.

    According to reports, the GB200 utilizes TSMC's 4-nanometer (4NP) process technology, integrating chips together with 10TB/sec NVLink5.0 connections.

    The GB200 contains 208 billion transistors, while the previous generation H100 only had 80 billion transistors. In terms of artificial intelligence, the GB200's AI performance is 20 petaflops per second, while the H100's is 4 petaflops per second.

    1 Reply Last reply
    0
    Reply
    • Reply as topic
    Log in to reply
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes


    • Login

    • Don't have an account? Register

    • Login or register to search.
    • First post
      Last post
    0
    • Categories
    • Newsletter
    • Recent
    • AI Insights
    • Tags
    • Popular
    • World
    • Groups