Skip to content
  • Categories
  • Newsletter
  • Recent
  • AI Insights
  • Tags
  • Popular
  • World
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
  1. Home
  2. AI Insights
  3. New AI Framework H2O: Real-time Conversion of Human Actions to Robot Movements - Walking, Backflips, Kicking, etc.
uSpeedo.ai - AI marketing assistant
Try uSpeedo.ai — Boost your marketing

New AI Framework H2O: Real-time Conversion of Human Actions to Robot Movements - Walking, Backflips, Kicking, etc.

Scheduled Pinned Locked Moved AI Insights
ai-articles
1 Posts 1 Posters 5 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • baoshi.raoB Offline
    baoshi.raoB Offline
    baoshi.rao
    wrote last edited by
    #1

    In the latest research, a team from Carnegie Mellon University has developed a framework called H2O (Human to Humanoid), which achieves real-time full-body remote control of humanoid robots through reinforcement learning.

    This framework successfully enables humanoid robots to imitate and perform various dynamic full-body movements in real-time using only RGB cameras, including walking, backflips, kicking, turning, waving, pushing, boxing, and more. To achieve this groundbreaking technology, the research team proposed a scalable 'simulation-to-reality' processing pipeline to construct large-scale human motion datasets, providing training samples for real-time teleoperation of humanoid robots. During the process, feasible actions were filtered and selected through privileged imitation, optimizing the humanoid robot's body model to ensure high fidelity in replicating human movements.

    The framework consists of three key stages: First, by optimizing shape and motion parameters, the SMPL (Skinned Multi-Person Linear model) body model was aligned with the humanoid robot's structure to form a foundational motion dataset. Second, through training a privileged imitation policy, abnormal and infeasible actions were removed from the motion dataset, generating a more realistic and refined motion dataset. Finally, by training a real-time imitation policy in a simulated environment, zero-shot teleoperation of the humanoid robot was achieved.

    The real-time teleoperation process involves using RGB cameras and pose estimators to capture human movements, which are then instantly imitated and executed by the humanoid robot through the trained imitation policy. This research has not only successfully achieved learning-based real-time teleoperation of full-body movements but has also demonstrated its applications in real-world scenarios such as kicking balls with both feet, handing over boxes, moving forward and jumping backward, and boxing. It is understood that this is the first technological breakthrough in the field of humanoid robots, providing strong support for new human-machine interaction and collaboration scenarios.

    Project entry: https://human2humanoid.com/

    1 Reply Last reply
    0
    Reply
    • Reply as topic
    Log in to reply
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes


    • Login

    • Don't have an account? Register

    • Login or register to search.
    • First post
      Last post
    0
    • Categories
    • Newsletter
    • Recent
    • AI Insights
    • Tags
    • Popular
    • World
    • Groups