TH
← Back
news 2026-04-15 · NVIDIA Newsroom

What if robots could understand spoken commands, see the world around them, and carry out complex tasks on their own...

What if robots could understand spoken commands, see the world around them, and carry out complex tasks on their own...

What if robots could understand spoken commands, see the world around them, and carry out complex tasks on their own... how much would the world change?


Ever asked someone to grab something off a table? They just do it — no need to explain how to move their arm, how tight to grip, because the human brain figures it out. Robots couldn't do that... until now.


NVIDIA just unveiled two breakthroughs that will reshape robotics:

1. GR00T N1.7 — a robotic "brain" that understands human language, perceives its surroundings, and takes action. It's ready for real-world deployment now.

2. Newton 1.0 — an open-source physics simulator (co-built with Google DeepMind and Disney) that lets robots train in virtual worlds before performing in the real one.

Already backed by 110 partner companies worldwide — from ABB and FANUC to Universal Robots and Figure.


🎯 Why this matters:


It's like training a new employee who you simply tell "organize the shelf by size" — and they just do it. That's what GR00T gives robots.

Imagine factories where robots understand commands instantly, restaurants where robots serve customers, farms where robots distinguish weeds from crops — all of this is becoming real.


Robots aren't science fiction anymore — they're stepping into the real world as actual coworkers.

📄 Source

NVIDIA Newsroom
Share: Facebook 𝕏
← Previous
Humans created the hardest exam they could imagine
Next →
The AI video tool that made global headlines and h