Boston Dynamics & Google DeepMind: Spot Robot Now Reads Handwritten Notes and Follows Natural Language Commands
April 18, 2026 — Boston Dynamics has integrated Google DeepMind's Gemini Robotics-ER 1.6 model into its Spot quadruped robot, transforming it from a scripted patrol unit into an intelligent assistant that understands natural language and physical context.
Key Capabilities:
- Natural Language Tasking: Spot now interprets handwritten instructions like "put away the shoes" and autonomously plans actions
- Physical Understanding: The robot reads its environment through cameras, interprets visual context, and executes multi-step tasks
- Industrial Autonomy: No human operator needed — Spot detects, classifies, and reports anomalies independently
Real-World Deployment: The upgrade targets industrial environments where Spot already operates: factories, oil-chemical facilities, and data centers. The robot now patrols autonomously, reads pressure gauges, detects water leaks, thermal anomalies, and open doors without pre-programmed scripts.
Partnership Background: The Boston Dynamics & Google DeepMind partnership began in early 2025. Gemini Robotics-ER 1.6 is the first public result, with plans to improve physical world understanding through new training scenarios.