Ping-pong robot beats top-level human players

· hardware · Source ↗

TLDR

  • Sony AI’s robot “Ace” defeated top-level table tennis professionals, published in Nature, marking a rapid SOTA leap in physical robotics.

Key Takeaways

  • The robot is built by Sony AI and named “Ace”; results are peer-reviewed in Nature (2026).
  • One year ago, Google DeepMind’s SOTA table tennis robot could only beat amateurs who don’t actually play the sport.
  • The pace of improvement from amateur-level to professional-beating in one year mirrors the trajectory seen in coding AI, not the slow grind expected from physical robotics.
  • Setup appears to require heavy, controlled lighting and specialized sensor rigs based on the paper’s methodology.

Hacker News Comment Review

  • The dominant reaction is surprise at the speed of physical robotics progress after a decade of Boston Dynamics demos that never materialized into real capability; multiple commenters invoke the Deep Blue/Kasparov moment as the closest analogy.
  • A structural fairness debate emerged: human players read opponents’ body kinematics to predict spin and trajectory; a robot with no readable human movement may give pros less to work with, making the win condition harder to interpret.
  • Some commenters set a higher bar, arguing the result only becomes meaningful when the robot faces similar kinematic constraints as a human body rather than purpose-built actuator geometry.

Notable Comments

  • @dmurray: Documents the one-year gap between DeepMind’s amateur-level SOTA and Ace, framing it as an anomalously fast jump for physical robotics.
  • @halfnhalf: Raises the opponent-readability issue – pros train to predict shots from human body cues, which a robot opponent does not provide.
  • @janalsncm: Links the Nature paper and notes the bright lighting requirement, suggesting controlled lab conditions matter to the result.

Original | Discuss on HN