Eka, a Cambridge startup founded by an MIT prof and ex-DeepMind researcher, demos a robotic gripper that handles novel objects with fluid, recovery-aware dexterity unlike anything currently on the market.
Key Takeaways
Eka uses a proprietary vision-force-action model trained in physics-accurate simulation, incorporating mass, inertia, and touch sensors, not human demonstration videos like competing VLA approaches.
The sim-to-real transfer gap, which killed OpenAI’s Dactyl project, is Eka’s claimed core differentiator; founders say their simulation fidelity closes it more reliably than rivals.
Demos include screwing in a light bulb, handling a jumble of keys with a plush fob, and placing chicken nuggets into moving conveyor containers with improvised short tosses.
Founders Pulkit Agrawal and Tuomas Haarnoja compare their trajectory to GPT-1 pre-ChatGPT: nascent but general physical intelligence that should scale.
Target is superhuman dexterity, with fine manipulation like iPhone assembly cited as the long-term benchmark.