Inside Google’s 7-Year Mission to Give AI a Robot Body


Often during evenings and sometimes weekends, when the robots weren’t busy doing their daily chores, Catie and her impromptu team would gather a dozen or so robots in a large atrium in the middle of X. Flocks of robots began moving together, at times haltingly, yet always in interesting patterns, with what often felt like curiosity and sometimes even grace and beauty. Tom Engbersen is a roboticist from the Netherlands who painted replicas of classic masterpieces in his spare time. He began a side project collaborating with Catie on an exploration of how dancing robots might respond to music or even play an instrument. At one point he had a novel idea: What if the robots became instruments themselves? This kicked off an exploration where each joint on the robot played a sound when it moved. When the base moved it played a bass sound; when a gripper opened and closed it made a bell sound. When we turned on music mode, the robots created unique orchestral scores every time they moved. Whether they were traveling down a hallway, sorting trash, cleaning tables, or “dancing” as a flock, the robots moved and sounded like a new type of approachable creature, unlike anything I had ever experienced.

This Is Only the Beginning

In late 2022, the end-to-end versus hybrid conversations were still going strong. Peter and his teammates, with our colleagues in Google Brain, had been working on applying reinforcement learning, imitation learning, and transformers—the architecture behind LLMs—to several robot tasks. They were making good progress on showing that robots could learn tasks in ways that made them general, robust, and resilient. Meanwhile, the applications team led by Benjie was working on taking AI models and using them with traditional programming to prototype and build robot services that could be deployed among people in real-world settings.

Meanwhile, Project Starling, as Catie’s multi-robot installation ended up being called, was changing how I felt about these machines. I noticed how people were drawn to the robots with wonder, joy, and curiosity. It helped me understand that how robots move among us, and what they sound like, will trigger deep human emotion; it will be a big factor in how, even if, we welcome them into our everyday lives.

We were, in other words, on the cusp of truly capitalizing on the biggest bet we had made: robots powered by AI. AI was giving them the ability to understand what they heard (spoken and written language) and translate it into actions, or understand what they saw (camera images) and translate that into scenes and objects that they could act on. And as Peter’s team had demonstrated, robots had learned to pick up objects. After more than seven years we were deploying fleets of robots across multiple Google buildings. A single type of robot was performing a range of services: autonomously wiping tables in cafeterias, inspecting conference rooms, sorting trash, and more.

Which was when, in January 2023, two months after OpenAI introduced ChatGPT, Google shut down Everyday Robots, citing overall cost concerns. The robots and a small number of people eventually landed at Google DeepMind to conduct research. In spite of the high cost and the long timeline, everyone involved was shocked.

A National Imperative

In 1970, for every person over 64 in the world, there were 10 people of working age. By 2050, there will likely be fewer than four. We’re running out of workers. Who will care for the elderly? Who will work in factories, hospitals, restaurants? Who will drive trucks and taxis? Countries like Japan, China, and South Korea understand the immediacy of this problem. There, robots are not optional. Those nations have made it a national imperative to invest in robotics technologies.

Giving AI a body in the real world is both an issue of national security and an enormous economic opportunity. If a technology company like Google decides it cannot invest in “moonshot” efforts like the AI-powered robots that will complement and supplement the workers of the future, then who will? Will the Silicon Valley or other startup ecosystems step up, and if so, will there be access to patient, long-term capital? I have doubts. The reason we called Everyday Robots a moonshot is that building highly complex systems at this scale went way beyond what venture-capital-funded startups have historically had the patience for. While the US is ahead in AI, building the physical manifestation of it—robots—requires skills and infrastructure where other nations, most notably China, are already leading.

The robots did not show up in time to help my mother. She passed away in early 2021. Our frequent conversations toward the end of her life convinced me more than ever that a future version of what we started at Everyday Robots will be coming. In fact, it can’t come soon enough. So the question we are left to ponder becomes: How does this kind of change and future happen? I remain curious, and concerned.


Let us know what you think about this article. Submit a letter to the editor at mail@wired.com.

Leave a Comment