The Remote-Controlled Future of Household Robots: A $20,000 Glimpse into a Distant Autonomy
Despite ambitious promises of fully automated homes, the reality of household robotics remains firmly tethered to human control. A new entrant, the Neo robot from Silicon Valley company 1x, exemplifies this paradox: a $20,000 humanoid device capable of pre-order now in Switzerland and Germany, yet reliant on remote operation by human employees. This raises critical questions about the path to true robotic autonomy and the value of current research investments.
The allure is strong. 1x pitches Neo not just as a domestic helper, but as a “social companion” capable of engaging in conversations and providing amusement. Customers can also opt for a $499 monthly rental. This offering fuels hopes that artificial intelligence is on the cusp of delivering a versatile, all-purpose robot to consumers. However, the current iteration of Neo is, as one expert put it, “as useful as a mixer without a blade.”
The fundamental challenge lies in replicating common sense. Tasks that are trivial for humans – distinguishing a sock from a ball, understanding the danger of placing a hot pan on plastic – remain significant hurdles for robots. They lack the nuanced understanding of force required for delicate tasks, such as gripping a wine glass or handling produce. “They don’t know how hard they can grip a wine glass,” the source text notes, “And that to put a tomato in a vegetable box, they have to apply a different force than when transporting a pumpkin.”
For now, Neo operates through tele-operation. A 1x employee remotely controls the robot’s movements via an internet connection, utilizing cameras within the robot to view the customer’s home. This arrangement, while enabling functionality, introduces significant privacy concerns. Potential buyers must accept that dozens, or even hundreds, of individuals could have access to footage of their homes, with 1x retaining both video and robot data for future development. According to a statement from 1x CEO Bernt Bornich to the Wall Street Journal, this level of access “must be ok for you for the product to be useful.” The company has not disclosed pre-order numbers, and a request for comment from the NZZ remained unanswered.
The primary justification for this remote-control model is data collection. 1x aims to replicate the success of generative AI models like ChatGPT, which achieved proficiency through massive datasets. By having humans demonstrate tasks, the company hopes to amass the data necessary to train an AI capable of autonomous operation. While the robot washes a wine glass under human direction, 1x records the force, movement, and sequence of actions, building a database for future AI development.
However, the scale of data required is daunting. Ken Goldberg, a robotics professor at the University of California, Berkeley, estimates it would take approximately 100,000 years of data to train a general-purpose robot – equivalent to the time it took to compile the data used to train ChatGPT. Currently, researchers have only collected roughly one year’s worth of data, suggesting that, “at the current rate of data collection, a universal robot (…) would be available in about 100,000 years.”
While this timeline seems insurmountable, Goldberg suggests alternative approaches, such as leveraging data from 100,000 cleaners equipped with motion-tracking technology. Imitation Learning, a technique where robots learn by observing human movements, has shown promise in specific applications. However, simulations, while capable of producing impressive feats like robotic backflips (as demonstrated by Boston Dynamics), fall short when it comes to the complex manipulation of real-world objects. The fine motor skills of the human hand remain beyond the capabilities of current computer models, hindering tasks like picking up a single sheet of paper or gauging the ripeness of a tangerine.
The debate extends beyond data acquisition. Aude Billard, a robotics professor at EPFL, argues that current research is “burning millions of francs on the wrong research,” prioritizing data collection over fundamental theoretical breakthroughs. She advocates for “securing limited but high-quality data” and “developing the right models,” drawing a parallel to cosmology: increased observation alone doesn’t advance understanding without intelligent interpretation.
Roland Siegwart, a robotics professor at ETH Zurich, adds another layer to the discussion, emphasizing the need for advancements in artificial intelligence itself. “Teaching a computer to learn is still extremely inefficient at the moment,” he states. He points out that humans can recognize new concepts from just a few examples, a capability that remains elusive for algorithms. Furthermore, robots must simultaneously process visual information, control motor functions, and interpret sensory data – a computationally intensive task.
Ultimately, both Siegwart and Billard believe that decades of research are still required before truly autonomous household robots become a reality. 1x, acknowledging these challenges, plans to begin delivering remote-controlled Neos to customers in the USA next year, with European delivery planned for a later date. The Neo robot, therefore, represents not a solution, but a data-gathering experiment – a costly, and potentially privacy-invasive, step toward a future where robots may finally be able to handle the hot pan without incident.
