Meta acquires Assured Robot Intelligence to build Android for humanoid robots

The TL;DR
Meta has acquired Assured Robot Intelligence, a startup co-founded by former Fauna Robotics co-founder Lerrel Pinto and former Nvidia researcher Xiaolong Wang, to bolster its humanoid robotics platform strategy. The deal, which brings full-body robot control models and touch sensor technology to Meta Superintelligence Labs, reflects Meta’s desire to be the Android of humanoids: provide the intelligence layer and let others build the machines.
Lerrel Pinto co-founded Fauna Robotics, a startup developing an accessible bipedal robot called Sprout. He left in 2025. Amazon acquired Fauna in March, along with its 50 employees and its three-and-a-half-foot-tall dancing humanoid, to enter the consumer robot market. Pinto then co-founded Assured Robot Intelligence with Xiaolong Wang, a former Nvidia researcher and associate professor at UC San Diego who won the MLSys 2024 Best Paper Award for work on developing an AI model. On Friday, Meta acquired ARI and both founders joined Meta Superintelligence Labs. The acquisition was closed the same day it was announced. Financial terms were not disclosed. The interesting question isn’t what Meta paid for a startup with employees based in San Diego and New York. That’s what Meta aims to do with the technology, and what that aim reveals about the company’s theory of how the humanoid market will develop.
The platform
Meta’s stated goal of robotics is to replicate what Google’s Android operating system and Qualcomm’s chips are doing for the smartphone industry: creating a foundation for everyone to build on. The company launched Meta Robotics Studio last year, hired former Cruise CEO Marc Whitten to lead the effort, and began hiring about 100 engineers to develop the humanoid’s internal hardware alongside the AI models that power them. CTO Andrew Bosworth described humanoid robots as Meta’s next bet for a scale comparable to augmented reality, a category where Meta has already spent tens of billions through its Reality Labs division. ARI’s acquisition adds some strength to this effort: robotic control models that allow humanoids to understand, predict, and adapt to human behavior in unstructured environments.
The platform strategy is clear. Meta aims to develop sensors, software, and AI models for robots and make them available across the industry, meaning the technology can be used by manufacturers that Meta does not own or control. This is the Android model used in virtual machines. For smartphones, Google provided an operating system and captured value through search, advertising, and the Google Play Store ecosystem. For robotics, Meta will provide a layer of intelligence and capture value through data, an ecosystem model, and integration with existing Meta platforms, where 3.3 billion people already interact daily. Meta has been aggressively acquiring AI talent, hiring five founding members of Think Machines Lab, including a researcher whose six-year package reportedly reached $1.5 billion. The acquisition of ARI fits the same pattern: small team, limited capacity, rapid integration into the Superintelligence Labs research division.
Technology
ARI’s technology contribution centers on what the company calls “robotic intelligence designed to enable robots to understand, predict and adapt to human behavior in complex and dynamic environments.” In practice, this means AI models for humanoid control of the whole body, the ability to coordinate the robot’s limbs, balance, and movement in response to real-time sensory input from the unpredictable real world. So they work well on a limited computer located inside the robot, rather than needing to be connected to a remote data center.
TNW City Coworking Space – Where your best work happens
A workplace designed for growth, collaboration, and endless networking opportunities at the heart of technology.
The company also developed Flesh, a tactile sensor that measures deformation in 3D printable microstructures using magnets and magnetometers. Tactile Sensing is one of the unsolved problems in humanoid robots. A robot that can see its location through cameras and lidar still can’t tell the difference between catching an egg and catching a tennis ball without tactile feedback. The gap between how robots learn through simulation and how they operate in the physical world remains a major barrier to implementation at scale. ARI’s self-learning work in robot control, combined with its sensor technology, addresses both sides of that gap: better models and better sensor input.
The market
The market for humanoid robots has gone from speculative to competitive over an 18-month period. Tesla plans to start mass production of its Optimus V3 humanoid between July and August, aiming for an annual volume of one million units by the end of 2026 and a price point between $20,000 and $30,000. 1X Technologies opened a factory in Hayward, California, to produce 10,000 NEO humanoid robots in its first year, the first year’s production capacity sold out within five days of the opening of pre-orders. Apptronik raised $520 million at a $5 billion valuation, partnering with Google DeepMind and its Gemini Robotics models. Amazon has made two robotics acquisitions in one month. Unitree aims for 20,000 humanoid shipments by 2026. Morgan Stanley projects that the global humanoid robot market will reach $38 billion by 2035 and $5 trillion by 2050.
The dynamics of competition specify three categories. The first category is direct integrated manufacturers, companies like Tesla and 1X that design, build, and sell a complete robot. The second category is platform providers, companies that provide the intelligence layer, operating system, or core components used by multiple manufacturers. The third tier is component suppliers, chip makers and sensor companies that sell both. The Meta places itself in the second category, and it is not alone. Google, through DeepMind’s Gemini Robotics program and its partnership with Apptronik, is pursuing a similar platform strategy. Europe is developing its own approach to the humanoid race, with companies and research institutes pursuing strategies that emphasize safety, industrial precision, and compliance over the speed-to-market approach favored by American and Chinese rivals.
A bet
Meta’s history with hardware platforms is instructive. The company missed the mobile phone. Facebook Home, its 2013 attempt to become the default interface on Android phones, was discontinued within a year. The company then spent more than $50 billion on Reality Labs trying to own the next computing platform in virtual and augmented reality, a bet that has yet to produce a return on anything approaching the scale of its advertising business. Ray-Ban Meta’s smart glasses are the closest the company has come to a successful hardware product outside of its social media presence, and those are actually a device for Meta’s AI assistant rather than a standalone computing device.
Robot betting is different in one way. Meta doesn’t try to make the hardware scale itself. It tries to provide the intelligence, models, sensor technology, and software stack, and let others build the machines. This is a lower-case strategy, more sophisticated than the Real Lab approach, and plays to the true potential of Meta in AI research, open source model distribution, and the economic arena. But it depends on the humanoid market that develops the way the smartphone market was developed: by hundreds of manufacturers that need a common software platform. If the market instead consists of several directly integrated players, each with a proprietary AI, the Android model doesn’t work. Tesla doesn’t look at the app. And it’s not 1X. Companies that might want a layer of Meta intelligence are those that have not yet been discovered, the humanoid proportions of Samsung and Xiaomi and Oppo, manufacturers that can build bodies but need someone else to provide the brain. Meta is betting those companies are coming. The purchase of ARI is the latest investment in ensuring that when they arrive, Meta’s technology is the first to reach them.



