Watch awkward Chinese humanoid robot lay it all down on the dance floor - Live Science
The vision element of Adam's VLA is driven by an Intel RealSense D455 depth vision sensor, which enables precise 3D environment modeling and real-time spatial awareness, as well as a number of lidar and standard cameras.

Key takeaways
Humanoid robots have moved from high‑profile demos to early‑stage commercial use in early 2026. At CES 2026, companies such as Figure AI demonstrated that its Figure 02 units can run 10‑hour shifts on BMW’s X3 line, having already handled 90 000 parts for 30 000 vehicles, while Boston Dynamics unveiled a production‑ready Atlas that will be built in tens of thousands of units per year for Hyundai’s Savannah plant. Tesla began training its Optimus Gen 2 models at the Austin Gigafactory using imitation learning, and the company now ranks among the top five humanoid suppliers with roughly 5 percent of global installations. A market‑tracking report shows that total humanoid deployments reached 16 000 units in 2025, dominated by Chinese firms—AgiBot (31 % share with more than 5 000 X2/G2 units), Unitree (27 %) and UBTech (just over 5 %). Hyundai and Siemens reported a proof‑of‑concept logistics deployment, and Schaeffler signed a five‑year plan to place hundreds of humanoids in factories beginning in 2026‑27. Faraday Future’s new FF AI‑Robotics division announced three consumer‑oriented robots, including a full‑size humanoid slated for shipment in late February, positioning the technology as the “iPhone moment” for robots. Not all milestones are smooth: XPeng’s IRON robot fell face‑first during a public showcase, highlighting ongoing balance challenges. Meanwhile, researchers at NUS and SMART unveiled a neural‑blueprint that gives soft‑robot platforms human‑like intelligence, promising more adaptable future humanoid systems. Together, these developments indicate that humanoid robots are transitioning from laboratory prototypes to limited industrial and commercial roles, with rapid growth expected over the next few years.
The vision element of Adam's VLA is driven by an Intel RealSense D455 depth vision sensor, which enables precise 3D environment modeling and real-time spatial awareness, as well as a number of lidar and standard cameras.
Sign up for the Live Science daily newsletter now
Get the world’s most fascinating discoveries delivered straight to your inbox.
The fully humanoid robot shown dancing is being developed in parallel with PNDbotics' Adam-U robot, a stationary model designed primarily as a data collection platform. According to the company's website, four fully mobile humanoid robots are also in development, with varying levels of freedom of movement, sensory capabilities and computing power.
Mentioned in this article