American innovative startup Figure, working in partnership with OpenAI, has shown impressive achievements in the development of humanoid robots. Its robot Figure 01, which recently received artificial intelligence from the developers of the well-known ChatGPT, has demonstrated the ability to interact deeply with humans. It perceives speech and commands with the “organs” of vision and hearing, and also gives quite meaningful answers to questions. In the presentation video, Figure 01 was asked to say what it sees in front of it. The robot replied that it saw an apple, a dish drainer, and the person who asked the question. Then it was asked to give something to eat. It chose an apple, explaining that it was the only edible product on the table. The android fulfilled the request to remove the dishes by carefully placing the glass on the drying rack.

The next leap in the development of the Figure robot was made possible thanks to the introduction of artificial intelligence technologies from OpenAI. The data collected by the android through its built-in cameras is processed by a powerful visual language AI model trained by OpenAI. This allows it not only to perceive, but also to understand human speech, and also convert it into specific actions. Specialized Figure neural networks, in turn, ensure the accuracy and speed of the robot’s response to external triggers.

Since its inception, the Figure project has been developing rapidly. And this is not surprising. The founder of the startup, Brett Adcock, assembled a team of experts from leading technology companies, including Google DeepMind, Boston Dynamics, and Tesla. In just a few months, their robot went from basic movements to performing complex tasks and interacting with household appliances. This is evidenced by videos published by the startup, where its brainchild is loading boxes and operating a coffee-making machine. Having gained intelligence, Figure 01 has reached a higher level and is likely to become the first representative of a new generation of smart robots.