top of page

Tesla finally unveiled its humanoid robot project code-named Optimus at its recent AI day. As many had already speculated, it was not a ground-breaking revelation full of cutting-edge technologies that would revolutionise the world of humanoid robotics. Indeed, we believe that the technical demonstration was akin to leading robotics platforms from around 7-8 years ago. However, it does appear that they have come far in what seems to have been a relatively short period of time.

Tesla’s CEO Elon Musk has recently said that he expected the robotics platform to be worth more than their EV car business one day revealed that they are focusing on human biology inspired hands with opposable thumbs, which would make Optimus useful for most settings. The robot would be powered by a 2.3 kWh battery pack with 500 W of peak power usage when active giving the robot a battery life of just over four hours. They envision that the robot would be able to move up to five miles per hour and, in mass production, would eventually cost around $20,000 or less.


He wants to achieve all these objectives in the next three to five years, which is an aggressive timetable. While many humanoid robotics projects are ongoing in the world today, progress towards the vision of humanoid worker robots remains slow.

In many ways, a multi-role humanoid robot is a much more difficult engineering feat than a level-5 autonomous driving car which Tesla is also trying to work towards.



Updated: Feb 24, 2023

Drones working together can create large 3D-printed structures made of foam or cement. Such creations would only be restricted by structural engineering constraints and factors like drone flight logistics.

The drone swarm construction takes inspiration from animals such as wasps and termites. “If you want to build something very large, typically in nature what happens is that many animals work together,” said Prof Mirko Kovac at Imperial College London and Advisory Board Member at Robocap, who led the project.

Kovac and his colleagues showed how several drones could cooperatively build a 2-metre-tall cylinder made of insulation foam and a 0.18-metre-tall cylinder made of special cement. The Aerial-AM framework comprises two kinds of aerial robot: the BuilDrone and the ScanDrone. The BuilDrone autonomously deposits the physical building material according to a predefined plan, while the ScanDrone monitors the structure as it takes shape, providing real-time feedback to the BuilDrone, which adjusts for variations with every layer deposited. The BuilDrone is also equipped with an adjustable nozzle — a “delta manipulator” — that can immediately compensate for errors that arise as the building material is deposited.


Each drone can operate for up to 10 minutes before needing to reload building materials and sometimes get a fresh battery. Additional testing and simulations demonstrated how up to 15 drones could coordinate flight paths and work together to build a dome. The drones can make their own AI-guided decisions about where to fly and how to deposit building materials, but they still require human supervision.



The European Commission has proposed new rules to help people harmed by products using artificial intelligence (AI) and digital devices like drones. The AI Liability Directive would reduce the burden of proof on people suing over incidents involving such items.

Justice Commissioner Didier Reynders said it would make a legal framework that was fit for the digital age. Self-driving cars, voice assistants and search engines could all fall under the directive's scope. If passed, the Commission's rules could run alongside the EU's proposed Artificial Intelligence Act, the first law of its kind to set limits on how and when AI systems can be used.

The AI Liability Directive will introduce a "presumption of causality" for those claiming injuries by AI-enabled products.

This means victims will not have to untangle complicated AI systems to prove their case, so long as a causal link to a product's AI performance and the associated harm is established. For a long time, social media firms have hidden behind the caveat that they are merely platforms for other people's information and therefore not responsible for the content of it.

The European Union does not want to repeat this scenario, with companies which make drones, for example, getting off the hook if they cause harm just because the firm itself was not directly behind the controllers.

If your product is set up to be able to cause distress or damage, then you need to take responsibility if it does, is the clear message.



bottom of page