At annual Tesla AI Day event , Tesla CEO Elon Musk revealed the company’s prototype robot meant to revolutionize their assembly line. The robot was only able to walk on stage and wave to the crowd. Then Elon and his team showed what they hoped for the production unit to look like once it’s ready.
Uh welcome to tesla ai day 2022. i do want to set some expectations with respect to uh our optimus robot um as as you know last year it was just a person in a robot suit uh but uh we’ve now we’ve come a long way and it’s i think you know compared to that it’s going to be very impressive so should we should we bring out the vot before we do that we have one one
Little bonus tip for the day this is actually the first time we try this robot without any backup support cranes mechanical mechanisms no cables nothing yeah want to join with you guys tonight but it was the first time let’s see you ready let’s go foreign so this is essentially the simple self-driving computer that runs in your tesla cars by the way
This is the this is literally the first time the robot has operated without a tether was on stage tonight so the robot can actually do a lot more than we just showed you we just didn’t want it to fall on its face uh so we’ll we’ll show you some videos now of the robot doing a bunch of other things yeah we wanted to show a little bit more what we’ve done over the
Past few months with apart and just walking around and dancing on stage ah just humble beginnings but you can see the autopilot neural networks running as is just retrained for the bud uh directly on that on that new platform that’s my watering can yeah when you see a rendered view that’s that’s the robot what’s the that’s the world the robot sees so it’s it’s
Very clearly identifying objects like this is the object it should pick up picking it up we use the same process as we did for autopilot to connect data in train your networks that we then deploy on the robot that’s an example that illustrates the upper body a little bit more what you saw was what we call bumble c that’s our uh sort of rough development robot uh
Using semi-off-the-shelf actuators um but we actually uh have gone a step further than that already the team’s done an incredible job um and we actually have an optimist bot with a fully tesla designed at both actuators um battery pack uh control system everything um it it wasn’t quite ready to walk uh but i think it will walk in a few weeks but we wanted to show
You the robot uh the the something that’s actually fairly close to what will go into production and um and show you all the things it can do so let’s bring it out let’s do it all right with the degrees of freedom that we expect to have in optimus production unit one which is the ability to move all the fingers independently move the to have the thumb
Have two degrees of freedom so it has opposable thumbs and both left and right hand so it’s able to operate tools and do useful things our goal is to make a a useful humanoid robot as quickly as possible the optimus is designed to be an extremely capable robot but made in very high volume probably ultimately millions of units and it is expected to cost much less
Than a car uh i would say probably less than twenty thousand dollars the the potential for optimistic is i think appreciated by very few people hey as usual tesla demos are coming in hot software integration hardware upgrades over the months since then but in parallel we’ve also been designing the next generation this one over here obviously there’s a lot that’s
Changed since last year but there’s a few things that are still the same you’ll notice we still have this really detailed focus on the true human form so on the screen here you’ll see in orange are actuators which we’ll get to in a little bit and in blue our electrical system so in the middle of our torso actually it is the torso we have our battery pack the sized
At 2.3 kilowatt hours which is perfect for about a full day’s worth of work what’s really unique about this battery pack is it has all of the battery electronics integrated into a single pcb within the pack so going on to sort of our brain it’s not in the head but it’s pretty close also in our torso we have our central computer so we still are gonna it’s gonna do
Everything that a human brain does processing vision data making split second decisions based on multiple sensory inputs and also communications so to support communications it’s equipped with wireless connectivity as well as audio support and then it also has hardware level security features which are important to protect both the robot and the people around the
Robot so now that we have our sort of core we’re going to need some limbs on this guy and we’d love to show you a little bit about our actuators and our fully functional hands as well so there are many similarities between a car and the robot when it comes to powertrain design the most important thing that matters here is energy mass and cost in the particular
Case you see a car with two drive units and the drive units are used in order to accelerate the car 0 to 60 miles per hour time or drive a city drive site while the robot that has 28 actuators um it’s not obvious what are the tasks at the actuator level so we have tasks that are higher level like walking or climbing stairs or carrying a heavy object which need
To be translated into joint into joint specs the rotary actuator in particular has a mechanical clutch integrated on the high speed side angular contact ball bearing and on the high speed side and on the low speed side a cross roller bearing and the gear train is a strain wave gear and there are three integrated sensors here and the bespoke permanent magnet
Machine so our actuator is able to lift a half tone nine foot concert grand piano our fingers are driven by metallic tendons that are both flexible and strong we have the ability to complete wide aperture power grasps while also being optimized for precision gripping of small thin and delicate objects some basic stats about our hand is that has six actuators and
11 degrees of freedom it has an in-hand controller which drives the fingers and receives sensor feedback reported directly from autopilot to the bots situation it’s exactly the same occupancy network that we’re talking to a little bit more details later with the autopilot team that is now running on the butt here in this video the only thing that changed really is
The training data that we had to recollect we’re also trying to find ways to improve those occupancy networks using work made on your radiance fields to get really great volumetric rendering of the bots environments for example here some machinery that the bot might have to interact with so we’ve been training more neural networks to identify high frequency features
Key points within the bots camera streams and track them across frame over time as the bot navigates to its its environment and we’re using those points to get a better estimate of the bots pose and trajectory within its environment as it’s walking and this is a video of the motion control code running into your product simulator simulator showing the evolution of
The robot’s work over time and so as you can see we started quite slowly in april and start accelerating as we unlock more joints and uh deeper more advanced techniques like arms balancing over the past few months we wanted some manipulate objects while looking as natural as possible and also get there quickly so what we’ve done is we’ve broken this process down
Into two steps first is generating a library of natural motion references or we could call them demonstrations and then we’ve adapted these motion references online to the current real world situation so let’s say we have a human demonstration of picking up an object we can get a motion capture of that demonstration which is visualized right here as a bunch of
Keyframes representing the locations the hands the elbows the torso we can map that to the robot using inverse kinematics and if we collect a lot of these now we have a library that we can work with but a single demonstration is not generalizable to the variation in the real world for instance this would only work for a box in a very particular location so what
We’ve also done is run these reference trajectories through a trajectory optimization program which solves for where the hand should be how the robot should balance during uh when it needs to adapt the motion to the real world so for instance if the box is in this location then our optimizer will create this trajectory instead i think the first thing within the
Next few weeks is to get optimists at least at par with bumble c the other bug prototype you saw earlier and probably beyond we’re also going to start focusing on the real use case at one of our factories and i make this product a reality and change the entire economy all of this was done in barely six or eight months thank you very much
Transcribed from video
Elon Musk reveals a humanoid robot at Tesla AI Day 2022 By The Verge