Elon Musk opened Tesla's AI Day with a call for AI researchers to join the company and help them build up Tesla's advanced computer vision and self-driving technology.
Musk wasn't lying when he said that this was an event for AI professionals that the company was hoping to recruit. This was easily one of the more opaque events that we've seen in a very long time.
Tesla's Autopilot takes the stage
AI Day's first deep dive came from Andrej Karpathy, Tesla's Director of AI and Autopilot Vision, who came on stage to describe the development of Tesla's computer vision system that powers Autopilot.
One of the more interesting things Karpathy discussed was the so-called HydraNets which are multi-task learning systems. Particularly of note was the challenges that these systems faced with the car's Summon function and how it navigated to the user.
Apparently, autopilot really has trouble with curbs, and the reason why is pretty fascinating. Because it is receiving data from eight different cameras, translating identified objects, represented by bits, into the vehicle's "vector space" and trying to figure out where in your car's three-dimensional map of the area an object actually exists is a hellish challenge.
Better heuristics can help, though, by cutting down the number of individual data nodes a car needs to process when using its pathing algorithm. A small parking lot can have as many as 400,000 nodes that need to be processed, so a street with traffic may have many more, a still-insurmountable challenge for full self-driving (FSD) vehicles.
What is Tesla Dojo?
Tesla's new Dojo computing architecture got some love at Tesla's AI Day, with a pretty thorough breakdown of the new system's distributed architecture and interconnected "Learning Units" (LU).
Acting like the Tensor cores in Nvidia's latest GPUs, these LUs would be networked together in a fabric and form the building blocks of the Dojo D1 chip, which itself would be arrayed on a single die with other D1 chips to make up a Dojo tile.
It doesn't end there, since these tiles — effective machine learning server blades — would be stacked up and interconnected into a whole new kind of server cabinet for machine learning that could provide 1.1 exaFLOPS of computing power (that's 1018 instructions per second) dedicated to neural networks and machine learning research.
None of this has been fully built to scale yet (that's why they need AI researchers and engineers, obviously), but it sounded impressive enough, much like most of Tesla's technological claims over the years. Can Tesla deliver on this one? It'd be cooler if they did, we'll just say that.
Is Tesla Bot for real?
Most of us were expecting some kind of surprise or reveal at the end of the AI Day event, and while we shouldn't be surprised that Elon Musk and Tesla want to build a humanoid robot that could one day replace human workers, it still was.
Humanoid robotics is a field with a long and deep history and the challenges of maneuvering a bipedal robot on even level terrain is a completely different beast than FSD cars.
Still, Musk isn't wrong that Tesla's work with artificial intelligence and machine learning for its Autopilot and FSD does it least give Tesla something of a head start on this one.
Do we think the Tesla Bot will ever see the light of day as a prototype? Almost certainly not, but you can't rule out a couple of prototype units somehow making their way to an event somewhere to at least show that Tesla made the attempt.
What is Tesla's 'AI Day'?
More than anything, Tesla's AI Day is, by Musk's own admission, a recruiting event for engineers in the field of artificial intelligence, a subset of programmers, researchers, and engineers that are increasingly in demand and in relatively short supply.
Musk's promises of Tesla FSD by 2020 proved harder to deliver than Musk apparently thought, and if he's going to regain the trust of investors then he's going to need to start delivering on at least some of his bigger promises.
The Model S Plaid+, which was supposed to make use of new battery technology to allow a car sold for less than $25,000 to have a 200+ mile range, was recently canceled. His visions for Autopilot and FSD are running into regulatory challenges after a number of high-profile accidents involving Tesla vehicles where Autopilot may have played a role.
Musk's vision for Dojo, his Gigafactories, solar-powered homes equipped with Tesla batteries, and all the rest is running into an increasingly skeptical audience who have seen decent enough results to be convinced that Musk isn't running a scheme, but whose often market-moving statements increasingly fail to meet concrete targets.
In many ways, AI Day is more about Musk and Tesla bringing on board the kind of quality engineers who can hopefully get things back on track for Tesla by eliminating the kinds of edge case accidents and failures that are starting to drag Autopilot and Musk's FSD dreams down.
If Dojo is as powerful as Tesla believes it is, then that is definitely a step in the right direction, but the proof is in the doing, and so far Musk hasn't done it yet.
This isn't to say that Tesla's AI processing chips aren't genuinely impressive, but Musk has yet to prove that this hardware solution is up to the task of safely driving a car from point A to B in a less than ideal environment. He's not alone in this. Industry experts regularly put fully autonomous, level 5 self-driving technology much further down the road than Musk ever has – and for good reason.
If Teslas' autopilot systems are being attracted to parked and unoccupied ambulances enough to cause an accident and Musk cannot tell us why that is happening, then it's going to become an increasingly heavy anchor on Tesla's brand in a way that Musk simply won't be able to tweet his way out of.
The challenge with neural networks is that they process so much data so quickly to come up with solutions to problems that asking how a machine learning algorithm came to a particular conclusion is all but impossible. They are proverbial black boxes, whose inner workings you understand but can never actually see.
If something in Tesla's Autopilot system is bizarrely attracted to emergency vehicles so much that it crashes into them, then that's not a problem with any particular input, that's a problem with the black box you're putting that input into, and there's no reason to believe that if it isn't ambulances it won't be school buses next time.
Fixing these problems will be a grueling challenge. Maybe some bright young engineers at Tesla's AI Day have some fresh new ideas. It's not like Musk and Tesla don't need the help right now.