Tesla Autopilot Mystery Solved — HW3 Full Potential Soon To Be Unlocked

Credit to Author: Chanan Bos| Date: Fri, 31 Jan 2020 06:12:38 +0000

Published on January 31st, 2020 | by Chanan Bos

January 31st, 2020 by  

In June 2019 after Tesla’s Autonomy Investor Day, we did a deep dive into Tesla’s HW3 chip that explored the various capabilities and potential of the HW3 processor system on a chip (SoC). I may have geeked out a little and made that a bit too technical, so I will try not to repeat that mistake in this article. Long story short, HW3 is a total beast. It is very different from the NVIDIA chip Tesla was using in the previous generation. So, it was very surprising when Elon said that there was no big rush to retrofit existing cars because “right now” (back then) you would not notice much of an increase in performance over HW2. 

For the next bit, you are going to need to take a good look at the image below, so take a moment to study it and return to it if necessary.

So, retrofitting HW2 to HW3 in H2 2019 does not improve performance. What this likely means is that Tesla pretty much just took the existing Autopilot software designed for HW2 and emulated it to run on HW3. Now, for those of you unfamiliar with emulation, a good explanation is the movie Inception, but for computers. Imagine a Windows 10 computer running the Android operating system in a window — basically, Android is a program on the computer, not the operating system.

In this case, HW3 emulated HW2 to get the existing Autopilot software to function. The only problem is, HW3 is supposed to run most tasks not on the processor or graphics card but on its Neural Processing Units (NPUs), which are not designed for direct software emulation and are probably not capable of that. In principle, the Graphics Processing Unit (GPU) and the Central Processing Unit (CPU) together are capable of emulation. However, the CPU and GPU components of HW3 are less powerful than the ones in HW2, so they physically cannot directly emulate it. So, Tesla did move some tasks to the NPUs to make it work — but Autopilot needed a major rewrite of the base code to truly unlock HW3’s potential.  

On a side note, HW3’s CPU and GPU are unnecessarily powerful for the minor tasks an NPU is not capable of. What this likely means is that those components were chosen to allow Autopilot to transition to the new hardware (HW4). It will likely have a much smaller GPU and CPU and either make more room for even more complicated neural nets plus higher-resolution/frame-rate cameras or simply reduce the power requirements of the SoC.

Thanks to Third Row Podcast’s interview with Elon Musk, we now have confirmation of the above theory and some really juicy new details. All these months, Tesla was rewriting Autopilot’s base code behind the scenes and will soon(ish) push that update to all vehicles running HW3. This could even signal the end to major updates for HW2 and HW2.5 Autopilot systems.

The next thing that we find out is what kind of changes under the hood Tesla has been allocating those extra neural nets to. Basically, the first of two major improvements is intertwining the different systems and decisions and making the neural networks work collectively. In other words, the car will be better at predicting that A results in B rather than observing A and reacting only after seeing B. We made a short 30 second clip for the occasion.

The other change pertains to how Autopilot looks at the world and interprets information from the cameras. Elon once described a human driver as 2 cameras on a gimbal powered by a supercomputer — so, that’s the eyes, neck, and brain. Here is how to visualize how Autopilot works now: imagine a person is sitting behind the desk. He is tasked with drawing on a blank piece of paper a layout with the positions and trajectories of all the cars around your vehicle by looking at 6 different screens that are positioned in front of him — hard work. Now, the new Autopilot system is one camera, a 360° camera. People sometimes joke that you need eyes in the back of your head, but imagine being able to see in 360 degrees and fully comprehend in your vision all that happens around you intuitively. So awesome. Well, that is how the new Autopilot system works. It stitches together the data from all the cameras into one 360° camera. This should significantly improve the system’s ability to learn from driving experience.

Part 2 of Elon's Story starts with a little announcement…

"There's quite a significant foundational rewrite in the Tesla Autopilot system that's almost complete" –– @elonmusk pic.twitter.com/am58oVz6Rb

— Third Row Podcast (@thirdrowtesla) January 30, 2020

Elon described this as part of the general trend of "The neural net eating up more and more of the system"

There are neural nets that do things like object detection but then there's also normal procedural code to do things like follow traffic rules

More learning over time

— Third Row Podcast (@thirdrowtesla) January 30, 2020


Tesla’s HW3 computer is an absolute beast. It can handle 7 times as many frames, has 7 times larger neural nets, and as was said in the presentation, “There are a lot of ways you can spend that.” Tesla has indeed slowly been allocating the SoC’s resources, and by the sound of it, they will soon be done rewriting Autopilot’s existing functions, will add a few more vital ones, and will publish that and then continue building on top of it.

A lot of people don’t believe that Tesla’s robotaxi autonomous network is less than a decade away, or could even be 4 or 5 years away. I say that those people underestimate exponential progress and machine learning, and do not fully comprehend how HW3 raises the playing field. Just to clarify, that does not mean that level 5 autonomy on all roads and off the road will be available all over the world, but at least one country with very good roads and infrastructure will have it and other countries will follow suit. 
 

Follow CleanTechnica on Google News.
It will make you happy & help you live in peace for the rest of your life.




Tags: , , , , , , , , ,

Chanan grew up in a multicultural, multi-lingual environment that often gives him a unique perspective on a variety of topics. He is always in thought about big picture topics like AI, quantum physics, philosophy, Universal Basic Income, climate change, sci-fi concepts like the singularity, misinformation, and the list goes on. Currently, he is studying creative media & technology but already has diplomas in environmental sciences as well as business & management. His goal is to discourage linear thinking, bias, and confirmation bias whilst encouraging out-of-the-box thinking and helping people understand exponential progress. Chanan is very worried about his future and the future of humanity. That is why he has a tremendous admiration for Elon Musk and his companies, foremost because of their missions, philosophy, and intent to help humanity and its future. He sees Tesla as one of the few companies that can help us save ourselves from climate change.

https://cleantechnica.com/feed/