Not a subscriber yet? Sign up here.
⚡️Tesla Investor Day Overview
Tesla hosted their first Autonomy Investor Day yesterday. This press event comes shortly ahead of releasing their Q1 financial results, and is an attempt to explain the technical details of their claimed competitive advantage over other automotive self-driving options. The audience, mostly financial representatives from large shareholders, were granted a 4 hour deep dive into the nuances of chip design, neural network architecture, and software development. Unsurprisingly, this topic proved a bit esoteric for the audience and most of the questions revolved around enforceability of patents, various financial projection scenarios, and specific anecdotes about personal autopilot use. I’ve captured some of the interesting points below.
🔩Hardware
Pete Bannon - VP Silicon Engineering
Lots of numbers and hardware specs. The computer sits between the glovebox and the front trunk in an independent enclosure.
Chip design started in February 2016, and concluded 18 months later in August 2017. This is quite fast, all things considered (new team, no in-house experience with chip fabrication).
Batch size of 1, compared to Google’s TPU which operates on a 256 batch size. This effectively means that computation can start as soon as an image enters the chip, rather than waiting for the rest of the sample to fill.
The GPU used on the chip was intentionally underpowered. This is because the GPU was used for post-processing, but neural net improvements mean that the image can now be used in full resolution, so the extensive post-processing is no longer needed.
For non-specialized architectures (GPUs), power consumption increases linearly with complexity of model / compute. This proves problematic for electric vehicles due to the effect on range.
The two system’s on a chip are designed to be fully redundant, although in production they are used to validate computation and to perform shadow calculations for development purposes.
👁 Vision
Andrej Karpathy - Senior Director AI
Reiteration that visual recognition is essential for autonomy.
All road signs contain text that are not captured by LiDAR.
“Fleet learning” is the term being used for the collection of live data coming from Tesla cars on the road. There are currently half a million Tesla’s with the full sensor suite operating in this capacity today. Note that MobileEye also has data from cars on the road today (incl. Audi, BMW), but lacking the vertical integration of the Tesla sensor suite.
Data coming from the fleet is valuable due to the long tail (cars flipped over, unusual road conditions, pickup trucks filled with objects, etc), not the magnitude. At some point, you don’t need any more images of highways; what becomes important are the edge cases.
Tesla has a data collection system that is engaged when drivers disengage autopilot, among other triggers. All human input is considered to be error (e.g. any time a human touches the wheel it is logged as an error and the NN team examines the incident or updates the model).
The computers within the Tesla SOC are optimized for inference, not for training. In other words, the Tesla idling in your garage cannot be sent training data to update the model due to the specialization of the hardware.
Later in the conversation, Elon somewhat backtracks on this statement when prompted by an investor who is clearly seeking a way to factor an AWS-like revenue potential into the fleet.
SpaceX uses LiDAR to navigate the Dragon capsule when docking with the Space Station. Elon has been known to publicly disparage the use of LiDAR for autonomy, going so far as to claim that the entire industry will drop the technology. This statement would lead one to believe that he has significant experience with LiDAR outside of Tesla and can accurately assess its limitations.
Autopilot progress in snowy environments will see seasonal improvements. Following the fleet learning mentality, this becomes obvious, but it is still interesting to note that the system learns from each winter.
🖥 Software - Navigate on Autopilot
Stewart Bowers - VP Engineering
Algorithms are tested on the fleet using Shadow Mode; pushing a development version of the model to the non-primary SOC and calculating the delta between the actual movement of the vehicle.
100,000 automated lane changes per day using Navigate on Autopilot.
📝 Other Notes
In recent weeks there has been increasing reference to “Regulatory Approval” that is treated as a blocker for deployment of full self driving. It seems as though this straw-man is conveniently being lined up to help obfuscate any potential delays from the autopilot team.
Robotaxi will be available “next year” (fool me once, shame on you. fool me twice …)
Any Tesla customer can enroll in the program and choose to share the car at particular time (e.g. while at work, overnight)
Only share with friends, co-workers, or friends on social media (interesting one, that)
Tesla will take a large % cut (30% was mentioned)
This will help to offset cost of payments
~$30,000 per year / per car with an operational lifespan of 11 years
Model Y and Semi launch 2020