Last week I got a question on Twitter regarding Tesla’s FSD, that question was.. what exactly is FSD? First off, it stands for Full-Self Driving and represents Tesla’s development efforts towards fully autonomous or driverless vehicles.
If you’re watching and monitoring every move Tesla makes, every Elon interview, then you’ll already know, but for the average person considering the purchase of a new vehicle, they may not be aware of the detail. When it comes to FSD, there is a lot of detail and this is one where the detail really matters.
Initially Tesla began describing their autonomous driving efforts as Autopilot. Over time that has change and now when you configure a new vehicle on the Tesla website, you’ll see this is now included.
The Autopilot now refers to the cars ability to accelerate and brake automatically, similar to adaptive cruise control. Autopilot also steers your car between the lines on the road. Unlikely the technology in most vehicles, Tesla uses lane centering, understanding the distance between the two lines, halving that to determine the midpoint, then position your car there and doesn’t ping pong between the lines.
The car leverages the on-board array of cameras, sensors and radar, integrated subtly in the vehicle’s body.
While humans have their vision obscured by A, B and C pillars creating blind spots, Tesla vehicles can see the entire 360 picture around the car thanks to 8 surround cameras, sampling multiple times per second and at a range of up to 250 metres.
Tesla also uses 12 ultrasonic sensors, allowing for detection of both hard and soft objects. A forward-facing radar with enhanced processing provides additional data about the world on a redundant wavelength that is able to see through heavy rain, fog, dust and even the car ahead.
Full Self-Driving Capability
To enable Tesla to move beyond level 2/3 autonomy, which still requires a driver, to hit level 4/5 where humans are no longer needed, the car has to have a brain, a way to process the insane number of possible scenarios, on the fly, in environments they encounter for the first time and never fail.
Tesla have taken a very different approach to that of other auto makers and are using AI, running on some serious custom built hardware, to be that brain. Initially they leveraged Nvidia’s Drive PX platform, but quickly outgrew it and designed there own.
This new onboard computer has over 40 times the computing power of the previous generation runs the new Tesla-developed neural net for vision, sonar and radar processing software. Together, this system provides a view of the world that a driver alone cannot access, seeing in every direction simultaneously, and on wavelengths that go far beyond the human senses.
What’s neat about this implementation is that Tesla is letting this brain learn from Tesla vehicles out in the real world, finding all the edge cases, like road works, or a car flipped on its side or debris on the road that you need to avoid. Where drivers are using Autopilot and have to take control, Tesla go back a few seconds and send frames from the video feeds of all cameras, data from the sensors and vehicle telemetry, and send it back to Tesla HQ over the in-built cellular connection.
This information is then analysed labelled, often by humans but increasingly by image recognition AI, so that edge case is understood and solved for. That learning by the central Tesla brain, is then pushed back to each car in the form of a over-the-air software update.
What’s neat about the Tesla architecture is that they can actually run the new software in shadow mode. This means the vehicle monitors the environment, models what response the unreleased software would have taken, then compare that with the current build and see if the results are better or worse. If the fix for that scenario is confirmed, it’d be rolled into the next public build. That’s a loop that will accelerate the resolutions of edge cases exponentially as all vehicles (more sold every day) with FSD hardware contribute.
As you can imagine, the list of cases where things in the world are enormous, but with every km driven, by hundreds of thousands of Tesla’s helping, the brain is learning fast. At some point, the car will be smarter and safer than a human driving. Tesla believes that point will be the end of this year, 2019, less than 6 months from now.
Being ‘feature complete’ as Musk calls it, means the vehicle can cope with all the regular scenarios as we drive – red lights, round-abouts, overtaking etc, as well as the weird and unforeseen ones.
Right now the Tesla website lists the following as Full-Self Driving features:
- Auto Lane Change: automatic lane changes while driving on the motorway.
- Autopark: both parallel and perpendicular spaces.
Coming later this year:
- Recognise and respond to traffic lights and stop signs.
- Automatic driving on city streets.
- Summon: your parked car will come find you anywhere in a car park. Really.
- Navigate on Autopilot: automatic driving from motorway on-ramp to off-ramp including interchanges and overtaking slower cars.
That ‘coming later this year’ is a little cheeky as this requires approval in each country and sometimes each state by Government authorities before it’s allowed for use. It’s likely we see this approved first in select areas in America first, then rolled out to Australia once our politicians understand it better.
Right now, when you use Autopilot, the way Tesla know that you’re paying attention is through inputs through the steering wheel. If you don’t have your hands on the wheel every say 30-40 seconds, then the car will complain until you do. Once FSD arrives, that annoying requirement will finally go away.
Every new Tesla comes with what Tesla calls Hardware V3, that includes the full self-driving computer, the cameras, sensors and radar necessary to achieve the dream of a driverless car, it’s just the software that’ll need to get completed, then approval.
Achieving this approval won’t be easy, but armed with a massive array of data that proves without a doubt that Tesla’s technology is many times safer than a human, would be a hard thing for any politician to combat.
Australia has spent millions of dollars on road safety campaigns to keep people safe on our roads, but relieving humans of their duties is the only real way of solving the issue. At that point, you can use your phone, watch a movie, hell even be intoxicated. Hopefully, as a society, we can find much better things to do with a vehicle that can drive itself, like reduce the number of vehicles in your garage from 2 to 1.
Another massive opportunity for Tesla owners who get FSD, is the ability in the future, to enrol their vehicle in the Tesla fleet. This would work like Uber, without the driver, meaning you could fit another person in the car, already creating efficiency, but not having the driver take their cut, means you could make money by sending the car out to work while you don’t need it.
Using something like Enhanced Summon (available to early access testers) that comes to pick you up from the carpark is neat, but the opportunity for Tesla to turn owners into small businesses is a much greater use.
Tesla let you buy FSD during the checkout process (currently A$8,500) or by purchasing after delivery, through a software unlock.
Right now Tesla warns that the currently enabled features require active driver supervision and do not make the vehicle autonomous. When FSD is fully realised, that won’t be the case and when asked who would bear the liability should something happen, Musk said Tesla probably should, setting the benchmark for all automakers to believe in their own hardware/software stack to keep passengers safe.
In April, Tesla released a video showcasing their latest build (at the time) of FSD. This shows the vehicle navigating residential streets, stopping at signs, traffic lights and navigating lane changes and more. This gives you an idea of what’s coming, but by the time 2020 rolls around, we’re all expecting a lot more.
The car should know in a relatively short period of time where your home and work are and daily commutes should be able to be achieved without user input. For those times you’re going somewhere else, you’d just need to tell the vehicle and it should safely navigate you there, doing things like checking the available range and charger locations relative to the journey etc.