The Tesla Model 3 in my garage received it’s the second update in as many weeks today. After being delivered with V9.0, the big feature updates arrived in V10.0 a few days later. Now today, Australians began receiving another update.
This release is version 2019.32.12.1 and focuses on bug fixes, rather than features. One of the biggest bugs, or issues with V10.0 was the dramatic increase in Autopilot nag, reducing the hands-free driving time to as little as 10 seconds.
This term Autopilot nag comes from the system reminding drivers to put their hands on the wheel, to confirm to the car you are paying attention. It’s very possible you are paying attention the whole time, but the car’s only data point currently, is to track your force (or lack of force) on the steering wheel.
When you engage Autopilot, you see the lanes on the display turn blue, indicating the car has locked onto the lane and has control. It takes just a matter of minutes for you to begin to trust the system. While it’s not perfect, you quickly become accustomed to relaxing a little more when driving, spending less mental effort to ensure the car stays between the lines and reduces speed based on the cars ahead.
In V9.0 Autopilot would only nag you after not receiving an input for as much as a minute. This was generous and also allowed for you to get a real preview into the future where the car can drive itself, but that reminder would eventually come, to ensure you know we’re not there yet.
In V10.0 the alert frequency became much more aggressive, shortening the timeframe to as little as 10 seconds.
The nag frequency is actually not a fixed value, determined instead by the vehicle speed. This means with highway driving, you will see it less often that roads of slower speeds. This reflects the strong confidence in highway driving, complimented by the typically well-marked lanes on a highway.
Tesla’s Autopilot system is powered by machine learning, something that takes labelled data and learns how to respond to similar situations on the road.
With every autopilot disengagement, Tesla can leverage the data from the cameras on the car and have humans label the data to detail the path the car should have taken. Once that data is labelled, the content is fed back into the ML model, for retraining.
Back in April at Tesla’s Autonomous Day, we learnt the company has what they call ‘shadow mode’. This is the capability to try the updated model side-by-side the active model and see if the new decision for a similar situation, would have been better or worse.
If it’s positive, then the update is rolled into production and propagated through the Tesla fleet. When the car finds itself in a similar scenario, the way it responds would be improved.
This cycle repeats and Autopilot should getting more confident in how it navigates the world, requiring the driver, less often. As the rate of data increases with more Tesla vehicles on the road, these improvements in learning shouldn’t happen linearly, but exponentially.
Given this trajectory, it was surprising to see the Autopilot nag increase the timeframe, when the system should be getting more confident about tracking lines around corners, through intersections etc.
Thankfully the software update today, extends the timeframe for the Autopilot nag, to a more practical timeframe, as much as 30 seconds. If the car has close to a 100% confidence in lane line marketings, driveable space and path planning, I wish there was no Autopilot nag at all, however I understand we’re not there yet.
As Tesla vehicles continue to gain confidence in the environments they see around them, we should be able to go longer without requiring input.
Achieving fully autonomous vehicles doesn’t happen automatically, you have to iterate your way there and until we have cars that can drive themselves, it is a challenge to find the right balance of maintaining driver attention and how handing back to the driver elegantly, should the system need to.
Model 3 in particularly is an interesting vehicle. It features a camera above the mirror that is currently disabled. Likely designed to be a safety system for when we have RoboTaxi’s, it would be great to Tesla leverage that camera for driver monitoring, rather than using the pretty basic steering wheel sensor.
So far Elon Musk has resisted calls to do so, as he believes this interim period before Full Self-Driving is too short to justify the effort required. General Motors have Supercruise, a system that uses driver monitoring, in place wheel inputs, to determine attention levels, however, it is currently restricted to highway driving in the US.
Here’s a good example of why you still need to be vigilant in your driver attention levels. Wildlife are very random in their actions and avoiding them is not currently supported by Autopilot.