More

    PSA: Tesla know about phantom braking and are working their ass off to fix it

    Chances are, if you’ve driven a Tesla on Autopilot for any significant length of time, you will have experienced phantom braking.

    This week, the National Highway Traffic Safety Administration (NHTSA) opened an investigation into the issue after receiving 354 complaints from customers, importantly, none of these included accidents.

    “The Office of Defects Investigation (ODI) has received 354 complaints alleging unexpected brake activation in 2021-2022 Tesla Model 3 and Model Y vehicles. Received over the past nine months, the reports have often been characterized as “phantom braking” by consumers. Tesla describes the subject vehicles as equipped with a suite of advanced driver assistance system (ADAS) features referred to as Autopilot which Tesla states will allow the vehicle to brake and steer automatically within its lanes.

    The complaints allege that while utilizing the ADAS features including adaptive cruise control, the vehicle unexpectedly applies its brakes while driving at highway speeds. Complainants report that the rapid deceleration can occur without warning, at random, and often repeatedly in a single drive cycle.”

    So what are the potential outcomes here?

    Many online have welcomed the investigation by NHTSA as they believe this will finally get Tesla to fix the issue. Those that believe fixing this is simply and simply a matter of Tesla prioritising developer effort in this area, simply haven’t been paying attention.

    Tesla’s Autopilot system works by leveraging AI, specifically Computer Vision to take the data inputs from cameras and sensors, to create an understanding of the world around it. This system makes inferences on what each object is, a curb, a car, a pedestrian etc, and often its trajectory. It then determines a safe, drivable path through the scene.

    The phantom braking is most likely a result of a situation where the inference engine seen something in the data and wasn’t confident it was safe to proceed through. To humans, we quickly observe and understand there are no solid objects ahead of us and it is safe to continue, however, the car is relying on its inputs and the AI models that enable it to understand objects.

    To improve this inference success and avoid any phantom braking events, Tesla needs to continue to train their models on millions of miles (or km) of driving and over time these events will occur less and then likely virtually never.

    It is not accurate to say Tesla is not working on the problem, they are, they’re pouring thousands of hours of engineering effort every week, to progress their autonomous driving solution, known as FSD, which is the same effort that will resolve this issue.

    The fix is not something you can simply flip a switch and have this be resolved. It’s also not going to be solved by adding back radar as many have proposed. It is worth highlighting that plenty of ICE vehicles with LV2 driver assist systems also experience this issue, almost all of those use forward-facing radar.

    Personally, I have experienced phantom braking in my 2019 Model 3 Performance, but thankfully rarely. For example, this week, I took a 3 hour round trip, with 95% on AP / Navigate on Autopilot. This allowed me to relax for much of the drive and simply monitor the car, rather than expend the mental energy required to keep within the lane lines and adapt to the speed of the vehicles ahead, a massive benefit.

    On the return trip, I had a single phantom braking event, with nobody else around, no obstacles ahead, but a couple of shadows on the road, which is often pointed to as a potential source the confusion. I always have my right foot hovering on or near the accelerator and within half a second, responded to my vehicle slowing and accelerated without issue.

    Where the most concern comes from, is around the concern of a phantom braking event resulting in being rear-ended. Thankfully the data suggests this is not actually occurring and as a follow vehicle, the responsibility is on the person behind to keep the distance, regardless of what the car ahead does. Ironically, if they are in a car using Adaptive Cruise Control and a safe follow distance, their car would slow without issue, albeit annoying.

    It seems Tesla is opting to implement braking when it has low or no confidence about the path ahead and as annoying as phantom braking is, that’s clearly the better of two evils. If Tesla was to proceed regardless of having little to no confidence if the environment ahead is clear, the risk is that a collision would actually occur, a much higher risk, than the risk of a rear-end collision if it braked out of caution.

    The takeaway from all of this is this, to address the concerns raised by customers, the NHTSA will likely recommend Tesla do better in this area. Tesla is already working incredibly hard to improve their AI-powered software, and with the ability to push over-the-air updates to resolve issues like this, we should expect to see this issue resolved in the months ahead.

    By comparison, vehicles from other OEMs, like my wife’s 2018 Honda CRV VTI-LX, that also experience phantom braking, will never be updated.

    Jason Cartwright
    Jason Cartwrighthttps://techau.com.au/author/jason/
    Creator of techAU, Jason has spent the dozen+ years covering technology in Australia and around the world. Bringing a background in multimedia and passion for technology to the job, Cartwright delivers detailed product reviews, event coverage and industry news on a daily basis. Disclaimer: Tesla Shareholder from 20/01/2021

    1 COMMENT

    1. Thanks Jason. 354 complaints out of close to a million cars… Not bad.
      It’s the same with my 2019 M3P. I use TACC almost all the time, and autopilot a hell of a lot. It makes any highway driving far lass tiring and enhances safety.
      I only very rarely have issues with phantom braking- (and actually would prefer that to ignoring an obstacle in front.)
      Likewise I always “hover” my foot over the GO pedal, which reduces any such events to just a bit of a nuisance.
      In my experience the only place it’s a significant recurring issue is single lane roads with parked cars, and no painted edge line – something which basic autopilot is NOT designed for anyway.

    Leave a Reply

    Ads

    Latest posts

    Reviews

    Related articles

    techAU