More

    Tesla driver blames FSD Beta on 8-car pileup, but that makes no sense, let’s see the crash data Elon Musk

    Whenever autonomous vehicles are involved in an accident, they get a lot of attention as we compare their abilities and deficits against us as human drivers. Today an incident involving a Tesla got lots of attention after an article and videos were released by the publication ‘The Intercept’ (keep in mind the author appears incredibly anti-Tesla).

    The incident occurred on November 24th, 2022 in the Yerba Buena Island Tunnel and involved no less than 8 vehicles. The 2021 White Model S had entered a tunnel, then brakes and turns left into the path of vehicles behind it.

    Things get interesting when the driver, Nicholas Jollymore (77 years old) of San Francisco, claimed that Tesla’s Full Self-Driving Beta Version 11 was active and that it malfunctioned.

    3 days later, an investigating officer called Jollymore to clarify his statement, where he reiterated he was in Full Self Driving Mode Beta and travelling around 55 miles per hour.

    The crash was captured on video by the Transportation Management Center (TMC) on the Bay Bridge and has been released. Rather than clarifying what occurred in the incident, the videos create more questions than they answer.

    Let’s start by stating that there is no doubt, the operation of the vehicle is still the responsibility of the driver, expressed at checkout when buying the vehicle, again when you buy and enable FSD and again when you enable Beta. Drivers are also prompted to pay attention when engaging FSD.

    Those who understand the abilities and limitations of Tesla’s software will understand that FSD Beta which uses AI for navigating city streets, uses the legacy Autopilot codebase when travelling on highways and with this stretch of the road having a 50MPH+ speed limit, it’s safe to say, FSD Beta (aka, Tesla’s most advanced software), is not responsible here.

    We also see the lane lines are well marked, an environment that Autopilot does a great job of tracking and centring the vehicle. Anyone who’s used Autopilot knows the car does not randomly dart to the left and is not responsible for a lane change as seen in the incident.

    Phantom braking is something that does occur from time to time, but to do this with nothing ahead of the vehicle is incredibly unlikely and even an application of the brakes (which we see from the rear video) does not explain the hard turn to the left.

    Navigating on Autopilot can change lanes and does so for a couple of reasons, the first being to be in the correct lane to follow the navigation route, and the second is to overtake slower traffic. Given the next turn was not for some time, and that there was no traffic in front of the car, both of these also seem unlikely. Automatic lane changes also only occur if you disable stalk confirmation on these lane changes.

    One theory was that an orange light adjacent to an off-ramp ahead of the tunnel’s entrance may have been incorrectly identified as a red light the car needed to stop for. The problem with that is the time and distance at which the brake light illuminates. This is again a time when having experience driving a Tesla (most on Autopilot) helps to understand that if this happened, the detection and response from the vehicle are virtually instant, if not ahead of passing the light. This has never had deleted reaction and applied the brakes seconds after passing the light or sign, also making it extremely unlikely.

    So if it wasn’t Autopilot and it wasn’t Navigate on Autopilot and it wasn’t FSD Beta, what was the cause of this accident?

    We really need Tesla to share vehicle data from the incident, as they have done in the past to absolutely know for sure, but my best assessment is that the driver was responsible and blamed FSD Beta.

    It is not possible to tell from the video how much of the vehicle’s movement the driver was responsible for, but we can see that it takes multiple seconds for a response. Typically attentive drivers would apply the accelerator in the event of phantom braking and in the event of an unwanted lane change, simply steer out of it. Neither of these things occurred.

    The drivers behind the cars also took far too long to respond to the changing environment ahead of them, which resulted in the pile-up. The fact that these cars were following at unsafe distances was highlighted in the summary of the report.

    Here’s the original article from The Intercept, here’s the Traffic Crash Report from the Department of California Highway Patrol, and here are the videos.

    We’ve reached out to Jollymore for comment and clarification on the incident and will update this post with any response.

    Jason Cartwright
    Jason Cartwrighthttps://techau.com.au/author/jason/
    Creator of techAU, Jason has spent the dozen+ years covering technology in Australia and around the world. Bringing a background in multimedia and passion for technology to the job, Cartwright delivers detailed product reviews, event coverage and industry news on a daily basis. Disclaimer: Tesla Shareholder from 20/01/2021

    9 COMMENTS

    1. Hey Jason. Good analysis here, but I think its inconclusive and saying that the FSD is not top blame is an incorrect headline. Without the data, we cannot accurately ascertain if there was an immediate error and/or prevention of the driver to respond by taking over. As you can see in this youtuber’s (MKBHD) video (https://www.youtube.com/watch?v=9nF0K2nJ7N8&t=407s), the FSD beta still has some moments where it’s challenged. This does not absolve the driver of responsibility – even a regular car could slow and come to a stop due to a mechanical or electrical fault. However, it may signal that the “sell or promise” of FSD may need to be re-marketed as Driver Assisted Self Driving or something like that.

      • Except that FSD cannot be enabled on a highway. This would have been Autopilot or Navigate on Autopilot. That means no stopping for red lights (or perceived red lights), no turns, and no lane changes except to navigate to the correct freeway at a split.

    2. This is the problem with Tesla fanboys… You wrote that ‘ it’s safe to say, FSD Beta (aka, Tesla’s most advanced software), is not responsible here.’ However, the driver made a statement to the police to say the opposite. I am not sure why would you go out of your way to call the driver a liar in their police statement? That’s just wrong. You are asserting that the Tesla software is perfect when it has been shown in multiple demonstrations not to be. It’s ok that it is not perfect. Don’t defend it and call people liars in their police statements because you have rose tinted glasses. And let’s also admit that idea that idea that FSD is meant to be used with a fully attentive driver is also a false narrative for what people will actually try and do with it. Just like Oxycodon was only meant for severe pain and recovery from surgery, millions of people used it recreationally and became addicted. FSD is meant to have an attentive driver but many people who use it, won’t be. So the software needs to be perfect, with a high bar set by human traits to abuse the gifts they have been given.

      • As I wrote elsewhere… FSD beta does not function on highways. If the driver stated otherwise, he was incorrect. This would have been Autopilot (and maybe Navigate on Autopilot).

    3. I wasn’t aware that Tesla already had vehicles out there that are fully autonomous, with no steering wheel or accelerator pedal, and that require no driver monitoring and take over if things don’t go exactly as planned.
      (And yes, I’m being VERY sarcastic).
      I hold the driver ENTIRELY at fault here, along with following drivers following too close, which of course rarely if ever happens (yes- more sarcasm.)
      What the hell was he thinking- allowing the car to fully slow and then stop in the far left (ie: fast) lane.

    4. Describing the way the software is supposed to behave isn’t evidence it can’t misbehave. Absent any evidence self driving wasn’t on, I’m not sure why anyone would doubt the police statement aside from wanting to believe the software is inflatable. FSD does warn to remain attentive but realistically, we’re all human. If you use something long enough, and everything goes fine – you’re going to get at least a little complacent. If you need the same attentiveness and reaction time as if you were driving yourself, the only way to get that is actually drive yourself.

    5. “Phantom braking is something that does occur from time to time”

      If this is the case then these “drivers aids” should not be allowed on the road.

      It’s hard enough dealing with unpredictable drivers on the road but to also have to deal with an unpredictable vehicle which is your own?

      Tesla have overcommitted and underdelivered and it’s extremely dangerous.

    Leave a Reply

    Ads

    Latest posts

    Reviews

    Related articles

    techAU