More

    Existing Teslas move to Tesla Vision in latest software update

    We’ve known this was coming for a while, but with today’s over-the-air software update (2022.24.6) on my Model 3, my radar is now effectively useless.

    Tesla’s approach to autonomy is to use Computer Vision, taking inputs from the car’s cameras located around the car, and making inferences about the environment around it, then planning a path through that environment, often leveraging your route entered into your navigation.

    While all new Model Ys made since June 2022 have no radar (behind the front bar), my 2019 Model 3, like many others, has previously used a combination of sensor inputs including front-facing radar to adapt to cars and objects ahead of it, along with a variety of active safety features.

    It’s no secret that customers experience phantom braking from time to time and as detailed at the CVPR keynote in 2021, Tesla’s former head of AI, Andrej Karpathy did a great job explaining why the company was so bullish on moving to vision-only. Karparthy explained that having inputs from a variety of sensors often competed and understanding what to believe was complex. When considering which should be the source of truth, they looked at which could be the most accurate not just now, but over time.

    Radar is ultimately a fairly noisy input, and even back then, Tesla was seeing that they could achieve better results using vision alone.

    While my car got 2022.24.6 today, a check of the release notes revealed a new item, listed under the earlier branch of 2022.20.

    Tesla Vision
    Your vehicle is now running Tesla Vision! It will rely on camera vision coupled with neural net processing to deliver certain Autopilot and active safety features. Vehicles using Tesla Vision have received top safety ratings, and fleet data shows that it provides overall enhanced safety for our customers. Note that, with Tesla Vision, available following distance settings are from 2-7 and Autosteer top speed is 140km/h (85 mph).

    The following active safety features previously leveraged Radar and now use Tesla Vision.

    • Forward Collision Warning
    • Automatic Emergency Braking
    • Lane Departure Warning / Avoidance
    • Emergency Lane Departure Avoidance
    • Pedal Misapplication Mitigation
    • Auto High Beam
    • Autowiper
    • Blind Spot Collision Warning Chime
    • Side Collision Warning

    Tesla Vision has been a key part of the FSD Beta which is yet to make its way outside the United States and Canada, but after more than 100,000 Beta users, it’s clearly proving successful. Tesla clearly has the data to show that radar is inferior to Vision-only, despite many opponents and competitors claiming radar and lidar are necessary.

    With today being Father’s Day in Australia, I had the opportunity to drive the car for close to 300km using Vision-only and can now compare it to my experience with a radar-assisted technology stack.

    When I drive, I spend 95% of my time on Autopilot, enabling it anywhere it’ll let me because it makes me a safer driver. Being confident the car will adat to the speed of the cars ahead and keep me in the lane lines, it provides more opportunity to take in the environment around you.

    When you’re in this mode you feel confident the car has your back but knowing the dramatic shift in this new software build, I was more cautious than usual.

    The real risk areas are around sections of the road that have gaps in the markings, or where the road widens significantly or new lanes are added, or lanes end and you need to merge. You also have attributes like how well the car centers itself in the lane, particularly around corners and in these areas I felt it did a really great job.

    During my drive today, the only time I experienced a brake where it wasn’t expected, was when I used the windscreen wipers to clean the windscreen. In this instance I seen an alert shown on the display and felt the car slowing by around 5km/hr. A few seconds later the windscreen was clean, the wipers stopped and the speed returned to normal.

    There was one section of the trip I was keen to drive, to test if any improvements had been made. This stretch of road is something the car had struggled on before. The segment of road features a long bend to the left, where a turning lane to the right emerges (blind to the car). The natural flow leaves you as a driver running through the right turn lane entry, something permitted by the dashed lines to enter the turning lane.

    Technically you’re probably meant to move left when continuing straight and avoid the turning lane altogether, but that’s not actually the challenge. The car spots the turning lane late, as humans would at 100km/hr and when it does, the car slows before making the decision to proceed through the right turn entry, as many humans would. Today there was really no change in this behaviour.

    The key takeaway from all of this is ultimately this, phantom braking from shadows, cross-traffic, intersection confusion or a million other things could still occur today with Vision-only, but there’s an awful lot of runway here.

    Tesla’s autonomous efforts are being fed by data from millions of vehicles across the world and every day, that number continues to grow. While things are not perfect, Tesla’s approach has the potential to resolve issues unlikely virtually every other car on the market.

    Phantom braking does occur in vehicles from other brands, and where it does that’ll happen for the life of that vehicle. Tesla’s approach of ingesting data from the fleet, training on that data, and then feeding improved models back to our cars to make better inferences allows the system to get better over time. With noisy radar out of the equation, I expect vision-only to really stretch its legs and deliver improvements in the weeks and months ahead.

    Jason Cartwright
    Jason Cartwrighthttps://techau.com.au/author/jason/
    Creator of techAU, Jason has spent the dozen+ years covering technology in Australia and around the world. Bringing a background in multimedia and passion for technology to the job, Cartwright delivers detailed product reviews, event coverage and industry news on a daily basis. Disclaimer: Tesla Shareholder from 20/01/2021

    2 COMMENTS

    1. I have one particular place when the car tries to jerk into a turn lane just after a concrete divider. It does this about 30% of the time. This behaviour is based on vision system as RADAR can’t see line markings. On 2022.24.6 this behaviour is still exactly the same.
      I think we will see the next significant improvements when the single stack, that is used in FSD Beta, is implemented.

    2. Whilst I have not noticed any difference with Tesla Vision during the daytime when driving home last night the automatic headlights (which are required to be on to use autopilot) was blinding on coming traffic. You have to keep turning it off to stop this

    Leave a Reply

    Ads

    Latest posts

    Reviews

    Related articles

    techAU