More

    Tesla FSD moves to hands-off, eyes-on driving with FSD 12.4.1 on all roads, leaving Ford’s BlueCruise and GM Super Cruise in the rear vision mirror

    Tesla’s latest software update is build 2024.15.5 which contains their autonomous efforts in Full Self Driving (Supervision) version 12.4.1. This release may only be an incremental build number of 12.3.x, but features a significant change to how the system works.

    In FSD 12.4.1, Tesla changes the way they monitor the driver’s attentiveness from measuring torque input to the wheel, to using computer vision AI to monitor the driver through their interior camera.

    As per the release notes, Vision-Based Attention Monitoring is described as follows.

    When Full Self-Driving (Supervised) is enabled, the driver monitoring system now primarily relies on the cabin camera to determine driver attentiveness. This enhancement is available on vehicles equipped with a cabin camera and only when the cabin camera has clear and continuous visibility of the driver’s eyes (e.g., the camera is not occluded, there is sufficient cabin illumination, and the driver is looking forward at the road ahead and not wearing sunglasses, a hat with a low brim, or other objects covering the eyes).

    Outside of these circumstances, the driver monitoring system will continue to rely on a combination of torque-based (steering wheel) and vision-based monitoring to detect driver attentiveness. When the cabin camera is actively monitoring driver attentiveness, a green dot appears next to the steering wheel icon on the touchscreen.

    If the camera detects the driver to be inattentive, a warning will appear. The warning can be dismissed by the driver immediately reverting their attention back to the road ahead. Warnings will escalate depending on the nature and frequency of detected inattentiveness, with continuous inattention leading to a Strikeout.
    Cabin camera images do not leave the vehicle itself, which means the system cannot save or transmit information unless you enable data sharing.

    Having been through internal testing, Tesla began rolling out FSD 12.4.1 to customers cars, meaning that Tesla owners in the US can no have their cars drive them to a destination, without touching the wheels or pedals.

    The system isn’t perfect, which is why Tesla are carefully monitoring that the driver is paying attention and is ready to take over, but this is a major step forward to their efforts to deliver a full automated robotaxi service and deliver on the name of their autonomous software package – Full Self Driving.

    The competition

    Other automakers like Ford and GM have promoted their ADAS systems (BlueCruise and Super Cruise) as being eyes-on, hands-off for some time, which left some customers with the impression they were ahead of Tesla when it comes to autonomy.

    If you took the narrow use case of highway driving, that argument would have had some merit, but those paying attention would have noticed that Tesla’s broad approach to autonomy was leading to this moment.

    With an OTA software update, Tesla is able to ship this software to millions of customers with one massive difference, this isn’t just for highways, Tesla’s driver monitoring and hands-off driving experience is available on all roads (in the US).

    If you woke up today and see this news, you may look at this as Tesla having just leapfrogged the competition, but those who understand Tesla’s approach knew this day was coming and for many aspects of driving tasks, were already superior.

    If you’re thinking Ford or GM could simply release an update to their software and enable hands-off driving everywhere, think again, their systems rely on HD mapping of roads, and while that’s being done for 130,000 Miles and 400,000 miles respectively, that leaves millions of miles that are not. Map the entire country is challenging and maintaining it is even worse, which is why Tesla doesn’t even try.

    Tesla uses a fundamentally different approach to autonomy which has the dramatic benefit of working at scale. Tesla’s computer vision takes input from the cameras around the car and compares it to the driving model, effectively a brain like ours that makes decisions about how to safely navigate through the environment your car is in. FSD is trained on millions of miles/km of driving and videos from owners who already have 12.4.1 are showing impressive results so far.

    This will now leave many Ford and GM owners to consider if they bought the wrong car, with Tesla owners (who purchase or subscribe to FSD), be able to enable the system and be driven to their destination, just by looking out the windscreen.

    Another key difference is that Tesla mounted their interior camera above the rear view mirror. Ford, and GM positioned their cameras in front of the wheel, meaning that on a turn, the car’s view of the driver would be blocked.

    We still don’t know when Tesla will roll this out internationally, however we know they need to train the model further to support RHD markets and the focus is certainly America-first, natural for an American company.

    As always, I recommend a diverse diet of YouTube videos of FSD drives, here’s a few below, if you have more, please leave them in the comments.

    Jason Cartwright
    Jason Cartwrighthttps://techau.com.au/author/jason/
    Creator of techAU, Jason has spent the dozen+ years covering technology in Australia and around the world. Bringing a background in multimedia and passion for technology to the job, Cartwright delivers detailed product reviews, event coverage and industry news on a daily basis. Disclaimer: Tesla Shareholder from 20/01/2021

    Leave a Reply

    Ads

    Latest posts

    Reviews

    Related articles

    techAU