Tesla FSD avoids hitting a deer as NHTSA probes rosy claims about its abilities
Elon Musk is on record saying that he expects Tesla to be able to demonstrate that its Full Self-Driving (Supervised) feature is safer than human drivers in the second quarter of 2025.
It can then drop the "supervised" part of the title and pave the way for the driverless Robotaxi's regulatory approval to provide Cybercab services on public roads, which should happen before its promised mass production in 2026.
A recent incident recorded on a Canadian highway near Calgary can only serve as another proof of the adequate FSD reactions that will need to be demonstrated to convince regulators it will be safe to greenlight unsupervised self-driving.
The Tesla vehicle was driving on FSD at about 70 mph on the highway at night, when a deer suddenly appeared from the left, trying to cross in front of the moving car. It took the FSD software a split second to recognize the danger, brake briefly, and let the deer pass in front of the car, shaking the passengers inside.
The animal could then be seen running away alive and well in the rearview camera, while the Tesla owner explained the brief pause in the footage:
When the car applied the brakes, the relative speed between the deer and the car briefly decreased, allowing the camera to capture more frames within that moment. This effect is visible in the slow-motion replay, where you can see a slight pause in the deer’s movement—corresponding to the instant when the brakes were engaged.
Rather than easing regulatory scrutiny over Tesla's FSD, though, the NHTSA has now opened another investigation, this time for misleading communication about its abilities on social media.
Tesla sometimes cites events like the deer incident that present FSD in a positive light on X, but, according to the NHTSA, those humblebrags can mislead the public.
Tesla, it argues, is acting as if its FSD feature is already making its cars good enough to serve as autonomous robotaxis "rather than a partial automation/driver assist system that requires persistent attention and intermittent intervention by the driver."
Needless to say, Tesla is arguing the opposite, saying that there are plenty of warnings while driving, and cautionary language in its FSD manual.
"We believe that Tesla’s postings conflict with its stated messaging that the driver is to maintain continued control over the dynamic driving task," continues the NHTSA, which also has another probe open into FSD's safety record in situations with low visibility that is still ongoing.
Get the 80A Tesla Gen 2 Wall Connector with 24' cable on Amazon
Are you a techie who knows how to write? Then join our Team! Wanted:
- News translator (DE-EN)
- Review translation proofreader (DE-EN)
Details here
Source(s)
Reuters, HaobamMano (X)