GPS spoofing attack could drive a Tesla Model S or 3 off the road, as security researchers find
One of Tesla's latest selling points is Navigate on Autopilot, a feature that allows a car to carry out some types of driving by itself. In theory, this should be done under close human supervision, although there are settings that can limit the mandate for this. It is found in the newer Models S and 3 cars from the American company.
Navigate on Autopilot requires Tesla's GNSS (or GPS) to complete the tasks involved. Accordingly, the group Regulus Cyber decided to test its theory that this software is vulnerable attacks known as GPS spoofing. This experiment was allegedly carried out using a Tesla Model 3 located in Europe, using the "low-cost, open source hardware and software...that is accessible to anyone via e-commerce websites and open source projects on GitHub" with which, as Regulus claims, this firm's spoofing tests are conducted as standard.
The company proceeded to report that a successful spoof of the Model 3's GNSS took "less than one minute". It allegedly caused the car in question to mistake a straight stretch of road for the turning located 3 miles ahead of it, while driving autonomously and "maintaining a constant speed and position in the middle of the lane". This resulted in the vehicle's deceleration prior to turning sharply to the right, which the driver in question was apparently unable to prevent.
This may be a worrying indication of GPS spoofing's potential to affect a car's ability to function autonomously while maintaining human safety. In addition, it was reported that the attack also affected the test car's air suspension systems as it made its inaccurate exit, causing it to alter its ride-height as if alerted to a change of road surface. Regulus Cyber have made these findings public, and will hopefully also pass them on to Tesla.
Are you a techie who knows how to write? Then join our Team! Wanted:
- News translator (DE-EN)
- Review translation proofreader (DE-EN)
Details here