06:19 GMT06 April 2020
Listen Live
    Tech
    Get short URL
    130
    Subscribe

    Security firm McAfee revealed in a Wednesday report that some Tesla vehicles can be tricked into speeding while their automated driving systems are active.

    The experiment was conducted on two Tesla cars from 2016 - a Model S and a Model X - which use camera technology constructed by Mobileye, a division of Intel and a supplier of semiautonomous car technology. The researchers found that when the Tesla Automatic Cruise Control feature was activated, the 2016 Model S would determine the speed limit using its EyeQ3 camera.

    After 18 months of research, the firm found that altering road signs - such as a 35 mph speed limit sign - with electrical tape could trick the car’s camera into misreading them. For example, adding a 2-inch piece of black electrical tape across the ‘3’ on a 35 mph speed limit sign caused the Tesla to interpret the speed limit as 85 mph.

    However, in their Wednesday blog post, McAfee researchers Steve Povolny and Shivangee Trivedi revealed that the newest models of Tesla vehicles do not include Mobileye EyeQ3 camera specifically. The researchers also revealed that conducting the same experiment on a 2020 Tesla vehicle with a new version of the Mobileye camera did not yield the same results. So, it appears that only Teslas manufactured between 2014 and 2016 that include the EyeQ3 model camera exhibited such vulnerabilities. The McAfee researchers also noted that the company disclosed the report’s findings to Tesla and Mobileye last year. 

    Following the report’s release, Mobileye released a statement: "Traffic sign fonts are determined by regulators, and so advanced driver-assistance systems are primarily focused on other more challenging use cases, and this system in particular was designed to support human drivers — not autonomous driving. Autonomous vehicle technology will not rely on sensing alone, but will also be supported by various other technologies and data, such as crowdsourced mapping, to ensure the reliability of the information received from the camera sensor and offer more robust redundancies and safety." 

    Tesla, however, has yet to comment on the report’s findings.

    This is not the first time that the car company’s autonomous driving technology has been scrutinized. 

    In April 2019, researchers at Keen Labs in China, a well-known cybersecurity research group, released a report describing how they were able to trick a Tesla Model S self-driving car into entering incoming traffic by simply placing stickers on the road, thereby interfering with the car’s autopilot lane-recognition technology. 

    The researchers also showed how Tesla’s automatic windshield wipers could be erroneously activated by simply placing a television set in front of the car. In Tesla’s response, the company noted at the time that the automatic windshield wiper system can be switched off at any time to give the driver manual control of the wipers.

    Related:

    Tesla’s First European Plant in Trouble After German Court Orders to Stop Ground Works
    Elon Musk on What You Need to Get a Job at Tesla: ‘Don’t Care If You Graduated High School’
    Tesla Temporarily Closes Chinese Stores - Reports
    Tesla Recalls Thousands of Vehicles Over Power Steering Issue
    Tesla Remotely Disables Autopilot on Used Model S Resold to Another Customer - Reports
    Tags:
    Automated vehicles, Traffic, Tesla
    Community standardsDiscussion
    Comment via SputnikComment via Facebook