Video: Chinese experiment finds flaws with Tesla Autopilot
Tencent Keen Security Lab says the attack chain was fixed immediately after it reported its findings to Elon Musk-led Tesla
From India to America and Europe to the Middle East, petrolheads from around the world debated the pros and cons of electric vehicles in 2018, and as these cars’ pros and cons – such as their benefits to the planet – are steadily debated in the global car community, the launch of Nissan Leaf and the more recent Aston Martin electric vehicle prove that industry leader Elon Musk’s Tesla is doing something right – even if Chinese technology experts have found flaws with its Autopilot system.
Tencent Keen Security Lab, a part of China’s Shenzen-headquartered Tencent Holdings – which owns a 5% stake in Musk’s company – recently revealed the findings of its study into the advanced driver assistance systems (Adas) used by Tesla.
On its website, Tencent Keen Security Lab says its experiment demonstrates how the Autopilot system on a Tesla Model S can be remotely compromised, adding that the “attack chain [was] fixed immediately” after it reported its findings to Tesla.
“Tesla Autopilot can identify the wet weather through image recognition technology, and then turn on the wipers if necessary,” Tencent Keen Security Lab’s description of the experiment reads.
“Based on our research, with an adversarial example craftily generated in the physical world, the system will be interfered and return an “improper” result, then turn on the wipers.”
In response to this seeming visual recognition system flaw, Tesla said the conditions of Keen Lab’s experiment – displaying an image on a TV in front of the car’s windshield – was “not a real-world situation that drivers would face, nor a safety or security issue”.
The Chinese research hub also studied how Autopilot could be distracted through its lane recognition feature, it explains: “Tesla Autopilot recognises lanes and assists control by identifying road traffic markings.
“Based on the research, we proved that by placing interference stickers on the road, the Autopilot system will capture these information and make an abnormal judgement, which causes the vehicle to enter into the reverse lane.”
Similar to its previous response, Tesla, commenting on the lane recognition section of the findings, said: “In this demonstration, the researchers adjusted the physical environment (for example, placing tape on the road or altering lane lines) around the vehicle to make the car behave differently when Autopilot is in use.
“This is not a real-world concern, given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should be prepared to do so at all times.”
The final part of Tencent Keen Security Lab’s study is arguable among its most interesting findings, and Tesla’s response for this does not rely on real-world scenarios.
Explaining this section of the experiment, Tencent Keen Security Lab says: “After [compromising] the Autopilot system on the Tesla Model S (ver 2018.6.1), Keen Lab further proved that we can control the steering system through the Autopilot system with a wireless gamepad, even when the Autopilot system is not activated by the driver.”
In response, Tesla says: “The primary vulnerability addressed in this report was fixed by Tesla through a robust security update in 2017, followed by another comprehensive security update in 2018, both of which we released before this group reported this research to us. In the many years that we have had cars on the road, we have never seen a single customer ever affected by any of the research in this report.”