Ed Markey to Tesla: Turn Off Autopilot Before It Kills Someone
He cited a video of the infamous Mass Pike snoozer.
Some guy literally asleep at the wheel on the Mass Pike (great place for it).
Teslas are sick, I guess? pic.twitter.com/ARSpj1rbVn
— Dakota Randall (@DakRandall) September 8, 2019
The now-infamous video shared by NESN’s Dakota Randall this September appears to show a man at the wheel of a Tesla hurtling down the Mass Pike with his head slumped forward and eyes clearly not on the road, stoking outrage about the technology that allowed such a terrifying scenario to play out on a busy highway. And now U.S. Sen. Ed Markey says it’s time to do something about it.
First, some background: Teslas, as you may know, come equipped with an “autopilot” feature, which takes over most of the driving responsibilities for the vehicles. It’s designed to disengage if a driver fails to keep their hands on the wheel, but as many Tesla owners have figured out, the system can be overridden if you jam objects like water bottles just so into the steering wheel. Whether the much-discussed Mass Pike snoozer had done such a thing remains a mystery (Tesla has raised suspicions that the video may be a hoax, although Randall contests that claim).
Either way, Markey says the video, as well as a subsequent report from NBC Boston that a Massachusetts man claims he accidentally drove 14 miles while asleep at the wheel, is evidence that Elon Musk needs to put the brakes on the function, at least for now. This week, Markey sent a letter to Tesla calling on the company to disable the technology until it can solve the issue.
“Alarmingly, you can go to YouTube right now and learn about some of these tricks,” he said during a Senate Commerce Committee hearing this week. “Somebody’s gonna die because they can go to YouTube as a driver, find a way to do this, and then some innocent person on the street will wind up dead or a driver in another car will wind up dead,” he continued, adding, “We can’t entrust the lives of our drivers and everyone else on the road to a water bottle.”
If @Tesla‘s Autopilot can be tricked with an orange, then it’s a lemon on safety. We need to make sure that “driver assistance” systems can’t be hacked into unauthorized “driver replacement” technologies. I’ve called on Tesla and NHTSA to fix Autopilot’s safety flaws immediately. pic.twitter.com/hwdmSFMXyp
— Ed Markey (@SenMarkey) November 22, 2019
To make his case, Markey pointed to this video in particular, in which the host of a Tesla-focused YouTube channel demonstrates the water bottle technique for fooling the autopilot. The host offers some ominous warnings about trying the hack, namely that the autopilot does not seem to be able to intervene if a car traveling in front of it swerves too quickly to avoid a parked vehicle, which investigators believe is what happened when a Tesla slammed into a parked firetruck back in September.