The Future of
of Self-Driving Cars
What the government’s investigation of Tesla’s autopilot system means for autonomous vehicles
It seems like we’ve been on the verge of handing over the keys (and the gas pedal) for a while now. From joy rides and taxis to cargo, AI-led autonomous vehicles are the future. Right?
These driver-assistance systems will—in theory—make the roads a safer place and minimize the potential for driver error to cause harm. In practice, it isn’t that simple. Suddenly, it’s harder to make out the road ahead—pun mildly intended.
Car and Driver recently proved that driver-assist systems can be easily fooled and made it painfully clear that most have a number of limitations. And now, one of the highest-profile driver-assist systems is the subject of a federal investigation.
The US National Highway Traffic Safety Administration has initiated an investigation of Tesla's Autopilot system. The probe follows 11 crashes with parked first responder vehicles since 2018, which resulted in 17 injuries and one death. According to Engadget, they’ll investigate how Tesla’s Autopilot system studies the road and, more importantly, ensures drivers engage with what’s going on.
Distracted driving can be deadly in any car. But safety experts say Autopilot may encourage distraction by lulling people into thinking that their cars are more capable than they are. And the New York Times, reporting on the system, found that it doesn’t include safeguards to make sure drivers are paying attention to the road and can retake control if something goes wrong—unlike the systems used by GM or Ford.
We’ll no doubt get there, but perhaps fully autonomous vehicles are further in the future than we’d originally thought. Waymo—formerly the Google self-driving car project—is 99% of the way there. But, according to Bloomberg, that last 1% has proven to be increasingly difficult and just out of grasp.