Chevron icon It indicates an expandable section or menu, or sometimes previous / next navigation options. HOMEPAGE

Self-driving cars need to pass a driving test before they're allowed on the road

google car
Google's self-driving car prototype. Getty Images

Every year, the July 4 weekend is one of the deadliest on the road.

Advertisement

We won't know how many people died this past weekend until the National Highway Traffic Safety Administration (NHTSA) releases its 2016 stats next year, but to put it all in perspective, an estimated 378 people died over the July 4 weekend in 2015, according to National Safety Council data reported by Bloomberg.

No one is freaking out about that stat. It's normal. We expect hundreds of people to die on the road every July 4.

We expect people to get liquored up and get behind the wheel. We expect their brains to go into vacation mode and their attention to be on fireworks or the massive quantities of grilled meats they intend to eat. We expect kids screaming in the back seat to distract their parents and people to lose focus in their rush to get home in time for work at the end of the long weekend.

We expect a lot of death on the road.

Advertisement

Instead of an uproar over all that death, the world was focused on just one that occurred several weeks ago in a Tesla driving in Autopilot mode, the company's semi-autonomous driving system. 

It's easy to understand why people panicked. The idea that a human wasn't in full control of his destiny while a computer had the wheel is unnerving to say the least. And it does raise an important question: While there's plenty of evidence to suggest semi-autonomous and fully autonomous vehicles could save a lot of lives, should those cars be required to pass a driver's test the same way humans do before getting behind the wheel?

Humans have to start with a learner's permit and drive under the guidance of another experienced licensed driver. In some states, they're not allowed to drive at night until they reach a certain age. They have to know the legal limit for blood alcohol content. They have to know how many feet before a stoplight they're allowed to change lanes. And a zillion other things. Autonomous vehicles don't have to do that yet.

Humans also have to pass a vision test. Meanwhile, there's evidence in the Autopilot death case that the Telsa belonging to the victim Joshua Brown could not see the truck that hit it coming at it from the side and failed to automatically apply the brakes.

Advertisement

A research paper published last year by Michael Sivak and Brandon Schoettle of the University of Michigan Transportation Research Institute tackled the issue of licensing self-driving cars. The paper concluded that self-driving cars would have to pass a lot of standards before being allowed on the road, including tests in harsh weather conditions and unforeseen situations that could put both the driver and a third party at risk. There's no gradual learning curve like with humans. It's all or nothing. And there's little evidence so far that even semi-autonomous vehicles are capable of making all the decisions a human driver can make in every situation.

We're a long way from that, yet we already have semi-autonomous vehicles on the road at a time when people have yet to fully grasp how they really work. Right now, there are a number of vehicles from Tesla, Mercedes, BMW, and Volvo, that offer semi-autonomous driving features, but they all function differently.

For example, Tesla's Autopilot can steer and brake for you while driving at high speeds, but Volvo's semi-autonomous system will only steer and brake while driving up to 30 miles per hour. If you want to go any faster, you are required to steer while it does the accelerating and breaking.

It's that kind of fragmented system that spurs people to do crazy things when they think they're in a self-driving car, despite clear warnings from the manufacturer that you should remain alert and in control at all times.

Advertisement

You've probably seen the videos of people sleeping behind the wheel or riding in the back seat as Autopilot guides them down the road. Standards need to be set to both educate the public on how self-driving systems work and to make sure self-driving vehicles meet a minimum set of requirements before they're allowed on the road. That's not happening today.

There's plenty of evidence to support what Tesla, Google, Volvo, and other tech and car companies have been saying for years — self-driving vehicles could drastically reduce deaths on the road. There were over 32,000 in the US last year, according to the NHTSA. So far, we only know of one death caused by an autonomous or semi-autonomous vehicle, and it happened after a total of 130 million Autopilot miles have been logged, according to Tesla. There's one death approximately every 90 million miles in other cars, according to the NHTSA

Those are definitely better odds. And they'll likely get even better once vehicles are fully autonomous. 

Tesla has repeatedly declined to comment on the accident beyond its clinical description of how the accident happened in a blog post. The NHTSA is now evaluating Tesla's Autopilot technology. Later this month, the NHTSA is expected to release guidelines car makers will have to follow when building self-driving or semi-autonomous vehicles. And President Barack Obama has proposed $4 billion in spending over the next 10 years to research self-driving vehicles.

Advertisement

Those are all smart steps that should've happened years ago when it became clear self-driving cars were coming whether the government attempted to regulate them or not. Brown's death is a sign that the technology may not be perfect, but it could be far safer than what we have today. Meanwhile, it's clear Google has been right all along. We should skip semi-autonomous systems and wait for fully autonomy instead.

But until we start licensing or regulating self-driving vehicles under a clear set of standards the same way we regulate human drivers, there's no way to be sure.

Tesla Self-Driving Car
Advertisement
Close icon Two crossed lines that form an 'X'. It indicates a way to close an interaction, or dismiss a notification.

Jump to

  1. Main content
  2. Search
  3. Account