We have all sat and wondered what it would be like to have a chauffeur. How convenient it would be to have someone at hand to drive you wherever you want, whenever you want to go. Car manufacturers have taken this idea and ran with it. Almost as good as a real live driver at your fingertips, self-driving cars are now a reality. Self-driving or autonomous vehicles have been on the horizon for some time. We all knew they were coming and the day is finally here. We saw some early versions in the 80’s such as the Carnegie Mellon University's Navlab and the ALV projects in 84 as well as Mercedes-Benz and the Eureka Prometheus Project from Bundeswehr University Munich in 1987. The current forerunners in this industry are the Google and Tesla cars. Both companies have a version of the self-driving car that is already on the roads.
They are on the roads but that doesn’t mean people have fully accepted them or they are completely safe. The United States NHTSA or National Highway Traffic Safety Administration as well as the Society of Automotive Engineers (SAE) classifies these cars in two different ways. The NHTSA classifies on a scale of 0 – 4 depending on the cars driving capabilities. 0 being the person drives the car completely and 4 being a car that does not require the human to interact at any time. The SAE scale ranges from 0 – 5 and criteria is judged by how much attentiveness and intervention is required by the human inside. 0 being a car with no automated control and 5 is a car that requires no human interaction other than inputting destination and starting the car.
There is no doubt about it, these cars are on their way to becoming as much a part of our life as our cell phones and tablets. But how safe are they and who is to blame if they are involved in an accident? These cars are manufactured to take out the human errors that cause fatal accidents each year on our busy roads. The trouble is computers make mistakes too. Before these cars become part of everyday life, car and law makers are trying to decide what the best course of action would be, legally speaking, when one of these cars causes an accident. It’s obvious that car makers are going to take the brunt of most legal action in most if not all of these types of cases, lawmakers are also going to have to reinterpret the laws pertaining to liability in car accidents.
Carmakers point to the extreme sensitivity of their self-driving cars and how advantageous that is to the human race. The many sensors, lasers, cameras, radar and ultrasound devices make it possible for them to distinguish between humans and inanimate objects like trashcans. They can tell if an obstacle is in the road and drive around it or stop for pedestrians to cross. They sense if the car in front of them brakes or slows. They know if it’s safe to merge or change lanes. Even with all this amazing technology, things can still go wrong. The key will be in discovering if the accident was caused by manufacturer default like a faulty airbag or bad brake system. There could be one or many liable parties, depending on the circumstances of the accident, the state in which the accident occurred and other factors, just like many other types of car accidents.
Self-driving car makers such as Google and Volvo have said they are already willing to take legal responsibility for accidents. Saying they will take full responsibility for self-driving car accidents isn’t that big of a revelation. It’s kind of obvious. Who else could blame be placed on? The statement was made more to serve as a security blanket for the public than an actual meaningful statement.
Many potential self-driving car owners are looking at the future of auto insurance with a hopeful eye. The hope being that some of the fees and responsibilities can be lifted from car owners if the cars drive themselves. Not likely, says Bryant Walker Smith, assistant professor of the University of South Carolina’s School of Law. According to Smith, since the need for non-collision insurance is still relevant, people will still need to carry individual insurance on their cars. Things like hail damage and graffiti will always be a problem and insurance will always be necessary. Collision insurance may change eventually but as of yet, none of the autonomous car companies have made any strides to challenge the current insurance industry standards. The reason may be because even though car companies will be assuming responsibility for more crashes but there will be less crashes in general. Smith uses the analogy a bigger piece of a smaller pie.
While it is pretty clear that the self-driving car makers will be the ones footing the bill for their accidents, what isn’t so clear is how lawmakers will construe current laws to determine guilt, culpability and blameworthiness. Especially since the majority of the transportation laws were written on the supposition that a licensed human driver would be operating the vehicle. Some states, according to the National Association of Insurance Commissioners (NAIC), such as Michigan, are considering proposing bills requiring self-driving cars to pass a licensing test before travelling on the road. Some aspects will require completely new laws, such as confirming a vehicle's safety before it begins to drive and how to check for proper maintenance.
The death of self-driving Tesla owner Joshua D. Brown in an auto accident May 7 in Williston, Florida was a devastating blow to the industry. According to Tesla, the Tesla Model S car did not distinguish the bright white side of a tractor trailer from the clear and brilliant sky behind it. The car did not engage its brakes and plowed into the truck and trailer, killing the man. He did not engage the brakes either. The accident seems to prove that the technology is not fully ready. At the very least it proves, as the Tesla company points out to its buyers, that drivers of all cars must stay alert, regardless of the cars autonomous capabilities. We must never assume the car is smarter than we are. We must stay vigilant and ready to take control of any situation.