Multiple companies are now developing the next step to vehicular technology: driverless cars. Once they’re fully developed, these vehicles should function in every situation without a human driver. To achieve this goal, a driverless car must be aware of its environment so that it can navigate safely.
Self-driving cars use multiple sensors, such as cameras and lasers, to perceive the world around. A built-in artificial intelligence then uses the data from the sensor to determine appropriate speed, whether it should apply the brakes, and if it should merge into traffic.
Although driverless cars have advanced radically in the last few years, the technology is still in its early stages. In the course of testing their self-driving vehicles, a number of accidents have risen, some of which were fatal.
Automated Accidents and Driverless Death
The quest for fully autonomous automobiles has caused a lot of damage in the time since companies started developing it. As of August 2019, there have been 186 collisions involving driverless cars in California alone.
Last year, an automated Uber car struck and killed a woman crossing the street in Tempe, Arizona. She was the first pedestrian fatality of the development of automated vehicles. The shocking accident occurred on the evening of March 18, 2018. According to police and the car’s security footage, the driverless Uber SUV hit the woman at 40 miles an hour and didn’t even slow down.
Although the vehicle had a human backup operator in it at the time of the accident, the accident happened too quickly for her to intervene. Footage of the car’s interior also showed that the backup operator was watching a program on her phone. As of August 2019, the case has yet to be fully resolved.
Ride-sharing apps are no stranger to lawsuits; there are lawyers specializing in Uber accidents, but the introduction of driverless cars to the road raises a serious question: if the autonomous vehicle gets in an accident, who is liable for the damages?
Law enforcement agencies and legal representatives could already have a hard time determining liability when working on a regular car accident. Driverless cars complicate the issue even more. One thing that might help is to clarify that a lot of cars advertised as “self-driving” aren’t actually capable of full automation.
Most cars still need the input of a human driver, and many warn their users when the computer thinks that the human driver should retake control of the vehicle. When an automated vehicle crashes because its human occupant chose to ignore its advice to switch back to manual operation, determining liability could be clear-cut.
There have also been instances when the automated emergency braking systems or sensor arrays of driverless vehicles have failed to activate. This was certainly the case in the accident that occurred in Tempe. Although it might seem that in such a scenario that the developers of such technology would be at fault for its failure, Uber wasn’t charged for the fatality.
New technology inevitably brings new legal challenges. Driverless technology is still in its infancy, and regulations are stil being discussed. The AV Start Act, for example, failed to pass. Perhaps the law will evolve right alongside these vehicles and protect passengers and pedestrians alike. Until such time, authorities and attorneys will have to work doubly hard to bring justice to the families of driverless technology’s victims.