Accidents of Self-Driving Cars


Accidents of Self-Driving Cars
Image by Mikael Seegen on Unsplash

While driving may appear to some as just another tedious duty, it is a pleasurable pastime that many others enjoy. Several automakers have attempted to create technology that does not require human drivers throughout the years. This has, of course, piqued the interest of those who follow the trend. However, the tragedy is that these cars have been involved in multiple accidents in the past, prompting concerns about when they should be released to the general public.

Who are those in support of the movement for self-driving vehicles?

Google has been testing self-driving cars since 2009. Uber also commissioned her self-driving car project in California in 2016. California’s law enforcement department took notice of the company. The Uber initiative was put on hold as a result of this. However, this only applies in California. In addition to the United States, the initiative was active in Canada, Pennsylvania, and Arizona.

The first fatal accident involving self-driving cars occurred in March 2018. The accident involved an Uber self-driving vehicle and a woman. Although a driver was inside the vehicle when the accident occurred, the driver was not engaged in the driving activity because it was in its autonomous driving mode. This crash prompted Uber to suspend its self-driving car project.

But the fact that Uber suspended its self-driving car project does not mean the end of self-driving cars. Many top auto companies like Volvo, Toyota, Tesla, GM, and Ford are all building their versions of self-driving cars. Waymo cars, for instance, have self-driving cars in Arizona and these cars have no human driver behind their wheels.

Many of these self-driving vehicle initiatives have achieved enormous success, far exceeding Uber’s. From 2019 through the first 9 months of 2020, Waymo’s self-driving cars have driven more than 6 million autonomous miles, including 65000 without a person behind the wheel. According to the New York Times, Waymo cars traveled over 5,600 miles in 2017 before a human driver took control to avert any collision. Waymo released statistics when it first launched Waymo one. The Waymo partnership’s most recent improvement is integrating with UPS with class 8 truck drivers.

Waymo’s cars were engaged in 18 collisions involving pedestrians, cyclists, drivers, and other objects in total. They had 29 disengagements or junctures when human drivers were compelled to take control, which very certainly would have ended in an accident.

Who is regulating self-driving cars?

If self-driving cars are unpredictable, the execution of their behaviors is even a greater mystery. For a long time, the US government has put its faith in self-driving car manufacturers to produce individual assessments on the degree of safety of their vehicles. Following the March 2018 disaster, numerous jurisdictions in the United States are treating self-driving cars more seriously. California has a statute that allows self-driving cars to be on the road as long as their manufacturers meet notification and safety regulations.

Let’s say that you are using a rideshare service like Lyft and Uber or vehicle builder; at the first mention of “Self-driving cars,” all you can imagine is the sign of bucks. It is not something hidden that most rideshare companies have their minds; what they stand to gain by building these programs because a taxi service that doesn’t have any driver on their payroll is almost like the realization of their dreams. There are a lot of interesting arguments for self-driving cars, and many auto experts state that errors of human origin are the primary cause of accidents recorded in self-driving vehicles. They also claim that letting computers take control might save more human lives in a car accident.

Many individuals feel that auto companies pushing towards introducing self-driving cars are concerned with their profits and what they stand to gain, rather than the public’s safety. Contrary to popular belief, another school does not believe that self-driving automobiles are the greatest option for people. Because of the great degree of technology, self-driving automobiles are exceedingly complicated. The amount of software that self-driving vehicle processors must execute is significantly more significant than total Facebook, the F-35 combat plane software, and the Chevy Volt! Isn’t it incredible?

Self-driving vehicles, without a doubt, require a mix of a controlled environment in which to be tested and a real-world situation in order to determine their degree of precision fully.

But it is also interesting to state that humans actually caused many crashes for which self-driving cars have been blamed. That’s what a recent report by Axios concluded. After considering all the documentation files at the State of California Department of Motor Vehicles, Axios discovered that human beings were responsible for at least 81 out of 88 accidents involving autonomous vehicles.

A report by Axios grouped autonomous vehicle accidents by whether the car was stopped or in motion when the accident occurred and by driving mode. A self-driving car is rightfully implicated for only one out of every 62 accidents when the self-driving car is in its autonomous way.

Tesla and Waymo may be at the forefront of self-driving car development, but they are far from the only ones. Fifty-five automakers already have a license to test their self-driving vehicles in California. Many of them seek the best ways to conduct these tests efficiently with little or no human interference.

Uber self-driving car’s crash, 2018

In March of 2018, a lady in her forties was murdered in Arizona by an Uber self-driving car. This was the first time a self-driving automobile was blamed for a fatal collision. The automobile was in autonomous mode when it ran over the victim, according to Temple police in Arizona. The automobile slammed into the woman crossing the street and suffered serious injuries. She died at a hospital, where she was transferred following the collision. Uber responded to the event on Twitter, saying they worked with local investigators to figure out what happened at the accident scene. However, an Uber official declined to comment on the situation.

Elaine Herzberg, a 49-year-old lady, was confirmed as the victim. On Sunday night, she was slain outside a crosswalk. The automobile, a 2017 Volvo SUV traveling at 40 miles per hour, failed to identify the woman, resulting in the catastrophic collision that rocked the whole globe. This is in stark contrast to what should be expected, given that most self-driving cars are equipped with various techniques that should allow them to recognize people, automobiles, bikes, and other things.

Following the disaster, proponents of self-driving vehicles faced a barrage of criticism, with many advocating for more rigorous regulations to guarantee that such an incident never happens again. According to John M Simpson, the project director in charge of privacy and technology at Consumer Watchdog, the accident highlights the need for more stringent laws around the operation of autonomous cars. Following the fatal collision, his lobbying team emphasized the necessity for a nationwide ban on self-driving car testing.

The 2016 and 2021 Tesla’s self-driving cars crash

The first company to reveal news of a lethal crash involving an autonomous vehicle was Tesla. This accident occurred in 2016, and it was a result of the inability of the sensors in the autonomous car to detect some heavy-duty cars. The autonomous vehicle sped under one heavy-duty trailer it failed to detect, leading to the death of a middle-aged man inside the car.

Tesla was the first to report a fatal accident with an autonomous car. This mishap happened in 2016, and it was caused by the driverless car’s sensors failing to notice some heavy-duty vehicles. The driverless vehicle drove beneath a heavy-duty trailer that it had been unable to detect, killing a middle-aged man inside the car.

California lawmakers recently approved the experimenting of autonomous vehicles with no human intervention on public roads. The Uber crash further revealed that the technology might need more time before fully legalized to be found on our roads.

Governor Doug Ducey of Arizona, a strong advocate of allowing firms to test their self-driving vehicle technology in the state, is well-known for attacking other states’ government officials for their strict regulations on self-driving car testing. In 2016, he advised Uber to move its self-driving car test out of California and Arizona. He also passed new legislation governing self-driving automobiles. He is frequently told that over 600 self-driving automobiles have explored Arizona’s public roadways.


Even though there is growing popularity and media craze for self-driving cars, it is rather unfortunate to the proponents of this emerging technology that it may take a while before the government of every country of the world agree to allow them on their roads.

Many developers and programmers are working on possible tech glitches and kinks that may cause an adverse effect. Very few companies promise a clean and efficient transport system when self-driving cars are legalized can do if the public has a negative attitude toward them. To buy the confidence, the companies involved in the self-driving car project should be able to do their homework very well to develop a very efficient and more intelligent system that would experience fewer fatal crashes.