Self-Driving Cars: Rewards and Risks

There are several companies in the world who are in the race to get their self-driving cars to be commonly seen on the roads the quickest. Many of these companies are under-estimating the time it will take for their vehicles to be fully autonomous. Tesla, for example, has under-estimated multiple times about the time it would take them to create a fully autonomous car (source). Nonetheless, there are still several contributions that self-driving vehicles have made to the world, even the roads aren't yet filled with them. For example, some self-driving vehicles have made a large impact on the current COVID-19 pandemic, in that they are able to deliver supplies to hospitals without the risk of human interaction (source).
"Beep" self-driving car used for transporting supplies to hospitals across the United States. (source)
The self-driving vehicle, named “Beep” (developed by NAVYA), traveled about “150 miles from the company’s headquarters in Orlando” to deliver supplies to hospitals as well as tests from drive through COVID-19 test sites (source). The vehicle traveled a route that was “isolated from pedestrians, traffic and staff” (source). This brings up the biggest issue currently associated with self-driving vehicles: safety. When it comes to driving automation, there are 6 levels that determine the amount of automation that exists in a vehicle. 
The 6 levels of driving automation. (source)
Currently, there are no level 5 cars on the road. However, Elon Musk, the CEO of Tesla, predicted a year ago that they would have developed a level 5 car by now. Tesla and other companies like Uber currently have level 4 vehicles, however these vehicles have proven to be all but perfect. There have been several instances of traffic deaths associated with these “high automation” vehicles due to bugs in the software (source). These cars use machine learning to make decisions about steering, braking and parking. Every time the car is in motion it is taking in sensory data about the surroundings that leads to its decision making. 
All the various ways in which self-driving cars gather data from the world around them. (source)
Despite the ever increasing technology involved with self-driving vehicles, the question of safety will always be in mind. On one hand, with machine learning capabilities and powerful radar systems being developed, the possibility of accidents will continue to decrease. On the other hand, these AI self-driving cars will never fully be capable of reading situations the way that humans are able to. For example,  reading the facial expressions of other drivers and cues of pedestrians. Yet, there is a study from University College London that shows that 90% of car crashes are due to human error (source). Therefore, if a large percentage of crashes are due to human error, then the value of the way the humans’ perceive the world is a lot less valuable. I think that even though there have been crashes and other issues associated with self-driving cars, the rate in which self-driving cars have accidents will still be much fewer than the rate that humans get in car crashes. I think that human error when it comes to subjectivity like driving is much higher than what a machine can do. This begs my final question, are self-driving cars worth the risk?


Comments

  1. I think that as far as general driving ability that self-driving cars would be way safer than human drivers, but my concern is the cars getting hacked and accidents being caused that way. I don't know how possible or how likely that is but data breaches seem to be so common these days that I would feel uncomfortable.

    ReplyDelete
    Replies
    1. At the moment, most self driving car accidents are from people error right? Well, if we do go to all self-driving cars, I feel like people will have tried to put as much safeguards to prevent cars from getting hacked. Data breaches have been an issue, but with cyber security becoming a bigger thing, then I'm sure data breaches will go down when we self driving cars are the norm.

      Delete
  2. I wonder if "Beep" had an easier time on route due to the fact that traffic has plummeted since the quarantine began. Can we expect similar results when "Beep" has to navigate through areas with high traffic density?

    ReplyDelete
    Replies
    1. Great question, Jason. The article about "Beep" mentioned that it traveled routes free from pedestrians, traffic and other obstacles so I'm assuming that it is not programmed to be able to avoid those types of obstacles. It seems likes there's a long way to go, in general, for autonomous vehicles to be able to safely navigate the roads completely on their own.

      Delete
    2. That is a great question. There is no like feature for comments otherwise, I would like this comment, Jason

      Delete
  3. This comment has been removed by the author.

    ReplyDelete
  4. After reading this article, I can see how autonomous vehicles can be appealing. That being said, I don't believe that anyone would be fully comfortable with self-driving cars if there were any chance of crashing, even though we all take huge risks getting behind the wheel ourselves. There's something to be said for being in control of a situation and I think if there were any accidents with self-driving cars, and no one to blame but a bug, that would cause confusion and upset a lot of people, making it difficult to build back trust and redeem these vehicles.

    ReplyDelete
  5. I agree with Olivia, adding on it reminded me of the boeing 737 max situation. As we saw all of the models of that plain were grounded all over the world. We don't stop flying when planes crash due to human error but we stopped when it was machines fault. Though it rarely happens, this shows that we trust human more then AI that could have bug.

    ReplyDelete
  6. Reading this article gave me such a broad and different views about autonomous vehicles and their benefits. However, I think it would be really hard for these self-driving cars to gain human trust, especially for the level 5, full automation ones, if they ever get on the market. Besides from their own bugs, they have a great risk of being hacked and taken full control over, which imposes a big security threats to the drivers, passengers or even pedestrians.

    ReplyDelete
  7. I think when it comes to autonomous vehicles, the level of driving automation has to be level 0, 1, or 5 and not any of the others. The problem with semi-automated cars is that when it comes to the shifting to manual, humans have to be ready to take over. If we are traveling using automated driving, it is very likely that we would be not paying attention. If the automated car were to switch to manual when a human is not ready, it could result in an accident.

    ReplyDelete
  8. If an autonomous vehicle causes an accident, who should be responsible for it? As you mentioned in the post, 'there have been several instances of traffic deaths associated with these “high automation” vehicles due to bugs in the software', how did they lawfully handle these cases? Have we had any law on autonomous vehicle yet?

    ReplyDelete
    Replies
    1. The source from the University College in London talks about the implementation of automated cars into society and about the laws associated with that. The article pushes for strict laws due to the possibilities of fatalities associated with bugs in software. Companies like Tesla and Uber have dealt with lawsuits due to error in their autonomous vehicles, so I'm sure there are laws in the works that are impacted by these lawsuits and that will hopefully prevent future accidents like those from happening.

      Delete
  9. The issue with automatic vehicle accidents is one of liability - who's at fault? Is the software; is the car manufacturer; is it an injured pedestrian; is it no one? Self driven cars could perform better and safer than regular cars. But in the cases where they are involved in accidents and are found at fault, there needs to be some standard for prosecution/settlements.

    ReplyDelete
    Replies
    1. I agree, I think that there is a major grey area there and I think creating laws and regulations for self-driving cars would help with that. If a law or regulation is created that places blame on the autonomous vehicle's manufacturer no matter the circumstances, those manufacturers will work a lot harder to ensure safety and increase their standards for their cars in order to avoid any backlash from their cars crashes/bugs.

      Delete
  10. I agree with Ehren's comment, at what point does automated driving become dangerous because we are too reliant on it? We have seen examples already of drivers asleep at the wheel of their teslas as they feel safe enough or lazy enough to do so in a dangerous situation like driving. Can we find a good balance between automated driving and human reliance to make sure drivers are prepared for situations where they are needed?

    ReplyDelete
    Replies
    1. You and Ehren bring up good points. I think if a grey area exists about the level of automation of a car then there will be more of a risk for error. I think that in order to fully integrate autonomous cars into our lives then they need to be level 5, because everything in between will mix human error with technology error and thus leading to more accidents.

      Delete
  11. Do you think that all autonomous cars will have the same design in order to keep their sensors working at the highest efficiency? If so, would the AI systems have to be re-trained to adapt to new models of car?

    ReplyDelete
    Replies
    1. I think that would be a good measure to have considering the more cars with the same sensors would mean more data for the cars to learn. If all the data is the same then the cars will get better. I would think that for each company they should keep their models the same, not cross-company.

      Delete
  12. This posts reminds me of Professors Tauheed's presentation and it's just fascinating to see how far AI has come. And the question of safety is the biggest here, and I hope we get to a point where we are all confident and feel safe in getting in one of these cars, because as of now, there is no way i am getting into a self driving car.

    ReplyDelete
  13. I am still on human side in this case. I don't think the self-driving cars are trusty enough and worth the risks. There is a moral issue related to self-driving cars that concerns me. It is a situation when there must be an avoidable injury/death, what will the AI choose? Of course, I don't want to be chosen to be sacrificed by an AI.

    ReplyDelete
  14. I believe that in order for this to work we cannot have a large 50/50 split of autonomous and human controlled cars. We'd need to have a network where all cars are connected. Perhaps we don't need to convert to all of these cars and instead we could make it mandatory that all cars have a chip implanted for bluetooth (range when cars are close together locally, say in city driving) and gps. This does however bring up a lot of privacy concerns.

    ReplyDelete

Post a Comment

Popular posts from this blog

OpenAI Five defeats Dota 2 World Champions: What is AI’s limit?

Can AI outperform doctors?