June 3: Self Driving Cars
Self Driving Cars
The dilemma with self-driving cars is something that has been highly debated throughout society. This video discusses how the ethical dilemma of self-driving cars is about programming them to make tough choices in unavoidable accidents. This creates many questions on how to set their moral guidelines.
Link: TED-Ed Talk
The video presented a important question- if you were the programmer how would you have the car choose left into the SUV, straight into the obstacles, right into the motorcycle? I would choose left into the SUV.
Reasons:
- The safest option for the driver
- Minimizes injuries for passengers inside the self-driving cars
- Limits the damage done to the outside environment
- Running into the motorcyclist would almost guarantee death or series injuries to them
- Braking in front of the truck would cause many injuries
- Vehicle crashes- automated systems can malfunction
- They can fail to recognize road signs
- Automated systems struggle to anticipate the actions of pedestrians or animals, increasing the risk of collisions.
- Self-driving cars are fire hazards
- Hacking- hackers can override system
- Health risks arise from radiation from systems in self-driving cars
Further Thoughts:
I think the most significant issue that arises is whose lives are more important- the people in the self-driving car or the people in the surrounding vehicles. You also have to take into consideration how you will determine who is responsible for accidents involving autonomous vehicles—manufacturers, software developers, users, or the other people driving the car.
Here are more things to consider:
- There could be kids in the SUV that could face serious injuries
- Ensuring that the AI systems driving these cars do not exhibit biases based on data they were trained on, which could lead to unfair or unsafe outcomes for certain populations
- How they could be programmed to make a decision on unpredictable variables that could occur when driving?
Comments
Post a Comment