Self-Driving Cars: The Moral Dilemma

Self-Driving Cars – The Moral Dilemma

We’re slamming our breaks on the thought of self-driving cars (for now), and we’re not alone!

While the guys in Silicon Valley are joyfully cruising down the road, doing anything but driving their car, we’ve got a few hesitations about self-driving cars, and ‘driving, but not driving’ our cars.

While there are certainly times where it would be nice if your car could take you places (like when you’ve had a pint, when you’re going to the airport at 3 in the morning and maybe even on the last part of your commute home from work), there are still a lot of uncertainties and unknowns to consider.

On one hand, the joy of stepping on the accelerator and feeling yourself pushed back into your seat would disappear and it would be the end of getting the shift from second to third gear just right.

On the other, it would eliminate human error, the unfortunate cause of most accidents and that might seem like a good idea.

But before we can introduce self-driving cars onto our roads properly, there are a plethora of legislative as well as moral challenges that we must consider.


Self-driving cars - the moral dilemma

Consider this scenario: a self-driving car is driving down the road, an unpredictable and dangerous situation arises and the car has three options:


A: Hit a little girl that has stepped out onto the road

B: Swing off the road and risk the driver’s life

C: Collide with an older couple driving the other way


Now how do you program a computer to make the right decision, when we don’t know what the right decision is?

Will software developers in California write an algorithm that takes: risk, the value of lives and perhaps life expectancy into consideration? And how will we put a value on life?

Life value = (estimated life expectancy * amount donated to charity * Girl Scout cookies bought * times one has giving up a seat to an elderly) Personal ranking on Santa’s list

Mercedes has actually taken the matter into their own hands and published a statement saying their cars will be programmed to sacrifice pedestrians if that will save the life of the driver.

So, in the above scenario, the car would hit the little girl to save the driver’s life.

These dilemmas are some that can only be solved if we give car manufacturers the right to create a life-and-death navigation system.

Now, this is only the first part of the problem. Regardless of what manoeuvre the car chooses, someone needs to be held responsible and accountable for the accident and the damages caused by it.

Who is legally responsible? Tesla? Google? The car occupant?

Our infrastructure is built by and for people and while the technology might be ready in the next decade, the society and the infrastructure might not be.

There are problems that must be solved, a moral compass that must set and safety as well as legislative procedures that must be laid out before the magical cars can take to road.

What do you think of driverless cars? Let us know on Facebook or Twitter.


If this blog has inspired you to learn more about Self- driving cars here are a few interesting reads:


5 Things That Give Self-Driving Cars Headaches

Could Self-Driving Cars Speed Hurricane Evacuations?

How The Tech In Your Car Is Making You A Bad Driver

Top 20 Pros And Cons Associated With Self- Driving Cars 

Forget Self-Driving Cars. Let’s Make Self-Driving Living Rooms 

So Nobody Knows How Much Self-Driving Cars Will Pollute