Should Your Driverless Car Kill You to Save Two Other People?


Recommended Posts

Should Your Driverless Car Kill You to Save Two Other People?

 

t9jtlsszcunnarm5l3ac.png

 

There's a train speeding down the tracks towards five innocent people who will never get away in time. You can save them by pulling a switch, but it'll kill another person on a different track. It's a thought experiment people have debated for ages, but it's about to become a real dilemma when we program robots to pull the switch, or not.

 

Popular Science explains how the classic hypothetical question becomes real:

A front tire blows, and your autonomous SUV swerves. But rather than veering left, into the opposing lane of traffic, the robotic vehicle steers right. Brakes engage, the system tries to correct itself, but there's too much momentum. Like a cornball stunt in a bad action movie, you are over the cliff, in free fall.
Your robot, the one you paid good money for, has chosen to kill you.

 

Maybe the robot itself didn't decide to kill you. Maybe its programmer did. Or the executive who decided on that company policy. Or the legislators who wrote the answer to that question into law. But someone, somewhere authorized a robot to act.

 

That's not the only possible situation either. As Patrick Lin asks in Wired, when faced with a choice of hitting one of two cars or people, what criteria should a driverless car use to pick its target? The future holds a whole bunch of complicated robo-ethics questions we're going to have to hammer out eventually, but in the meantime let's start with one:

 

Should a driverless car be authorized to kill you?

 

Source: Gizmodo

Link to comment
Share on other sites

"The needs of the many, outweigh the needs of the few, or the one."

 

And yes if I was in that position (the one), I would pick the others over myself.

 

But i'd have a problem if it became a "chance" for one vs a guarantee for the other...

Link to comment
Share on other sites

"The needs of the many, outweigh the needs of the few, or the one."

 

And yes if I was in that position (the one), I would pick the others over myself.

 

But i'd have a problem if it became a "chance" for one vs a guarantee for the other...

The problem is people tend to have a problem when they're the "one" :D

 

Even I would have a problem if, say, a drunk driver with 2 people in their car pulls into my lane and my car decides to put me into a tree to avoid the car.  Why should I have to be sacrificed for a drunk driver?  And then where would the line be drawn?

  • Like 2
Link to comment
Share on other sites

Ban Driverless cars :p

 

(And people who drive drunk should be castrated too)

Edited by Daniel F.
edit for language
Link to comment
Share on other sites

This reminds me of iRobot where the robot calculates that the grown man has a higher chance of survival and saves him over the child. This is where a robot will never be able to make the right decision everytime because sometimes "logic" is not the best answer. I really don't understand the push for driverless cars unless it could 100% take away the chance of accidents and automate the roads completely but then we start getting into Skynet levels of paranoia!

Link to comment
Share on other sites

Such decision should never be put into code.  The issue here is the onus of responsibility.

They have to put some decision into code so the car knows where to go.  

 

 

At some point a simple blockage ahead becomes an out of control car barreling towards you and a decision has to be made.

Link to comment
Share on other sites

Hmm, allowing immediate manual control at any given point, perhaps? Of course, such event would be written to the black box, so that it can be known with certainty it wasn't the tin-can, who made the grave mistake. Not that it solves anything, really...

Link to comment
Share on other sites

"The needs of the many, outweigh the needs of the few, or the one."

 

And yes if I was in that position (the one), I would pick the others over myself.

 

But i'd have a problem if it became a "chance" for one vs a guarantee for the other...

Driverless cars = Kobayashi Maru  :laugh: 

Link to comment
Share on other sites

Before deciding what a robot should do in that circumstance, maybe we should agree on what a human should do. This is a tricky philosophical question that has not found any sort of consensus today.

  • Like 1
Link to comment
Share on other sites

Before deciding what a robot should do in that circumstance, maybe we should agree on what a human should do. This is a tricky philosophical question that has not found any sort of consensus today.

Well it is a precedent, as a computer, no matter how well programmed is a difference engine, possible chance of 3 dying, lesser of 2 evils, kill 1 to save 2 etc...

Link to comment
Share on other sites

Before deciding what a robot should do in that circumstance, maybe we should agree on what a human should do. This is a tricky philosophical question that has not found any sort of consensus today.

 

A consensus will never be found. People can't even agree on which way a toilet paper roll should be placed on the dispenser. I have little hope in this resolving.

 

What should I do? Depends on how fast I'm going and the statistical chance that the impact will have more casualties than if I veered off a cliff.

What would I do? Swerve into traffic.

What should the Computer do? Calculate the statistical probabilities of fatalities in both situations and chose the one with the highest survival rate.

What will it do? Blue Screen.

 

Edit: I would just like to add some perspective so people can see this from a logical point of view.

 

You are not your Cars friend. To the Car you are just another Human, just like the Humans in the other Cars. A problem I can see People having is that there should be some sentimental weight that the Car places on your life but sadly that destroys the entire system.

 

Put yourself in the shoes of the Cars in the opposing lanes (A). What would you want the Car (B) to do? Drive off a cliff or crash into you? Well, If the Car (A) your piloting is also autonomous it may choose the option of Car (B) going off the cliff as this saves your life. Now if both Cars put value on your (The driver of said vehicles) life, the whole thing reverts back to simple logic of choosing the option with the least probable casualties.

 

This question really should be a two-part question. Question 2 should be, "If a Car has a critical failure and as such has an option to either swerve into your trajectory to potentially save that passengers life on the Risk of killing you and another or driving off a Cliff killing that passenger with probable certainty; what option should that Car choose?"

 

If I were the programmer, I'd choose the option with the highest probability of survival, it's the logical, reasonable and just choice.

  • Like 1
Link to comment
Share on other sites

People should be able to configure their car with how highly it should prioritize their own life (e.g. "my life is worth the life of x other people"). Ethical decisions should be left to people, and the program must just obey.

Link to comment
Share on other sites

People should be able to configure their car with how highly it should prioritize their own life (e.g. "my life is worth the life of x other people"). Ethical decisions should be left to people, and the program must just obey.

yes, but then the car sees no reason why children or vulnerable people should not be run over to protect the driver.....

I get what you're saying but a computer cannot make that call

Safest thing to do would be to program the computer to protect all users (drivers, cyclists etc) and pedestrians alike, but then it'll decide the safest thing for it to do would be to drive at a speed so slow, it's impossible to kill anything

Link to comment
Share on other sites

yes, but then the car sees no reason why children or vulnerable people should not be run over to protect the driver.....

I get what you're saying but a computer cannot make that call

Safest thing to do would be to program the computer to protect all users (drivers, cyclists etc) and pedestrians alike, but then it'll decide the safest thing for it to do would be to drive at a speed so slow, it's impossible to kill anything

You're right, someone could consider their life to be worth a dozen 80-year-olds, but not even a single child.

Link to comment
Share on other sites

"The needs of the many, outweigh the needs of the few, or the one."

 

And yes if I was in that position (the one), I would pick the others over myself.

 

 

It is nice to think so but one can never really say for sure until they are in that situation. Self preservation is very a powerful instinct.

Link to comment
Share on other sites

then it'll decide the safest thing for it to do would be to drive at a speed so slow, it's impossible to kill anything

 

That reminds me of the mythical Quake match. A person set up a Game of quake with all A.I. bots and after 4 years they all decided the best winning strategy was not to play at all. Link

  • Like 1
Link to comment
Share on other sites

If you're dumb enough to own a driver-less car, then I say let natural selection take its course.  There are too many variables at play for me to ever trust a computer to have control over my car.  If it comes down to buying a car that has that or walking, I'll just ride my 4 wheeler everywhere.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.