Jump to content



Photo

Should Your Driverless Car Kill You to Save Two Other People?


  • Please log in to reply
36 replies to this topic

#1 Ironman273

Ironman273

    Neowinian Fanatic

  • 8,258 posts
  • Joined: 26-October 01
  • Location: Florida
  • OS: Windows 8.1 Pro (Work) Windows 10 (Home)
  • Phone: Nokia Lumia 830

Posted 13 May 2014 - 02:21

Should Your Driverless Car Kill You to Save Two Other People?

 

t9jtlsszcunnarm5l3ac.png

 

There's a train speeding down the tracks towards five innocent people who will never get away in time. You can save them by pulling a switch, but it'll kill another person on a different track. It's a thought experiment people have debated for ages, but it's about to become a real dilemma when we program robots to pull the switch, or not.

 

Popular Science explains how the classic hypothetical question becomes real:

A front tire blows, and your autonomous SUV swerves. But rather than veering left, into the opposing lane of traffic, the robotic vehicle steers right. Brakes engage, the system tries to correct itself, but there's too much momentum. Like a cornball stunt in a bad action movie, you are over the cliff, in free fall.
Your robot, the one you paid good money for, has chosen to kill you.

 

Maybe the robot itself didn't decide to kill you. Maybe its programmer did. Or the executive who decided on that company policy. Or the legislators who wrote the answer to that question into law. But someone, somewhere authorized a robot to act.

 

That's not the only possible situation either. As Patrick Lin asks in Wired, when faced with a choice of hitting one of two cars or people, what criteria should a driverless car use to pick its target? The future holds a whole bunch of complicated robo-ethics questions we're going to have to hammer out eventually, but in the meantime let's start with one:

 

Should a driverless car be authorized to kill you?

 

Source: Gizmodo




#2 mastercoms

mastercoms

    Expert Microsoft Fanboy & C# Coder

  • 1,189 posts
  • Joined: 21-May 13
  • Location: Marietta, Georgia
  • OS: W10 + Fedora 21
  • Phone: Lumia 928 WP8.1U1 Black

Posted 13 May 2014 - 02:23

Well, if every driverless car killed it's user in the possibility of an accident, then the people the car was trying to save would be killed by their cars too



#3 Hum

Hum

    totally wAcKed

  • 63,471 posts
  • Joined: 05-October 03
  • Location: Odder Space
  • OS: Windows XP, 7

Posted 13 May 2014 - 02:26

Should a driverless car be authorized to kill you?

 

Yes -- there are too many people.



#4 Raa

Raa

    Resident president

  • 12,978 posts
  • Joined: 03-April 02
  • Location: NSW, Australia

Posted 13 May 2014 - 02:33

"The needs of the many, outweigh the needs of the few, or the one."

 

And yes if I was in that position (the one), I would pick the others over myself.

 

But i'd have a problem if it became a "chance" for one vs a guarantee for the other...



#5 Praetor

Praetor

    ASCii / ANSi Designer

  • 3,500 posts
  • Joined: 05-June 02
  • Location: Lisbon
  • OS: Windows Eight dot One dot One 1!one

Posted 13 May 2014 - 02:37


Should a driverless car be authorized to kill you?

 

if her name's Christine, then yes.



#6 OP Ironman273

Ironman273

    Neowinian Fanatic

  • 8,258 posts
  • Joined: 26-October 01
  • Location: Florida
  • OS: Windows 8.1 Pro (Work) Windows 10 (Home)
  • Phone: Nokia Lumia 830

Posted 13 May 2014 - 02:37

"The needs of the many, outweigh the needs of the few, or the one."

 

And yes if I was in that position (the one), I would pick the others over myself.

 

But i'd have a problem if it became a "chance" for one vs a guarantee for the other...

The problem is people tend to have a problem when they're the "one" :D

 

Even I would have a problem if, say, a drunk driver with 2 people in their car pulls into my lane and my car decides to put me into a tree to avoid the car.  Why should I have to be sacrificed for a drunk driver?  And then where would the line be drawn?



#7 Aheer.R.S.

Aheer.R.S.

    I cannot Teach Him, the Boy has no Patience!

  • 12,033 posts
  • Joined: 15-October 10

Posted 13 May 2014 - 02:44

Ban Driverless cars :p

 

(And people who drive drunk should be castrated too)


Edited by Daniel F., 13 May 2014 - 10:19. Reason: edit for language


#8 Skiver

Skiver

    Neowinian Senior

  • 3,796 posts
  • Joined: 10-October 05
  • Location: UK, Reading

Posted 13 May 2014 - 12:33

This reminds me of iRobot where the robot calculates that the grown man has a higher chance of survival and saves him over the child. This is where a robot will never be able to make the right decision everytime because sometimes "logic" is not the best answer. I really don't understand the push for driverless cars unless it could 100% take away the chance of accidents and automate the roads completely but then we start getting into Skynet levels of paranoia!



#9 +Nik L

Nik L

    Where's my pants?

  • 34,367 posts
  • Joined: 14-January 03

Posted 13 May 2014 - 12:40

Such decision should never be put into code.  The issue here is the onus of responsibility.



#10 OP Ironman273

Ironman273

    Neowinian Fanatic

  • 8,258 posts
  • Joined: 26-October 01
  • Location: Florida
  • OS: Windows 8.1 Pro (Work) Windows 10 (Home)
  • Phone: Nokia Lumia 830

Posted 13 May 2014 - 13:17

Such decision should never be put into code.  The issue here is the onus of responsibility.

They have to put some decision into code so the car knows where to go.  

 

 

At some point a simple blockage ahead becomes an out of control car barreling towards you and a decision has to be made.



#11 Phouchg

Phouchg

    has stopped responding

  • 5,689 posts
  • Joined: 28-March 11

Posted 13 May 2014 - 13:28

Hmm, allowing immediate manual control at any given point, perhaps? Of course, such event would be written to the black box, so that it can be known with certainty it wasn't the tin-can, who made the grave mistake. Not that it solves anything, really...



#12 xrobwx

xrobwx

    Leave the gun. Take the cannoli.

  • 1,148 posts
  • Joined: 14-June 03
  • Location: Panama City Beach, FL USA
  • OS: Win 8.1
  • Phone: Galaxy Note II

Posted 13 May 2014 - 13:30

"The needs of the many, outweigh the needs of the few, or the one."

 

And yes if I was in that position (the one), I would pick the others over myself.

 

But i'd have a problem if it became a "chance" for one vs a guarantee for the other...

Driverless cars = Kobayashi Maru  :laugh: 



#13 Lord Method Man

Lord Method Man

    Banned

  • 3,758 posts
  • Joined: 18-September 12

Posted 13 May 2014 - 13:37

Human decisions are removed from defensive driving. Google begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th.



#14 Hum

Hum

    totally wAcKed

  • 63,471 posts
  • Joined: 05-October 03
  • Location: Odder Space
  • OS: Windows XP, 7

Posted 14 May 2014 - 01:35

if her name's Christine, then yes.

... or Sarah Conner



#15 Andre S.

Andre S.

    Asik

  • 7,870 posts
  • Joined: 26-October 05

Posted 14 May 2014 - 01:39

Before deciding what a robot should do in that circumstance, maybe we should agree on what a human should do. This is a tricky philosophical question that has not found any sort of consensus today.