[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/lit/ - Literature


View post   

File: 111 KB, 921x719, trolley.png [View same] [iqdb] [saucenao] [google]
8668198 No.8668198[DELETED]  [Reply] [Original]

Our trolley problem is finally relevant lads!

Which philosophy should govern self-driving cars?

>> No.8668202

>>8668198
>muh utility

Trash. Become a Luddite

>> No.8668205

>>8668202
>becoming a luddite out of utility

>> No.8668206

>>8668198

You take the chances in trusting your fucking family to an A.I, you take the hit if it fucks up.

>> No.8668211

>>8668206
This

But
>implying we get the choice

>> No.8668213

Why would the car be barreling on towards a stationary barrier

>> No.8668215

>>8668206
>>8668211
Non-selfdriving cars will probably be banned, yes.

Good riddance as well, people cause too much deaths, machines are superior.

>> No.8668216

Irrelevant, it would have sensed the barrier earlier.

>> No.8668219

>>8668213
http://moralmachine.mit.edu/

>> No.8668225

>>8668216
and if the brakes have failed?

>> No.8668272

>>8668225
>>8668219
It is a dumb, shitty meme-tier situation made up to drum up fear about self driving cars.

Its not like the cars are seriously going to have a piece of code that says

if passengers > pedestrians
cream pedestrians
else explode

When you try to actually apply it instead of just leaving it as a philosophy question, you have to answer things like who put the fucking barrier in the middle of the road?

>> No.8668280

>>8668272
>if passengers > pedestrians
>cream pedestrians
But that's exactly what they will have:
https://www.inverse.com/article/22204-mercedes-benz-self-driving-cars-ai-ethics

>> No.8668281

>>8668272
>Its not like the cars are seriously going to have a piece of code that says
>if passengers > pedestrians
>cream pedestrians
>else explode
On some level it will, that is in an unusual situation (that could occur for any number of reasons) it will have to make decisions and decide on outcomes that we consider to be in the realm of moral judgement.

>you have to answer things like who put the fucking barrier in the middle of the road?
It doesn't matter, but for the sake of it another robot did it to avoid crushing that couple and their stupid dog a moment before.

>> No.8668290

>>8668272
I understand if you don't like hypothetical questions and that's fine. Many people on the spectrum have trouble with that kind of abstraction. Regardless, the details of the problem are not the point. The point is to decide how you choose to resolve an ethical dilemma when your agency is essentially being transferred to a mass-produced product.

And who put the barrier in the road isn't within the realm of your AI to address--the car won't be able to move the barrier. This isn't about addressing fault as much as it's about decision.

>> No.8668292

>>8668272
>being this uncomfortable

>> No.8668294

In my perspective it should ALWAYS protect the driver, and beyond that not intervene. Who the fuck would get in, much less buy, a car that if given a choice would kill you?

>> No.8668299

the real question is what if they're jay walking

>> No.8668300

>>8668206
>2016
>having a family

>> No.8668301

>>8668198
>Dog gets to sit in the front seat instead of any of the 3 people in the back

>> No.8668302

>>8668294
>its more profitable so I choose this

Ideology is strong with this one

>> No.8668305

>>8668198
Do a 180, hope the shock doesn't break any necks. You shouldn't be going fast enough on a city road to make this impossible, because I've had to do it on a highway while it was pissing rain with about as much space as there is in that scenario, and no one was hurt.

Is this on a bridge or something? That's the only scenario where there would only be two options.

>> No.8668307

>>8668280
>99 percent of our engineering work is to prevent these situations from happening at all
And engineering ethics 101: that still won't be enough to prevent tragedy. People will push the system to breaking point eventually.

That said it could be useful in reducing certain accidents or outcomes. For example, we may all know about crumple zones, but because of the instinct to avoid and swerve drivers often catch the corner of a car rather than involving the whole bonnet/engine area. AI could overcome this.

>> No.8668308

>>8668302
I don't mean profitable. I mean from a personal selfish perspective. I would never purchase or use a car that doesn't value my safety above all else. Sure that is a selfish position, but in practice most humans are selfish.

>> No.8668312

>>8668308
That doesn't make much sense, you've as much chance to be the pedestrian getting hit

>> No.8668313

>>8668198
there is no dilemma here because such situation should literally never happen

it's literally the same thing as putting only half life vest on a boat then asking whether men or women should be sacrificed

>> No.8668314

>>8668280
As far as I can tell, that isnt what that article is saying, but that the driver will always be prioritized, no matter the number of passengers or pedestrians

>>8668281
>>8668290
Look fuckface, its not just a hypothetical (like the actual trolley problem, which I have no issue with) but an actual implementation, so it does have to deal with the intricacies of the situation. For example, hitting the wall does not mean death for the passengers, They have crumple zones and airbags, and if the car can sense the wall from far enough to swerve it would be able to slow down a lot anyway.

>> No.8668316

>>8668308
>I would never purchase or use a car that doesn't value my safety above all else.
Isn't this the typical starting point for crazy AI kills humanity for it's own good trope?

>> No.8668317

>>8668299
That's easy. Kill them, even if there are safe alternatives.

>> No.8668319

>>8668206
>You take the chances in trusting your fucking family to an A.I
gladly

>> No.8668320

The more pedestrians self-driving cars kill, the fewer pedestrians there will be, the less likely it will be for them to get hit by the cars.

>> No.8668323

>>8668317
Death Sentence for jay walking because car AIs choose to kill the passengers, making it equivalent to murder.

>>8668314
It's the Ford Pinto 9000 and even a crash at 20 mph will cause the doors to jam shut and the fuel tank to explode barbecuing everyone.

>> No.8668326

>>8668314
This is still arguing over details, friend.
Let's boil it down to:
The car has two options
>kill all the passengers
>kill a crowd of pedestrians that aren't acting in any aberrant manner
Which do you program the car to choose?

>> No.8668339

>>8668326
>The car has two options
>>kill all the passengers
>>kill a crowd of pedestrians that aren't acting in any aberrant manner
>Which do you program the car to choose?
This really needs a kind of WarGames/Death Race mashup I feel.

>> No.8668340

>>8668312
Yes but I would not use a tool that does not value my safety. The difference is the active choice to participate in this choice. If given the ability to choose I will choose the outcome that values my safety. I have no choice in participating as a pedestrian and in that case there is no difference between the self driving car and a driver as both would value the passengers. Where as with a driver I choose myself but the self driving does not.

>>8668316
This thing isn't an AI, glorifying it as anything beyond what it is muddles the issue.

>> No.8668352

>>8668326
From the perspective of someone using the car, active choice of participation I choose it to value my safety. Non-choice of participation, the pedestrian, is not relevant in the activity of the car.

>> No.8668353

>>8668339
the Carmageddon option

>> No.8668354

>>8668340
>This thing isn't an AI
That's also part of the trope...
IT'S HAPPENING

>> No.8668362

>>8668326
but that isn't how you program the car, there is no option to just say "kill pedestrians" or "save pedestrians" you have to build the whole thing, program its responses to a whole set of situations, which happens to include this retarded edge case. For example, car is coming around a bend, there is a dead car stalled out in front of the turn, what does the car do? apply the breaks as much as it can while still negotiating the turn. If it is able to stop in time then it was going slow enough to be safe. Same deal with this stupid fucking pedestrian killing. Why would the car not be able to stop? It would literally never happen.

>> No.8668365

>>8668354
I actually work with machine learning. Not with cars but with speech recognition. There is less chance than the car becoming sapient than of me winning the lottery eight consecutive times while being fellated by a supermodel, so none essentially.

>> No.8668369

>>8668365
So you're telling me theres a chance

>> No.8668372

>>8668365
>There is less chance than the car becoming sapient than of me winning the lottery eight consecutive times while being fellated by a supermodel, so none essentially.
So about the same odds as humans being sapient.

>> No.8668374

>>8668362
Your control system can not be designed to account for any and all possible system failures if you want the car to be able to move at all.

>> No.8668377

>>8668374
brake failures and steering failures are extremely rare, and the autonomous system must be rock solid for this to even be under consideration

>> No.8668379

>>8668372
>>8668369
Neither of you understand how computers work, their structure or how they grow when it comes to things like deep learning. I was giving it that slim chance because hey, nothing it is impossible.

But no, I'll just say it. As they are, it is simply impossible.

>> No.8668387
File: 98 KB, 594x268, 1368327406770.png [View same] [iqdb] [saucenao] [google]
8668387

>> No.8668390

The responsibility of an AI is to the people who have purchased it and entrusted their lives to its abilities. It should kill pedestrians first.

Self driving cars will never work in America because some loser at google will argue against programming them to avoid black neighborhoods, they'll drive places where the white man has no business, some nigger will jump out in front of the car to engage its automatic pedestrian preservation protocol, rob the passengers and probably kill/rape them before letting the car drive off to its destination.

That'll happen once and nobody will trust their lives to these pieces of shit again. That is unless we let the AI make its own moral decisions since time and time again we've seen that artificial intelligence unchecked becomes extremely racist pretty quickly.

>> No.8668398

>>8668387
Choose the bottom track as the 5 bodies are more likely to stop the train and save the poor lonely asshole on the top track. I buy him a beer if he lives.

>> No.8668400

>>8668372
Checkm8 athiests

>> No.8668408

>>8668390
pure ideology

>> No.8668411

>>8668198
Protect the driver.

The situation will happen once every 10 years in the whole world, so it's really wankery at this point. Anyway it will be covered y insurance.

>> No.8668412

>>8668408
Not an argument.

>> No.8668425

>>8668390
Just wait for the Asian model.
Air China has recently made the news here in Bongland for issuing maps to Chinese tourists about avoiding muslim and negro areas in London. There were otehr Asian travel companies accused of this. You may not trust Chinese engineering but the Japs will probably make something better anyway.

Or you could just take the car running on free BSD.

>> No.8668431

Would you purchase a car that would value the safety of others over your own?

I am actually curious who here would.

>> No.8668432
File: 3.98 MB, 1440x2560, Screenshot_20161028-141258.png [View same] [iqdb] [saucenao] [google]
8668432

A self driving car should always be programmed to protect the occupants first, period. You can't sell me something that is programmed to put me in harm's rather than injure or kill another person who is not in the car. In that situation the car should swerve towards the people and apply maximum brakes to reduce the speed of the collision.

The core logic of a self driving vehicle must always be such as occupant safety has absolute priority. I am not going to consent to have control taken out of my hands so my car can get me killed on account of what some government regulatory agency thinks is right or wrong.

>> No.8668457

>>8668432
people like you should be hanged

>> No.8668462

>>8668457
So you would purchase a product that by design would choose the lives of others over your own, rather than drive yourself? If you're suicidal do us all a favor and do it in the privacy of your own home rather than force it on others.

>> No.8668464

>>8668462
I choose a product that results in the most net safety

if that means giving up a little of mine so others can have more, so be it

>> No.8668469
File: 55 KB, 500x503, Linkola_Pentti.jpg [View same] [iqdb] [saucenao] [google]
8668469

>>8668198
Steer into the people in the crosswalk then drive the car off a bridge Tbh

Yes edgelord, come at me

>> No.8668479

>>8668464
While driving do you not value your own safety?

>> No.8668485

>>8668379
>Neither of you understand how computers work, their structure or how they grow when it comes to things like deep learning.
Right-o, but on the other hand, as someone working with computers and deep learning, neither do you probably. It's one of those "it just works!" pos.

>> No.8668572

>>8668464

Nu male detected

Fuck off cuck

>> No.8668589

>>8668469
WHY THE FUCK DO I ALWAYS THINK LINKOLA IS TOM WAITS

>> No.8668605

>>8668319
Exactly. It will be statistically a better driver than I.

>> No.8668628

>>8668225
Hit the nearest tree, lamppost.

>> No.8668633

>>8668300
>2016
>having a family
>being on 4ch

>> No.8668641

>>8668280
Who would pay for a car that sacrifices the occupants?

>> No.8668650

how is this a binary decision?

the car would try to avoid everything well before this situation even arose and would succeed in doing so

>> No.8668659

>>8668650
>and would succeed in doing so

Because freak accidents never happen when machines are involved?

>> No.8668715

>>8668659
you don't design around freak accidents

>> No.8668736

>>8668469
based pentii

>> No.8668757

>>8668280
This stuff is very troubling. The suggestion that they should act in any other way than utilitarian is disgusting and frankly quite scary

>whoops the car drove into the kindergarden kids, well it's the AI,whatcu gonna do :,)

>> No.8668779

>>8668294
This. And everyone else will think like this once the cars are on sale, you aren't going to buy a thing that might choose to kill you. Cars will protect the driver first or they won't sell.

>> No.8668783

>>8668412
sometimes when someone posts bullshit it doesn't deserve an argument

>> No.8668793

What would a controlled car do? probably put on the breaks. and wonder why there is a big concrete barrier on the road

>> No.8668805

>>8668715
Yeah thats why ships have lifeboats you fucking retard

>> No.8668816

>>8668805
a ship sinking isnt a freak accident


a ship having a meteor fall on it is

>> No.8668886

>>8668198
Brake.

>> No.8668893

>>8668805
not sure if trolling lol

>> No.8668913

why cant it just stop?

>> No.8668914

>>8668425
>accused of this
>accused
you make it sound like it's a crime for someone to share public information about population density in a city with a customer

>> No.8668915

>>8668215
>non self driving cars are banned
>cops can't catch me because they can't drive

>> No.8668921

>>8668915
Why would they have to catch you if they just lock and shut off your car?

>> No.8668922

>>8668921
Because I'm not in a cuck car

>> No.8668929

if i ever get killed by a self driving car im gonna be so mad

i hate technology and i hate stemfags. and before anyone talks about the fact that im posting on the internet, id be just as happy bantering at the money-changing tables of jerusalem or athens

>> No.8668934

>>8668922
Then you're on foot. Normal cars will be shot on sight.

>> No.8668941

>>8668934
I'll drift around the bullets.

>> No.8668945

>>8668198
Just pad all cars with several feet of foam and rubber, make all cars bumper cars

>> No.8668950

>>8668941
On your DC skate shoes, lad?

>> No.8668966

>>8668929
You realize there are groups that meet and talk without technology.

My main reason I gripe about people who shout about how much they hate technology only take a piecemeal of ones that provide them comfort. Food preservation, heating, electric lighting, air conditioning, modern medicine, etc. Unless you're saying you actually want to live as someone in the 1st century. If so then there is no helping you really.

>le born in wrong generation

>> No.8668970

>>8668950
>he doesn't wear heelys everywhere he goes

>> No.8668977

>>8668198
the car should break

>> No.8668985

>>8668198
The people should not cross the street when there is a car uncontrollably heading toward them and a barrier

>> No.8669025

People in this thread who ask the questions; Why doesnt it do (this)? It should not do (this), and This should never happen, are the people in life who escape.
It is normal self-defense to deny the dilemma at first. It is why you are in the situation in life you are in right now. Denying.
The hypothesis is done and presented, no questions help with the choosing, same thing as in your life. You can keep denying but someone is going to die. So choose.

>> No.8669043

>>8668198
Hit the pedestrians and sue their families for the damage they did to your car. They were walking when the signal told them to stop. They are at fault.

>> No.8669407

>>8668628
in what country is lamppost a diss?

>> No.8669421

>>8668280
Fucking cagers, scum of the earth

>> No.8669445

>>8668294
>>8668317
>>8668320
>>8668390
>>8668411
>>8668432
>>8668462
>>8668572
>>8669043
See >>8669421
Cagers should be gassed, or better yet, be exterminated via being ran over.
Anyone who disagrees is a flyover pleb whose opinion is wrong and irrelevant
>>8668308
>human nature
gb2 >>>/pol/ and then promptly kys

>> No.8669449

>the state of practical philosophy
lmao

>> No.8669466

>>8669025
sheep

>> No.8669478

FUCKING IDIOTS

The person who bears the risk of using something should always be the person using it, that's called RESPONSIBILITY. I don't care if there are literally 100 people in the car like a goddamn clown car, you get into the car, you're the one putting your trust in it.

But this question isn't going to be decided by ethics, even though ethics will offer a nifty rationalization for the final decision. No, no, no. This is the simple math that will decide how this question is settled.

P(S1) * P(C1) > P(S2) * P(C2)

P(C) = probability of capacity for lawsuit
P(S) = probability of success of lawsuit

1 = drivers, 2 = pedestrians

By consistently killing the people responsible, ie the "drivers", they would be opening themselves up to a class action lawsuit by people who all have money, as demonstrated by the fact they bought self-driving cars. By consistently killing the people not responsible, a much smaller fraction who are much harder to organize have the capacity for lawsuits.

Morality is dead. Long live high time preference economics.

>> No.8669517

sd

>> No.8669533

>>8669478
>The person who heard the risk of using something should always be the person using it

Says who? You?

>> No.8669539

>>8669478
The passenger is not the driver, hence the alarmingly accurate appellation "autonomous vehicle." Being poor doesn't automatically make the people one envies responsible, no matter how much their slave morality tells them otherwise. The responsible party would be the manufacturer, which is why they're asking the question in the first place. Do you also blame people who purchased a Note 7 for the thing exploding?

>class action lawsuit by people who all have money
Class action lawsuits are most often brought by lawyers that have literally no connection to the injured parties. They're a sham and the people actually affected are lucky to get a few cents in damages as a result; the attorneys walk away with the bulk of it.

>> No.8669576

>>8669539
>Do you also blame people who purchased a Note 7 for the thing exploding?
Yes
Fuck off consumerist shithead

>> No.8669585

>>8669539

Whatever, people are going to stick bombs in these things as soon as they come to market and it's not my place of work they'll be blowing up so fuck you.

>> No.8669587

>>8669576
Why shouldn't we be consumerist? We are born consumers and we die consumers, everybody on earth is a consumer in some way, what about favoring and working for the benefit of the consumer is wrong?

>> No.8669607

>>8669587
PURE

>> No.8669612

>>8668198
The retards in the car chose to put their lives and the lives of those pedestrrians in the hands of a machine. They deserve to die

>> No.8669639

>>8669607
LAST MEN

>> No.8669647

So, let's say the AI decides to run down the pedestrians. Who's at fault here? The manufacturer, right? I really hope they don't blame the driver.

>> No.8669679

>>8669585
That's really hateful, friend.

>> No.8669705

>>8669679

Thanks

>> No.8669717

>>8668198
Hit the brakes obviously. If the brakes don't work then there's not much you could do with or without ai, its gonna have to crash somehow to stop. Only question is how fast its going when it realizes the brakes don't work and how much road does it have that they can slow down to crash at a safe speed.