[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/lit/ - Literature


View post   

File: 265 KB, 700x627, 1515029264042.png [View same] [iqdb] [saucenao] [google]
13786717 No.13786717 [Reply] [Original]

Well?

>> No.13786720

>>13786717
the trolley isn't even moving

>> No.13786725

>>13786720
Also, there is no trolley, a human being or a robot, it's only a drawing.

>> No.13786726
File: 251 KB, 700x627, zoom.png [View same] [iqdb] [saucenao] [google]
13786726

>>13786720
how about now?

>> No.13786728
File: 65 KB, 900x883, 1567051346291.jpg [View same] [iqdb] [saucenao] [google]
13786728

It would throw itself into the trolley's axles.

>> No.13786730

Let the trolley kill the group, then stomp the single person to death.

>> No.13786736

If you read the book you would know the police robots were able to hurt human beings but with extreme discomfort.

>> No.13786738

>>13786717
0th law says that humanity as a whole takes precedent i.e. needs of the many outweigh the needs of a few.

>> No.13786743

>a law can never be broken

>> No.13786746

>>13786725
also we don't know if the robot has free will and can't hold him responsible for anything he does, because he doesn't do anything by itself, so it's the same as if a stone lay near the lever, robots are just objects.

>> No.13786747

>>13786738
That's how you get an AI overlord in the books.

>> No.13786752

realistically I would immediately walk away as far as possible and try to avoid being seen so that I'm not mixed up in any of this bullshit from a legal standpoint. I probably wouldn't like those five people anyway. welcome to china.

>> No.13786779

>>13786743
idiot

>> No.13786783

>>13786746
yes anon you're very cute, have a biscuit

>> No.13786929

>>13786779

my facetious comment aside, if you think AI could ever be programmed with the certainty of a bullshit scientific law from a scifi novel, then you're retarded

probably a grey goo believing soiboi

>> No.13786951

>>13786929
I used to like Asimov, but studying A.I at uni made me lose all respect for the hack

>> No.13787004

>>13786929
imagine being this heiney-massacred over intentionally flawed fictional laws

>>13786951
easy to lose respect for something you never understood in the first place

>> No.13787030

>>13786717
So after upgrading to 1903, when I hit the windows key the "Type to search" box appears for a brief moment, but then it disappears before I can 'type to search'. Does anyone know how to fix this besides "creating a new profile"?

>> No.13787056

>>13786951
The whole point of his AI system was the had human-like minds and even personalities. It was to write good stories not be accurate to something that doesn't even exist now, let alone back when he was writing his books.

>> No.13787087

>>13786717
He kind of covers that is the books but there are different levels of priority inside each of the laws. The robot would likely chose the option that saved the most lives.

>> No.13787179

Turn the crank half way and shoot the gap

>> No.13787203

>>13786730
And then masterbait to it :^)

>> No.13787273

its better to kill those four people than the single person. If you kill the one person, the other four might find a way to antagonize you and overwhelm you with their number. If the one person lives, he would think twice about messing with a crazy fucker who just murdered four people

>> No.13787364

>>13786726
IT'S TOO FAST

>> No.13787402

>>13786728
this

>> No.13787556

Methinks we should find the guy who keeps letting trolleys loose.

>> No.13787888

>>13786717
The robot would have to request that a human override the trolley lever. It cannot resolve this situation without being sent to robot jail, for breaking robot law.