[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math

Search:


View post   

>> No.9273748 [View]
File: 990 KB, 480x270, 1480348877746.gif [View same] [iqdb] [saucenao] [google]
9273748

i think this is a fundamentally flawed discussion because it assumes that not only there will be a situation in which braking is impossible but also that a driver will not be able to take manual control and apply emergency brakes

if, against all odds, the regular brakes have failed, and the emergency brake is not working, the car should immediately hand over control to whoever happens to be in the car, thus transferring any moral obligations to the user and conveniently avoiding this whole debate
if there is no one in the car, the situation is slightly different, and the car is permitted to destroy itself, but i feel like this is an edge case
if you just want philosophical circlejerking, ask the generalized version of this question, which will get definitely better responses
>in a situation where human death due to a machine is unavoidable, what "choices" should that machine make?

Navigation
View posts[+24][+48][+96]