[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/lit/ - Literature

Search:


View post   

>> No.20999548 [View]
File: 22 KB, 611x611, TrolleyProblem .jpg [View same] [iqdb] [saucenao] [google]
20999548

Name a single real world situation to which this problem is applicable to.

>> No.20936706 [View]
File: 22 KB, 611x611, TrolleyProblem .jpg [View same] [iqdb] [saucenao] [google]
20936706

There is no situation that cannot fall into a binary of good or evil.

>> No.19404439 [View]
File: 23 KB, 611x611, 03CB827B-335A-43DE-891A-9008BC1A0CDF.jpg [View same] [iqdb] [saucenao] [google]
19404439

Assume no emotions, desires, or preferences exist. Then morality cannot exist, as no one would benefit nor be harmed by any action. Now imagine that an individual has preferences, but the world does not. Then he will certainly prefer some actions over others, even believing that other people have preferences as he does, but in the end, he’s only acting out of his own, irrational preferences. Now imagine that he does not have preferences, but that the world does. Then, to him, it does not matter what he does, as all experiences and therefore actions are equally preferable to him.

So it is clear that morality, if it can be said to exist, is wholly dependent on subjective preferences, which are beyond rationality. The only moral axiom that could exist is that I should do what I will have preferred in the end, though this is circular, as the only justification for why I should do what is preferable is because I prefer it. But any other formulation of morality is no less circular, and certainly more absurd. The word “should” is an odd word, but it can be clarified if you remember that it is based on some goal. For example, if you want to go to sleep early, you should stop using electronics before bad. But everyone wants to have the least regret, everyone wants to live a preferable life and be happy, so “should” in a general sense is based on this goal. We can say with confidence that certain things are healthy and good for us, but of course, no one truly knows what is best in the long run for the self, just as the utilitarian does know what is best for the world. After all, any “gray area” in moral decisions is nothing other than the result of not knowing which choice is better for the self.

>> No.15877904 [View]
File: 23 KB, 611x611, 8A83D362-156C-483D-8D93-22F5E67EA937.jpg [View same] [iqdb] [saucenao] [google]
15877904

>> No.15502688 [View]
File: 23 KB, 611x611, 0b0584bf02449513f879837cc95f19e7e0-09-trolley.rsquare.w700.jpg [View same] [iqdb] [saucenao] [google]
15502688

>>15502523
Isn't this basically our old friend mr trolley, the question of passive vs active violence

>> No.15493591 [DELETED]  [View]
File: 23 KB, 611x611, 123.jpg [View same] [iqdb] [saucenao] [google]
15493591

You cannot be guilty for minding your own business.

>> No.15273424 [View]
File: 23 KB, 611x611, 0b0584bf02449513f879837cc95f19e7e0-09-trolley.rsquare.w700.jpg [View same] [iqdb] [saucenao] [google]
15273424

Any good books on decision theory?

Navigation
View posts[+24][+48][+96]