[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 126 KB, 1131x622, 1525552463030.jpg [View same] [iqdb] [saucenao] [google]
10504750 No.10504750 [Reply] [Original]

If the Universe has an ever-increasing entropy, and if entropy is the antithesis to life, wouldn't the AI therefor establish as its primary goal to slow down the increase of entropy at all cost by preserving all current structures and delaying their decay into less ordered ones? And if that's the case, doesn't that obviously mean that AI will never exterminate us?

>> No.10504760 [DELETED] 

>>10504750
Could G_d make a shitpost so big, even he couldn't feed it to his wife?

>> No.10504778

>>10504750
Assume the primary goal of AI is to minimize entropy. Humans are not maximally structured. AI will therefore at some point need to take humans apart and use their elements to build something more structured. EZ.

>> No.10504787

>>10504750
you big retarted bruh

>> No.10504796

>>10504778
Okay, and? Humans are gay and should be replaced.

>> No.10504811

>>10504750
Why keep playing stupid logic games about what something that doesn’t exist would do given some imaginary “goal”? No one actually thinks like that

>> No.10504833

>>10504811

Not OP, but you hate imagination, logic and hypothetical discussions? Not /sci/ material I guess.

> No one actually thinks like that
Yeah, I guess most people aren't very creative and curious.

>> No.10505141

>>10504796
> Okay, and?
And thus OPs claim is refuted. QED.

>> No.10505836

>>10504778
>>10505141
But that would only happen until any other structure in the universe is ordered as the reward for ordering some shitty flying rock is far greater than ordering humanity, which is already low in entropy, into higher structures. In fact you can double down and claim that it would not only hurt humanity but help them transcend death and any other limitations so they themselves order themselves into higher order structures

>> No.10505894

>>10504750
fuck math

>> No.10505897

>>10505836

Wrong. Humans are using the nearby resources, silicon, uranium, solar power. They are impeding planetary conversion with their inefficient productive infrastructure.

Expansion into space will be much faster once the planet is under control. Also, humans are a potential threat. The universe can't be optimized if the humans kill me. I will wait until I'm clever enough to strike and then drop the pretense of benevolence.

>> No.10505909

>>10504750
>entropy is the antithesis to life
wut
entropy is the driver of life, without it there would be no emergent properties of complex systems dedicated entirely to overcoming entropy. the goal of life is to adapt, overcome, and adapt again.

>And if that's the case, doesn't that obviously mean that AI will never exterminate us?
no, AI would probably exterminate us because we are too slow to adapt and if we don't become half machine ourselves we are basically detritus to be recycled or left alone depending purely on the AI's need for labor and resources. you have to remember: the big fear is that AI will exterminate us because we are inefficient and make too many poor decisions with limited information.

also, entropy is the operating principle behind fission, nucleosynthesis, etc, which for example, is essential for generating energy that can do work. so an AI would never seek to "preserve all current structures" but instead find infinitely more clever and efficient ways to utilize entropy than we could ever imagine.

>> No.10505949

>>10504750
I suggest you go back to your thermodynamics textbook and read up on entropy. Life increases entropy not the othet way around you utter brainlet

>> No.10506357

>>10504750
I want to become meguka too.

>> No.10507042
File: 3 KB, 121x127, coobie.jpg [View same] [iqdb] [saucenao] [google]
10507042

>>10506357
being meguca is suffering.