[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 189 KB, 1080x1068, IMG_5022.jpg [View same] [iqdb] [saucenao] [google]
10877426 No.10877426 [Reply] [Original]

Your most concise and profound definition of ENTROPY please

>> No.10877446

Going to shit

>> No.10877472

>>10877426
"disorder"

a clump of ice in space will have low disorder=low entropy. if that clump of ice you smash it into pieces it will now be in a state of higher disorder=higher entropy

yes this has consecuences for the universe that are not completelly clear. like why does it form low entropy "clumps" in his way to a high entropy final rest (supposedly according to current theories....)

>> No.10877494

>>10877426
macro can be from numerous micro

example: bathtub temperature is x,
gazillions of water molecule arrangements can produce that

>> No.10877516

>>10877446
>>10877472
>>10877494
the absolute state of /sci/

>> No.10877525
File: 2.61 MB, 480x270, source.gif [View same] [iqdb] [saucenao] [google]
10877525

<<<------------------------------------------

>> No.10877526

>>10877516
>hi, i'm an absolute summerfag

https://www.youtube.com/watch?v=vX_WLrcgikc
https://www.youtube.com/watch?v=kfffy12uQ7g

>> No.10877528

>>10877426
thing disperse

>> No.10877545

>>10877526
you fucking idiot

>> No.10877552

>>10877545
no U

>> No.10877555

>>10877426
Entropy is nothing more or less than the process by which order degrades and breaks down into chaos.

>> No.10877558

Winter is comin'

>> No.10877567

The song "summer girls" by lfo sums it up pretty well

>> No.10877593

>>10877426
Read "A Farewell to Entropy". All the bullshit " hurrdurr a measure of disorder" doesn't mean shit.

>> No.10877596

>>10877593
entropy literally isn’t real.

>> No.10877600

>>10877472
So basically entropy is heat?

>> No.10877663
File: 98 KB, 831x1024, 1545379811635.jpg [View same] [iqdb] [saucenao] [google]
10877663

>>10877426
>investment into a distinct region of space with the point of creating a semi-closed system that can be differentiated from the background
>entropy is the opposite of that

>> No.10877669

>>10877596
Neither are numbers, but we still use them to measure quantities.

>> No.10877700

>>10877426
Lacking in pattern

>> No.10877704
File: 233 KB, 730x600, qb.png [View same] [iqdb] [saucenao] [google]
10877704

>> No.10877754

>>10877426
Diffusion with extra steps

>> No.10877756

>>10877704
>dumb pic, nothing to say
your dad should have pulled out sooner

>> No.10877764

>>10877600
Not exactly since a hot bowl of soup cooling to the ambient temperature is entropy as well.

>> No.10877768

>>10877756
Because he didn't get a pokemon reference?

>> No.10877771

>>10877426
The amount of chaos in a system

>> No.10877774

>>10877768
>pokemon reference
nothing to say

>> No.10877934

>>10877426

gay

>> No.10877959

>>10877426
If humanity is Harry Potter and the universe is Hogwarts then entropy is literally Voldemort.

>> No.10877967

>>10877426
https://www.youtube.com/watch?v=UwHmnarXwZ8

>lack of order or predictability; gradual decline into disorder

Time and Entropy are one in the same. To reserve Entropy is to reverse Time. To stop Entropy is to stop Time.

>> No.10877973

>>10877426
Tea=entropy

>> No.10877982

>>10877669
how can entropy be real if our numbers aren't real

>> No.10877994
File: 53 KB, 520x390, 1552616623486.png [View same] [iqdb] [saucenao] [google]
10877994

So I can only claim to know anything about entropy from a material PoV, even then I'd doubt the accuracy of it.
It's a bulk substance taking on properties of another bulk substance when heat increases.
Like you have a solid chunk of silver and copper not touching in a furnace. You will get more molecules of silver on the copper and vice-versa as you increase the temperature in the furnace.
From what I can tell, this is from heat breaking down the molecular bonds and increasing the speeds of the individual molecules that split from the bulk which encourages intermingling of the two disparate molecules.

>> No.10877997

Entropy is the measure of the minimum information necessary to encode the state of a system.

>> No.10878001
File: 856 KB, 881x786, image.png [View same] [iqdb] [saucenao] [google]
10878001

I feel like if I type my definition of entropy, someone later will look it up and find a /sci/ archive and think I copied it from here

>> No.10878008

>>10877555
>>10877669
>>10877771
>>10877997
window lickers detected

>> No.10878011

>>10878008
kek, brit insults are funny

>> No.10878035

>>10877426
Entropy is the ratio between the amount of information to define something and the actual information contained within it. For instance, uncomputable numbers have high entropy as they can only be defined by axioms, meaning they have a 1 to 1 ratio of axiom to individual number.

>> No.10878044

Entropy - First Pee after birth.(Intro Pee)

>> No.10878597
File: 55 KB, 640x504, 5AC7E3D3-D4F0-4478-9759-C5C09DED3F41.jpg [View same] [iqdb] [saucenao] [google]
10878597

>>10877426
IT IS THE DIRECTION THAT HUMANS SEE AS TIME

>> No.10878600

>>10878597
This is objectively true

>> No.10878729

>>10877426
>ignorance (opposite of information)
>complexity
>disorder

>> No.10878736

>>10878729
t. philosophy major

>> No.10878747

Here is my 100% guaranteed proof for entropy:

There is no such thing as a true "black box", it doesn't exist. You can try as hard as you want, every single object or container will have a radiation signature, if only a small one.
This means that any closed system will always be emitting radiation.

Which means that radiation must always be spreading out through the universe, getting ever more diffuse. Since energy is conserved, and this is always happening no matter what, it means that entropy always increases

>> No.10878812

>>10877600
An incredibly hot object has order in the sense that it is dense with energy. If the heat was distributed across the entire universe, the system of particles would be considered fully disordered.

>> No.10878833

>>10877426
The measure of the set of points belonging to a certain energy in the configuration space of a system. This is proportional to the probability to hit such a point if chosen randomly.

>> No.10878844

>>10878747
imagine wasting time on something like this post

>> No.10879095

>>10877959
Ur gay

>>10877704
You're actually right

>> No.10879138

>>10877426
Entropy is the amount of ways energy can be distributed within a system. A good analogy to conceptualize this is money. Lets suppose you start with a 100 dollar bill and you have to split it between 30 people. The simplest way is to simply give one person the bill, resulting in 30 possible configurations. The outcome with the most configurations is the split the 100 dollar bill into one dollar bills (excluding coins), resulting in a vastly greater number of configurations. Entropy is considered at a minimum with the smallest number of configurations and at a maximum with the most configurations.

>> No.10879145

>>10879138
Was Richard Feynman a mistake?

>> No.10879160

>>10877426
Incompressibility

>> No.10879170

>>10879145
he worked on the manhattan project so no

>> No.10879173

>>10879170
Not that incarnation of Mr Feynman, the aged meme vector is the one im talking about

>> No.10879177

Entropy: the amount of uncertainty involved in a stochastic variable or process

>> No.10879234

>>10879173
meme feynman was a mistake

real life feynman was based

>> No.10879241

Entropy is the differentials in a system with order and or disorder. Nothing more than that. At least I cannot say it better. Try to google and you'll see.

>> No.10879264

>>10877426
Consider a probability distribution over a finite alphabet A, with corresponding set of finite words denoted A*. We say that A can be asymptotically coded with rate R if there exists a mapping from A* to the set of finite bit strings with the property that for any [math] \epsilon>0 [/math] and any N sufficiently large, it is possible to map every word in A* of length n>N to a bit string of length [math] n(R+\epsilon) [/math] such that the original word can be uniquely reconstructed from the bit string with probability at least [math] 1-\epsilon [/math]. The entropy of the distribution is then the minimal value of R for which A can be asymptotically coded with rate R. See Shannon's source coding theorem for more details.

>> No.10879451

>not even saying what kind of entropy you're talking about

>> No.10879455

>>10879451
it took over a day for this post to appear

>> No.10879488
File: 487 KB, 1996x1330, disappoint.jpg [View same] [iqdb] [saucenao] [google]
10879488

>>10878747
>>10878812
>>10878833
>Like 40 posts before someone finally mentions energy

>>10879451
I noticed this too.

>> No.10879504

What about predictable randomness, like the white noise on an unused television frequency?
Is it high or low entropy?