[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 16 KB, 666x500, klepsydra.jpg [View same] [iqdb] [saucenao] [google]
6052761 No.6052761[DELETED]  [Reply] [Original]

Hi /sci/. I've got this deep life problem that bugs me everyday. How do you describe time? Or else what do you associate it with?

>> No.6052763

pic related

>> No.6052767

>>6052761
yes

>> No.6052774

Time is just a measurement of when things happened.

I have a hard seeing it as a 4th dimension though because it's not physical.

>> No.6052788

Unit of time: second : The second is the duration of 9 192 631 770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium 133 atom.

There, time is now defined.

>> No.6052811

>>6052761
One of the elements of "reality", I try not to think of it as a dimension but as a condition

>> No.6052824

>>6052788
>using duration to describe time
>using a measure of time to define time

>> No.6052828

>>6052824
Yes, I agree, we should use a measure of weight to define time.

>> No.6052875

>>6052761
movement

>> No.6052895
File: 1.94 MB, 266x148, 1366867253924.gif [View same] [iqdb] [saucenao] [google]
6052895

the rate of change OP

>> No.6052909

>>6052895
I keep seeing this gif. Does anyone know what show it's from?

>> No.6053943

>>6052761
> Or else what do you associate it with?
Changing of matter. Go to fucking uni already. But this time pick *accurate*ly.

>> No.6053949

Direction of increase of entropy.

>> No.6053952
File: 109 KB, 476x600, .jpg [View same] [iqdb] [saucenao] [google]
6053952

>>6053949
> Time
> Vector

>> No.6053973

>>6053952
>Time
>not a vector

>> No.6054007

>>6053973
>Tripfag
>Trolling
It's like
>Bowler
>Bowling

>> No.6054017

If entropy increases with time

Doesn't that imply that the past is more stable???

>> No.6054035

>>6054017

No, it implies it is less stable.

>> No.6054042

>>6054035
Actually, it implies it is just as stable, as there is always proportional tendency to minimize energy of the system.

>> No.6054049

>>6052828
i lol'd

>> No.6054050

>>6053949
The system that is the Earth decreases in entropy.
stop defining time via entropy

>> No.6054052

>>6052774

isnt it?

>> No.6054077

Time is what makes the description of motion of free particles in an inertial reference frame as simple as possible.

>> No.6054156

We could define Entropy as time
Since entropy in a closed system is non-decreasing, its the same as the time going forward only (that also means that you can decrease entropy locally/travel back in time by increasing it somewhere else)

it also works well with the fact that on atomic scales the timescales of processes are also low.
Since bigger Volume means bigger system size, there are also much more possible states the system can take, so since the maximum Entropy is higher it also takes longer to get there. Therefore atomic excitations decay after nanoseconds, cells function on timescales of microseconds, a banana peel takes weeks until it decays into boring goo and stars live for millions of years

>> No.6054172

>>6053952
obviously!
since entropy is a scalar field, and it changes, the direction of change is a vector field in 3D.
but that would require a new metric because vector additions would be more trickier

>> No.6054207

You guys I think are going to basic, even in your text book explanations.

Maybe we should ask, why do we all experience time the same? Apparently gravity has a lot to do with time. Why the fuck does gravity effect time? Gravity increases mass, the greater the mass the less you experience time relative to less mass area. Time is movement over distance, but not only that time isn't so much a force as a name for an effect of several forces together that are never the same.

>> No.6054233

>>6054156
most systems aren't closed, but time still passes in them.
Entropy doesn't change at a constant rate with respect to time.
>>6054172
> direction of change is a vector field in 3D.
wut? 4D? I think I'm missing something

>> No.6054242
File: 12 KB, 306x251, QB.jpg [View same] [iqdb] [saucenao] [google]
6054242

>>6054172

>> No.6054316

>>6054207
> Maybe we should ask, why do we all experience time the same?
Maybe we should ask, why do we all experience sex the same?

>> No.6054327

Maybe we should ask, why do we all experience tuna the same.

>> No.6054381

http://lesswrong.com/lw/qp/timeless_physics/

>> No.6054417

>>6054327
Maybe we should ask tuna, why deos it experience us all the same.

>> No.6054461

May be we should not create idiotic threads?

>> No.6054536

Time is change.

If no change is measured-experienced, then no time has passed.

It't not that difficult

>> No.6055481

people here are so fucking stupid, it's astonishing.

>> No.6055515

>>6054381
Along the same lines, but a bit more up to date, and it turns out, very useful for calculation, as it scraps the need for Feynman diagrams.

http://arxiv.org/pdf/1212.5605.pdf

>> No.6055575

it's the movement of matter in space

>> No.6055588

>>6054156
>We could define Entropy as time

Entropy,
therefore memory.

So memory
is the only evidence of the past.

Without memory, time is nothing.

Now if memory could be manipulated... would that equal time travel?

Are mad men rudimentary time travellers?

>> No.6055590

>>6053952
>1380377337299.jpg
Passable!

>> No.6055600

>>6054417
Maybe we should ask time.

>> No.6055603

>>6054017
It just implies that the past is less entropized.

>> No.6055654

>>6055588
No.
Entropy is not memory.
Entropy is entropy. It can kind of be described as chaos, but it's loss of the ability to do useful work.
Entropy has also been co-opted for CS purposes talking about information and the loss of information to chaos. It mostly means you cannot go back because something happened to the system that was irreversible.

Time exists without an observer. Broken observers do not affect time.

>> No.6055699

>>6055654
>Entropy is not memory.
Well, entropy is lack of memory. The glass is half-empty. Amirite?

>> No.6055705

>>6055699
> entropy is lack of memory
no it is not

>> No.6055710

>>6055603
the future is closer to equilibrium (according to second law of thermo)

>> No.6055722

>>6055654
>>6055699
Entropy is clearly defined. It's counting states. There's more if you look at the wiki.

The change in Entropy between two state is what is positive. There's a subtle but important difference between Entropy and Change in Entropy.

Time is seen as fourth dimension in that it is related, intertwined, with space. Time is relative and changes depending on how fast you are going and a bunch of other stuff that General Relativity says.

So in order to describe time you also have to describe Space. So what is Space?

>> No.6055723

>>6055705
1. Isn't entropy a measure of randomness, or missing information? You've just said something like this yourself. I thought I was saying the same using different words.
2. Isn't "memory" just another word for "information"?

>> No.6055755

>>6055723
1. Entropy is a measure of the number of specific ways in which a system may be arranged.

2. Memory is stored information.

>> No.6055760

>>6055722
>Time is change in entropy.
>Change in entropy is the 4th dimension of spacetime.
>So what are the remaining 3 dimensions?
Yeah... such simple questions... and all humans are too stupid to conceive answers.

WE, MONKEYS.

Not being gay
but maybe we shouldn't know the answers.

>> No.6055773

Entropy is history, not memory.

>> No.6055793

>>6055722
you can go between states without changing entropy...
A Carnot engine does not lose entropy.
that's why it's shitty for defining time.

>> No.6055805

>>6055760
We should always seek answers and ask questions. And if you find a way to answer them with experiments to back them up then you win a Nobel prize. Einstein was the one who provided us with the insight that space and time were related and he won a Nobel because of it. So saying that people are too stupid is rather harsh, we're all trying our hardest to make sense of the universe.

Also there is nothing that says Entropy is time. It's just observed that systems tend increase it's Entropy. But there's also systems that decrease Entropy such as a black hole. It's a correlation vs. causation debate that I see is going on in this thread.

>> No.6055828

>>6055723
This is the CS definition NOT the one used for the second law and the entropic death of the universe.
entropy about information is not the missing information. it's the distorted information.
entropy is used to describe irreversible processes done to that information.
For instance we try to lose information to get a truly random number. We go through multiple irreversible processes that distort the old information. This distortion can't be something like a key. It's like multiplying the whole number by the atmospheric disturbances (something considered very random).

memory is stored information.

When talking about thermodynamics, entropy is a property of a state. It increases when there's a heat transfer or mixture. Say I have a hot plate and a cold plate and I push them together so they conduct. I cannot go back to the state where the temperatures are separate again without having an open system that allows energy to go in.

>>6055755
I've heard this definition, but I can't wrap my head around using it for thermo stuff.

>> No.6055849

>>6055828
>When talking about thermodynamics, entropy is a property of a state. It increases when there's a heat transfer or mixture. Say I have a hot plate and a cold plate and I push them together so they conduct. I cannot go back to the state where the temperatures are separate again without having an open system that allows energy to go in.
This is still about information. The key difference between the hot/cold plates and the two warm plates is that you know a lot more about the former. If you have two warm plates, and you know the exact position and velocity of each atom in it, then you CAN in fact go back to a cold and a hot plate with no extra energy going in. Compare Maxwell's demon, which does in fact work if you know where all the hot and cold gas particles are.

>> No.6055853

>>6055828
It's a pretty difficult to relate it to thermodynamics since you'll need a firm grasp on Statistics. But it's actually well thought out and clearly defined. The problem is that people gloss over important concepts and nuances in counting states and arranging objects.

>> No.6055858

>>6055828
Nay, it's not just CS, thermodynamics entropy can be rigorously described by statistical mechanics.

>> No.6055861

>>6055849
> If you have two warm plates, and you know the exact position and velocity of each atom in it, then you CAN in fact go back to a cold and a hot plate with no extra energy going in.
entropy theory does not require atomic theory. The original statements talk about heat transfer and the ability to obtain useful work from a thermal reservoir.

Besides that I don't exactly understand what you are saying/I've only talked about entropy in a thermodynamics class.
How could you separate them if you knew where every particle was?
Are you using the thermal energy of some of the particles and giving it to the others to move them back to their original temperature state?
There's no method of doing this. Maxwell's demon doesn't exist.

>> No.6055867

>>6055861
Not him, but I think he means it will come back to its original state by pure chance after a long enough period of time.

For all macroscopic system, that time is offensively huge, longer than the lifetime of a gogolplex of universes, but it's there.

>> No.6055872

>>6055793
>Carnot engine
>close system

>> No.6055875 [DELETED] 

>>6055872
> Doesn't know what a thermal reservoir is

>> No.6055876

>>6055861
>The original statements talk about heat transfer and the ability to obtain useful work from a thermal reservoir.
And that is an approximation, not the ultimate underlying truth.

>How could you separate them if you knew where every particle was?
>There's no method of doing this. Maxwell's demon doesn't exist.
Let's discuss the Maxwell's Demon case; the plates case is in the end similar but more complicated.
In the Maxwell's Demon thought experiment, you have a warm gas and you want to separate it into a hot part and a cold part by putting it in a device consisting of two chambers, one called "hot" and one called "cold", with a door in between that opens when a hot gas particle tries to enter the hot chamber or a cold gas particle tries to enter the cold chamber, and closes otherwise.
The reason that this is impossible without costing extra energy is that the door mechanism needs to MEASURE the temperature (read: velocity) of the approaching particle, and this takes energy.
However, if you already KNOW the velocity of each particle, and are of course able to simulate physics fast enough to keep your information up to date as time progresses, then you CAN in fact make the door work this way. There's no need to measure anything -- you can open and close based on your existing knowledge. You can turn your knowledge of the system into useful work.

And that's why entropy is really just a measure of your lack of knowledge.

>> No.6055880

>>6055875
But I do, it's a simplification done to solve a specific problem, not a physical object...

>> No.6055885

>>6055880
I can't remember why carnot cycles are the best cycles. I'll look into it again eventually.

>> No.6055887

>>6055876
>And that's why entropy is really just a measure of your lack of knowledge.

Stop saying this, you are confusing people.
Entropy is a measure of the number of specific ways in which a system may be arranged. Period.

>> No.6055892

>>6055887
>Entropy is a measure of the number of specific ways in which a system may be arranged. Period.
The number of specific ways in which a system may be arranged *according to my knowledge*. Probability and uncertainty is in the mind, not in the actual world. The actual world is only arranged in one specific state. The ways it "may be" arranged is a description of my mind, not a description of the world.

>> No.6055891

>>6055876
> needs to MEASURE the temperature (read: velocity)
so a small amount of the "velocity" is transferred to the measuring device. Like a pressure gauge?
I love it.
I thought statistic mechanics was basically just what I learned in my two years of thermo classes.
I love learning new shit like this. Thanks guys.

>> No.6055909

>>6055892
Bro, you're heading into the philosophy area and needlessly complicating things. If you want to discuss the nature of truth and what knowledge is then I suggest you take a gander over to /lit/.

>> No.6055912

>>6055909
My explanation makes experimental predictions, see >>6055876. Experimentally confirmed predictions, I might add.

>> No.6055921

>>6055909
He's not wrong.
You might complain that he is complicating things for laymen but that's another issue.

The information-entropy relation is very physical.
For example, it creates a lower bound on the heating created by a computer CPU. Of course we are still way above that lower bound but it's there. It exists because every transistor destroys information. Two bits in, one bit out.

If we wish to go beyond that limit eventually, we absolutely do have to make reversible electronics, where no information is destroyed in the computing process. i.e. two bits in, two bits out, and you can deduce the two bits in from the two bits out.

It may all seem philosophical and strange but it all makes sense if you represent those as inductors discharging into each other.

>> No.6055924

http://en.wikipedia.org/wiki/Solomonoff%27s_theory_of_inductive_inference

>> No.6055926

>>6055876
Is there a reason why measuring velocity takes energy?
Measuring a distance doesn't mean I lose some distance.
Is it because you can only measure it via a force it expels (and thus work)? You can't look at two states and see "yep that's the velocity"

>>6055887
> Entropy is a measure of the number of specific ways in which a system may be arranged.
This is whats confusing as hell. My water bottle can be arranged in many different ways, but most of them will mean that my water is all over the place.

Knowledge is a mathematical thing too. We used a measure of certainty in robotics. When you measure something you gain knowledge and certainty, when you move something you lose knowledge and certainty.

>> No.6055932

>>6055921
Well, we'd need an objective TOE to achieve reversibility, but that's more something to approach, than it is something to attain.

>> No.6055933

>>6055912
Bro, you're confusing determinism with entropy. Which is a different debate. Determinism arose with the philosophical implications of Newtonian Physics. Then Quantum mechanics comes along and, well you can read about that on your own. Don't worry, a lot of people thought like you did including Einstein.

>> No.6055937

>>6055921
> It exists because every transistor destroys information. Two bits in, one bit out.
???
Is this because to turn on a transistor you have to have to power the transistor and then turn it on?

Idk I basically have taken enough classes to build transistors to a weak bus system thingy (not an EE if you can't tell).

>> No.6055938

>>6055926
>My water bottle can be arranged in many different ways, but most of them will mean that my water is all over the place.
Good example actually, let's explain that. Let's consider the water at rest.
Simple fluid mechanics say the surface is flat, and the volume of water has the shape of your bottle.

Now consider one water molecule near the bottom, and another on top. If you switched those two molecules, would the result be any different? No, physically it would be exactly the same.
Consider now that you can do that for any pair of molecules. You could reach a very large number of permutations like that, all resulting into the same thing.

The logarithm of that number is the entropy of your water. To a constant.

>> No.6055941

>>6055932
Of course, I'm talking reversibility on a computational level, not perfect reversibility in the whole physical process.

>> No.6055946

>>6055937
That's because the two bits in are Vg and Vd, and the bit out is the current.

>> No.6055948

>>6055933
Needs to read up on Bell's theorem before declaring the proper way to interpret QM.

Locality and counterfactual definiteness are incompatible. Which, if not both, are unjustified assumptions could provide insight into whether or not the world is deterministic or indeterministic, but it's still an open question.

>> No.6055952

>>6055948
>Locality and counterfactual definiteness are incompatible.
Not so. Locality, counterfactual definiteness, and a single world are incompatible. Each pair of two is consistent.

>> No.6055954

>>6055926
Easiest explanation is of entropy is a deck of 52 cards. Shuffle them and line them up in all the possible ways that you can arrange them. There's 52! possible combination.
Then start counting. How many times does does it arrange in order of the number and suits (i.e. ace of spade, ace of clover, ace of diamond, ace of heart, then two of spade, etc)? 1/52!. Ok now ask yourself how many ways do you get a Ace of heart as the first card? How many ways can you get an Ace(of any suit) as the first card?

>> No.6055958

>>6055926
>Is there a reason why measuring velocity takes energy?
>Is it because you can only measure it via a force it expels (and thus work)?
http://lesswrong.com/lw/o5/the_second_law_of_thermodynamics_and_engines_of/ explains it better than I ever could.

>> No.6055961

>>6055952
MWI is local without counterfactual definiteness.

>> No.6055973

>>6055946
gate-drain-source
gate and drain are high.
but isn't one of them always high?
transistors always have confused the crap out of me. I always cheated and had an EE look at my diagrams before soldering.

>>6055954
> Ok now ask yourself how many ways do you get a Ace of heart as the first card? How many ways can you get an Ace(of any suit) as the first card?
I was good until this.
why does it matter if the ace of heart is the first card? did we measure and get that?
and for the second did you measure that there was an ace, but didn't see the suit?
these measurements would increase knowledge about the system and would decrease the number of total possibilities so that kinda makes sense.

>> No.6055991

>>6055973
You're over thinking things. I'm simply asking out of all the possible combinations, how many of them have an ace of heart as the first card. Then I ask for any Ace as a first card. You will find that their "probabilities" are different.

Why is this important? Because now Statistics arise. Now you have probability and spread.

>> No.6056003

>>6055991
of course they are different. what does this have to do thermal equilibrium?
Probabilities make sense...
Are you saying that you kinda expect it to be ace? And you have a spread of different probabilities that are closer to having ace of hearts are the ones which just having an ace.

>> No.6056056

>>6056003
It is the different probabilities of state that is important. The sets of cards in which the first 26 cards are equal to 8 or lower and the other half is greater than 8 occurs much less than a set in which high cards and low cards are all over the place.

Now instead of "a deck of 52 cards" i have a bottle of 6.02*10^23 atoms of Helium. And instead of a card rank from "ace to king" I have different velocities from zero to a really big number instead.
Now I ask what is the probability of the states that all the helium atoms in the top part of my bottle have a velocity equal to or greater than say 200 m/s and in the lower half, all the particles have speed of 100 m/s or lower so that the bottle average speed is 150 m/s? Keep in mind that temperature is the average speed of a collection of particles so this would mean that the top of my bottle is hot compared to the bottom which is cold. You will find that this probability is extremely, extremely low compared to one in which the velocities of the particles in the top and bottom are mixed (this means the top and bottom have the same temperature).

I hope you got something from what I just said since seeing the connection clearly requires a course on Statistical thermodynamics.