[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 1.81 MB, 400x400, 1610338805877.gif [View same] [iqdb] [saucenao] [google]
12707418 No.12707418 [Reply] [Original]

Redshift over cosmic scales: Anyone here know if it makes events appear to last longer? For example, if we measure an event to last exactly one second in our galaxy, would an observer 13 billion light years away also measure it to last one second? Seems like if the light 'stretches' from blue to red then the 'total length' of the light waves that are observed would need to increase too, right? Since the length of the wave increases, it would take longer to pass through the far-away detector, which would appear as a longer event.

Anyone know, and if there are any megabrains here, do you know by how much the time of the event would increase by? Since we're talking nanometers of shift from blue wavelength to red, it seems like it would be a very small increase, right?

>> No.12707547

>>12707418
Yes. The time would appear to be ticking slower in a distant galaxy, but it has nothing to do with red shift due to expansion of space and instead has to do with the fact the galaxy is moving away from us.

>> No.12707640

>>12707547

I'm not sure I understand. Isn't the distant galaxy moving away from us BECAUSE of the expansion of space?

Here's why I started wondering about this -- I read some Wikipedia article about an odd supernova that seemed to go off twice, once in 1954 and then again exactly 60 years later in 2014. I believe it was 500M light years away. I was curious about time measurements over large distances. Basically curious whether or not if I flip on a switch for exactly one second, does it measure as one second to all observers no matter how far away?

Good point about that relativistic nature time thing, though. Makes the answer a lot more complicated.