[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math

Search:


View post   

>> No.12116840 [View]
File: 83 KB, 468x147, Convolution_of_box_signal_with_itself2.gif [View same] [iqdb] [saucenao] [google]
12116840

>>12116735
>>12116754
Bre, you manage to be very confusing in a few sentences.
Both start at the same starting point? Then why not start at 0 and play it to a random height?
You say
>then it is the next players turn.
without having before mentioned any action only taken by the first player.

And in any case, with those small bounds, it sounds like the game with pretty surely be over before say the fourth turn.
After n jumps, then expected distance will be [math] \sum_{k=1}^n k \int_0^1 x\x dx = \frac{n}{2} [/math], wouldn't it.
The density for the first step is uniform, a box.
I might be wrong but I think the density for the second step is just the convolution with the box again, e.g. giving 0 chance of being at 0 and at 2, but higher inbetween.
And then you keep doing convolutions, with the peak at n/2.

Easy to test in simulation if that's correct.

>> No.10437321 [View]
File: 83 KB, 468x147, Convolution_of_box_signal_with_itself2.gif [View same] [iqdb] [saucenao] [google]
10437321

>>10436277
intuitively, convolution is mapping the change in the shared area between two functions, as one passes over the other.

>> No.9802121 [View]
File: 83 KB, 468x147, giphy.gif [View same] [iqdb] [saucenao] [google]
9802121

>>9801715
>>9802021
As >>9801799 posted, it's maybe best understood by studying the proof of the classical central limit theorem.

For any distribution and a function g(X) with a series expansion (sum of powers of X), expectation values E[g(X)] can be computed if you know E[X^n]. The characteristic function f(t)=E[exp(itX)] captures those data, and you can get X^n by computing -i(∂/∂t)^n f(t). The fourier transform of a Gaussian is a Gaussian. What remains to show is that the characteristic function of the proces given by the sum of indepenent variables is a Gaussian.
From that standpoint, you could say the Gaussian pops up because it's a nice object w.r.t. the Fourier transform - the niceness that comes out from the independence and linearity assumption of the process under considerations - and the characteristic function has that exp(itX) that ties to the Fourier transform.

To believe in the central limit theorem, what I find helpful is computing the n-ford convolution of a simple distribution, like e.g. a rectangle, where you see that if you keep on smearing one function against itself a bunch of time, you approach a Gaussian. E.g. if you compute the convolution of two squares you get this triangle (pic related) which is broader and pointier. Then if you do the convolution of the triangle with another rectangle, the new rectangle reaches the tip even sooner and the whole form becomes bumpier. Then if you compute the convolution of that bump with a fourth rectangle, then you get a smoother bump and so on and so on. This is the distribution of 4 independent random jumps with the max distance of half the width of the rectangle. If you got thin rectangles and compute 1000 jumps, you'd have a nice looking Gaussian

Navigation
View posts[+24][+48][+96]