[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 7 KB, 300x168, laplace.png [View same] [iqdb] [saucenao] [google]
12027119 No.12027119 [Reply] [Original]

So, I never took any PDE in my math undergrad and somehow I got shoved into an applied PDE course.

I was to prove that the Laplace EQ is invariant under rotation.

So, i'm pretty good at making concrete examples and then seeing what mechanisms are happening and making a general proof.

I proved that Uxx + Uyy = Ux'x' + Uy'y' where x'y' arise from a fixed rotation of theta. Essentially I can prove the statement for two dimensions...How I do this generally I do not understand.

I've seen proofs online that start with v(x) : = u*Ox, and y = Ox. My question are x and y vectors or just variables?

if they are vectors I get a column vector y = Orthogonal matrix * column vectors of x....what do I need to do next?

>> No.12027194

Fuck. I thought you guys were smart. LOL.

>> No.12027277

>>12027119
i didnt even finish highschool but x is always the same distance from y in this case and therefore not variables and indeed vectors

idk draw some lines or something dude fuck i just started community college yesturday and have to take remedial math to catch up

>> No.12027323

x and y are probably vectors that contain each variable x_0 -> x_n and y_0 -> y_n. If you already proved it in 2-D what's so hard about generalizing your proof?

>> No.12027352

>>12027119
Nigga i just graduated hs but applied shit is ez af https://math.stackexchange.com/questions/970892/show-laplace-operator-is-rotationally-invariant essentially u transform to polar then differentiate leaving u with invariance

>> No.12027372

>>12027352
https://en.wikipedia.org/wiki/Rotation_matrix?wprov=sfla1
I think you can use this to generalize

>> No.12027407
File: 79 KB, 1024x768, 1592364923718.jpg [View same] [iqdb] [saucenao] [google]
12027407

>>12027119
relate the derivatives to each other with chain rule and it all falls through

[eqn]
\frac{\partial}{\partial x_i}
= \sum_j \frac{\partial x'_j}{\partial x_i} \frac{\partial}{\partial x'_j}
[/eqn]
this relates the definition of the gradient in two (completely arbitrary) bases. writing this matrix notation we get
[eqn]
\nabla = \mathbf{J} \nabla'
[/eqn]
where [math]\mathbf{J}[/math] is the jacobian (i.e. the coordinate transformation).

next apply the definition of divergence to get laplacian
[eqn]
\nabla^2 = \nabla \cdot \nabla = (\mathbf{J} \nabla') \cdot \mathbf{J} \nabla' = \nabla^T \mathbf{J}^T \mathbf{J} \nabla'
[/eqn]
assuming an orthogonal coordinate transformation matrix [math]\mathbf{J} \in \{\mathbf{O} \in \mathbb{R}^{n \times n} | \mathbf{O}^T \mathbf{O} = \mathbf{I}\}[/math], then the above equation simplifies
[eqn]
\nabla^2 = \nabla'^T \mathbf{I} \nabla' = \nabla'^T \nabla' = \nabla' \cdot \nabla' = \nabla'^2
[/eqn]
thus
[eqn]
\nabla^2 = \nabla'^2
[/eqn]

be careful when manipulating derivative operators. you really should give it a function to work on so you don't miss any applications of the product rule, etc.

>> No.12027413

I'm back. And I got answers I expected from this shit hole.

>> No.12027430

>>12027407
ccc

>> No.12027434

>>12027407
Hey I am OP and thanks. Can I ask you a few more questions? I am 34 and just got into grad school. I got my bachelors in math at 21....have just taught high school since. I'm fucking lost dude.

>> No.12027436

>>12027434
go for it

>> No.12027453

>>12027436
So I see why ∇^2=∇⋅∇=(J∇)⋅J∇

but why is (J∇)⋅J∇=∇TJTJ∇

i am missing something!

>> No.12027490

>>12027453
i'm going back and forth between vector and matrix notation for inner products
[eqn]
\vec{u} \cdot \vec{v} = \vec{v}^T \vec{u}
[/eqn]
and am also using the basic property of transposes that
[eqn]
(\mathbf{A} \mathbf{B})^T = \mathbf{B}^T \mathbf{A}^T
[/eqn]

btw, when manipulating these expressions, i'm thinking of [math]\nabla[/math] in terms of
[eqn]
\nabla = \begin{bmatrix}
\frac{\partial}{\partial x_1} \\
\frac{\partial}{\partial x_2} \\
\frac{\partial}{\partial x_3}
\end{bmatrix}
[/eqn]
that may help you expand out the expressions if you want to

>> No.12027530

>>12027490
I've erased 10 things I was going to type to you, lol. I'm very grateful for the help. I'm kind of baffled by the first statement that you just typed...how do I take the transpose of a vector?

I remember the property for two matrices, but only after I saw it. I understand what the gradient operator is and I remember divergence is a scalar.

I have some catching up to do....Will you be up awhile? I am going to be posting on here daily for the next 8 years till I get my phd.

My background is in industrial math. I did a lot of engineering courses in my undergrad, not math.

>> No.12027534

>>12027530
transposing a vector turns it from a column to a row vector, or vice versa. it's the same as with a matrix just think of it as a mx1 matrix turning into a 1xm matrix

>> No.12027538

>>12027530
I suppose the tranpose of a column vector that is 1xn would be a row vector nx1...but I could just be an idiot.

>> No.12027540

>>12027534
I guess i'm not an idiot.

>> No.12027541

>>12027530
Transpose of a row vector is the same vector as a column and vice versa

>> No.12027544

>>12027534
>>12027534
We typed that virtually at the same time. I was pretty good at linear algebra. So, how are you typing math font on 4chan?

>> No.12027548
File: 243 KB, 3600x1300, joseflatex.png [View same] [iqdb] [saucenao] [google]
12027548

>>12027544
I'm not the person you're responding to initially.

do you know latex?