[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 2 KB, 285x97, lb.png [View same] [iqdb] [saucenao] [google]
2283515 No.2283515 [Reply] [Original]

Mathematicians and physicists,

Guys, what's the difference between a matrix and a tensor? I understand that a matrix is a type of tensor...right? The only tensors I've come across are in Special Relativity, which I at first though were just matrices (4x4s).

So how do they really differ? Apparently the Levi-Civita tensor is a mind fuck. What is that?

I've know the Lorrentz boost is a tensor, because of its two notational non-summed indices, but this is a 4x4, and I've seen a tensor before that's 3x6? What's this about?

Vague and hopeless question I know, but thanks.

>> No.2283579
File: 95 KB, 472x369, forever alone face.png [View same] [iqdb] [saucenao] [google]
2283579

>> No.2283592

A tensor works as a linear machine.
Notationally similar to matrices, yet they perform some work on general elements (which may be scalars, vectors, or matrices themselves).

>> No.2283712

Given a linear vectorspace V of dimension n over the reals R, a dual-tensor, <span class="math">\phi[/spoiler], of degree two is a bilinear mapping <span class="math">\phi : V \times V \rightarrow R[/spoiler]. An n by n matrix is simply an array containing <span class="math">n^2[/spoiler] entries.

Let V* be the dual space of V and assume that there is a non-degenerate bilinear form <span class="math">\phi[/spoiler] on <span class="math">V \times V[/spoiler]. Then there is a one to one correspondence between <span class="math">n \times n[/spoiler] matrices and linear mappings <span class="math">V \rightarrow V*[/spoiler]. On top of this, there is a one to one correspondence between mappings <span class="math">V \rightarrow V*[/spoiler] and dual tensors of degree 2:

Let T(V) be the set of tensors on V and L(V) the set of linear transformations as above. Define a mapping <span class="math">f: T(V) \rightarrow L(V)[/spoiler] by <span class="math">(f(\phi) (v))(u) = \phi (v)(u)[/spoiler]. This is a bijection.

What this means is pretty much that you can think of bilinear forms as matrices.

Now, there are tensors of higher order and they are still just bilinear mappings from some cartesian product of vector spaces.

>> No.2283733

>>2283712
>assume that there is a non-degenerate bilinear form <span class="math">\phi[/spoiler] on <span class="math">V \times V[/spoiler]
Disregard that

>> No.2283758

To make the construction explicit, I'm just saying that given a bilinear mapping <span class="math">\phi[/spoiler] I can identify it with the matrix M defined by <span class="math">M _{ij} = \phi (e_i,e_j)[/spoiler]. Note that this makes sense since <span class="math">\phi (e_i,e_j)[/spoiler] is a scalar.

>> No.2283776

>>2283712
<div class="math">(f(\phi) (v))(u) = \phi (v)(u)</div>
should of course be
<div class="math">(f(\phi) (v))(u) = \phi (v,u)</div>

>> No.2283799
File: 127 KB, 300x427, 1268284375066.jpg [View same] [iqdb] [saucenao] [google]
2283799

>>2283515
A very simple way to understand it:

A scalar - it is just a number, no degrees of freedom - 0 rank tensor

A vector - one degree of freedom - it can be a row vector, or a colum vector - 1 rank tensor

A matrix - "a 2d object" - 2 degrees of freedom - rank 2 tensor

A "cube of numbers" - 3 degrees of freedom - rank 3 tensor

Anything else I can Help you with?

>> No.2283823

>>2283799
But you're just restating the question... this is probably the explanation his book gave him, which is why he is confused.

>> No.2283836

>>2283799
scalar - a point of numbers - rank 0 tensor
vector - a line of numbers - rank 1 tensor
matrix - a plane of numbers - rank 2 tensor

etc

>> No.2283858

From the little GR I remember, a tensor has additional constraints to make it physical or something.

>> No.2283859
File: 58 KB, 521x785, 1292779622381.jpg [View same] [iqdb] [saucenao] [google]
2283859

>>2283823
What exactly is OP queation?
Even he says it is too vauge.

Please clarify OP
>>2283515

>> No.2283882

>>2283859
>Guys, what's the difference between a matrix and a tensor?

That's a pretty concrete question.

>> No.2283894

>>2283712
>Now, there are tensors of higher order and they are still just _multi_linear mappings from some cartesian product of vector spaces.

fixed

>> No.2283908

>>2283515
A matrix is just an arrow or "step" with {x, y, z} components.

The most tangible way I've heard was to consider a cube with 6 faces and with each face having vector out of it. The cube also gets "shearing forces" depending on how it spins in the 3 axises making 3 more forces for 9 in total. 9 forces times 3 (x,y,z) components for each vector gives 27 variables.

>> No.2283912
File: 116 KB, 1200x810, 1292779141898.jpg [View same] [iqdb] [saucenao] [google]
2283912

>>2283882
The post with the examples is more helpful, but whatever.....

Tensors are geometric entities introduced into mathematics and physics to extend the notion of scalars, geometric vectors, and matrices to higher orders.

\Thread

>> No.2283934

>>2283912
No. No they are not. They are multilinear maps into the base field, something which it is natural to study in linear algebra.

In physics they arise naturally since the only output form an experiment is a scalar, which is exactly the output you get from a tensor.

>> No.2283961
File: 101 KB, 760x1140, 1293745347725.jpg [View same] [iqdb] [saucenao] [google]
2283961

>>2283934
You say nothing that contradicts my post.

>> No.2283977

>Tensors are geometric entities

>> No.2283985

tensors are matrices of matrices

>> No.2283991

>>2283858
A tensor is like a matrix (of sorts) that obeys the tensor transformation rules

http://mathworld.wolfram.com/Tensor.html

the levi civita permutation tensor is a bit down the page and explained in more detail

Also guys, don't let your urge for precision stand in the way of understanding. There's no need to get into mappings with a newbie.

>> No.2284001

>>2283977
Any tensor could be regarded as a geometric entity, if need by. Show me a tensor that I couldn't use a a gemoemtic enity in some n dimesional space? lol

>> No.2284035

>>2283991
It's not an urge for precision, I just feel that it's the only way to really explain the difference. What's the difference between a matrix and a linear map? Well, one's an array, the other is a mapping... the essential point in this discussion is that you can identify the space of second degree tensors on a space with the space of linear mappings from that space into it's dual space, thereby identifying it with a matrix.

>>2284001
It's pretty clear at this point that you are a physicist or an engineer. It is ok that you mostly think of tensors in a setting where they can be interpreted geometrically. They are however not intrinsically geometric entites.

>> No.2284076

Being an idiot, how might tensors be explained to me?

>> No.2284097

>>2284076
For many practical purposes, tensors of degree two is all you need, in which case you can just think of them as matrices. When you apply them to two vectors you then need to apply the matrix as a linear transformation to the first vector (viewed as a coloumn vector) and then apply the second vector (viewed as a row vector) to the resulting vector. This gives you a scalar, as was needed, and all tensors can be obtained in this fashion.

>> No.2285988

OP here. Sorry for late reply; posted this question before I went to sleep (I'm in the UK).

Most helpful answer so far is:
>>2283908

MOAR

>> No.2286395

This is why maths > physics

>what's the difference between a matrix and a tensor?
>a matrix is an arrow, the stress tensor is a tensor
>physicist finds answer enlightening.

>> No.2286433

A matrix is a table of coefficients of a given basis in a 2nd order tensor. That's it. With matrices/vectors you are also, superficially, locked to 0, 1, and 2 order tensors (even if it would be perfectly possible to denote higher orders in a similar method).

You have scalar product, with tensors you have a inner product. You have transpose and scalar products of vectors, with tensors you have outer products (dyadic product).
With 1st order tensors, a transpose is meaningless. There is no "column tensors" or "row tensor" because there is no need for it. A transpose of a second order tensor is as simple as changing the two indices of the coefficient.

In continuum mechanics one often uses third and fourth order tensors.

Usually these higher order tensors are tangents that appear when differentiating lower order tensors.
<div class="math">\sigma_ij = E_{ijkl} \epsilon_{kl}j</div>

I find it very elegant to think of these as tangents or linear mappings, especially when considering different basis (as with the case of large deformations in mechanics).

>> No.2286434

>>2286433
I messed up the latex
<div class="math"> \sigma_{ij} = E_{ijkl} \epsilon_{kl} </div>

>> No.2286446

>>2283991

Correct and complete answer by this anon.

Tensors are matrices that transform like a tensor according to the tensor transformation law.

>> No.2286458

>>2286446
No, dear physicists could you please stop posting this crap? A tensor is not a matrix which transforms according to a certain law.

>>2286433
This is the truth.

>> No.2286499

>>2286458

You are directly inducing an interpretation for a matrix, which you can´t. Certain matrices are a table of coefficients, others aren´t. There are further differences.

Also, where is the difference between
>>2286433
and
>>2283991
?

There is practically none, as the transformation law implies everything that follows.

>> No.2286536

>>2286499
No, the ones directly inducing an interpretation for a matrix are the physicists. You need to understand that a matrix is simply an array. There is nothing more to it.

A vector is an element of a vector space. If you choose a basis, you can write it as a coloumn vector, writing out it's components with respect to that basis.

A linear mapping bewteen vector spaces isn't a matrix. It is a linear mapping. If you choose a basis, you can fully describe the action of the linear mapping by writing down a matrix and interpreting this matrix in a certain way. Since matrices are vectors, it makes sense to talk of the entries as the components of the linear mapping with respect to a basis.

A tensor is a multilinear mapping from a cartisian product of vector spaces into the base field. Again, if we choose a basis we can identify it with a multi-dimensional array of numbers as described in
>>2283758
A tensor is not a matrix which transforms in a certain way. Rather, any tensor can be represented by a (multidimensional) matrix.

>> No.2286543

Summing up, you can represent a second degree tensor by a matrix.

>> No.2286544

>>2286536

>You need to understand that a matrix is simply an array. There is nothing more to it.

Well, that´s what i wanted to say. But it contradicts this:
>A matrix is a table of coefficients of a given basis in a 2nd order tensor.

See what i mean?

>> No.2286565

>>2286544
Yea, I see, he should probably have worded as
>Given a basis and a 2nd order tensor, there is a way to identify it with a matrix.

I don't disagree with
>>2283991
he's just giving the "down to earth" explanation. What I didn't like was
>Tensors are matrices that transform like a tensor according to the tensor transformation law.

>> No.2286572
File: 38 KB, 201x228, 1281408348236.jpg [View same] [iqdb] [saucenao] [google]
2286572

wtf is liner mapping?

>> No.2286575

>>2286565

Sounds good that way.

What didn´t you like about the last line you quoted, if i might ask?

>> No.2286619

A matrix is not a tensor and a tensor is not a matrix. Rather, given a basis, the coordinate representation of a tensor w.r.t. that basis is a matrix, and any matrix "induces" a tensor in a certain way w.r.t. that basis.

However, the correspondence is one-to-one, so if it is convenient it makes sense to choose a basis and simply identify the tensors with their corresponding matrices.

>> No.2286629

>>2286619

Not "any" matrix.

>> No.2286648

>>2286629
Yes, indeed any matrix. The correspondence is bijective. I can write it out explicitly if you're like, but it's just going to be the generalisation to finitely many dimensions of the construction in
>>2283758

>> No.2286877

>>2286648

I believe one can come up with a matrix that does not induce a tensor with respect to any base.

>> No.2286915

>>2286619
>implying matrices arent rank 2 tensors

>> No.2286942

>>2286877
One cannot. I'll just do the example for second rank tensors (bilinear forms), it's the same for higher ranks.

Given any bilinear form, say <span class="math">\phi[/spoiler], it is obvious that <span class="math">\phi[/spoiler] is completely determined by choosing a basis <span class="math">\{ e_1,...,e_n \}[/spoiler] and giving the values of <span class="math">\phi(e_i,e_j)[/spoiler]. Define the matrix M by <span class="math">M_{ij}=\phi (e_i,e_j)[/spoiler]. This is how you get a matrix from a bilinear form.

On the other hand, if you have any matrix M, define a bilinear form by <span class="math">\phi (e_i,e_j) = M_{ij}[/spoiler] and extend by bilinearity, that is <span class="math">\phi (v,u) = \sum _{i,j} u_i v_j\phi (e_i,e_j) = \sum _{i,j} u_i v_j M_{ij}[/spoiler]. This is a bilinear form, hence it is a tensor of rank 2.

>> No.2286970

>>2286942
The construction here is of course not basis independent. Notice that I do not require that the matrix transforms as a tensor.

>> No.2287095

>>2286915
Yea, they aren't.