[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 99 KB, 1200x1094, 1200px-Components_stress_tensor.svg[1].png [View same] [iqdb] [saucenao] [google]
12107122 No.12107122 [Reply] [Original]

Not asking about the definition. Just have a specific question. Can tensor be thought of as a function that takes an input and spits an output? For a rank-2 tensor which is a matrix, can it take a vector and rotate it, for example? How is then such a tensor different than a general linear map that rotate/resize things?

>> No.12107402

https://www.youtube.com/watch?v=bpG3gqDM80w

>> No.12107668

>>12107122
https://www.tutorialspoint.com/computer_programming/computer_programming_arrays.htm

Function with array input? No tensor is that array imo, not that function in which it goes into.

>> No.12107835

>>12107122
Yes, a tensor can be considered a multilinear map of q products of a vector space V and p products of its dual V*.

A rank 2 tensor is not a matrix. A matrix is a (1,1) tensor. They are in general different due to their transformation properties under change of basis.

>> No.12107838

>>12107122
The most mundane view of a tensor can be thought of a generalization of vectors and matrices where one specifies a shape [math] (i_1, i_2, \dots, i_n) [/math] for a table that indexes values using n coordinates.
What makes tensors interesting is their mathematical properties as multilinear maps and associated transformation laws. It’s not necessarily *hard* as it’s about being careful about the algebra

>> No.12107849

tensors are multilinear maps it's simple really...

>> No.12107882
File: 287 KB, 1080x1193, Screenshot_20200911-005400_Google.jpg [View same] [iqdb] [saucenao] [google]
12107882

>a tensor is an element of a tensor product

>> No.12107894
File: 110 KB, 680x490, brain.jpg [View same] [iqdb] [saucenao] [google]
12107894

>>12107882
A tensor is something that transforms like a tensor.

>> No.12107910
File: 48 KB, 680x636, 1591569832100.jpg [View same] [iqdb] [saucenao] [google]
12107910

>>12107894
It's an array of numbers.

>> No.12107924
File: 17 KB, 800x458, tensor univ.png [View same] [iqdb] [saucenao] [google]
12107924

>>12107882
Holy based...

>> No.12107929

>>12107882
>>12107894
Why are physics undergrads so impressed by recursive definitions

>> No.12107931

a tensor is a [math]GL(n)[/math]-equivariant map from the set of frames into a suitable cartesian power of the set of coordinates

>> No.12107941

>>12107929
>tensor is an element of a tensor product
>recursive definition
Do you think "a vector is an element of a vector space" is recursive also?

>> No.12107956

>>12107929
>>12107910
brainlet
>>12107882
>>12107894
brain chad

>> No.12108030

>>12107931
Holy shit, that's great. Recently learned about basics of frame theory. Gonna use that definition from now on.

>> No.12108040

>>12107122
Last definition I read was that, Tensor is bilinear map. That's it. I know nothing more than that.

>> No.12108044

>>12107668
>tutorialspoint.com/
Don't post low tier website sources here. This degrade's this board overall quality.

>> No.12108065
File: 73 KB, 700x432, qZ_Mj_ma5irFk3rXS5vRWQ_r.jpg [View same] [iqdb] [saucenao] [google]
12108065

>>12107122
>Update

>> No.12108100

>>12108030
what the hell is "frame theory" ?

>> No.12108107

>>12108100
https://arxiv.org/pdf/1204.0096.pdf

>> No.12108240

>>12108040
I think thats a particular type of a rank-2 tensor, probably (1,1) since biliniear map is a inner/dot product which is basically matrix multiplication of a contravariant and covariant vectors.

>> No.12108242

>>12107931
that sounds close to the definition of tensor product as a multi linear map where you multiply dimensions as opposed to adding them in case of direct sum.

>> No.12108243

>>12107929
>recursive definitions
Thats what Tooker does and everyone shits on him for doing that.

>> No.12108245

>>12107835
>A rank 2 tensor is not a matrix. A matrix is a (1,1) tensor.
But (1,1) is rank 2 = 1+1. I don;t understand what you mean. It could be (2,0) or (0,2) or (1,1) it is still a 2-dimensional matrix. I think it is about how many contravariant and covariant vectors (basically columns or rows) you combine to get that matrix. I am more curious about what that matrix does, not how you build it.

>> No.12108252

>>12108242
not really

>> No.12108262

>>12108243
there's a difference between recursive definition and circular definition

>> No.12108266

>>12108262
Not in this case. The tensor definition is literally
>X is something that does X.
It doesn't make sense

>> No.12108272

>>12108266
because it's circular, not recursive

>> No.12108277

>>12108266
"Element of a tensor product" has at least the potential of being non-circular. "Transforming like a tensor" is just plain circular It's like saying a duck is anything that walks like a duck. It limits what a duck can be somewhat, but doesn't rule out the possibility that ducks are the same things as armadillos.

>> No.12108285

>>12108272
>>12108277
Well OK, thats my point. Since if "tensor is something that transforms like a tensor" is an acceptable mathematical definition of a tensor then Tooker is a genius and his work is correct.
>introduce real numbers via some other concept
>use that concept to define real numbers
Same trick.

>> No.12108287

>>12108285
>"tensor is something that transforms like a tensor"
of course it's not acceptable and nobody claims it is. but at least the idea is not wrong, it can be formalized as >>12107931

>> No.12108298

>>12107894
could someone explain what the tensor transformations are? Or just link me to that one crystallography dude's explanation of tensors? I used to understand them, but forgot.

>> No.12108307

>>12108298
https://www.youtube.com/watch?v=rlpziTbJZk0

>> No.12108315

>>12108298
fix a basis -> tensors can be written as linear combinations of (tensor products of) elements of this basis and the corresponding dual basis. change the basis -> the coefficients also change, and they do so according to the well known formula.

>> No.12108515

>>12108245
no, it's not a matrix, because matrices transform one way and rank (2, 0) and rank (0, 2) tensors transform different ways. a matrix is not just a 2d collection of numbers. it is a representation of a linear transformation with respect to a basis.
there is no canonical map between a vector space and its dual, therefore there is no canonical correspondance between (2, 0) and (1, 1) tensors. just because there is a vector space isomorphism doesn't mean shit.

>> No.12108628

Eigenchris on youtube has a great series of videos on tensor algebra and tensor calculus. He fucks up the first few with a silly mistake but it gets much better.

>> No.12108635

What I don't understand is if a tensor is a *result* of an operation, or it is an operation that acts on something? You know, similar to a linear map that maps some input (ex. a vector) to an output (another vector) by transposing, shrinking it etc. Like what the hell do you do with it in practical terms, such as in physics. Do you combine a couple of things to get rank-2 tensor and you are done? Thats the outcome? Or do you then use that tensor (represented as a matrix) as some sort of an operation that acts on other objects? i/e. it takes some input and produces an output, such a vector? I have asked that question before and I am always given a million of confusing/mutually exclusive/evasive/misleading answers. Does anyone here actually understand tensors as physical objects other than at the level of abstract definitions learned by rote in an abstract math class? (That seems what mathematicians are trained to do; if I didnt understand shit I would also prefer an extremely abstract language to share my confusion with others).

>> No.12108661

>>12108515
you're wrong. matrix IS literally a collection of numbers. it can represent a (1,1), (0,2) or (2,0) tensor, but there's no reason to prefer either of these cases.

>> No.12108662

>>12108635
a tensor is an operation which acts on something, but people use them to hold information in a nice way because they have the structure of multidimensional arrays if you express them in coordinates
the thing you have to understand is that just about anything can be envisioned as acting on something.
for example, a linear functional is a (0, 1) tensor which acts on a vector, which is a (1, 0) tensor. but you can view it the other way, as the tensor acting on the linear functional by evaluating the functional. all these things are transformations. but sometimes it's more valuable to think of them as things which hold data.

>> No.12108670

>>12108661
NO, because matrices come equipped with an ALGEBRA STRUCTURE, and it only makes sense to multiply (1,1) tensors to get a (1, 1) tensor (see rules of tensor contractions) and it makes no sense to multiply (2, 0) tensors to get another rank 2 tensor. You're a fucking moron. IF IT DOESN'T OBEY MATRIX MULTIPLICATION, THEN IT'S NOT A FUCKING MATRIX. IT'S A 2D ARRAY.

>> No.12108683

>>12108662
>a tensor is an operation which acts on something, but people use them to hold information in a nice way
So... is it both then? To hold information for what purpose? To apply that information to convert some input to some output?

>> No.12108694

>>12108670
what makes a collection of numbers organized by rows and columns obey or not obey anything? you have two matrices, you multiply them according to the rules. what prevents you from treating any array of numbers as a matrix that you can multiply by another array of numbers which is another matrix?

>> No.12108696

>>12108670
>IF IT DOESN'T OBEY MATRIX MULTIPLICATION, THEN IT'S NOT A FUCKING MATRIX. IT'S A 2D ARRAY.
this is one of the most retarded things I've ever read on sci

matrix literally is a 2D array, get over it

>> No.12108700

>>12108662
>about anything can be envisioned as acting on something.
Well, I mean consider a linear map. It can be a 2D matrix that takes a vector as input and rotates it. Thats what I mean by "acting". Of course you can act on that matrix as well but I don't think it is relevant here. Which is why I was wondering if tensors can be compared to transformaion matrices if they can transform some kind of input.

>> No.12108724

>>12107122
>is if a tensor is a *result* of an operation, or it is an operation that acts on something?
It encapsulates both. A (pure) tensor is essentially a list of vectors [math]v_1 \otimes v_2 \otimes ... \otimes v_n[/math]. Because finite-dimensional vector spaces are isomorphic to their dual space, some of these can be thought of as functionals instead (these are usually called contravariant indices). In this way, you can get e.g. square matrices as a special case, since [math]V \otimes V^* \cong \hom(V,V)[/math] in a natural way.

>> No.12108728

>>12108724
Meant for >>12108635

>> No.12108760

>>12108683
No, literally just to hold information. Turns out tensors are a useful, general concept with many usages.
Obviously someone using them to hold data isn't using them to transform things or to perform other tensor operations on them. This is why a lot of people just use multidimensional arrays and don't call them tensors.
In reality, you need to carefully differentiate between a tensor (a transformation) and a representation of a tensor in a basis (numbers in a multidimensional array). There are people who only care about the latter, who are using them to store and process data. Well there are reasons you would want to think of these like tensors because certain operations on tensors allow you to perform data compression and stuff like that (see: tensor decompositions). There are also people who care about the transformations, but the way you store said transformation in a computer is by writing it in coordinates (which produces a multidimensional array of numbers).
>>12108700
Only certain vectors transform space in the way you're thinking of it. It's not fruitful to think of tensors as an extension of the way that matrices multiply vectors. What's useful is thinking of tensors as an extension of the way matrices are sandwiched by vectors (a row on the left and a column on the right) which produces a number. i.e. it's useful to think of a matrix as encoding a quadratic form.

>> No.12108769

>>12108694
It doesn't make any sense to multiply two (2, 0) tensors. They don't multiply like how (1, 1) tensors multiply. You're abusing notation, an algebraic structure always includes a collection of operations.
>B-but matrices form a set!
The instant you use matrices to represent transformations, they inherit a fucking algebraic structure.

>> No.12108808

>>12108044
Sorry, but it's best explained what an array is, even better than wiki, because there's actual code that can work, so you get your image of matrix notation.

>> No.12108832
File: 65 KB, 272x204, freg.png [View same] [iqdb] [saucenao] [google]
12108832

>>12107122
Honestly the most confusing thing about tensors is all the different conventions and conceptions used by different fields to approach the subject.
>he says rank-n tensor when he means order-n tensor

>> No.12108945

>>12108760
>you need to carefully differentiate between
>between a tensor (a transformation)
#1

>a representation of a tensor in a basis
#2

Yes I am trying to be as careful as possible, even at the exact wording:

So, #1 a "transformation" is something that transforms other things, correct?

And #2, can actually be "transformed" as the object of transformation, for example under the change of the bases? So we encode certain information in the tensor, and that information can change as the result of the change of the basis, correct?

Thats the distinction between #1 and #2?

At this point it is not even about math it is almost about ambiguities in the wording (object vs subject), that also adds to the confusion!
Such as: I transform (transition from male to female).
Or I transform (other people, since I am a surgeon).

>> No.12109007

>>12108945
Yeah, when people say "a tensor is something that transforms like a tensor", "transform" means "gets transformed by a change of basis" not "transforms other objects"
I see where your confusion is.
The problem is that a tensor is technically still a multilinear transformation, in the sense that it takes a bunch of vectors and turns them into a number.
I think part of your issue is that you're trying very hard to pick out a very particular, specific definition for "tensor." There is no such thing. Plenty of people use the word tensor to describe different, related things. What it really is is a signal word to indicate that there is a certain universal property at play, a universal property which extends the idea of bilinearity.

>> No.12109337

can anyone get me that array of numbers meme

>> No.12109388

>>12109337
Where's the lie though?
It's a multidimensional array of numbers whose algebraic properties are defined in terms of the abstract tensor product (ie the quotient over the free vector space over the Cartesian product) and multilinear maps.

>> No.12109414

>>12107894
Physicists really sometimes surprise me with this level of retardedness

>> No.12109761

>>12108245
If you want to represent (2,0) and (0,2) tensors as arrays with contraction rules compatible with the multiplication rules for matrices and vectors, you would need to do something like:
(2,0) => column vector of column vectors
(1,1) => column vector of row vectors or vice-versa
(0,2) => row vector of row vectors
In this case, the "columnness" or "rowness" of the enclosing arrays would be encoding the contravariance or covariance of the components.

>> No.12109766

>>12109414
>The laws of physics assume their simplest form in an inertial frame
>An inertial frame is one in which the laws of physics assume their simplest form

>> No.12110265

>>12109007
>tensor is technically still a multilinear transformation, in the sense that it takes a bunch of vectors and turns them into a number.
Yeah I read that definition which is confusing since some rank-2 tensors in physics act on a vector and spit a vector. in my mind they are like a prism that refracts incident light at a different angle. For example permeability may be a dyadic 3 x 3 tensor. B=uH so it acts on H which is a vector and returns B which is another vector that now points in a different direction. So it changes both magnitude and direction encoded in the tensor. So technically it is a dot product of u and H which is a bilinear map, but it returns a vector not a scalar, since u is a 3x3 matrix not a vector. Urgh so confusing.
From an intro to tensors by NASA:

> if we form the inner product of a vector and a tensor of rank 2,a
dyad, the result will be another vector with both a new magnitude and a new direction

How is this related to a multilinear map (or a bilinear map) that eats 2 or more vectors and spits out a scalar?

>> No.12110304

>>12108515
Not that anon, but dies that indicate that humans are best at understanding matrices when they represent 2d transformations of vecotrs?
Dies it have to do with how used we are with a 2d representation in general? Or is it a byproduct of LinAlg1 being taught like that?

>> No.12111508

>>12110265
Bilinear form is not the same as bilinear map.

>> No.12111536

>>12110265
in general, a tensor of type (p,q) is a multilinear map which takes q covectors and p vectors as input and returns a scalar as output.
equivalently it's a multilinear map which takes p vectors as input and gives a tensor product of q vectors as outputs. (one is inclined to think that it eats p vectors and returns q vectors, but that's not entirely correct)
a "dot product of a tensor and (co)vector" is a misleading terminology. it probably means the evaluation of a tensor on a (co)vector.

>> No.12111549

>>12111536
>a tensor of type (p,q) is a multilinear map which takes q covectors and p vectors as input and returns a scalar as output.
But for example if p+q=2, it is a rank-2 dyadic tensor and it is a matrix, not a number, right?

>> No.12111558

>>12111549
Yes, and it needs a vector and covector as input to give you a number via matrix multiplication

>> No.12111565

>>12111549
I haven't said tensor is a number. it's a multilinear map which takes p+q inputs and returns a scalar, at least technically. it can be represented by (p+q)-dimensional array of numbers. for example (1,1) tensor is technically a bilinear map which takes a vector and a covector and returns a scalar, but you can interpret it is a map which takes a vector and gives back another vector, because of some cannonical isomorphism.

>> No.12111571

>>12107122
If you tried to explain that to little kid, how would you say, what a tensor is? Because bunch of size of directions of number in every direction isn't quite a fit, but you use so symmetric tensors everywhere.

>> No.12111582

>>12111565
>a bilinear map which takes a vector and a covector and returns a scalar,
via which operations, via matrix multiplication done twice? we have three things here: the tensor (a 2D matrix, 9 elements), and two vectors. What do we do with them to get one single scalar in the end?

>> No.12111607

>>12111582
yes, write covector-matrix-vector and multiply, result is a scalar.

>> No.12111625

>>12111582
>via which operations
Is it always matrix multiplication or do others exist and have to be specified?

>> No.12111659

>>12111625
I dunno, gotta be matrix multiplication since the dot product is lurking there somewhere... It is everywhere. Bilinear form is the dot (inner) product, too.

>> No.12111685

>>12111607
Thanks anon. I cannot say it is crystal clear but it is not entirely insane to think that way. I still like to think that a 2-d rank-2 dyadic tensor matrix eats a vector and returns a vector via matrix multiplication. That makes sense in physics. And that explains the need for a dyadic tensor since it is constructed via the tensor(dyadic) product of two vectors and therefore can encode both magnitude and direction for some specific applications in physics.

>> No.12111693

>>12111625
I'll write how it goes for (1,1). it doesn't matter if you don't get it in detail, it's just to give you an impression. let [math]T[/math] be the tensor. technically it's a map which eats [math]\varphi \in V^*[/math] and [math]v \in V[/math] and returns [math]T(\varphi,v) \in \mathbb{R}[/math]. this is how it looks in coordinates:
let [math](e_i)[/math] be a basis and [math](\eta^i)[/math] dual basis. the tensor can be written as linear combination [math]T = \sum T^i_j e_i \otimes \eta_j[/math]. the summands [math]e_i \otimes \eta^j[/math] work simply as [math]e_i \otimes \eta^j(\varphi,v) = \varphi(e_i) \eta_j(v)[/math], the left hand side is a multiplication of scalars. express [math]v = v^i e_i[/math] and [math]\varphi = \varphi_i \eta^i[/math] in your basis. using bilinearity and the fact that [math]e_i[/math] and [math]\eta^j[/math] are dual, the evaluation becomes
[eqn]T(\varphi,v) = \sum T^i_j e_i\otimes \eta^j (\varphi_l \eta^l,v^k e_k)
= \sum T^i_j \varphi_l v^k \eta^l(e_i) \eta^j(e_k) = \sum T^i_j \varphi_i v^j[/eqn]
which is matrix multiplication of the row vector with entries [math]\varphi_i[/math], 2D matrix with entries [math]T^i_j[/math], and column vector with entries [math]v^j[/math]. tensors of different ranks work the same in principle, in coordinates the evaluation always looks something like [math]\sum T^i_{j,k} \varphi_i v_j w_k[/math], this would be the case of (1,2) tensor. if you interpret this as "3D matrix multiplication" is up to you.

>> No.12111705

>>12111693
Nice.

>technically it's a map which eats φ∈V∗φ∈V∗ and v∈Vv∈V and returns T(φ,v)∈RT(φ,v)∈R
So the dot product. There is always the dot product there somewhere.

>> No.12111721

>>12111705
>So the dot product. There is always the dot product there somewhere.
no, there's no dot product. in coordinates it looks like matrix multiplication of a row and a column and this happens to be the same formula as the standard dot product in R^n, but it doesn't mean there's a "hidden dot product" somewhere and it's misleading to think otherwise.

>> No.12111748

>>12111693
Makes sense, thanks anon.

>> No.12111754
File: 507 KB, 1070x601, 33481015d04b3974f9ed7acf616592901b13507ebdabf48ee1d6d09d63acc2c4[1].png [View same] [iqdb] [saucenao] [google]
12111754

>>12111721
>mfw
lol every time i think i got it, it is wrong
But I've always thought that a map that eats two vectors from V and V* and returns a scalar is called a bilinear form aka inner product aka the dot product in a finite dimensional euclidean space. Having said that I am not sure if it is actually V x V* -> R or it is the same vector space V x V -> R. However there is a placeholder there (.) which denotes the dot product as in B<b,.>

>> No.12111784

>>12111754
Oh wait, I realized that it could be V x V -> R or V x V* -> R, it doesn't matter, it is just (2,0) vs (1,1) but it is still a bilinear form which is the dot product, right? How come there is no dot product then?

>> No.12111799

>>12111754
dot product is a symmetric, positive definite bilinear form. i.e. a bilinear map [math]V \times V \to \mathbb{R}[/math] satisfying some extra properties. by definition every dot product is a tensor of type (0,2). not every tensor of type (0,2) is a dot product, because it might not be symmetric or positive definite. tensor of type (1,1) is cannot be dot product, because it has wrong domain. dot product needs to eat two vectors, not vector and covector.

>> No.12111883

>>12111799
OK, that makes sense. But the dual space V* of covectors is really a space of one-forms, that eat a vector from V and return a scalar, right? But how do they do that? Don't they do that via the dot product of itself with v from V? Because that linear functional is actually <v*,.> and that . is a placeholder to take v from V and apply the dot product with it, right? So sounds like it is really a dot product of a row vector from V* with a column vector from V. Or that is not a true dot product?

>> No.12111900

>>12111883
No, <v,.> would be the element of V*

>> No.12111911

>>12111883
no, evaluation of a covector (a 1-form) on a vector is not dot product. some people might call it like that, or they might write the pairing in the brackets <,> to indicate some similarity but it's not dot product, it just isn't. in R^n the formula for dot product is "transpose one vector and do matrix multiplication" but that doesn't mean "product of a row with a column = dot product". it's ok to think about it like that at first, but if you attempt to understand things seriously, you need to get this out of your head.

>> No.12111935

>>12111883
>>12111911
also the thing is that in [math]\mathbb{R}^n[/math] you can easily switch between vectors and covectors, simply by transposition. so formulas like [math]x^T y[/math] can mean both dot product of [math]x,y \in \mathbb{R}^n[/math] or evaluation of the 1-form [math]x^T \in \mathbb{R}^{n*}[/math] on [math]y \in \mathbb{R^n}[/math]. but once you move into some more abstract vector space (such as tangent space to some manifold), it's super important to keep the distinction between vectors and covectors.

>> No.12111980

>>12111900
Ah thats right, v* as an element of V* is actually <v,.>

>>12111935
And so v in <v,.> is actually a transpose of v from V (for R^n).

>> No.12111993

>>12111980
Yes. In general, if you have a vector space [math]V[/math] equipped with a dot product, then every vector [math]v \in V[/math] determines a covector [math]\langle v,- \rangle \in V^*[/math]. actually this is an isomorphism of vector spaces [math]V \cong V^*[/math]. so that's why you can switch between vectors and covectors e.g. in [math]\mathbb{R^n}[/math]. if you follow the convention that [math]\mathbb{R}^n[/math] are columns and [math]\mathbb{R}^{n*}[/math] are rows, then the isomorphism is exactly transposition.

>> No.12112470

>>12111993
thanks anon, really appreciate your input.

>> No.12112679

>>12107122
>which is a matrix
kys engineer fag