Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This isn't quite right, but it's a common misconception. Any rank-2 tensor can be represented in matrix form, but not all matrices are tensors, similarly with rank-1 tensors and vectors.

The distinction is important because thinking about the way you have presented leads to confusion about what tensors are...



Sorry this is just wrong. Maybe you’re trying to get at the “co/contravariant” properties of tensors, in which case your statement can be more clearly stated as, e.g., the space rank (2, 0), and rank (1,1) vectors, admit different interpretations as internal hom spaces of vector spaces. But in any interpretation of your statement the distinction is never important because all spaces distinguished by this distinction are isomorphic via cononical isomorphisms.


You might want to think that through a bit more, leaving aside the issue of needing the underlying vector space, are you sure you are comfortable with the statement that all matrices are tensors?


Yes absolutely. Given a matrix as an array of numbers there's a number of natural ways to interpret it as a tensor. What are you uncomfortable about?


Was just explaining the meaning of “higher-order” within the context of this library. But thanks, now I have a clearer picture of what to include in the main README file and notebooks with tutorials, so to avoid misleading.


Can you explain this distinction in more detail?


Sure! There are several different ways to define them, but fundamentally tensors ar geometric objects that define linear relationships with respect to particular vector spaces. They are often defined somewhat loosly by their behaviour under transformations.

In context we often refer to scalars, vectors,matrices as order/rank 0,1,2 tensors (higher order tensors don’t have the same sort of common shorthand). This works fine when you have the context of the underlying vector space, and nderstand the “rules”. Physicists do this a lot, and they often love shortcuts :)

However, there is a growing use/abuse of the terminology (see machine learning) to just mean n-dimensional arrays. The analogy is drawn that a matrix is a 2d version but you can have 3D, 4d, etc. While it’s true that a NxN matrix can represent a tensor (given the context as previously) that misses most of the structure... as such, it’s an unfortunate use of the name.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: