Most nonmathematicians might think that tensor operations are pretty hard without any formal proof, but new results prove that they are NP Hard which is not good news if you are trying to work something out using them.
A tensor is the ndimensional generalization of a matrix. A matrix is a 2D tensor in that it has rows and columns. A 3D tensor is a cube of numbers with rows, columns and slices. In a matrix the numbers are indexed by two variables, e.g. something like a_{ij}; in a tensor the number of indexing variables can be more than two, e.g. a_{ijk} for a tensor of dimension, or more accurately rank 3.
We use matrix algebra and matrix numerical methods very often in almost any numerical program  it is the foundation of just about all the number crunching we do. We solve linear systems of equations, find eigenvalues, perform singular value decompositions and so on. These might be regarded as difficult math by some programmers, but for the majority of these tasks we have polynomial time algorithms. That is, vector and matrix computations are in P.
Tensor algebra generalizes the the linear algebra that we are all familiar with to higher dimensions  multilinear algebra. It turns up in advanced geometry, the theory of gravity  the general theory of relativity that is, numerical mechanics, and so on. It is also being used in AI and computer vision and a number of cutting edge approaches to all sorts of topics. The good thing about tensor algebra is that from an algorithmic point of view it isn't really anything new  in most cases just a few more indexes to cope with.
Surely the algorithms that we need to work with say rank3 tensors is no more difficult than for rank2 tensors, i.e. matrices? It has to be in P  right?
Well no.
According to a recent paper, things are much worse for rank3 tensors. It seems that they mark a dividing line between the linear convex problems that are tractable and the more difficult nonlinear nonconvex class. In the paper a whole list of generalizations of tractable matrix problems are shown to be NPhard in their rank 3 formulations. These include finding eigenvalues, finding approximate eigenvalues, symmetric eigenvalues, singular values, proving positive definiteness, approximating a tensor by rank 1 tensors and so on. There is even a mention of a generalized question about finding a matrix function which is shown to be undecidable for any tensor 20x20x2 or greater. This is a big surprise because the equivalent matrix function can be easily found and another hint that throwing in just one extra index to a matrix problem makes it very much more difficult. As the title of the paper puts it  Most tensor problems are NPhard.
Should we abandon tensor methods because most of them are not in P?
You need to keep in mind that these sorts of analysis are only valid asymptotically when the size of the tensor n in an n x n x n tensor gets very large. It could well be that for a range of small n we can find algorithms that get the job done in reasonable time. The fact that a problem is NPhard doesn't mean its impossible!
The paper concludes with some open questions and, if you are reasonable at math you should find it an interesting read.
LiquidFun V1.1 Runs In A Browser 28/08/2014
We reported on Google's LiquidFun V1 earlier in the year as, as much fun as you can have without actual liquids getting involved. Now we have LiquidFun V1.1 and you can have as much fun in a browser.
[ ... ]

Algorithmia  An App Store For Algorithms 12/08/2014
Algorithmia is a new marketplace for algorithms where people who need to use an algorithm can find one without having to implement it for themselves and algorithm creators can get paid for their effor [ ... ]
 More News 
