ИСТИНА |
Войти в систему Регистрация |
|
ФНКЦ РР |
||
Numerical data are frequently organized as d-dimensional matrices, also called tensors. However, only small values of d are allowed since the computer memory is limited. In the case of many dimensions, special representation formats are crucial, e.g. so called tensor decompositions. Recently, the known tensor decompositions have been considerably revisited and the two of them, previously used only in theoretical physics, are now recognized as the most adequate and useful tools for numerical analysis. These two are the Tensor-Train and Hierarchical-Tucker decompositions. Both are intrinsically related with low-rank matrices associated with a given tensor. We present these decompositions and the role of low-rank matrices for the construction of efficient numerical algorithms.