ИСТИНА |
Войти в систему Регистрация |
|
ФНКЦ РР |
||
Numerical data are frequently organized as d-dimensional matrices, also called tensors. However, only small values of d are allowed if we need to keep this data in a computer memory. In the case of many dimensions, special representation formats are crucial and it looks natural to try the so called tensor decompositions. In the recent decade, the known tensor decompostions have been considerably revisited and the two of them appeared and are now recognized as the most adequate and useful tools for numerical analysis. These two are the Tensor-Train and Hierarchical-Tucker decompositions. Both are intrinsically related with low-rank matrices associated with a given tensor. In the talk, we expound the role of low-rank matrices for the construction of efficient numerical algorithms and consider possible developments of the idea of cross approximation that proved to be very fruitful for matrices and then has been successfully extended over to tensors. The nice property of the approach is that we construct the approximation using only a small portionof the data. The idea of cross approximation is substantiated by the maximal volume concept for low-rank approximation of matrices and related with the classic problem of choosing a ``good'' basis from a given set or vectors. We discuss possible advantages of using ``good'' frames and what it may give for better work with tensors.