ИСТИНА |
Войти в систему Регистрация |
|
ФНКЦ РР |
||
Rapid development of deep ANNs made the number of hyperparameters constantly grow. As a sequence various aspects of ANNs, such as inference time, efficient resources utilization, losses and even training time were strongly influenced. In general methods of hyperparameters tuning are used for adapting well-known ANN models to new tasks or to tasks in similar areas without pre-training or for synthesis of new particular architectures for particular problems. In this article we compare different types of hyperparameters tuning like CoDeepNEAT, Naive Evolution, Tree-Parzen estimation, structured annealing with MorphNet post-tuning. We apply these methods to particular network architectures for image processing and HRM signal estimation. The process of adaptation this technology to big networks requires a lot of computational resources, so it's necessary to use parallel implementations. It can be done by utilizing HPC with hybrid computational nodes. Also we propose new type of tool based on Microsoft NNI. It is used for tuners comparison, convergence analysis, and runs different tuners in parallel mode on cluster nodes.