ИСТИНА |
Войти в систему Регистрация |
|
ФНКЦ РР |
||
https://mbns.bruker.com/acton/attachment/9063/f-ab007725-a93b-45be-a660-1886932ec702/1/-/-/-/-/Rev-A0_Nanobruecken%202021%20Program-BRUKER.pdf *** Nowadays, the modern equipment allows one to perform in-situ indentation tests during the simultaneous observations by SEM or (S)TEM. Meanwhile, there is not so much free software available for the processing of the corresponding graphical data. pyNIDA is the open-source (GPL-3) Python-based software, which allows one to perform the full cycle of in-situ nanoindentation data processing. First part of the software allows one to perform digital tracking of the object displacements observed by microscope by Digital Image Correlation functions from OpenCV. It is also possible to take a displacement of the whole system into the account tracking the substrate. The result of this part is a csv file with an observed object displacement vs. time. Second part involves a comparison of indentor displacement data and the results of visual observations. It allows one to calculate a sample drift function with respect to the zero point differences and scale factors. This part results in a drift function and in a csv file with drift-corrected load and displacement data vs. time. Third part serves for the analysis of the load-displacement data by one of the proposed models: Hooke’s law or Hertzian model for sphere/cylinder. Fourth part is designed for the analysis of the particle shape evolution in time. Two different approaches are in use: contours analysis by OpenCV, and random walker segmentation of areas by skimage. For the moment this part has been properly tested for spherical particles only, but it is still under development. Code is freely available by the link https://github.com/LebedevV/pynida. Your feedback is very important for us, so feel free to send your comments, opinions, and patches. All test data were collected on MEMS-based Hysitron PI-95 at Zeiss Libra 200MC TEM. This work was partially supported by the Russian Science Foundation (Grant 19-13-00151).