ИСТИНА |
Войти в систему Регистрация |
|
ФНКЦ РР |
||
Many of us do not move much when we focus on solving intellectual problems. However, when we use computers to enhance our intellectual power, our fingers need to make multiple precise movements to send our commands through physical interfaces such as a keyboard or a mouse. Although this activity is highly automated, it still requires brain activity that is very different from what is needed for solving the problem, and certain distraction from the main focus may happen. Could we become more effective in mental problem solving if the interaction between our brain and computers was possible without such mediation by the motor activity? This question certainly deserves experimental clarification. It could be addressed in experiments with invasive brain-computer interfaces (BCIs), but they are so far too risky to be used in extensive experimentation. Non-invasive BCIs, at least at the current state of technology development, are too slow and imprecise to increase the fluency of interaction, if taken alone. However, passive BCIs (Zander, Kothe, 2011) may provide the machines with information needed to better respond to the user's needs, and we may try to combine them with other interaction technologies to create new modes of communication. Gaze interaction is based on activity of tiny eye muscles and is often closely related to head movements which involve more massive muscle work. Both gaze activity and related head movements rarely require any effort and are mostly unconscious. Moreover, they are inherent to our use of graphic user interfaces (GUIs): gaze often reaches screen buttons and menu items even earlier than the mouse cursor does. Probably the only serious obstacle preventing gaze from becoming almost an ideal mean for fluent interaction is the identity of gaze behavior in cases when it is used for interaction and for spontaneous or intentional exploration of the visual scene. We may assume, however, that patterns of brain activity could be very different in such cases, and therefore could be differentiated by passive BCIs. The basic approach of our group at Kurchatov Institute to building the interface for the presumably fluent human-machine interaction is straightforward. It is based on analyzing the brain signals recorded during the use of gaze interaction technology and on applying the results of this analysis to brain signal based detection of gaze behaviors related to the user’s intentions. We demonstrated that in certain cases the gaze dwells used for interaction and the spontaneous gaze dwells can be differentiated based on statistical features related to brain activity likely related to expectation of the system’s response to the dwell (Shishkin et al., 2016), in line with the earlier proposal by Zander’s group (Ihme and Zander, 2011; Protzak et al., 2013). Interestingly, the use of expectation-related electroencephalogram (EEG) component was proposed for “the direct cerebral control of machines” by Grey Walter as early as half a century ago (Walter, 1966); this component is a part of the Contingent Negative Variation (CNV) that he discovered earlier. However, expectation of the machine’s action implies that there is a delay between the intention and the action, and it seems not likely that this delay can be significantly shorter than about half a second to make the related brain activity detectable. Such a delay seems to be too long for fluent interaction. Grey Walter (1966) also proposed to use, as a separate control channel, the “intention wave”, an earlier component of the CNV, but it appeared to be linked to motor preparation and therefore could not be helpful in the non-motor interaction design. A solution may come from the analysis of typical temporal organization of actions, that can be considered as grouped into segments or chunks, and from understanding the gaze based human-machine interaction as communication (Fedorova et al., 2015). In this perspective, a delay in the machine’s response would probably not prevent the fluency of interaction if it will occur in the beginning of the interaction sequence, e.g., as the first gaze dwell. The machine can then recognize the rest of the sequence as commands without the need to get data for BCI decisions, so that much shorter gaze dwells can be used (Shishkin et al., 2017). Our group is currently developing an online version of an Eye-Brain-Computer Interface (EBCI) that implements components of this methodology (Nuzhdin et al., 2017). If this system will be indeed proved to increase fluency of interaction at least in some use scenarios, it could be applied for solving a number of tasks, especially creative ones, and also for patients who have motor disabilities but whose gaze control is preserved. In addition, a human-machine system based on interaction with increased fluency may have emerging qualities that can be of interest not only for solving practical problems but also for basic science. References Fedorova A.A., Shishkin S.L., Nuzhdin Y.O., Velichkovsky B.M. (2015) Gaze based robot control: The communicative approach. 7th Int. IEEE/EMBS Conf. NER, 751–754. Ihme K., Zander T.O. (2011) What you expect is what you get? Potential use of contingent negative variation for passive BCI systems in gaze-based HCI. Int. Conf. ACII, 447–456. Nuzhdin Y.O., Shishkin S.L., Fedorova A.A., Kozyrskiy B.L., Medyntsev A.A., Svirin E.P., Korsun O.V., Dubynin I.A., Trofimov A.G., Velichkovsky B.M. (2017) Passive detection of feedback expectation: Towards fluent hybrid eye-brain-computer interfaces. 7th Graz BCI Conf., 361-366. Protzak J., Ihme K., Zander T.O. (2013) A passive brain-computer interface for supporting gaze-based human-machine interaction. Int. Conf. UAHCHCI, 662–671. Shishkin S.L., Nuzhdin Y.O., Svirin E.P., Trofimov A.G., Fedorova A.A., Kozyrskiy B.L., Velichkovsky B.M. (2016) EEG negativity in fixations used for gaze-based control: Toward converting intentions into actions with an Eye-Brain-Computer Interface. Front. Neurosci. 10:528. Shishkin S.L., Zhao D.G., Isachenko A.V., Velichkovsky B.M. (2017) Gaze-and-brain-controlled interfaces for human-computer and human-robot interaction. Psychology in Russia: State of the Art. 10, 120–137. Walter W.G. (1966) Expectancy waves and intention waves in the human brain and their application to the direct cerebral control of machines. Electroenceph. Clin. Neurophysiol. 21, 616. Zander T.O., Kothe C. (2011) Towards passive brain–computer interfaces: applying brain–computer interface technology to human–machine systems in general. J. Neural Eng. 8:025005.