Abstract
This paper describes a process of assistive technology evaluation using the eye and head movements as a CHI - Computer Human Interaction. In order to collect the data, it was used the Glasses Mouse Interface (IOM - Interface Óculos Mouse), device in development at the Federal Institute of Science, Education and Technology (IFSUL) which has been evaluated according to principles of user experience and usability testing.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
- Accessibility
- Assistive technologies
- Computer human interaction
- Glasses mouse
- Assistive technologies evaluation
1 Introduction
People with disabilities generally do not have the same access to health care, education and employment opportunities. It is known that, very often, they do not receive the support they need, and end up experiencing the taste of exclusion to perform their daily activities. Analyzing the overall rates of people with disabilities, it is possible to perceive that social inclusion of minorities is not a simple task. One of these daily activities is the use of information technology. Although it can be considered a basic activity, for a population that is affected by a motor disability, the search for technological means enabling access to computer use, it becomes a necessity, especially in educational institutions.
The problem arises when access to technological resources is denied due to the high cost. Several initiatives, in many areas of knowledge, try to adopt the full use of various computer tools for people with motor disabilities through the use of what is called assistive technologies (ATs), which allow accessibility and inclusion of users with those needs.
There are already many projects involving different areas of knowledge developing new applications of Computer Human Interaction (CHI) on Assistive Technologies.
A kind of project in this area regards to the creation of Interface Glasses-Mouse called IOM [19] in the South Riograndense Federal Institute at Pelotas (RS). Such device aims to allow the use of the computer by people with motor paralysis (without compromised intellectual capacity), including the use of Information and Communication Technologies (ICTs), to control the mouse movements through the head the movement and the action of mouse click, by the blinking of the eyes. The project is still under development of its features. In the other hand, it is already patented, having ready and tested prototypes for its posterior industrialization. That will enable an affordable lower cost device when compared to many existent devices with the similar purpose.
To validate the development of the IOM project a systematic mapping of assistive technology for the same purpose is required, mainly to enable the user with motor disabilitiesto manipulate graphical computer interfaces. The purpose of this mapping is to understand the state of the art in this scenario and confirm that the research is moving in the right direction.
The tests of User eXperience (UX) and usability evaluation have also been implemented and are aimed, through the analysis of their results, to generate inputs in how to establish the interaction of this device with these interfaces, describing difficulties and opportunities for improvement of the IOM project.
2 Glasses Mouse Interface (IOM - Interface Óculos Mouse)
The IOM system, represented in Fig. 1, is composed by a glasses with two sensors (gyroscope and accelerometer) that allow people use head and eyes movements to control the computer tasks. The IOM enables an alternative interaction style to mouse and keyboard controls. Hence, such solution can be very useful for users to perform hands-free control tasks.
This device is characterized as glasses that, may or not, contain the ocular lenses, according to the user needs. It basically presents two types of sensors which are responsible for the capture of voluntary eye blinking, the positioning, and inclination of the head. According to the user’s motor disability, software control routines, such as calibration of inertial position and cursor speed, may be adapted by the user using the computer interface device to increase the comfort and performance.
Besides making possible the interaction with the computer hands-free, other goals of the IOM project is to develop a low-cost, lightweight and comfortable assistive technology. At this purpose we use prototypes in preliminary design stage to evaluate their performance according to these requirements.
3 Related Works
Several projects in HCI area are under development or have already been developed in order to facilitate and enable users with disabilities to access the computer, as shown in Table 1.
Some examples of these devices are switches, joysticks and pointing devices activated by the body movements; computer screen vitual keyboard simulator softwares; speech recognition systems; eye movement controlled by computer vision systems; devices that control head movements; and more sophisticated devices employing the electric potential of the brain using EEG signals - electroencephalography, signals received by the eye movements using EOG - electrooculography [4] or contraction signals of voluntary muscles using EMG - electromyography [12] and even interfaces controlled by the brain, and brain-computer interface (BCI) together with a combination of brain sensors [3].
The detection of head movements can be done through video cameras, defined as video-based or camera-based, or by a device attached to the head with sensors that capture these movements, called head-based or head-track. Other motion capture method, is the face monitoring, which detects the region of the nose and mouth to control the mouse pointer. Alternatively, eye tracking (eye-based or gaze tracking) positions the mouse cursor at the estimated position of the user’s eyes.
Eye tracking interfaces obtained even better performance than the traditional mouse (hand motion) in the speed aspect, however, the accuracy of current eye trackers is not sufficient for a satisfactory pointing in real time [6]. In [6], it proposed a more than one movement use technique to capture the cursor movement, head movement and eye tracking, called HMAGIC (Head Movement And Gaze Input Cascaded Pointing). The mouse pointer is activated by the direction of gaze and head movements the user makes for fine-tuning. The idea is to combine the advantages of speed of eye movements with the precision of the head movements.
Other related works use the camera-mouse software [23] to capture movements of the head, by recognizing the face and using the user nose as a central point. In [1] was added a feature to the camera-mouse to avoid accidental clicks, a pop up window allows the user to confirm the click, if it is inside a period of time that you can not be sure that you really wanted click the object in focus.
An overview of these assistive technology devices in CHI for motor disability users was made, like a comparative table, using as a tool the systematic mapping, show in Table 1.
4 Evaluation Methodology
Many tests have been applied, as a way to quantify and qualify the IOM User eXperience in relation to its control of the GUI (Graphic User Interface). A qualitative approach, in order to evaluate the device’s UX and usability with oriented tasks, resulted in the quantitative data analysis for continual improvement of the IOM. Although the project focuses on the development of AT for people with motor disabilities, it was decided at that time to apply the tests in typical users (without restricted mobility). This is done due to the fact that when we test this device directly with people with motor disabilities, it generates an expectation of immediate use of a product that is still in its initial development cycle. These users get high expectations that can influence the evaluation of aspects of UX, while the typical users are completely “disengaged” and without risk of frustration.
It was also considered that the feedback from a typical user, with respect to IOM’s user experience, will be approximated in basic concepts of cognitive ergonomics and the utilization of the tool in computer use will approximate the user with motor disabilities since the tests there is a restriction only use the IOM as CHI device without any other apparatus. According to Brade [20], the test of driving process follows the steps below, which were used in evaluation methods: Test planning, Organization of material, Local preparation, Pilot test, Choice of users, Test driving, Result analysis. The intended audience for the evaluation tests was composed by students of the institution and the operating system used in the test (Microsoft Windows) for activities development. As explained before, the intention was not to assess the IOM interface specifically for disabled users (despite the central bias of the project predict that) but the general public, and this may give general contributions that are useful for the use of IOM in a specific context. If we think of features that include people with motor disabilities to use the IOM would have two most common usage scenarios. The first one would be the use at home, in different postural positions, some of which can compromise the efficiency of the usability testing and operation of the device (e.g. a lying person using computer). Another scenario, which was planned for the tests environment, it was the classroom environment, or work location in which through the IOM would be possible to include this user in a real scenario with accessibility purposes education and work, for example. We have opted for this second condition by the ease of using the actual project development educational environment. For verification in terms of usability, it was applied a script of predefined tasks consisting in a user interaction with some graphic elements arranged in the interface. This script was explained in real time to users at the moment of development of tasks, and to complete a task was revealed the next step. Uptake of these tasks listed below where the time interval was quantify between each of the tasks as well as the errors made (clicking in an inappropriate spot for example) using the formula: time in seconds / wrong clicks.
List of WIMP (Windows Icons Menus and pointers) tasks:
1. Click on the icon (open file manager); 2. Clicking on an icon (maximize window); 3. Make a scrolling (scroll to the bottom of the screen); 4. Clicking on an icon to open “.doc” file; 5. Clicking on an icon to changing tools tab; 6. Clicking on an icon to close software; 7. Clicking on an icon to open file manager menu; 8. Clicking on an icon to close file manager.
The listed tasks generate quantitative data, the time of the task execution was logged by the video recording, whereby it is possible to identify some usage patterns and constant errors in the use of IOM in which usability becomes evident. In order to qualifying the user experience we apply our observations in the records recorded on video, aind in which bodily expressions and audios were collected about IOM use. It were also applied two questionnaires, one focused on the general perception of use and the the other focused use of the AttrakDiff [21] online tool which generates a User eXperience chart from opposite pairs of adjectives attributed by users to the tested device.
5 Results and Discuss
A total of 9 volunteers, 3 women and 6 men aged between 18 and 45 years, were recruited in IFSUL - campus Pelotas to develop planned tests which were applied. Most of them were students and had never used any device or eye movements based software and head pointer control in the GUI. Half of them uses glasses for visual disability and almost use the mouse as the primary interaction tool with the computer. As the testing protocol already established, at first the typical users made a routine tasks with the common mouse and then using the IOM device. This whole routine was recorded in both the GUI and through facial and body registration of users. They were also asked to answer a use experience questionnaire. Some ergonomic aspects were raised in this questionnaire about the IOM’s usage on a possible fatigue generated by its use in the eyes and overall comfort.
Some of results about the questionnaire applied of the users:
-
Most users stated that the use of IOM caused a little eye fatigue, or no fatigue;
-
In relation to the neck comfort, which needs to be moved to control the pointer on the GUI through the IOM, three users felt a moderate to high fatigue.
-
About general physical effort the data also points for most users with low or no fatigue. Only two users reported a moderate to high fatigue, which was confirmed by good comfort appointed by a majority of IOM users;
-
In addition to the ergonomic aspects that generate important axes of analysis, especially regarding the IOM design principles, the functional aspects of the same interaction with the computer were answered by users, who, in general, indicated that their difficulties and overall experience of use IOM. Most users indicated difficulties both in the control of the mouse pointer’s movement speed and the accuracy of the same as the charts below.
This qualitative analyse about questionnaire is confirmed as shown in Table 2, average time of tasks development in usability testing, which demonstrated the highest time spent with the IOM and the recurring errors to perform each of the tasks regarding the use of the mouse.
In general terms the average time for development tasks using the traditional mouse is around 17 % faster than with the IOM. Although comparatively the mouse as a device that allowed greater flexibility (smaller tasks runtime) and a lower error rate in the context of this research, the majority of users indicated a satisfactory user experience with the IOM regarding the first use of it.
6 Discussion
Through the comparative Table 1 of systematic mapping, we can see that several studies show the use of the head and eyes movements as good techniques to be used as a control in HCI devices, especially considering the scenario of assistive technology focused on people with motor disabilities.
Even the capture of eye movements have less physical movement of the heads movements, still this movements have spent less physical effort than the employee to move arms or hands. However the results of usability evaluation with the IOM were significantly worse than the average mouse which leads us to consider some aspects. One hypothesis is that the learning curve for users, when faced to a new device, is larger, requiring of the user more time to use the same to get more consistent comparative results of IOM in relation to the mouse that is already widely used. We will require new testing devices that actually use this same technique of interaction to establish standards for more calibrated comparisons according to the device rated in the issue in here, just like as those ones listed in systematic mapping. These tests must also follow more specific protocols and be refined as established by Morimoto [6] whereby the tabulated data have higher accuracy, and this method already applied in tests with some other devices such as the Camera Mouse [4], which allows a good degree of comparison.
We evaluated the implementation of Communicability Evaluation Method (CEM), or the think loud method, on the tests and this has not established a direct relationship with the aim of testing since our goal was not to assess semantic aspects of graphic elements of the interface, but the usability and user experience with the IOM device. In terms of user experience, these tests showed that users found some difficulty in evaluating this new device because it has characteristics of objects reference (glasses and mouse) used in a new context of use, like HCI device. The estrangement in the use of this should be evaluated as normal if we consider that this new apparatus has no other similar already widespread, which makes it a product with innovative features to intended audience of people with motor disabilities. Through a questionnaire AttrakDiff tool was qualified friendly and attractive as the IOM is. The results show how the IOM is a desirable and self-oriented device.
7 AttrakDiff Tool
In Fig. 2 the axis of the average value of the product dimensions is well on the threshold between self-directed and desirable product. There is, therefore, the interpretation of these results, the perception of space for development both in terms of usability as in hedonic quality. The user is clearly stimulated by the product, but it is at least this power valence threshold, and you can say the same regarding the perception that the product gives to self-solve. The results of usability tests confirms the need of improvement of the IOM especially in some specific points such as: cursor movement speed control; accuracy and the the click itself controlled by eye blinks presented some usability issues. Suggestions collected by users and implementation for group perceptions by IOM and its testing compared to other devices with the same purpose (some of which are already being implemented software): control from the head movement speed; implementation of graphical interface element that facilitates the control on the accuracy of the click; new click control modes, such as longer blink or time without moving the cursor to activate the action. For further work is necessary to apply new tests with these protocols and an increasing systematization of the results of the IOM compared to other assistive technologies that use the same principle of interaction, in order to generate a final product more enjoyable and usable by users with motor disabilities.
References
Kwan, C., Paquette, I., Magee, J.J., Lee, P.Y., Betke, M.: Click control: improving mouse interaction for people with motor impairments. In: The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2011), pp. 231–232. ACM, New York (2011)
Martins, J.M.S., Rodrigues, J.M.F., Martins, J.A.C.: Low-cost natural interface based on head movements. Procedia Comput. Sci. 67, 312–321 (2015)
Hakonen, M., Piitulainen, H., Visala, A.: Current state of digital signal processing in myoelectric interfaces and related applications. Biomed. Signal Process. Control 18, 334–359 (2015)
Naves Jr., E., Pino, P., Losson, E., Andrade, A.: Alternative communication systems for people with severe motor disabilities: a survey. BioMedical Engineering OnLine 2011
Feng, W., Chen, M., Betke, M.: Target reverse crossing: a selection method for camera-based mouse-replacement systems. In: PETRA 2014: Proceedings of the 7th International Conference on PErvasive Technologies Related to Assistive Environments, May 2014
Kurauchi, A., Feng, W., Morimoto, C., Betke, M.: HMAGIC: head movement and gaze input cascaded pointing. In: PETRA 2015: Proceedings of the 8th ACM International Conference on PErvasive Technologies Related to Assistive Environments, June 2015
Huo, X.: Tongue drive: a wireless tongue-operated assistive technology for people with severe disabilities, 03 November 2011. https://smartech.gatech.edu/handle/1853/45887. Accessed 02 Dec 2015
Ossmann, R., Thaller, D., Nussbaum, G., Pühretmair, F., Veigl, C., Weiß, C., Morales, B., Diaz, U.: AsTeRICS, a flexible assistive technology construction set. Original Res. Art. Procedia Comput. Sci. 14, 1–9 (2012)
Vickers, S., Istance, H., Hyrskykari, A.: Performing locomotion tasks in immersive computer games with an adapted eye-tracking interface. ACM Trans. Access. Comput. 5(1), Article 2, 33 p. (2013)
Su, M., Yeh, C., Hsieh, Y., Lin, S., Wang, P.: An image-based mouth switch for people with severe disabilities. Recent Pat. Comput. Sci. 5, 66–71 (2012)
Zhu, D., Gedeon, T., Taylor, K.: Head or gaze? Controlling remote camera for hands-busy tasks in teleoperation: a comparison. In: Proceedings of the 22Nd Conference of the Computer-Human Interaction Special Interest Group of Australia on Computer-Human Interaction, OZCHI 2010, pp. 300–303. ACM, New York (2010)
Perez-Maldonado, C., Wexler, A., Joshi, S.: Two-dimensional cursor-to-target control from single muscle site sEMG signals. IEEE Trans. Neural Syst. Rehabil. Eng. 18, 203–209 (2010)
Biswas, P., Langdon, P.: Multimodal intelligent eye-gaze tracking system. Int. J. Hum.-Comput. Int. 31(4), 277–294 (2015)
Azmi, A., Alsabhan, N.M., AlDosari, M.S.: The wiimote with SAPI: creating an accessible low-cost, human computer interface for the physically disabled. IJCSNS Int. J. Comput. Sci. Netw. Secur. 9(12), 63–68 (2009)
Nguyen, V.T.: Enhancing touchless interaction with the leap motion using a haptic glove. Comput. Sci. (2014)
Su, M.C. et al.: Assistive systems for disabled persons and patients with parkinson’s disease. Lecture Notes on Wireless Healthcare Research: 105
Manresa-Yee, C., Varona, J., Perales, F.J., Salinas, I.: Design Recommendations for Camera-Based Head-Controlled Interfaces that Replace the Mouse for Motion-Impaired Users. Springer-Verlag, Berlin Heidelberg (2013)
Montanini, L., Cippitelli, E., Gambi, E., Spinsante, S.: Low complexity head tracking on portable android devices for real time message composition. Received: 1 April 2014 / Accepted: 25 February 2015 / Published online: 8 March 2015 © OpenInterface Association 2015
Machado, M.B., Colares, A., Quadros, C., Carvalho, F., Sampaio, A.: Oculos Mouse: Mouse Controlado pelos movimentos da cabeca do usuario, Brazilian Patent INPI n. PI10038213, Brazil (2010)
Brade, A.N.: Shaping Web Usability, 304 p.. Addison-Wesley, Boston (2002)
Hassenzahl, M., Burmester, M., Koller, F.: AttrakDiff: Ein Fragebogen zur Messung wahrgenommener hedonischer und pragmatischer Qualität. In: Mensch and Computer 2003, pp. 187–196. Vieweg + Teubner Verlag
World Health Organisation. World Report on Disability. Geneva: World Health Organisation (2011). http://whqlibdoc.who.int/publications/2011/9789240685215_eng.pdf?ua=1 Accessed on 20 July 2015
Betke, M., Gips, J., Fleming, P.: The camera mouse: visual tracking of body features to provide compute access for people with severe disabilities. IEEE Trans. Neural Syst. Rehabil. Eng. 10(1), 1–10 (2002)
Alonso-Valerdi, L.M., Salido-Ruiz, R.A., Ramirez-Mendoza, R.A.: Motor imagery based brain–computer interfaces: an emerging technology to rehabilitate motor deficits. In: Original Research Article. Neuropsychologia, In Press, Corrected Proof, Available online 14 September 2015
Feng, W., Chen, M., Betke, M.: Target reverse crossing: a selection method for camera-based mouse-replacement systems. In: PETRA 2014: Proceedings of the 7th International Conference on PErvasive Technologies Related to Assistive Environments, May 2014
Perini, E., Soria, S., Prati, A., Cucchiara, R.: FaceMouse: a human-computer interface for tetraplegic people. In: Huang, T.S., Sebe, N., Lew, M., Pavlović, V., Kölsch, M., Galata, A., Kisačanin, B. (eds.) ECCV 2006 Workshop on HCI. LNCS, vol. 3979, pp. 99–108. Springer, Heidelberg (2006)
Kjeldsen, R.: Improvements in vision-based pointer control. In: Proceedings of ACM SIGACCESS Conference on Computers and Accessibility, pp. 189–196. ACM Press (2006)
Kjeldsen, R., Hartman, J.: Design issues for vision-based computer interaction systems. In: Perceptual User Interfaces 2001, Orlando, Fla (2001)
Missimer, E., Betke, M.: Blink and wink detection for mouse pointer control. In: Makedon, F., Maglogiannis, I, Kapidakis, S. (eds.) Proceedings of the 3rd International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2010). ACM, New York, Article 23, 8 p. (2010). doi:http://dx.doi.org/10.1145/1839294.1839322
Goncalves, C., Padilha Lanari Bo, A., Richay, R.: Tracking Head Movement for Augmentative and Alternative Communication
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Rodrigues, A.S. et al. (2016). Evaluation of the Use of Eye and Head Movements for Mouse-like Functions by Using IOM Device. In: Antona, M., Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. Interaction Techniques and Environments. UAHCI 2016. Lecture Notes in Computer Science(), vol 9738. Springer, Cham. https://doi.org/10.1007/978-3-319-40244-4_9
Download citation
DOI: https://doi.org/10.1007/978-3-319-40244-4_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-40243-7
Online ISBN: 978-3-319-40244-4
eBook Packages: Computer ScienceComputer Science (R0)