Abstract
Automatic gesture recognition systems generally require two separate processes: a motion sensing process where some motion features are extracted from the visual input; and a classification process where the features are recognised as gestures. We have developed the Hand Motion Understanding (HMU) system that uses the combination of a 3D model-based hand tracker for motion sensing and an adaptive fuzzy expert system for motion classification. The HMU system understands static and dynamic hand signs of the Australian Sign Language (Auslan). This paper presents the hand tracker that extracts 3D hand configuration data with 21 degrees-of-freedom (DOFs) from a 2D image sequence that is captured from a single viewpoint, with the aid of a colour-coded glove. Then the temporal sequence of 3D hand configurations detected by the tracker is recognised as a sign by an adaptive fuzzy expert system. The HMU system was evaluated with 22 static and dynamic signs. Before training the HMU system achieved 91% recognition, and after training it achieved over 95% recognition.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Tamura, S., Kawasaki, S.: Recognition of sign language motion images. Pattern Recognition 21(4) (1988) 343–353
Davis, J., Shah, M.: Visual gesture recognition. IEE Proceedings Vision, Image, and Signal Processing (1994) 101–106
Starner, T., Pentland, A.: Real-Time American sign language recognition using desk and wearable computer based video. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(12) (1998).
Watanabe, T., Yachida, M.: Real time gesture recognition using eigenspace from multi input image sequences. Proceedings of The Third International Conference on Automatic Face and Gesture Recognition (1998) 428–433.
Dorner, B.: Chasing the colour glove: Visual hand tracking, Master’s dissertation, Department of Computer Science, Simon Fraser University(1994)
Regh, J., Kanade, T.: DigitEyes: Vision-based human hand tracking, Technical Report CMU-CS-93-220, School of Computer Science, Carnegie Mellon University (1993)
Holden, E.J.: Visual Recognition of Hand Motion, PhD thesis, University of Western Australia(1997)
Fels, S. S., Hinton, G. E.: Glove-Talk: A neural network interface between a data-glove and a speech synthesizer. IEEE Transactions on Neural Networks 4(1) (1993) 2–8
Vamplew, P., Adams, A.: Recognition and anticipation of hand motions using a recurrent neural network. Proceedings of IEEE International Confference on Neural Networks 3 (1995) 2904–2907
Liang R. H., Ouhyoung, M.: Real time continuous gesture recognition system for sign language. Proceedings of The Third International Conference on Automatic Face and Gesture Recognition (1998) 558–565
Holden, E. J., Roy, G. G., Owens, R.: Hand movement classi-cation using an adaptive fuzzy expert system. International Journal of Expert Systems 9(4) (1996) 465–480
Lowe, D.G.: Fitting parameterized three dimensional models to images. IEEE Transactions on Pattern Analysis and Machine Intelligence 13(5) (1991) 441–450
Johnston, T.A.: Auslan Dictionary: A dictionary of the sign language of the Australian deaf community, Deafness Resources, Australia, (1989)
Cox, E.: Adaptive fuzzy systems. IEEE Spectrum February (1993) 27–31
Zadeh, L. A.: Fuzzy sets. Information Control 8 (1965) 338–353
Yoshikawa, T.: Foundation of Robotics Analysis and Control, The MIT press, Cambridge, Massachusetts (1990)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Holden, EJ., Owens, R. (2001). Visual Sign Language Recognition. In: Klette, R., Gimel’farb, G., Huang, T. (eds) Multi-Image Analysis. Lecture Notes in Computer Science, vol 2032. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45134-X_20
Download citation
DOI: https://doi.org/10.1007/3-540-45134-X_20
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42122-1
Online ISBN: 978-3-540-45134-1
eBook Packages: Springer Book Archive