Abstract
This paper describes new applications of a locomotion interface that uses fingers instead of legs. With this device, users let two fingers “stand” or “walk” on a ball floating on water. The first-person perspective presented to the user is updated according to the state of the ball. The aim is to make users feel virtually present by means of the synchrony between their vision and the haptic information from their fingers. The difficulty of controlling of the ball with fingers lets users subjectively experience an unsteady foothold. The proposed system is structured to be space saving and cost effective compared to an ordinary full-body motion simulator and is thus suitable for museum exhibitions.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
In our ordinary lives, we often feel like sharing our own experience with others; on the other hand, we would like to experience what others have experienced. During a trip, for example, we would like to tell others about the rural landscape, the surrounding historical environment, or the emotional ups and downs by seeing them. We can summarize the above as the feeling of “being there,” which can never be communicated by mere images or videos. Digital information technologies that measure, record, and preserve experiences have been developed recently. Wearable wide-angle cameras such as GoPro have become commonplace, making the recording of an individual experience easily possible. Therefore, a means of displaying archived information to users has attracted increasing attention. What is the best way to experience what others have experienced as our own experience?
The sensation of walking plays a key role in the subjective feeling of presence or the sensation of being in one place, even when one is physically situated in another place [1]. It has been reported that the extent to which a locomotion technique resembles its real-world counterpart has a positive effect on the sensation of presence [2, 3]. There have been two directions in research on locomotion: the development of wide-area trackers so that users can actually walk about and the development of body-active surrogates for walking, e.g., a treadmill and walking-in-place [4–6]. Both ideas are straightforward and have the advantage of possibly providing direct haptic feedback to the user’s legs. If achieved, they would be the most natural locomotion system. However, the devices or systems tend to be large and complicated because of the need to support full-body motion. Therefore, we cannot always feasibly construct these full-body simulators. Another method for operating a virtual body is to use fingers as an input system instead of legs. For example, finger motions that mimic leg movements can be used for operating bipedal walking robots [7] and navigational tasks [8]. In this method, the physical motion is scaled down, and consequently, physical body fatigue should be reduced, as compared with that incurred by full-body locomotion techniques. We tested the occurrence of the sense of body ownership in our last study [9]. Our preliminary results suggested that the synchrony between the first-person perspective, which is a key component of human self-consciousness, and proprioceptive information was able to induce body ownership over a virtual avatar’s invisible legs. This body ownership causes users to interpret the haptic stimulation through their fingers as deriving from the avatar’s legs.
This paper describes new applications of a locomotion interface that uses fingers. With this interface, users let their two fingers “stand” or “walk” on a ball floating on water. The first-person perspective provided to the user is updated according to the state of the ball. Users have two main modes of feedback: the first-person perspective and the haptic information from their fingers. The method used in this study was the same as our last study [9]. The aim of this study is to make users feel as if they exist in a virtual world by means of the synchrony between their vision and the haptic information.
After making some prototype systems and conducting informal tests, we found that the floating ball had an interesting feature when interacting with human fingers. It is difficult to control the floating ball with two fingers in the desired manner. We tried to take advantage of this difficulty to allow users to experience an unsteady foothold experientially. When do we feel an unsteady foothold? An earthquake is an obvious case when people cannot maintain their posture. People are also likely to stagger on a suspension bridge or boat. Playground equipment, e.g., a swing or a unicycle, can make people have a fun balancing posture and enjoy unsteadiness. If we imagine these scenes, we can readily grasp how people feel the unsteady ground. In other words, when we would like to create these scenes virtually, an unsteady feeling on the ground is indispensable for presence. To generate an unsteady foothold, the ground needs to be shaken mechanically. When using anthropomorphic finger motion as input, the system can be structured in a space-saving and cost-effective way compared to an ordinary full-body simulator and is thus suitable for museum exhibitions. We adopted this method and tried to make users feel an unsteady foothold in two scenes.
2 Related Work
Walking in a virtual world is a fundamental task that virtual-reality technologies should be able to accomplish. Providing the ability to walk through virtual scenes is of great importance for many applications such as training, architectural visits, tourism, entertainment, games, and rehabilitation. Over the years, a large number of technical approaches have been proposed and investigated. Most of these addressed the locomotion interface that supports human leg motions themselves [4, 5]. Full-body locomotion using these interfaces is able to facilitate the acquisition of spatial knowledge of an environment and results in better navigation in the virtual environment than using common input devices [10]. However, there is a problem for practical use: entire devices supporting full-body motions tend to be large and complicated. Instead of simulating full-body locomotion, several interaction techniques using a full-body metaphor have been presented [3, 11, 12]. These met the demands to avoid user collision with real-world obstacles. However, these metaphor techniques are lacking in terms of kinesthetic feedback.
Another method for realizing locomotion is to use fingers instead of legs. This takes advantage of the structural similarities between fingers and legs. Users operate their fingers as if walking or running. For example, two fingers were used to mimic leg movements for generating the full-body motion of animation characters [13] or operating bipedal walking robots [7]. The advantage of this approach is the possibility of scaling down the device or environments around the fingers. We studied whether the tactile stimulation on the user’s fingers can be felt as a tactile experience on the sole of a virtual avatar. We proved that the synchrony between the first-person perspective of the avatar as presented by a monitor and the proprioceptive information together with the motor activity of the user’s fingers are able to induce an illusory feeling that is equivalent to a sense of ownership over the invisible avatar’s legs [9]. Under this condition, the ground under the virtual avatar’s foot is felt through the user’s fingertip. The plasticity of the tactile perception using an anthropomorphic finger motion interface was also investigated. The experimental results suggested that the participants interpreted the tactile sensation on the basis of the difference in scale between fingers and legs. The tactile size perception was proportional to the avatar’s body (foot) size.
There has been another approach based on the “walking-in-place” technique. The finger walking in place (FWIP) method was proposed in [8, 14]. In this method, users can move forward or backward and rotate in a virtual world as a users’ fingers slide on a multitouch sensitive surface. In terms of spatial-knowledge acquisition (e.g., spatial relationships and features), the FWIP results exhibited a better performance than a rate-based translation and turning system (i.e., joystick) during maze navigation tasks.
A locomotion interface using fingers has some restrictions such as those on the translational or rotational direction due to the limitations of the wrist and is not suitable for long-term use because of fatigue. On the other hand, it has the advantage of spatial acquisition and is suitable for situations in which users cannot use their legs or for paraplegics.
3 Prototype
When using finger motions as an interface, users remain at the same physical position. Therefore, an interface based on an actual walking motion is not suitable because of space restrictions. Solutions to these restrictions consider the physical constraints on the user’s movement in one of two ways. One is the development of a finger-walking simulator such as treadmills [4, 5] or the Virtusphere [15]. Another solution is so-called walking-in-place techniques, which enable the user to navigate in virtual environments by walking in place. Generally, the latter metaphor techniques are lacking in kinesthetic feedback. Therefore, we adopted the finger-walking simulator. A prototype is illustrated in Fig. 1. We made this by referring to the Virtusphere [15]. However, the entire system was scaled down and simply structured with consumer goods.
When the user’s fingers “walk” on the plastic ball and rotate it, the optical mouse under the plastic ball tracks the rotation. This device enables users unlimited walking in the same physical space. There are two aspects of this device that need to be improved. First, considerable friction is generated between the plastic ball and the three ball casters, which made users feel an awkward contact force while walking. Second, it is difficult to give users some force feedback. For precise tracking, the distance between the ball and the optical mouse should be strictly controlled. Therefore, it is impossible to translate the ball and provide the fingers with force feedback.
4 Proposed System
4.1 Concept
Smooth movement of the surface and force feedback would make users feel a subjective feeling of presence. Considering these points, we tried to improve the device through trial and error and finally, had the idea of using a ball floating on water. The water replaced the ball casters as the support for the ball and led to smooth rotation of the ball. The force due to the buoyancy was experienced by the user’s fingers. We aim to make users feel a ground instability via their fingers due to the difficulty of controlling the ball. Haptic and visual feedback was provided to the user from the floating ball as a result of the user’s finger input, and these two types of sensory information would be integrated. We expect that illusory body ownership over the avatar in the virtual world would be induced if users do not feel a discrepancy between the haptic feedback to the fingers and the visual information.
4.2 System Configuration
Figure 2 shows a system diagram of the proposed system. The system comprises three parts: a user who experiences the content, a ball that plays a role in haptic input/output, and a server that projects the visual world. The system provides the user with visual and haptic information. We make use of a ball floating on water for haptic input/output. Fingers contact the ball and interact with it, and the ball is freely in extra motion. The state of the ball was tracked by a PlayStation Move (PS Move) that is fixed inside the ball. The PS Move is able to obtain values from an accelerometer, a gyroscope, and a magnetometer. These values are sent to a server via Bluetooth 2.0 wireless radio communication. The server updates the graphical perspective as new values are obtained and projects a new perspective.
5 Application
Informal user tests showed that it was difficult to control the ball with two fingers. The force users apply to the ball makes it unsteady as a result of the interaction between the ball and the water. For example, in order to keep the ball at a constant position, users apply force to the ball. As users push the ball into the water, the buoyancy increases. We prepared a semispherical container filled with water so that the reaction force constantly arises, no matter which direction the ball is pushed (Fig. 3).
With this system, we developed two scenes. One is to walk down a mountain on a rainy day. The other is to stand on a small boat.
5.1 Walking Down a Mountain on a Rainy Day
The aim of this application is to induce the feeling of actually walking down a mountain trail in a user with their own feet. Users are asked to “walk” on the ball floating on water with two fingers while watching a first-person video of a person walking down a mountain on a rainy day. The video was taken on Trail 6 at Mt. Takao, which is the most dangerous trail course on the mountain. The cameraman captured video by holding a GoPro camera near his eyes. The avatar in the video pays attention to the muddy ground and wet rockface and looks down at the ground – the cameraman’s foot is in the video (Fig. 4).
In order to induce the feeling of “being there” and actually walking down a mountain trail with their own feet, we did three things. The first was to relate the rotating speed of the ball according to finger motion to the playback speed of the first-person video. The PS Move inside the ball measures the rotation of the horizontal axis to the users. The measured rotation determines the playback speed. The walking speed of the cameraman was not completely, but to some extent, constant. If the users rotate the ball faster, the avatar in the video moves faster. If the ball is rotated slowly, the avatar correspondingly moves slowly. The avatar in the video not only goes forward but also backward by means of reverse playback. Whether the avatar goes forward or backward depends on the direction of rotation of the floating ball. Second, we made full use of force feedback to the user’s fingers. We aimed at making users interpret the reaction force resulting from a pushed ball as an impact of a sole while stepping down the mountain trail. Third, the difficulty of controlling of a ball with fingers increases in time with the relative unsteadiness of the steps while walking down the rainy rocky trail. The effects of these three points are expected to make users experience the video subjectively.
5.2 Standing on a Small Boat
The aim of this application was to induce a feeling in users of standing on a boat with their own feet. Users are asked to “stand” on the ball and keep it steady with two fingers while watching a first-person perspective of a virtual avatar over a small boat floating on the water. The graphical perspective was rendered by Unity (Fig. 5). In this application, the floating ball in the real world corresponds to the small boat in the virtual environment.
A weight was fixed to the bottom sphere inside the ball. The ball does not easily rotate and becomes steady because of the weight. The PS Move measured the values, and the orientation of the ball was calculated from them. The orientation of the ball determines the orientation of the first-person perspective of the virtual avatar. Maintaining a ball at a constant position resembles the sensation of maintaining balance on the boat with two legs. We aimed to make users feel the ground instability from the fingers.
6 Exhibition in a Public Space
We demonstrated our applications at a public media art exhibition on campus (Fig. 6). The number of people who tried out these applications was over 500.
We found that the ways in which users moved their fingers were different from person to person. We obtained feedback from the visitors about these applications.
For the application of walking down the trail, most feedback indicated that the synchronous playback of the video with a user’s finger motion made users feel as if they were actually going forward with their legs, even though the footsteps of the person in the video were not completely synchronous. When users rotated the ball, the fingers got wet and some reported that the wet fingers had an important effect on being immersed in the rainy scene in the video.
As for the application of standing on a boat, we received following feedback. It was difficult to maintain balance with two fingers. The rotation of the viewpoint made users feel that they were in the virtual world. There were reports that the feeling of the ground instability was much stronger when operating the ball with fingers than when just watching someone operate the device. This indicated that the application had the effect of enhancing presence.
7 Conclusion
This paper described a new application of a locomotion interface that uses fingers instead of legs. With this device, users let their two fingers “stand” or “walk” on a ball floating on water. The first-person perspective presented to the user updated according to the state of the ball. The aim was to make users feel that they are “there” by means of the synchrony between their vision and the haptic information from their fingers. The difficulty in controlling of the ball with their fingers lets users experience an unsteady foothold subjectively. The proposed system was structured to be space saving and far more cost effective than an ordinary full-body motion simulator and is thus suitable for museum exhibitions.
References
Witmer, B.G., Singer, M.J.: Measuring presence in virtual environments: a presence questionnaire. Presence: Teleoperators Virtual Environ. 7, 225–240 (1998)
Slater, M., Usoh, M., Steed, A.: Taking steps: the influence of a walking technique on presence in virtual reality. ACM Trans. Comput.-Human Interact. 7, 225–240 (1995)
Usoh, M., et al.: Walking > walking-in-place > flying, in virtual environments. In: SIGGRAPH 1999 Proceedings of the 26th Annual Conference on Computer Graphics and Interactive Techniques, pp. 359–364 (1999). doi:10.1145/311535.311589
Darken, R., Cockayne, W., Carmein, D.: The omni-directional treadmill: a locomotion device for virtual worlds. In: ACM Symposium on User Interface Software and Technology, pp. 213–221 (1997). doi:10.1145/263407.263550
Iwata, H.: Torus treadmill: realizing locomotion in VEs. IEEE Comput. Graph. Appl. 19, 30–35 (1999)
Slater, M., Steed, A., Usoh, M.: The virtual treadmill: a naturalistic metaphor for navigation in immersive virtual environments. In: Göbel, M. (ed.) Virtual Environments 1995. Eurographics, pp. 135–148. Springer, Vienna (1995). doi:10.1007/978-3-7091-9433-1_12
Fernando, C.L., et al.: An operating method for a bipedal walking robot for entertainment. In: ACM SIGGRAPH ASIA 2009 Art Gallery & Emerging Technologies: Adaptation 79, ACM (2009). doi:10.1145/1665137.1665198
Kim, J.-S., Gračanin, D., Matković, K., Quek, F.: Finger walking in place (FWIP): a traveling technique in virtual environments. In: Butz, A., Fisher, B., Kruger, A., Olivier, P., Christie, M. (eds.) Smart Graphics. LNCS, vol. 5166, pp. 58–69. Springer, Heidelberg (2008)
Ujitoko, Y., Hirota, K.: Interpretation of tactile sensation using an anthropomorphic finger motion interface to operate a virtual avatar. In: 2014 24th International Conference on Artificial Reality and Telexistence (ICAT) (2014). doi:10.1109/ICAT.2013.6728900
Waller, D., Loomis, J.M., Haun, D.B.M.: Body-based senses enhance knowledge of directions in large-scale environments. Psychon. Bull. Rev. 11, 157–163 (2004)
Razzaque, S., Kohn, Z., Whitton, M.C.: Redirected walking. In: Proceedings of the EUROGRAPHICS, pp. 289–294 (2001)
Williams, B., Narasimham, G.: Updating orientation in large virtual environments using scaled translational gain. In: Proceedings of the 3rd Symposium on Applied Perception in Graphics and Visualization, vol. 1, pp. 21–29 (2006)
Lockwood, N., Singh, K.: Finger walking: motion editing with contact-based hand performance. In: Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 43–52 (Eurographics Association, 2012)
Kim, J.-S., Gračanin, D., Matković, K., Quek, F.: The effects of finger-walking in place (fwip) for spatial knowledge acquisition in virtual environments. In: Taylor, R., Boulanger, P., Krüger, A., Olivier, P. (eds.) Smart Graphics. LNCS, vol. 6133, pp. 56–67. Springer, Heidelberg (2010)
Medina, E., Fruland, R., Weghorst, S.: Virtusphere: walking in a human size vr “hamster ball”. Proc. Human Factors Ergon. Soc. Annu. Meet. 52, 2102–2106 (2008)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Ujitoko, Y., Hirota, K. (2015). Application of the Locomotion Interface Using Anthropomorphic Finger Motion. In: Yamamoto, S. (eds) Human Interface and the Management of Information. Information and Knowledge in Context. HIMI 2015. Lecture Notes in Computer Science(), vol 9173. Springer, Cham. https://doi.org/10.1007/978-3-319-20618-9_65
Download citation
DOI: https://doi.org/10.1007/978-3-319-20618-9_65
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-20617-2
Online ISBN: 978-3-319-20618-9
eBook Packages: Computer ScienceComputer Science (R0)