Abstract
In the context of automotive, Human Systems Interactions Design is a great challenge, taking into account the road safety issues and the complexity of the driving task under high time constraint. To support this task, existing on-board systems display mainly visual messages, forcing the drivers to move their eyes away from the road. This paper presents an overview of studies related to drivers’ perception and cognition when this information is displayed on the windshield (Head-Up Display or HUD), as it can be a solution to reduce the duration and frequency drivers look away from the traffic scene. Nevertheless, HUD might have also shortcomings raising new critical contexts, which are discussed. The Augmented Reality (AR) concept is also presented, as this solution can bear HUD potential drawbacks such as the risk of occluding relevant objects of traffic as well as phenomena like perception tunneling and cognitive capture.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
- Head Up Display
- Augmented Reality
- Road safety
- Human factors in automotive
- Advanced driver information system
1 Introduction
Driver informative and assistance systems (IVIS for In-Vehicle Information System and ADAS for Advanced Driver Assistance Systems) for automotive have been developed for many years now, with the aim to improve driver safety, performance, efficiency and comfort through the support of information and communication technology. The driving activity is highly complex, partly due to time constraints for the human to detect, perceive, process information before taking decision and react, this activity being run in a road environment full of unpredictable events. In this framework, the display of relevant information such as “step by step guidance to the destination”, “road, traffic or weather events ahead”, “road accidents or disturbances”, “distance with the previous vehicle”, “danger of road holding loss”, “obstacle ahead” and so on…can be highly useful as supporting information for the driver to take decision and to prepare the correct action. Questions for Human Factors Design rely in the specification of the best modalities allowing conceiving efficient and effective co-operation between the human and the machine in a context where the driving activity is the main priority task. The efficiency or, on the contrary, the negative consequences in terms of road safety while using these systems will depend on the compatibility existing between their interfaces, their modes of dialogue, the road environment and the functional abilities of the drivers [1].
In terms of perception, it is well recognized that driving performance is closely linked to visual ability and visual strategy [2]. Many authors agreed upon the fact that the perceptive visual channel is the essential one for the driving task, information necessary to flawlessly accomplish the driving task being estimated to be up to 90 % perceived by the visual channel. Existing on-board systems display mainly visual messages: it can be texts, pictograms and/or cartographic displays on screens implemented on the dashboard. When the driver needs to operate these systems, they are forced to move their eyes away from the road for few seconds. The longer drivers take their eyes off the road when driving, the higher would be the possibility that a traffic accident will occur [3–5], with a duration of more than two seconds of moving eyes away from the road being the critical value linked to accident probability occurrence [6]. In broad outlines, each in-vehicle display can be considered as having an associated “visual cost”, quantifiable in terms of number and duration of glances to be performed in order to get a specific information from the system [7].
As there is a gap or distance between physical spaces (e.g. the real road environment) and virtual information spaces (e.g. the in-vehicle screen), users take time and have to expend cognitive effort to adjust from one space to another. This gap is defined as the “cognitive distance” between computing and physical spaces [6]. There are two distinct components that comprise cognitive distance. The first is the cognitive effort required to move one’s attention from the physical space to the information space, and to locate the appropriate information within the information space: eyes from the road toward the system. The second component is the effort required to move back from the information space to the physical space and apply the extracted information to the task at hand, such as, for example, in a typical context using GPS system, eyes from the electronic map toward the real road environment to take decision related to maneuvers and vehicle control actions. As the effort required for either of these components grows, the overall cognitive distance grows. Furthermore, if users are required to switch between these two spaces frequently, the impact of the cognitive distance can be even greater. This is particularly true for people who either have a cognitive difficulty, or are completing a task that is time-sensitive or has a high cognitive load associated with it, and certainly applies to elder drivers who may be suffering from age-related cognitive decline [6].
In this context, human factor concern is how to define criteria and efficient recommendations to support vehicle systems design displaying visual messages without distracting drivers from the driving task, taking into consideration road safety major issues [8] and knowing that the numbers of systems being implemented in automobiles are increasing every year [9, 10].
2 Head Up Display and Augmented Reality in Automotive Context
The mode of display on the windshield or Head-Up Display (HUD) can be part of the solution for displaying information from systems to the driver since it is recognized to reduce the time and frequency drivers look away from the traffic scene [11], the HUD being defined as “any transparent display that presents data without requiring users to look away from their usual viewpoints.”
Furthermore, the Augmented Reality (AR) concept, where the information displayed on the windshield is matching with elements of the real road scenery and has the potential to be presented at the place where the cause for the need of information presentation is located, reduces the number of glances to get critical visual information relevant for the driver.
2.1 The Head Up Display Concept
Historically, the HUD has been used in fighters to present information without requiring pilots to look away from their usual view. In the automobile industry, General Motors employed HUD first time in 1988 for Oldsmobile Cutlass Supreme. However, even if HUD is not a new concept, various problems with the technology, the light source used, and the optics, make it not sell as well as anticipated. Today, there is a growing interest in HUD from vehicle manufacturers following considerable advances and maturity in the technology.
One of the main benefits expected to accrue from the HUD’s longer focal distance is a decrease in the accommodative shift, reduced re-accommodation demands for drivers to fixate upon external targets. It is expected that the principal beneficiaries of the shorter focal transition will be the elderly, given their restricted accommodative range [6] and no longer having to look through the near correction (lower part) in their eyeglasses as required to view the instrument panel [15].
If HUD gains popularity since it reduces focal accommodation time [12], this mode of display also allows improving “eyes on the road” time by reducing the number of glances to the in-vehicle [13, 14]. Drivers’ response time to an urgent event is faster with a HUD than a HDD (Head Down Display), and speed control is also more consistent [4, 16]. HUD allows more time to scan the traffic scene, quicker reaction times to external road events, earlier detection of road hinders, less mental stress for drivers and easier for first time users to use [17], lower recognition error rates and less time than using the traditional dashboard [18], and enhances understanding of the vehicle’s surrounding space particularly under low visibility conditions [16]. Globally, most drivers feel safer when driving with a HUD [19].
This benefit in terms of increase situation awareness can impact the probability a driver will success in detecting a time-critical event [15].
Therefore, in the future, HUDs are expected to become an indispensable device for most drivers.
Nevertheless, some criticisms have been done concerning HUD advantages: for example, the recorded scan saving time may be valid only for low workload situation and may not generalize to higher-workload conditions [15]. Furthermore, negative impact of HUD can emerge as superimposing symbology on the forward driving scene may mask external objects via contrast interference [20], this effect depending upon the extent to which the HUD symbology fills a given viewing area, and the contrast of the HUD imagery with the visual background [15].
There is empirical evidence that the lens accommodation (i.e., optical focus of the eyes) is not at infinity when viewing the HUD, but somewhat nearer [21]. The amount of this deviation, or misaccommodation causes objects in the outside world to appear smaller and more distant [22]. This phenomenon is highly correlated with the large variations among individuals in their resting position accommodation. When the distance selected for automotive HUD imagery is designed to place it near the front edge of the car, by definition, all outside objects (e.g., other vehicles) will be at considerably greater distances than the HUD images. Sojourner & Antin [23] recognized the possibility that automotive HUDs cause the kind of size and distance misperception described by Roscoe [24]. They mentioned that the effect may be mitigated by the number of visual cues in the driving environment, “still, if, on a foggy night, the HUD imagery caused a leading automobile’s taillights to be minified, resulting in their distance being overestimated, the likelihood of a rear-end collision would be increased”.
In addition to HUD focal distance affecting drivers’ accommodation and perception of actual objects while driving, HUD images may clutter or block drivers’ view and affect visual attention [11], which can create serious safety hazards. For example, a simulator study showed that the majority of the drivers (77 %) did not want the HUD image within their focal area while driving, instead they chose to locate the HUD either to the right, below, or to the left of the immediate area looked [25]. The tradeoff between increased eyes-on-the-road time and increased visual clutter from HUD symbology, in terms of response effectiveness for safety-critical targets in the forward driving scene, remains to be determined [15].
Compared to an in-car display, a HUD has no static background. Instead, it image plane lies in the outside environment, which makes it moving when the car moves. As a matter of fact, the frame of reference of objects in the HUD is different. An object shown in a way that is appears to be standing still on the ground at a certain position moves over the HUD display. Basic psychophysical research needs to be conducted on perceptual estimations of the absolute distances of objects while participants’ eyes are focused on automotive HUD imagery. The actual distances of the objects should cover a range of distances applicable to driving safety. Furthermore, the distance estimations should be obtained for the same set of participants with and without the presence of the HUD imagery, in order to anchor the effect to a naturalistic baseline and to permit an examination of potential individual differences [11].
As it was previously noted, the cognitive load due to the switching from an instrument panel or dashboard to the outside scene is much lighter with a HUD [26]. Nevertheless, HUDs overloaded with information, especially those using textual output, can create the effect known as cognitive capture [27]. Cognitive capture occurs whenever the driver is distracted due to the presence of multiple visual stimuli. These visual cues take up a significant amount of the driver’s attention resources and can dilute the essential focus on the driving task. The results of cognitive capture would be to decrease the relative salience of outside objects from the real road scenery and, consequently, lower the likelihood that these objects would be noticed or detected by the driver. Overall, cognitive capture is a perturbing cognitive issue with instantaneous, adverse impact on a driver’s performance and safety.
2.2 The Augmented Reality Concept
Some of the HUD shortcomings previously described can be overcome by the Augmented Reality (AR) concept. The AR extends the three-dimensional world by superimposing computer-generated virtual objects into the environment of the user [28]. This concept has been recently developed in the automotive context, allowing a matching between information displayed on the windshield with the real world, which the driver is looking on [16, 29]. The combination of objects or places and their inherent information allows for condensed information and thus for enhanced perception. The presentation of information uses new, implicit presentation schemes that require less mental load for interpretation. Especially information related to the spatial relationships in the environment of the car has the capability to be transferred to AR.
Since more than a decade, researchers have investigated and evaluated a number of AR-based visualization concepts using mobile platforms or projector-based driving simulators [30–32].
The fact that the information can be spatially related to the object of concern introduces new opportunities for fast and efficient information presentation but also generates new issues.
In comparison with HUD, AR presentations bear potential drawbacks such as the risk of occluding relevant objects of traffic as well as phenomena like perception tunneling and cognitive capture. Yet, it can be strongly argued that information presentation through this upcoming technology tends for the usage in time-critical environments.
In the following section, HUD-AR display is discussed for several types of functionalities useful in driving context.
3 HUD-AR and Driver’s Assistance Functionalities
Considering the potential advantages and in order to test the perverse effects, the HUD-AR types of display have been studied for several drivers’ assistance functionalities.
3.1 Drive Path Support
Lane Change. Lutz Lorenz et al. [33] tested AR display to indicate safe corridor for lane change in order for the driver to safely take over. They showed an improvement of the take over process by two positive aspects related to AR display in comparison with reference situation: (1) more drivers used the brake pedal to reduce speed, which is generally a positive indicator in terms of safety, (2) all drivers steered and braked in a very similar manner, as the trajectories showed. Nevertheless, it was not assessed how drivers would have follow an adverse recommendation, e.g. in case a car would have been located in the blind spot or approaching fast. Furthermore, this study showed that in situation of lane change, AR conditions have a tendency to make drivers to look at the side mirror later than drivers not supported by AR information, as the drivers’ visual attention was firstly attracted to the AR information on the road. Indeed, it was only after interpreting this AR information that drivers started checking the side mirror to prepare for the lane change.
Lane Keeping. Lane keeping can be especially critical for inexperienced drivers and lane-keeping support can be very desirable for bad weather conditions and darkness. The AR concept allows underlining the drive path, making it then more salient for the driver.
Displaying the drive path in AR can improve lane-keeping behavior by decreasing the lane deviation [19] (Fig. 1).
3.2 Detection of Critical Road Events
Drivers have to beware of vehicles, road hazard, lanes, pedestrians and traffic signs while controlling vehicle speed and directions. All these works increase physical and mental workload, which is dangerous especially for elderly and people with slow recognition and responses. Therefore, an alarm that alerts the driver of a danger in the road can help to minimize driver workload and reduce vehicle accidents. The fact to display critical road event on the windshield can help driver’s hazard event detection, in comparison with a HDD, with a 100-ms HUD advantage in participant reaction time for the detection of pop-up obstacles [34] (Fig. 2).
3.3 Night Vision
AR display can improve visualization during night, indicating location of pedestrians and obstacle, enabling effective and efficient information transfer directly understandable by the driver [32, 35].
3.4 Navigation
The concept of projecting navigation instructions and guidance onto the windshield using HUD or AR has been investigated for some years with the objective to make decision making easier for the driver orienting himself in various traffic situations and road infrastructure complexities. Indeed, using a GPS-based navigation system displaying on-screen information creates issues of divided attention, drivers having to focus on both the information display and the road, and extra cognitive load in matching the computer-generated streets on the GPS system to the real streets in the 3-dimensional perspective that drivers have (Fig. 3).
An AR projection can be used to minimize the issue of visual distraction, divided attention and cognitive load by overlaying driving directions on the windshield and on the road, making it easier for the driver to focus attention in one single location and to translate the virtual information to effective navigation instructions.
Sato et al. [36] use the whole windshield surface to project navigation information such as destination and distance combined with the direction, which the driver should follow (Fig. 4).
AR-based visualization has also been employed for the purposes of supporting navigation and perception in the cases of hidden exits or roundabouts [30].
The virtual cable uses a volumetric display to create a true 3D image and superimposes it on the windscreen (Fig. 5)
Nevertheless, as the AR virtual image is, by definition, matching with the reality of the road infrastructure, the potential of anticipation for this type of display while navigating is less high than with a screen display, which can inform the driver about the next actions to proceed much more in advance than AR display, when the related infrastructure is still not in the driver’s field of view. To overcome this shortcoming, SeungJun Kim, Anind K. Dey [6] imagined an AR display that allows anticipating the next actions, even if the related infrastructure was not yet in the driver’s field of view. In this original concept, the AR information is superimposed on top of the real street scene, and extended with a display of the coming roads to follow appearing before the driver reaches the physical place (see Fig. 6).
Testing this display, the authors showed that, from a general point of view, elderly drivers liked the fact that the AR allowed them to look at both the navigation display and the street at the same time and mentioned that this made it easier to notice pedestrians crossing the street [6]. As it was expected, AR reduces the impact of divided attention and cognitive load for elder drivers who have difficulty using navigation aids and may suffer from cognitive decline.
Nevertheless, it has to be noted that, when the visualization indicated in advance an upcoming turn, some drivers made errors and turned at an earlier intersection than the one they were supposed to. Other drivers commented that when the visualization indicated that they go straight (via a highlighted path that rises vertically up the windshield), they thought that meant they could continue to go straight, regardless of the state of the traffic lights.
4 Conclusion
Even though the HUD has been around since the 80’s, it is still a rather uncommon way to display visual information in automotive context. Studies showed that the HUD display has great potential but not a high acceptance level amongst drivers [38]. One possible reason to why HUDs have not become more popular is that the design focus of the HUD has been upon implementing a technology more than accommodating the users. When considering the usefulness of a system, like the HUD, people tend to use or not use an application to the extent they believe it will help them perform their job better [39] and performance gains are often dependent upon the users’ level of willingness to accept and use the system.
Design principles from classic 2D displays are no longer applicable in their full extend for this mode of display, due to the altered motion behavior of visualized objects. The information displayed on the windshield can be either continuous or discrete, displayed in 2D or 3D, spatially registered to the environment or displays unregistered symbolic content. Investigation should go on to determine which combination of design principles for HUD and AR operates best in relation to specific driving task.
Studies have to be conducted in real road environment rather than in simulator context to get better understanding of driver’s level of acceptance using HUDs and to gain insight on where drivers prefer the HUD image to be located [25]. Based upon the fact that most drivers prefer user-friendly and user-centered automobile devices sensitive to their personal taste and emotions, some recent research investigates how to present HUD images that could cater to the drivers’ psychological feelings and emotions [40].
Only 2 % of automobiles sold in 2012 had HUDs. However, by 2020, that rate will rise to 9 %. Japan had the highest fitment rate of car HUD systems in 2010, but Europe is expected by 2020 to take the lead. Furthermore, HUD technology combine with AR delivers a potential to overcome existing bottlenecks for increasing visual information displayed in the driving context, in comparison with in-vehicle screen display.
Based upon this perspective, the overview made in this paper underlines that HUD-AR visual display can be a great hope for supporting drivers in terms of enhanced perception and lower workload, but with a high caution while designing and implementing this information on the windshield, right in the driver’s field of view. So, research in Human Factors has to be extensively conducted in order to fully understand how to optimize the great technical opportunity of the HUD-AR concept for automotive in order to keep the best and avoid the worst, taking into account the road safety issues.
References
Pauzié, A., Amditis, A.: Intelligent Driver Support System functions in cars and their potential consequences on safety. In: Ashgate (ed.), Safety of Intelligent Driver Support Systems: Design, Evaluation, and Social perspectives, pp 7–25 (2010)
Castro, C.: Visual demands and driving, in Human Factors of Visual and Cognitive Performance in Driving, CRC Press, Technology & Engineering (2008) 21 November 2008
Caird, J.K.: A meta-analysis of the effects of cell phones on driver performance. Accid Anal Prev. 40(4), 1282–1293 (2008)
Liu, Y.C., Wen, M.H., Driving performance of commercial vehicle operators in Taiwan:Comparison of head-up display (HUD) vs. head-down display (HDD). Int. J. Hum Comput Stud. 61, 679–697 (2004)
Wittmann, M., Kiss, M., Gugg, P., Steffen, A., Fink, M., Pöppel, E.: Effects of display position of a visual in-vehicle task on simulated driving. Appl. Ergon. 37, 187–199 (2006)
SeungJun, K., Anind, K.D.: Simulated augmented reality windshield display as a cognitive mapping aid for elder driver navigation in: Proceeding, CHI 2009 and SIGCHI Conference on Human Factors in Computing Systems, pp. 133–142 (2009)
Yantis, S., Jonides, J.: Attentional capture by abrupt onsets: New perceptual objects or visual masking? J. Exp. Psychol. Hum. Percept. Perform. 27(6), 1505–1513 (1996)
Wickens, C.D., Hollands, J.G.: Engineering Psychology and Human Performance, 3rd edn. Prentice Hall, Upper Saddle River, NJ (2000)
Bishop, R.: Intelligent Vehicle Technology and Trends. Artech House Inc., Walker, Stanton, and Young, Norwood, MA (2005)
Walker, G.H., Stanton, N.A., Young, M.S.: Where is computing driving cars? International Journal of Human-Computer Interaction 13(2), 203–229 (2001)
Tufano, D.R.: Automotive HUDs: the overlooked safety issues. Hum. Factors 39(2), 303–311 (1997)
Burnett, G.: A road-based evaluation of a head-up display for presenting navigation information. In: Proceedings of the tenth international conference on human-computer interaction pp. 180– 184. Lawrence Erlbaum Associates, New Jersey (2003)
Horrey, W.J., Wickens, C.D. Alexander, A.L.: The Effects of Head-Up Display Clutter and In-Vehicle Display Separation on Concurrent Driving Performance. In: Proceedings the Human Factors and Ergonomics Society Annual Meeting, p. 1880 (2003)
Kiefer, R.J.: Effects of a head-up versus head-down digital speedometer on visual sampling behavior and speed control performance during daytime automobile driving (SAE Tecnical. Paper 910111). Society of Automotive Engineers, Warrendale, PA (1991)
Gish, K.W., Staplin, L.: Human Factors Aspects of Using Head Up Displays in Automobiles:A Review of the Literature, DOT HS 808 320 (1995)
Charissis, V., Papanastasiou, S.: Human–machine collaboration through vehicle head up display interface. Cogn. Technol. Work 12, 41–50 (2010)
Liu, Y.C.: Effect of using head-up display in automobile context on attention demand and driving performance. Displays 24, 157–165 (2003)
Okabayashi, S., Sakata, M., Fukano, J., Daidoji, S., Hashimoto, C., Ishikawa, T.: Development of practical heads-up display for production vehicle application (SAE Technical Paper No. 890559.Society of Automotive Engineers, New York (1989)
Tonnis, M., Lange, C., Klinker, G.: Visual longitudinal and lateral driving assistance in the head-up display of cars. In: Proceedings of the Sixth IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, pp. 128–131 (2007)
Okabayashi, S., Sakata, M., Hatada, T.: Driver’s ability to recognize objects in the forward view with superposition of head-up display images. Proceedings of the Society for Information Display 32, 465–468 (1991)
lavecchia, J.H., lavecchia, H.P., Roscoe, S.N.: Eye accommodation to head-up virtual images. Hum. Factors 30, 689–702 (1988)
Smith, G., Meehan, J.W., Day, R.H.: The effect of accommodation on retinal image size. Hum. Factors 34, 289–301 (1992)
Sojourner, R.J., Antin, J.F.: The effects ofa simulated head-up display speedometer on perceptual task performance. Hum. Factors 32(3), 329–339 (1990)
Roscoe, S.: The trouble with HUDs and HMDs. Hum. Factors Soc. Bull. 30(7), 1–3 (1987)
Tretten, Ph., Gärling, A., Nilsson, R. and Larsson, T.C., An On-Road Study of Head-Up Display: Preferred Location and Acceptance Levels. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 55 p. 1914 (2011)
Weintraub, D.J., Ensing, M.: Human Factors Issues in Head-Up Display Design: The Book of HUD (CSERIAC state of art report) (1992)
Bossi, L., Ward, N., Parkes, A.: The effect of simulated vision enhancement systems on driver peripheral target detection and identification. Ergon. Design 4, 192–195 (1994)
Azuma, R.: A survey of augmented Reality presence. Teleoperators Virtual Environ. 6(4), 355–385 (1997)
Park, H.S., Park, M.W., Won, K.H., Kim, K.H., Jung, S.K.: In- vehicle AR-HUD system to provide driving-safety information. ETRI J. 35(6), 1038–1047 (2013)
Narzt, W., Pomberger, G., Ferscha, A., Kolb, D., Muller, R., Wieghardt, J., Hortner, H., Lindinger, C.: Augmented reality navigation systems. Univ. Access Inf. Soc. 4(3), 177–187 (2006)
Sawano, H., Okada, M.: A car-navigation system based on augmented reality. In: SIGGRAPH 2005 Sketches, p. 119 (2005)
Scott-Young, S.: Seeing the Road Ahead: GPSAugmented Reality Aids Drivers. GPS World 14(11), 22–28 (2003)
Lorenz, L., Kerschbaum, Ph., Schumann, J.: Designing take over scenarios for automated driving: How does augmented reality support the driver to get back into the loop? In: Proceedings of the Human Factors and Ergonomics Society 58th Annual Meeting (2014)
Weihrauch, M., Melocny, G., Goesch, T.: The first head-up display introduced by General Motors (SAE Technical paper No. 890228). Society of Automotive Engineers, New York. (1989)
Bergmeier, U., Lange, C.: Acceptance of Augmented Reality for driver assistance information. In: Proceedings 2nd International Conference on Applied Human Factors and Ergonomics, Las Vegas (2008)
Sato, A., Kitahara, I., Yoshinari, K., Yuichi, O.: Visual navigation system on windshield head-up display. In: Proceedings of 13th world congress & exhibition on intelligent transport systems and services (2006)
Plavsic, M., Bubb, H., Duschl, M., Tonnis, M., Klinker, G.: Ergonomic Design and Evaluation of Augmented Reality Based CautionaryWarnings for Driving Assistance in Urban Environments, in Proceedings of International Ergonomics Assocation (2009)
Gish, K.W., Staplin, L., Stewart, J., Perel, M.: Sensory and Cognitive Factors Affecting Automotive Head-Up Display Effectiveness. Transportation Research Record 1694, Paper No. 99–0736, pp. 11-19 (1999)
Davis, F.D.: Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 13, 319–339 (1989)
Smith, S., Shih-Hang, F.S.: The relationships between automobile head-up display presentation images and drivers’ Kansei. Displays 32, 58–68 (2011)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Pauzie, A. (2015). Head Up Display in Automotive: A New Reality for the Driver. In: Marcus, A. (eds) Design, User Experience, and Usability: Interactive Experience Design. DUXU 2015. Lecture Notes in Computer Science(), vol 9188. Springer, Cham. https://doi.org/10.1007/978-3-319-20889-3_47
Download citation
DOI: https://doi.org/10.1007/978-3-319-20889-3_47
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-20888-6
Online ISBN: 978-3-319-20889-3
eBook Packages: Computer ScienceComputer Science (R0)