Keywords

1 Introduction

Sensor-based, mobile behavioral analytics provide a new form of adaptive instructional systems for consideration by researchers and learning scientists. Real-time use of data from the Internet of Things (IoT) connected devices provide a system for processing data quickly and efficiently in dynamic, changing conditions such as live simulation training. Once these data streams are captured, they may be processed in real time leveraging artificial intelligence (AI) infused machine learning techniques such as deep learning and computer visioning. In this paper, we describe our conceptualization of an adaptive, human-machine instructional system incorporating IoT and video behavioral sensing of first responder teams engaged in a large-scale live simulation training exercise in an exploratory study research effort.

2 Adaptive Human-Machine Instructional Systems and Emergency Response Teams

Adaptive instructional systems are defined as “…artificially-intelligent, computer-based systems that guide learning experiences by tailoring instruction and recommendations based on the goals, needs and preferences of each individual learner or team in the context of domain learning objectives (p. 3)”, (Bell and Sottilare [2]. When the domain is the complex system of multiple emergency response teams engaged in a real-world live simulation training exercise, the adaptive instructional computer-based system may be conceived somewhat differently. A system consisting of sensor-based devices that seamlessly collect behavioral and environmental data in-situ (i.e., in the real world while live action typically unfolds) allows for the automatic processing and extraction of relevant data streams to inform learning and training in the simulation debrief or after action review. This integrated, seamless data collection from multiple sources can provide targeted, real time visualization of important temporal, behavioral, and environmental metrics to inform an iterative cycle of reflection, learning and improved performance for the first responder teams.

Specifically, this proposed human-machine iterative cycle represents a socio-technical system that incorporates adaptivity in the systems’ ability to observe and sense the environment while leveraging machine learning algorithms to quickly interpret and process the contextual information from the building environment as well as the dynamic behavioral activity of the teams as it unfolds in real-time to provide human and machine feedback for learning. This adaptive and reciprocal feedback loop (see Fig. 1) provides the selection and visualization of the rich sensor-based information using machine learning algorithms and computer vision techniques. These advanced computational methods can help to better inform leadership, incident command and the first responder teams about their performance and environmental conditions in the simulated emergency context to provide situational awareness, contextual and tactical feedback to adapt their behavior for the next simulation run. These processed data streams individually and in combination can provide targeted, integrated information for enhanced human sensemaking of a complex and dynamic live action scenario with multiple teams and large amounts of behavioral and environmental data. Machine learning and computer visioning techniques support data-driven decision-making for the responder teams that goes beyond what is possible with human observation and feedback alone.

Fig. 1.
figure 1

Adaptive cycle of a human-machine, sensor-based instructional system for use in emergency response simulation training

Currently, our work is situated in the multiteam emergency response training context. Emergency response multiteam systems are comprised of functionally distinct teams from different agencies who must engage in interdependent activity in order to ensure public safety [10]. The nature of emergency situations places a premium on not only effective coordination among members of the same team (or agency), but also effective coordination between these teams [9]. Our research program attempts to improve coordination within and between the first responder teams by utilizing a sensor-based adaptive instructional system.

This paper details our initial effort, focusing on response times of first responder organizations by providing a form of instant replay. This instant replay with various representations of video and sensor data permits enhanced feedback and research inquiry into the visualization of team and between team tactics as well as provides fine grain timing of important behavioral events. These analytics when visualized to participating first responder teams provide a new window and research investigation into how teams function together, team tactical strategy and incident command as well as potentially can improve the shared common operating picture and overall performance of the system. We see this as a human-machine interactive cycle seamlessly capturing team-based behavior and environmental data through sensor and video data devices during the live simulation training in real-time leveraging machine learning and computer visioning algorithms for processing, analysis and visualization of selected information for enhanced feedback. The adaptive socio-technical human-machine system works in a cyclical reflective process where the sensor and video data are processed in the cloud, continuously updated as each simulation run or exercise provides additional data from which the computational system adapts and learns providing rich information for the human system to adapt and learn.

3 Data Streams and User Studies in Emergency Response Simulation Training

When an emergency incident occurs, information from a variety of data sources are triggered that could potentially be mined with advanced computational techniques to inform emergency response and live simulation training. These data sources include: 1) public data streams such as television broadcasts, public radio communications, video, social media and physical sensing streams and consumer drone footage; 2) first responder data streams such as mission critical radio secure communications, wearable camera feeds, biometric data, emergency response robotic/drone information, and physical sensing devices such as blue force tracking or wearable cameras; and 3) building environment contextual data streams such as indoor location-based data, occupancy data, wi-fi detection, humidity, temperature, light, sound, carbon dioxide, organic gases barometric with data visualized and represented on a digital twin model of the building. These information data streams individually or in combination could potentially be tapped to inform incident response, team- and multiteam-based tactics, team- and multiteam-based learning and decision-making along with capturing the changing conditions of an emergency such as a fire, severe weather or active violence incident (AVI). These information streams may enhance the contextual and situational awareness of first responders, their leadership as well as the public when used in a secure manner. In simulated emergency conditions, these seamlessly captured and processed multimodal data streams can also provide access to important behavioral and environmental data for learning and training. Leveraging artificial intelligence (AI) advanced computational techniques to process and sift through massive amounts of this multimodal data permits the visualization of important metrics and data-driven analyses for first responder teams and their leadership immediate viewing. This creates an opportunity for a socio-technical, adaptive, and reciprocal human-machine learning cycle. Specifically, capitalizing on machine learning and computer visioning techniques can transform these large amounts of in-situ data into enhanced information for learning and training feedback with dynamic and actionable intelligence for first responders to improve individual and team and multiteam system response and behavior (see Fig. 1).

In addition to providing enhanced data-driven information to the first responder teams, an adaptive sensor-based instructional system can also provide citizens participating in the live simulation training exercise, an opportunity to experience and react to the use of various communication channels that incorporate these data streams to potentially provide enhanced information. Regarding the guided learning experience, researchers have noted that ineffective training tools used to assist training simulation instructors and trainees may result in suboptimal training effectiveness outcomes [4, 6]. For example, this system could include processed data and recommended evacuation routes that change as the dynamics of an emergency situation change. When new forms of communication and enhanced information are introduced to citizens in an emergency response simulation training exercise, these citizens can provide important input and guidance in iterative cycles of user studies to potentially improve the system. User studies provide an important iterative revision cycle evaluating the new forms of information and resulting communication from the first responder perspective as well as from the citizen user perspective.

Figure 1 below attempts to broadly capture this adaptive cycle of a human-machine, sensor-based instructional system for use in emergency response simulation training. The cyclical and iterative phases may be depicted from the start of an incident response, seamlessly collecting multiple described multimodal data streams (represented in graphic as buckets of integrated data streams) to be processed through machine learning and advanced computational techniques in the cloud and then visualized providing enhanced analytics and information for both the first responders and the public and improved through user studies (see Fig. 1).

4 Research and Development Targeting the Adaptive Human-Machine Iterative Experiential Learning Cycle

The sensor-based adaptive instructional system with seamless data collection described in this paper collects multimodal stream analytics from multiple data streams (e.g. sensing through video, audio, IoT building environment devices, media, social media, wearable technologies and represented on a digital twin representation of the building) to extract relevant information about the context of the simulated emergency and individual/team/multiteam behaviors. We propose a systematic approach to seamlessly collect this data related to the first responder team-based activity and the surrounding environment in the live simulation exercise in a dynamic and adaptive human-machine cycle to improve their learning and training. The selected data streams are efficiently processed for relevant patterns through advanced data computation and algorithms including machine learning and computer visioning techniques. These patterns are structurally stored with cloud metadata and processed with an elastic search database resulting in structured representation for targeted information retrieval on various types of visual interfaces such as time and behavioral data. Therefore, these computational techniques can efficiently process the multiple, multimodal and continuous data streams to identify important behavioral patterns and information (e.g. such as which persons were in a particular building location at a designated point in the action). The relevant information is visualized in real time for the first responder teams and their leadership to leverage for enhanced reflection and feedback based on this extracted information and used for teams to learn and adapt to the changing, dynamic conditions from run to run. The adaptive instructional cycle results from the capture, ingestion and advanced computation of multiple data streams that when automatically processed in real time provide insights into specific patterns of behavior immediately made available to the first responder leadership and to the participating teams for reflection and sensemaking in the debrief to provide a basis for experiential learning and behavior change.

4.1 Kolb’s Experiential Learning Cycle

This type of experiential learning cycle aligns with Kolb’s experiential learning theory [8]. This cycle may be described as “…the process whereby knowledge is created through the transformation of experience” and “…knowledge results from the combination of grasping and transforming experience” [7]. The sensor-based, adaptive instructional system grasps fine grain data from the first responder team’s behavioral activity and environmental conditions of the simulated emergency transforming and processing it for enhanced situational awareness and reflection in the debrief or an after-action review. Two modes are described in Kohn’s (1984) theory including the description of experiential learning in grasping experience through a cycle of initial Concrete Experience (CE) and Abstract Conceptualization (AC) – and transforming experience through Reflective Observation (RO) and Active Experimentation (AE). These first responder teams have initial concrete experience in the live simulation training exercise with capturing aspects of their experience that are transformed into data representations through visualization in an abstract conceptualization of that experience for reflective team-based observation and learning. As that learning and reflection occur with enhanced guidance from leadership and trainers, then it can be leveraged and acted up for active experimentation in the next simulation run. In this way, the human-machine dynamic and reciprocal feedback loop provide an iterative representation of an adaptive learning cycle that reflects Kohn’s description of experiential learning. The system is adaptive by capturing dynamic behavioral information and ingesting the changing conditions of the emergency situation for processing to provide enhanced information for human interpretation and sensemaking. The computational part of the system autonomously captures this behavioral and contextual data to process and reveal select important aspects such as response time intervals or location and dispersion of personnel in the exercise to the first responder trainers and leadership for feedback. The human part of the system associates and interprets the behavior of the first responder teams and the contextual information from the building environment to provide additional guidance and support for team- and multiteam-based experiential learning such as would occur during instant replays for sports teams to improve their learning and performance in the next play.

4.2 Visualization of Information in the Sensor-Based Adaptive Instructional System

The visualized behavioral and contextual data are then incorporated into an enhanced common operating system and training system to provide contextual and behavioral intelligence for reciprocal human sense-making and machine computational analysis cycles that might adapt to each other. As the first responder teams view the processed information, they can reflect and make decisions for the next run based on the highlighted, processed, fine-grain data. For example, the computational vision algorithm may detect team behavioral patterns that are more efficient or less efficient in providing care to the injured simulated patients in the scenario. The behavioral and environmental conditions are visualized back to the teams for enhanced learning and feedback to potentially influence the next run.

In turn, the computational system can potentially learn with each run, the behavioral patterns of the teams such as who was near who at what point in time and in what part of the building, or were the first responders working well as a team or system of teams to care for all the simulated patients? This visualized information can illuminate, for example, the dispersion of team members and teams across the patient care needs in the situation and determine the timing of care by the responders to patients in need. If more responders react and begin to provide care on the first simulated patients they see initially and do not search for other patients in different parts of the building, outcomes for the later identified patients can be compromised. The system can spatially visualize location, occupancy of the first responders in specific parts of the building to demonstrate the results of in-situ decision-making influencing response times or behavioral patterns of teams. When this information is shown back to these teams, they can adjust their behavior. It is conceivable that over time, the computational system may be able to potentially learn these patterns of contextual information, behavior and related time intervals to iteratively progress to improve the information visualized for response and training. While our work has not yet fully matured to this level or the operational environment, we imagine a day when the fully integrated human-computational system may potentially learn dynamically and adaptively as it leverages sensor-based data capture and intelligence to track and update contextual conditions for adaptive recommendations and informed decision-making in this context.

5 Exploratory Study

In this paper, we detail our exploratory study, in which we focus on leveraging multimodal data stream analytics, visualization and machine learning algorithms to look at these unstructured, multimodal data streams to gain insight into the important timing, contextual information and team- and multiteam-based activity [13]. The human-machine adaptive cycle above employs a broad system perspective, which illustrates how information data streams from multiple, multimodal sources can inform an emergency incident. Specifically, these sensor data streams can provide important timing, contextual information and tactical, team-based information. These data analytics from multiple sources were processed with machine learning and computer visioning techniques to reveal and enhance the human experiential learning cycle visualizing team- and multiteam-based tactics and timing as well as providing cyclical input for the computational system to also potentially learn from each run. Improved selection, processing and enhancement of important sensor-based information including behavioral and environmental data provide enhanced information and visualization to prompt improved human sensemaking and contextual reasoning about the emergency training situation. The reciprocal loop between the computational system and the human system provided a form of socio-technical adaptivity in grasping experience through the seamless capture of continuous data streams of the sensor-based technologies and transforming the first responders’ concrete experience into visual information for reflective observation of important behavioral and contextual data for use in the debrief and active experimentation based on this information in the next simulation run. This selective capture and visualization of information by advanced computation through machine learning and patterns contributed to the human sense-making and situational/contextual awareness to then potentially and adaptively contribute to experimentation and change of individual, team-and multiteam-based behaviors. This described human-machine adaptive cycle moves us closer toward the development of an enhanced decision support system that is informed by the continually changing dynamics of the situation.

5.1 Live Simulation Training Context

Live simulation training is a form of training which places learners in a realistic, immersive experience in the context with which the learner works. Simulation is an effective tool to develop complex skills across multiple contexts. In comparison to alternative training approaches (e.g., tabletop exercises, computer simulations), live simulation training exercises more fully replicate physical features of the real working environment by utilizing live actors and real equipment [5]. Within the emergency response environment, live simulation training is an important component of first responder preparedness. Live simulation training affords the first responders the opportunity to practice key processes including individual, team and system coordination, planning, and communication [1]. These processes are practiced within an environment that illicit similar physiological responses to what the first responders are likely to experience in a real emergency. Additionally, live simulation training helps to elucidate team and between-team coordination, communication, and knowledge-sharing issues in an environment where these human errors can have major negative consequences [3].

Due to the personnel and material resources required to implement live simulation training, there are several design components that must be carefully considered. It has been recommended that live simulations consider 1) instructional features embedded within the simulation, 2) the exercise scenarios are carefully crafted and contain opportunities for performance measurement and diagnostic feedback, 3) learning experience is guided (e.g., through timely facilitated debriefs), and 4) simulation fidelity is matched to task requirements [12]. Regarding the guided learning experience, researchers have noted that ineffective training tools used to assist training simulation instructors and trainees may result in suboptimal training effectiveness outcomes [4]. For example, much of the live simulation training debriefs are conducted using observation-based methods which may lose valuable information, therefore diminishing the quality of performance feedback delivered to the learners [11]. However, adaptive human-machine instructional systems may address some of these concerns and offer a more effective way to deliver quality feedback to the learners, facilitating the transfer of critical learning outcomes.

5.2 IoT Device Data Streams for Adaptive Instruction in Law Enforcement Training Example

Our exploratory research involved staging a live simulation active violence incident (AVI) in a large public arena building on a University campus in the mid-Atlantic region of the U.S. that included six separate first responder teams comprised of over 70 fire, EMS, law enforcement personnel, and volunteers. This exercise was a part of an ongoing research and development effort geared toward improved understanding of how the integration of multiple streams of data can be leveraged in a human-machine adaptive experiential learning cycle to enhance public safety training and response effectiveness in emergency situations. Specifically, the staged scenario provided a pilot test environment incorporating adaptive cycles of feedback through capturing individual, team- and multiteam-based experience, providing data-driven reflection and guidance with machine learning and computer vision techniques leveraging smart building and sensor-based technology systems that were targeted to support first responders tactics and experiential learning.

The live simulation exercise research pilot tested several technologies instrumented in the arena including video, location-based and occupancy sensors, and displays from a variety of technology innovators. Key technology partners developed the core IoT infrastructure, providing the analytics platform to utilize first responder team behavioral data and daily building environment operation data as well as a 3D digital twin visualization of the Arena to inform learning from emergency response simulation training. During routine operations, some of these smart building technologies were designed to improve the operational and energy efficiency of the arena, to enhance comfort and provide additional services to patrons. However, in the case of an emergency simulation exercise the 3D digital twin visualization, video and sensors can also be available to help responders more rapidly determine the location and type of emergency, help find victims more quickly, learn from their behavior and experience through mobile behavioral and building environment analytics to ultimately, save lives.

5.3 Sensor Systems and Displays

Specific sensors and displays included 24 sensor pods incorporating remote video capability, Wi-Fi detectors, blue-force tracking, LiDAR occupancy detectors, particulate and environmental sensors along with 2D/3D visualization tools. The analytics platform along with a digital twin of the facility, and Wi-Fi-based indoor location platform, allowed for rich information data streams to be captured along with real time first responder team tracking capabilities for review in the simulation debrief to enhance team and multiteam learning.

The live simulation-based exercise engaged the teams of first responders whom prior to the exercise, had never collectively engaged in a joint exercise together. The staged incident was an active violence incident mass casualty exercise in the large arena which required the joint efforts of local fire, EMS and law enforcement first responders. First responders were not provided the system interface with available data for use during the staged incident, such as the 3D digital twin of the building prior to arriving on scene. The first responder participating teams and simulated patient actors were also shown the technology system for adaptive instruction for input and evaluation. However, during a feedback session, leadership representing the participating agencies were presented with the opportunity to closely review the technology system and the first responder teams as well as the citizen volunteer simulated patients were interviewed about the adaptive system.

6 Data Analysis

We analyzed the captured video and sensor data displays during and after the simulation exercise deployment. We recorded, transcribed and analyzed the first responder debrief sessions to capture insights about the exercise itself. Then first responder teams were shown some of the captured and processed data for feedback. The expert panel provided rich feedback on the potential use of this type of system in their agencies.

7 Results

The teams and expert panel emphasized the desire for increased situational awareness for first responders. Members of the response agencies mentioned the usefulness of the 3D digital twin visualization in providing an enhanced and shared understanding of the operational environment, and for incident command responsible for managing the personnel and response efforts. This was deemed especially helpful for first responder teams who were unfamiliar with the layout and structure of the building, which is likely to be the case for real emergency events. Additionally, the occupancy and location-based sensors were seen as helpful data streams when engaging in search and rescue, after the exercise suspect was neutralized. The security of the collected data streams was also cited as an important consideration by all involved. As such, when designing the technology system, the security of data, and mitigation from cybersecurity attack or intrusion was highlighted as a critical component of the technology system developed. When instructed to record, all data collected via the technology system would be housed on a secure custom solution.

We designed and employed an initial prototype of an AI-infused system (see Fig. 2) which seeks to augment the learning experience of emergency responders in the live simulation training exercise. The AI-infused, adaptive instructional system was created to support training instructors with multimodal learning analytics for enhanced training debriefs (e.g., after-action reviews). The system applied AI methods of computer vision and machine learning to process the various multimodal data streams in near real-time, collected by the aforementioned state-of-the-art sensing technologies during the training exercise.

Fig. 2.
figure 2

Initial prototype of an AI-infused system

A core objective of the training simulation debrief after different scenario runs is to review and analyze the effectiveness of specific events of interest, such as the time differential between the response dispatch to the time of the neutralization of the shooter; or time of interaction between the responders and victims or patients. The AI-system provided first responders with an experimental user interface that displays information for these key indicators. Additionally, the AI-infused system provides a starting point for important insights for coordination activities within-teams, and between the different responder teams. This effort promotes the opportunity to facilitate both within- and between-team learning, which is an often neglected critical component of emergency response management operations. Future exercises will further explore the effectiveness of the technology and AI-infused system in initiating change in behavioral patterns, or coordinated dance, among the multiteam system composed of these brave first responder teams.

Preliminary insights from the training and initial exploratory research data analysis indicates that first responders and their leadership see significant value in this type of joint agency training event in building levels of trust amongst the different agencies that would respond and work together in an actual event. Situating the live simulation training and corresponding research cycles for evaluation of these emerging technologies, building intelligence and their security in a university setting provides a unique opportunity to explore new technologies, in-situ, incorporating direct insights from first responders in an applied research approach. The combination and integration of technology systems may provide enhanced situational awareness and security within a building, and first responder wearable devices for collecting sensor and environmental data to enhance actionable data for use in emergency preparedness training will also allow for researchers to study within- and between-team interaction and team and multiteam system learning with real-time data from these devices. Future training and research events are currently being planned.

8 Summary

In our work supporting the experiential learning of first responder teams, we connect their concrete experience captured through the seamless sensor-based data to the behavioral activity taking place in the real world which is then processed through advanced machine computational techniques and represented in an abstract conceptualization of team- and multiteam-based behavior for reflective observation and learning. This reflective human observation of the first responder teams and leadership leads to active experimentation by the teams in the next simulation run based on the learning and reflection on the processed, AI-infused data streams representing their prior team activity. Creating a reciprocal human-machine iterative loop and supporting their experiential learning cycle is the goal of this ongoing research and development. Positioning the described system as an adaptive instructional system in the sense that by ingesting dynamic information about the changing conditions of the simulated emergency situation for processing provides opportunities for enhanced human interpretation and sensemaking. The computational part of the system autonomously captures behavioral and contextual data to process in the cloud and artificial intelligence techniques help to reveal important aspects and timing information for shaping the emergency response multiteam system’s future behavior. The human part of the system interprets this enhanced feedback in the debrief to potentially influence the experiential learning cycle and behavior of the first responder teams in an adaptive, real-time change cycle based on that data. This human-machine cycle may provide a novel conceptualization of adaptive instructional systems for future research.