Research
Impact of Motion and Eye-Tracking Data in Augmented Reality Learning
In our current NSF project, we conducted multiple experiments to examine the effects of metacognitive monitoring feedback in a new motion- and location-based AR learning environment. Additionally, we investigated the changes in motion and eye-tracking data with metacognitive monitoring feedback and their relationship to students' AR learning.
Augmented Reality into Undergraduate Engineering Laboratories
AR technology and sensor-based motion & location tracking systems will be combined to provide fine-grained computer-generated 3D images and actual physical objects in real-time. The team technologically innovates an AR system by integrating a real-time motion-tracking sensor and an optical see-through head-mounted display (OST-HMD).
Integrating Augmented Reality and Location Tracking Systems
In this study, we developed an interactive learning solution for engineering education by combining augmented reality (AR) and Near-Field Electromagnetic Ranging (NFER) technologies. We built an instructional system that integrates AR devices and real-time positioning sensors to improve the interactive experience of learners in an immersive learning environment.
Indoor Tracking System for Health Care Process Validation
This study investigates the reliability of the Near-Field Electromagnetic Ranging (NFER) system in capturing the real-time location data of nurses’ workflow and sees whether the NFER system could be a new method to conduct a time-motion study in an intensive care unit (ICU).
Measuring Human Performance using Eye-tracking Technology
In this study, we have developed the gauge monitoring simulation and applied the eye and head tracking technology to assess the effectiveness of flow, level, pressure, and temperature gauge shapes developed by the Abnormal Situation Management Consortium. The primary objective of this research is to investigate the relationship between human perception stress (pupil diameter) and performance (sensitivity d’) in a process control monitoring task using the eye and head-integrated human-in-the-loop (HITL) simulation
Eye Inter-fixation Analysis of User Behavior in a Dynamic Control Task
The primary objective of this research is to investigate the relationship between eye inter-fixation time and situation awareness in a dynamic control task. In this study, participants were trained to use the Anti-Air Warfare Coordinator (AAWC) simulation. According to the results, participants with relatively high levels of situation awareness show lower inter-fixation time than those with relatively low levels of situation awareness.
Measuring Effectiveness of Aftermarket Crash Avoidance Technology
This study aimed to assess the collision avoidance capabilities of aftermarket products over road evaluation. The team conducted a real-driving usability study for evaluating aftermarket Collusion Avoidance Technology (CAT) devices by using eye-tracking technology.
Developing NGOMSL model for EMR Use in an Emergency Department
The primary objective of this research is to develop a Natural Goals Operators Methods and Selection rules Language (NGOMSL) model for Electronic Health Record (EHR) use in an Emergency Department (ED). In this study, the research team visits the ED Clinical Engineering lab at the Mayo Clinic in Minnesota and collaborates with scientists in systems engineering to observe emergency physicians’ and nurses’ behaviors related to the EHR process. Behavior monitoring equipment in the lab was used as a tool to collect physicians and nurses’ behaviors. The team developed hierarchical task analysis (HTA) charts by using the collected data. Following this, the team will develop the NGOMSL model based on the HTA charts.