Three people interacting with an apartment model on a desk using the Virtual Reality eye gear.

The Johns Hopkins Engineering Center for Learning Design and Technology (CLDT) congratulates the inaugural winners of the Extended Reality (XR) Innovation Awards—the first at JHU to recognize the school’s investment in extended reality for learning and research. These new awards support projects connected to teaching and learning, and research in spatial computing, virtual reality, augmented reality, and mixed reality.

In addition to supporting XR efforts through these awards, the CLDT has set up an XR lab in the Stieff Silver building, suite 110, where faculty, staff, and students can incorporate alternate realities into their research projects and lesson planning.

Extended Reality (XR) Innovation Award Winners

Making Observations in the Clinic Through Virtual Reality
Eileen Haase, Brock Wester, Caitlin Torgerson, Arielle Drummond, Soumya Acharya, Jon Resar, Rodney Omron, Clifford Weiss, and Ashish Nimgaonkar

For biomedical engineers, hospital immersion is a huge component of observing and classifying clinical needs. Both the BME undergraduate and graduate (CBID) design teams spend weeks on the hospital floor identifying potential areas for improvement, followed by application of the design process to develop a prototype which solves a problem. This model of bringing students into the operating room, while inspiring to the students, has significant limitations. The number of students allowed on the hospital floor has always been strictly limited. As a result, many biomedical engineering students never have a chance to spend time in an operating room. More recently, the pandemic has made it difficult for anyone who is not a clinician to make observations. We propose using extended reality (XR), including augmented and virtual reality, to bring the hospital examining and operating rooms to every student in biomedical engineering, from first-years through the graduate level. We will start by focusing on bedside procedures that are commonly performed in hospital and clinic settings, the cardiovascular system (including TAVR and EKGs), and imaging (ultrasound, echocardiograms, interventional radiology) and interventional gastroenterology. The goal is to combine anatomical XR with clinical immersion so that students can experience a system from the perspective of the doctor, patient, and organ.

Gamifying Linear Algebra With AR and VR
Joseph Cutrone, Sergey Kushnarev

This project seeks to build three interactive AR mobile games to teach, practice, and engage with content from introductory linear algebra. The AR version will use GPS locations of specific iconic areas, images, or statues on the Homewood campus to apply linear algebra concepts to solve puzzles involving data analysis, modelling, and cryptography. The three interactives will be connected with a storyline to then provide the final clue once all individual puzzles are solved. For students online, a VR JHU campus will simulate the desired effects to locate the items on campus. Engagement metrics, completion data, and quiz scores will be collected for research on the effectiveness of the XR experience to students.

Cybersecurity Learning Through Extended Reality
David Concepcion, Nelson Sanchez

This application proposes the design and development of a virtual reality environment where students will learn, communicate, interact, and share information about cybersecurity. In today’s world, cybersecurity needs to be applied to all manner of technologies, from web-based, mobile, internet-of-things, cloud, containers, and many other complex systems. The only way cybersecurity will be established is if each IT professional makes the effort to build systems with security at the forefront. Most IT systems with multiple layers and components are extremely complex making them a challenge to grasp where to apply security principles. We propose implementing progressive learning techniques in collaboration with the APL XR team to design, develop, and deploy a virtual learning environment prototype that provides an interactive 3D representation of the interfaces, components, actors, and data related to cybersecurity. The first phase of our project will provide WSE with the digital components of an XR learning environment. The second phase would use the first phase components to create a scenario-driven gamification system to entice students to play repeatedly while discovering new outcomes, thus cementing learned cybersecurity concepts.

Extended Reality Training and Assessment System for Health Care
Ehsan Azimi, Chien-Ming Huang, Nassir Navab, Judy Huang

The complexity of medical interventions and processes is continuously increasing. However, due to working-hour restrictions, increasing costs, and ethical concerns regarding patient safety, clinical training opportunities are continuously decreasing. Education and quality training of healthcare professionals remains a challenge, and despite considerable improvements in immersive technology such as augmented reality (AR) and virtual reality (VR), these modalities have not yet been integrated systematically into the curriculum or practical training of healthcare professionals.  We, therefore, propose an interactive ecosystem for training and assessment of clinical tasks in mixed reality, which consists of authoring the desired surgical task, immersive training and practice, assessment of the trainee, and sensory and behavioral analysis. This information-based ecosystem will also provide the data to train machine learning algorithms. We use the insertion of a central venous line, a common procedure that is widely taught and learned, as our illustrative use case; however, the modular design of the system makes it expandable to other procedures. Our mixed reality training ecosystem is agnostic to the procedure or specific hardware which makes it scalable and sustainable.