Educational Innovation through Extended Reality Initiative (XRI)
The Extended Reality Initiative (XRI) began in fall 2021 with funding provided by the IU Indianapolis “(Re)Building Community Through Engagement” grant that provided funds to proposals offering up ways of “(re)building our campus community through intentional engagement strategies.” From this grant funding, the XR Initiative launched with a half-day event that highlighted extended reality technologies such as the Oculus Quest VR headsets, the MERGE Cube, Google Cardboard VR Glasses, and more.
The XR Initiative is a collaboration among multiple units that will be developing future programming highlighting innovative technologies available for use in teaching and learning. This initiative hopes to inspire faculty to develop and implement course activities using these types of activities as well as informing students about how they can use these emerging technologies to enhance their studies at IU and inspire their future careers.
The XRI has expanded to now offer XRI Faculty Fellow grants to introduce extended reality technologies to faculty and inspire the development and implementation of these innovative technologies into their courses.
2024 XRI Faculty Fellow Grant Recipients & Project Descriptions
Associate Professor of Clinical Anatomy, Cell Biology & Physiology – IU School of Medicine IU Indianapolis/IU Northwest Gross Anatomy & Neuroanatomy Courses
This project will involve students enrolled in the Human Structure course at Indiana University School of Medicine at the IU Northwest Campus. The VR experience/assignment will focus on the gross anatomy portion of the course. Students learn much of their gross anatomy through anatomical dissection of a human cadaver. To do this, 30 students are divided into 5 dissection groups: each consisting of 6 students. Each group of 6 students is further divided into an “A” dissecting group and a “B” dissecting group. Groups A and B perform cadaver dissections on alternate days. At the beginning of each laboratory session, the group that completed the prior dissection peer-teaches that material to the group that will be performing the current dissection. Thus, if Group A dissected on Tuesday and Group B dissects on Wednesday, Group A peer-teaches the material from Tuesday to Group B before Group B begins their dissection on Wednesday. Currently students are provided with a “pre-lab” dissection video of a faculty member demonstrating the structures students will be seeing in the laboratory on a prepared cadaver. Student evaluations have shown that students do not find these videos helpful in preparing them for the laboratory. Thus, finding an alternative solution that adequately prepares students for each dissection is an ongoing objective of mine.
I would like to try utilizing VR Anatomy activities designed to replace the pre-laboratory videos and see if students feel like the activities make them more prepared for each day's dissection than the current videos do. My working hypothesis is that students will find a short VR activity introducing them to the structures they will be locating in the lab to be more efficient and effective than the current pre-laboratory videos. My initial plan is to choose two laboratory sessions that involve material that is historically very difficult to visualize on the human cadaver itself, create “pre-lab” activities for each of those labs using the VR software, and then have students use the VR software to visualize the structures prior to performing their dissections. I will submit short questionnaires to students after they’ve completed their dissections and ask them to compare the utility of visualizing structures using VR prior to lab to the utility of watching the pre-lab videos. I will create pre-laboratory activities for a minimum of two laboratory sessions so that students in both Group A and Group B will be given an opportunity to utilize the VR software.
There is one other aspect of the course where I envision VR Anatomy has the potential be beneficial for students. Recall that not all students perform every dissection. Thus, I'd like to make the VR anatomy program(s) available to students in non-dissecting groups for any lab they might choose. Thus, if Group A is not performing a particular dissection, they could spend the "lab" time going over the structures to know for that lab using VR Anatomy. Because this is a secondary goal, it will be optional for students. However, I suspect that if the software itself is detailed enough, students would prefer to review the laboratory structures as a group using VR software in place of the mediocre "review" materials we currently provide.
Stuart Schrader Clinical Associate Professor – IU School of Dentistry IU Indianapolis VR Use in Providing a Solution for Dental Fear, Anxiety, and Fear of Pain
Not enough patients living with special needs seek out or complete dental treatment due to their dental anxiety and/or fear of pain. The question then arises how to provide a solution for dental fear, anxiety and pain reduction that is mindful of: (1) the need for reducing opioid prescriptions and its unintended consequence of possibly leading to addictions, (2) counter-indications of using certain anesthetics (e.g., increased heart rate from epinephrine for hypothyroid medically complex patients), and (3) how to support patients stated interest in the DentaVox study for their wishing to find a non-pharmacological method for managing anxiety and pain. And more generally speaking, 45% of this study’s participants reported feeling that local anesthesia was or was somewhat harmful and 80% reported very to somewhat likely to refrain from local anesthesia if this further reduced their dental bill. Ultimately, these intersecting conditions and study findings strongly support both the general patient population and people living with special care considerations both wishing non medication induced alternatives to the reasons that they either don’t visit a dentist, don’t stay the full length of their visit or don’t return for immediate or near term follow up visits.
One such solution for anxiety and pain is virtual reality (VR) and mixed reality (MR) as an emerging non-pharmacologic, cognitive-behavioral (CB), therapeutic form of mediative relaxation, behavioral distraction and mitigation and management for patient’s anxiety and pain. According to Riva, Wiederhold and Mantovani, (2019), clinical VR exposure is ‘embodied medicine.’ It’s advantageous according to Kenney (2018) are to enhance CB positive states within VR by introducing hypnotherapeutic imagery, messaging and auditory responses. Felemban et al., (2021), Lopez-Valverde et al., (2020), Wiederhold et al., (2014), Tanja-Dijkstra et al., (2014) have all found that using VR in clinical dentistry when in a dental chair, while applying a topical anesthesia application, during needle insertion, after administering local anesthetic, and post many types of dental procedures/surgeries can: (1) reduce self-reported and physiological data regarding a patient’s anxiety and pain after dental treatment, (2) increase a patient’s perceived control, and (3) help reduce patients vivid memories which can in term limit future dental fear and/or anxiety.
Implementation Plan
Although dental schools like us have begun to teach about the merits, challenges and opportunities afforded by such emerging XR technologies we have not brought our students up to speed about how they can be integrated within the flow of post humanistic patient centered care. In other words, we may teach for example about the importance, use, benefits, and pitfalls associated with CAD CAM 3D printing of dental materials and products (e.g., crowns) and the use of deep learning narrow artificial intelligence assessment/diagnostic systems (e.g., for detecting oral cancerous lesions). However, we don’t add further knowledge nor application just yet for how such technological innovations best fit, complement and integrate within a humanistic behavioral clinical patient management approach within an ever-increasing transhumanistic clinical provider world that needs to learn how to best blend person and relationship centered provider-patient skills with the application of using XR technologies within clinical settings. Dental students need to have an XR clinical education lab space to safely begin to learn, discuss and apply the principles of person-centered care (e.g., caring, empathy, compassion) when operationally and effectively using XR technologies within clinical encounters. The development of such an educational lab space will assist students in receiving applicable experiential learning about how to use these newer emerging devices and related relaxation, meditative, therapeutic and hypnotherapeutic software for the betterment of relationship centered care within clinical practice.
Associate Professor of Management – Division of Business IU Columbus W430 Organizations and Organizational Change
The W430 class project is designed to provide students with an opportunity to apply a design thinking framework to develop innovative solutions to improve the performance of an organization. The project involves creating an XR simulation of an Amazon Go Like Store, which is a cashier less and frictionless retail store that uses computer vision, sensors, and artificial intelligence to track what customers buy and charge them automatically. The students use the Meta Quest 2 headset to create a virtual reality (VR) experience of their concept, which allows them to explore various options for enhancing the customer experience and exploring innovations in the retail space. The use of VR engages the students to develop critical thinking, creative problem solving, and VR competences as a result of this classroom experience and project.
The request for an XRI grant is to purchase a Meta Quest 3 headset, which is a mixed reality (MR) headset that can enable the students to experience and innovate using augmented reality (AR) in addition to VR. The Meta Quest 3 headset allows the students to create 3D designs for their Amazon Go Like Store within the classroom and potentially at the project site (a retail store), prior to creating a full VR experience. By using AR, the students can overlay digital information and objects onto the physical environment, which can help them to visualize and test their ideas more effectively and realistically. The use of AR can also stimulate the students to generate more novel and feasible solutions, as they can interact with the digital and physical elements simultaneously and collaboratively. The Meta Quest 3 headset also enables the students to experience the advancements in MR headsets, which can further inspire them to consider the possibilities and value of XR for improving organizational performance. The Meta Quest 3 headset, coupled with the Meta Quest 2 headset, enables the students to learn about emerging technology and develop their digital literacy in all three areas of XR: VR, AR, and MR.
Clinical Assistant Professor – IU School of Dentistry IU Fort Wayne Using VR to Enhance Interprofessional Core Collaborative Competencies
This project aims to assess the effectiveness of Artificial Intelligence (AI)-powered virtual reality simulation (VRS) in enhancing interprofessional core collaborative competencies among health professional students. Using the Unity 3D game engine, a virtual clinic environment will be created, featuring non-player character patients and avatars representing various healthcare providers.
The key purpose is to compare the impact of AI healthcare provider avatars with human-controlled avatars on the development of collaborative skills. The implementation plan involves designing immersive scenarios where students can engage in synchronous interprofessional discussions, delegate tasks, and make collective decisions within a realistic virtual setting.
Anticipated outcomes include improved collaborative competencies, enhanced engagement through immersive learning experiences, and the ability to practice and apply skills safely in a controlled environment, thereby contributing to the advancement of interprofessional education.
Assistant Professor – Herron School of Art + Design IU Indianapolis MUS-A 107: Music Technology Fundamentals MUS-N 519: Digital Sound Design MUS-E 536: Independent Study-Special Project
The project’s main purpose is to expose students to the emerging use cases of virtual, augmented, and mixed reality, which shall be summarized in this document as “XR” or “extended reality”, within the field of music technology. This unique educational experience will foster a deeper connection between the students and XR by enhancing existing curricula across both undergraduate and graduate modules in music production, sound design and live performance. Our aim is that this will result in a meaningful contribution to the expanding field of XR music applications by combining XR technology with real-time musical engagement. The project seeks to cultivate a generation of musicians and technologists who can comfortably navigate and contribute to the integration of XR with music production, performance and education.
Apps such as Mirror, PianoVision and Paradiddle use overlays and gamification to create highly immersive learning experiences on acoustic instruments (‡1). The participants in Doganyigit & Islim’s 2021 case study (‡2) involving the use of VR for vocal training noted the experience manifested “a generally accelerated learning process” and helped more easily visualize concepts more traditionally accomplished through individual imagination. We theorize that applying visualizations and kinesthetic, spatialized manipulations to abstract music and audio concepts will provide more concrete avenues to understanding of obscure and imaginative elements that results in a similar increase in learning velocity.
We have targeted three classes for implementation, and anticipate these classes will provide project access for 30-35 students:
MUS-A 107: Music Technology Fundamentals (Fall 2024):
The XR app will allow the students to manipulate settings on multiple audio tracks in real-time along three dimensions using the positional and rotational data of the hand controllers. Over the course of two days of active instruction, beginning with a general overview of extended reality, students will spend the first day learning how to use the XR app to interact in real-time with a popular digital audio software product called Ableton Live. On the second day, students will put what they have learned into practice by working on a group project, experimenting with the XR app on their own, and taking part in a live performance activity. This hands-on approach will provide an understanding of the technology and its live music applications. At the end of the lesson plan, students will present their extended reality musical performances, demonstrating their technical skills and artistic discoveries.
MUS-N 519: Digital Sound Design (Fall 2024):
A three-dimensional audio lesson will be delivered over the course of three classes and a small project will be presented at the conclusion of the lesson. The students will use spatial positioning techniques in XR to assist in creating a spatialized audio (three-dimensional surround) sound design in a format such as Dolby Atmos for a provided video clip. The integration of the XR app with digital audio production software such as Reaper and Ableton Live will provide the students with a platform-agnostic understanding of the concepts of spatial audio, strengthened by the visualizations and gestural manipulations available within an XR environment. The XR app will replace traditional ways positional audio data is visualized, such as faders and two-dimensional boxes on a toolbar, with an immersive environment that allows sound objects to be positioned in the scene directly analogous to their real-world locations.
As part of his PhD research, the student (John Best) will develop and implement the technology required to achieve the lessons outlined above under the instruction of Dr. Timothy Hsu, Faculty Fellow and in collaboration with Jerelle Austin, Instructor and PhD candidate. The technology will allow for further content development of instructional activities to aid in the sustainability and growth of the project. Furthermore, it will adapt to a variety of music- and audio-specific applications, providing a platform to further pedagogical innovation and integration in these fields. Early user testing will be conducted on prototypes of the app prior to the fall semester to validate lesson objectives and strengthen the probability and level of successful student outcomes.
Previous projects by the PhD student proving the feasibility of our goals can be observed in the following two projects:
MUS-N 526 (FA23) Final Project: VR-XYZ https://youtu.be/dkCiwWevocI
MUS-N 525 (SP23) Final Project: XRelocomOcean https://edam2023.deck10.media/stages/STAGE-1#event-uz1oqvhob
During the course of this project, we anticipate student outcomes to include acquiring the ability to navigate in the extended reality technology to accomplish a set of pre-defined and measurable tasks derived from lesson content. Students will also demonstrate proficiency in translating a set of defined gestural motions to create a specified musical or audial effect. The students will then synthesize these concepts with a project, where students are encouraged to think creatively about how to manage "sound objects" in three-dimensional space to generate music and audio via individual expression. Through working on group projects and reflecting on their experiences, students will gain a heightened understanding of how XR and music technology interact. At the conclusion of the project, students should have demonstrated not just technical ability but also an increased awareness and understanding of the creative possibilities obtainable through music engagement in extended reality.
Lecturer – Kelley School of Business IU Bloomington T276 – Honors Course
Currently, one of the courses I teach is T276 (an Honors course) where we facilitate career/professional development including teaching the foundational skills for a professional interview. We provide time in class for informal interview practice which leads to assessing them with a virtual (Zoom) mock interview conducted by Kelley’s Undergraduate Career Services. The overwhelming feedback from students in our career courses includes them wanting more interview practice. Our faculty team has tried several other programs for different interview mediums such as Quantified Communications and StandOut. These are great tools to allow for independent interview practice, which also include scoring students on their performance, however it still feels one-sided and not as realistic to prepare for an in-person interview.
The proposed assignment would be to include an additional interview simulation using augmented reality. With the rise of technology such as Chat GPT and XR technologies, companies are also using innovative ways to recruit new talent (according to LinkedIn’s Talent Blog https://www.linkedin.com/business/talent/blog/talent-strategy/innovative-ways-companies-are-using-virtual-reality-to-recruit) and train/evaluate current employees (according to Harvard Business Review https://hbr.org/2021/01/how-companies-are-using-vr-to-develop-employees-soft-skills). Some of these new recruitment strategies include playing skills-based recruiting games, providing VR opportunities to test other skills, using VR to experience office culture and give tours of headquarters, give first-hand experiences in actual job roles, and more. Our students are likely to encounter XR technologies during recruiting, so being able to be exposed to this in a class environment where the risk is low will set them up for greater familiarity with the technology as well as success interacting with it.
Lecturer – College of Arts + Sciences IU Bloomington History Courses
I am planning to incorporate a VR technology in my history courses to achieve two main goals:
1. To enhance the course content through visualization to help students to understand the course materials and add a spatial dimension to the timespan of history. Space is traditionally considered as a geographical category but it’s crucial for understanding historical developments, a role of environment in the processes of colonization and urbanization, economic and cultural connections, political conflicts and state formations. I will start with incorporating panoramic photos and videos into the course content, mostly using already created and published VR tours to introduce some historical sites and a possibility to explore history using a VR technology in class. I believe I need to curate my own VR tour based on the available photos and videos to become comfortable with using the technology in class and before assigning it to the students. So, step one will be to create a database of available visual resources to be used for the VR tours to the historic sites. Step two, to identify course units where the use of VR will be justified, and it will enrich students’ learning. Step three, to create a VR tour to be used in a classroom for visualization of the course materials and for modeling an assignment for students. Thus, I am planning to use available VR tours (for example, as a virtual museum visit or exploration of a historical site or natural environment) and to curate my own to master the technology and prototype the VR assignment for students.
2. To enhance learning activities and develop students’ competencies which they apply in a classroom to improve the quality of learning, and which can be useful for their future careers. Among these competencies that can be developed by using the VR assignment, are digital literacy, creativity, collaboration, and critical thinking. I will create one VR assignment for students to work in small groups to gather textual and visual information, evaluate sources, develop a project design, and convert into a VR tour or fieldtrip. My course D320 Modern Ukraine is based on the combination of temporal and spatial approaches as students explore different places to examine historical changes and cultural differences. I am planning to include the elements of VR project in the modules about Kyiv, Lviv, Kharkiv, Odesa where students will be able to zoom into streets and houses and to walk virtually on one of the streets or squares or visit one of the historical houses. I aim to make a VR one of the means to “travel” to Ukraine and understand its complicated history and cultural diversity through the visualization of space, natural and built environment. In this course we also discuss Russia’s war against Ukraine, following the news and reportages from the frontlines and different territories, so VR may be very useful to “see” places from the war stories. This VR assignment is also a way of creative storytelling because the assignment prompts will include a requirement to write a short narrative reflecting on the sources and presenting a story of the place. Students will also develop their creativity in writing a story as well as critical thinking to access the information, analyze texts and images and finally, to create a virtual tour. I will keep the assignment at the small scale, limiting it to one place (a house, a street, a square) to make it feasible to accomplish in two or three weeks. Students will be also working collaboratively as teams to help each other to navigate a new technology and tools and collect information they need for the project. I will allocate one class for the technical questions and practice to smooth the learning curve and provide classroom space and time for teamwork. I will be teaching D320 Modern Ukraine in AC C102, one of the active-learning classrooms (I am a Mosaic Senior Fellow and have been teaching in the ALC last two years), so I will be able to utilize the classroom space for effective interaction and collaboration.
Associate Professor – Eskenazi School of Art, Architecture + Design IU Bloomington Digital Architectural Drawing
Architecture and interior design industries significantly rely on visual communication, and virtual reality (VR) helps designers better understand their design by immersing in the real-time 1:1 scale rendering environment. And for this reason, they are many advantages to VR over traditional 2-dimensional documentation. While VR technologies have been developed rapidly for the professional architecture and interior design industries, academic institutions may not update their courses and technologies frequently, leading to a gap in teaching the most current practices and trends in VR design. Currently, VR hasn’t been implemented in any courses in the interior design program at Indiana University Bloomington. While it is difficult to completely revamp the current interior design curriculum, adding VR assignments to interior design and architectural drawings courses will be a beginning step to prepare our students for a future that will be dominated by this technology.
The following two assignments will be incorporated my Digital Architectural Drawing and Advanced Digital Architectural Drawing classes in Fall 2024. Digital Architectural Drawings is taught every semester with four sections in the fall and two sections in the spring. Advanced Digital Architectural is taught every other semester.
1. Digital Architectural Drawing
Course Overview
This course will strive to further develop the student’s graphic communication skill and 3D modeling and rendering skill as a designer. Through hands-on learning, students will develop computer-aided design (CAD) graphic communication skills that are required for professional practice of interior design field using both AutoCAD and Revit primarily.
Assignment: Miller House Visualization
In this assignment, you will be required to further develop the virtual model of Miller House that you have built in Revit. Your will import the Revit model in TwinMotion and then added materials, architectural entourage, landscapes, weather, and sun settings in a VR environment. Each student will be expected to present a walkthrough of their individual VR model to the class.
2. Advanced Architectural Drawing
Course Overview
This course aims to further develop students’ advanced digital design and modeling/visualiazation skills by considering the digital-physical workflow in the context of contemporary interior design. The main software will be Rhino, Grasshopper, and Unity. In addition to designing in Rhino, Grasshopper, and Unity, students will have hands-on experiences with a range of digital fabrication tools such as 3D printer, laser cutter, and digital cutter, as well as virtual reality headsets such as Meta Quest Pro. Through a combination of exercises and projects, the students will design a set of interior objects, from small-scale lighting and furniture to large-scale interior partitions and surfaces. Finally, students will be asked to create a metaverse gallery to display and share their work.
Assignment: Metaverse Gallery
In this project, you will be asked to create a Metaverse gallery to brand yourself as a designer and showcase all the models you have created in the class this semester. You will develop the virtual reality gallery in Unity and then publish the content to Spatial.IO to share with your peers.
Clinical Assistant Professor – IU School of Public Health IU Bloomington SPH-Y 397 Recreational Therapy Internship & Professional Preparation
Recreational therapists use various intentional designed recreational and leisure activities to help increase the physical, social, emotional, behavioral, and intellectual functioning of the clients they serve. The largest employer of recreational therapists is the VA Medical system. Within that system one of the fastest growing and most effective forms of recreational therapy treatment is conducted through the use of VR programming. This is not a treatment modality that we are able to instruct out students in due to lack of funding.
The purpose of this proposal it to develop a VR project for our undergraduate recreational therapy students so they can learn the basics of using VR as a healthcare intervention for the clients they work with after graduation.
Working with recreational therapists at the Indianapolis VA Medical center, we will design a three-day project that introduces students to VR technology, teaches them how it can be used to achieve therapeutic benefits for a variety of clients with a range of diagnoses’, and then offer practical hands-on experience conducting VR treatment through the scope of practice of a recreational therapist.
2023 XRI Faculty Fellow Grant Recipients & Project Descriptions
MATH 13200 Mathematics for Elementary Teachers
I propose to develop an exploration of 3D shapes using XR technology. The course targeted is MATH 13200, the third of a 3-course sequence of Mathematics for Elementary Teachers. The focus of this course is geometry. There are 2 sections of this course taught each semester, and the content is included in the second half of MATH 13600, of which there is 1 section taught each semester.
An objective in elementary school mathematics is that students are able to visualize a 3D shape from a 2D shape on paper. The future elementary teachers in this course struggle with the visualization. There are computer models of this, but I would like the teachers to be able to enter and explore shapes to recognize their properties. One specific example is pyramids and cones. These objects have two "heights" - an altitude and a slant height. My students struggle to see them and to relate them using a right triangle cross-section. I would like them to experience the difference and create the right triangle while inside the shape. I have a 3D model for this purpose, but students still cannot "see" the heights and triangle in the model.
As the coordinator of these courses, I would create at least two lesson plans using this technology and ask instructors to implement them in their sections. I would plan to use headsets available at the University Library. I anticipate students will understand construction and dimension of solids and will be able to visualize 3D shapes from 2D representations more fluently. In addition, an (un-assessed but important) objective of these courses is that students will experience a variety of pedagogical methods that they can apply in their classrooms. Experiencing XR technology will, I believe, create a desire in many of them to integrate similar technology into their own pedagogy in the future
INFO B626 Human Factors Engineering for Health Informatics
I am teaching INFO-B626, Human Factors Engineering for Health Informatics, which focuses on how human factors affect the design, implementation, use of health informatics systems. In the course, I would like to offer an activity/assignment focusing on human factors when using VR solutions in rehabilitation therapy settings. Research and clinical communities actively investigate and adopt VR solutions to provide rich stimuli to create immersive, engaging virtual environment during intensive, longitudinal rehabilitation therapies for individuals with chronic conditions (e.g., individuals with acquired brain injury, developmental disorders, etc.). Moreover, the performance of individuals with chronic conditions in diverse VR contents can be used to estimate the clinically validated assessment scores of the individuals.
An activity/assignment will be developed to give hands-on experience of collecting, analyzing, and interpreting the VR-based data to students. More specifically, the activity/assignment will be composed of two different parts. First, during an in-class activity, students will be given opportunities to experience diverse VR contents and collect their own performance data (e.g., the number of successful executions of desired tasks) as well as assess their corresponding assessment scores using clinically validated batteries. Second, students will use their own data (and of their classmates) collected during the class to visualize the data (e.g., plotting a bar chart), develop machine learning-based estimation models to translate the VR-based performance data to clinically validated assessment scores, and interpret the results. While students' own data will not perfectly resemble the actual data of individuals with chronic conditions, the newly developed activity/assignment will give a clearer idea about how to collect, analyze, and interpret the VR-based data.
Various Undergrad Courses
Extended reality (XR) presents a new opportunity for engaging students in learning additive manufacturing (AM) or 3D printing (3DP) knowledge. However, it is not clear how to design suitable XR experiences that have comparable results as real-world laboratory experiences. The objective of this Extended Reality Initiative (XRI) Faculty Fellows grant project is to develop the best practices for integrating XR into AM education and to support adoption of those practices in classrooms. Built on PI's previous virtual reality experiences, this XRI project includes: (1) development of a series XR activity library for AM process and materials testing; (2) comparison the effectiveness of students' XR experiences versus conventional hands-on laboratory experiences; and (3) mini-workshop to engage the materials and manufacturing, computer science, and informatics community in determining and disseminating best practices for XR use in additive manufacturing education.
PSY B203 Ethics and Diversity in Psychology
I teach an online gen ed/required undergraduate psychology course titled Ethics and Diversity in Psychology (Psy-B203), with an enrollment of ~180 students per term. One of the learning outcomes for that course is to apply, analyze, and evaluate psychological research and practice to discern bias and challenge claims that arise from myths, stereotypes, or untested assumptions. One of the modules in the course addressing this objective concerns prejudice reduction principles, which includes the concept of empathy and perspective taking. Currently, I have student watch a 10-minute video of a sexual harassment victim talking about her harassment experience and how it affected her as a way to help students gain empathy and perspective taking on the issue of sexual harassment. They then engage in an online-small group discussion about the video with guided discussion questions designed to elicit empathy and perspective taking and to share those feelings and perceptions with other classmates.
I am applying for an XRI grant to start to develop this activity into a VR or other XR platform because of the potential for extended reality technology to substantially enhance empathy, which in turn, may lead to pro-social behaviors such as willingness to be an active, effective bystander if they should witness similar situations as well as reduce their intentions to engage in sexual harassment or related conduct. Although my long-term goal would be create a fully animated, interactive VR environment where students could either experience (mild) harassment or witness it and make various decisions and see how those might play out, as well as to see how various courses of action might affect their empathy and perspective taking, currently I would like to take my existing videos (2) and turn them into 3-D videos to get an initial assessment of how it affects students empathy, perspective taking, and other outcomes such as attitudes toward victims of sexual harassment, and their intentions to engage in sexual harassment. Should this initial step into XR technology show encouraging signs for these outcomes, I would seek funding to develop more immersive XR experience for my students.
R110 Fundamentals of Speech Communication
The proposal that we have developed for XR technology includes the following elements: 1.) We want to include XR technology use within our R110 Fundamentals of Speech Communication Course. 2.) This would be implemented as an assignment for all general course sections in FA2024 with a pilot period in SP2023. 3.) We are also developing a plan to use XR technology in our online course sections for FA2024. Our projected pilot period would be during SP23 and SU23. Since our course is standardized, we could create XR technology exposure to as many as 3000 students per academic year when fully implemented into the course. Additionally, we intend to provide/install XR technologies into the IUPUI Speaker's Lab which supports our R110 course and other University students. This would provide another Campus Source for this technology exposure. The Speaker's Lab currently exceeds 1000 unique visits per semester. When this is a required visit from an R110 assignment. This number of visits would increase dramatically. Having just completed the Digital Gardener Fellowship, we will that this would complement the assignments we are developing to enhance our university outcomes and objectives for our course. And at the same time bring more current technology learning to our course and campus.
Virtual World Design & Development Course
In the fall, I teach a virtual world design and development class that touches upon virtual reality exercises near the end of the semester. I would like to build this into a more robust offering with some extra focus on augmented reality, but currently it is hard to do so as my device capabilities are limited to the four VR devices we have in the emerging technology lab. To be able to teach more XR in that classroom, I would need several more devices capable of supporting XR development. For this particular project, I would implement an XR-focused section in my fall virtual worlds games course to train students how to make multiplayer experiences in XR games, with a final project focused on building a proof of concept in this space (2 players in a virtual XR space, playing a game such as table tennis). Students would learn: 1. How to make XR experiences (both in virtual reality, and augmented reality) 2. How to design for common XR interaction modes (gaze, controller tracking) 3. How to represent players in a multi-user XR space 4. How to create shared XR interactions so that two or more users can manipulate objects in a shared XR space 5. An understanding of the possibilities of the design space and where multi-user XR is particularly effective In addition to the above, they would leave with one complete project that demonstrates capability in the above learning outcomes and have a foundation to build upon for future classes. To get to these outcomes, students would be assigned an XR project that requires making a multiplayer experience, and then provided a set of tutorials that trains on the use of the technology in building the fundamentals through YouTube videos and hands-on laboratory assignments. During the project, weekly meetings and feedback from the instructor will keep the projects on-track and developing.
2022 XRI Faculty Fellow Grant Recipients & Project Descriptions
W430 Organizations and Organizational Design (Change)
The proposed project is a significant enhancement to a current class project that currently focuses on developing 2D prototypes.
In this organizational change project, in project teams, students will apply their design thinking learning using extended reality to visualize “What If” ideation and “What Wows” prototypes. Using extended reality, students will learn the ways to enhance prototypes by creating spatial interactions and potentially multi-sensory experiences.
This extended reality experience will enable students to effectively test and critique their prototypes for organizational process change. The project concludes with the teams sharing their extended reality creations enabling other students to provide feedback to further improve their change ideas.
D790, D791, and D890 Pediatric Dentistry Clinical Rotations
Artificial Intelligence (AI), Augmented Reality (AR), and Virtual Reality (VR) are some of the alternatives available today for dental education. These technologies allow the recreation of virtual scenarios, some of them in real-time, facilitating real clinical procedures. The benefits of this model for student training will be enormous. These new training options will allow students to repeat the training in a specific clinical approach until the competence needed to perform it in a patient is reached.
This project aims to enhance our ability as faculty to prepare students to fulfill the techniques in intraoral local anesthesia in children. Building a virtual reality or augmented reality tool allows us to practice landmarks and methods useful for pediatric dentistry anesthetic placement.
A150 Survey of the Culture of Black Americans
In a classroom of diverse students, addressing the manifestations of topics such as racism, micro/macroaggression, and colorblind racism, to name a few, that Black/African Americans experience presents some challenges at times. Currently, instructors use various teaching techniques like scaffolding, reflections/personal narratives, group learning, strategic course designs, and so on to counter these challenges.
This course proposes using the assignment, The Black Experience, to address the presented challenges by allowing students to virtually experience the manifestations of such topics. This assignment incorporates a VR experience that takes the player through a "long history of restriction of movement for Black Americans and the creation of safe spaces in our communities." This experience enhances the student's experience as they gain a more profound and relatable understanding of the manifestations of topics addressed in the course.
X405 Topical Explorations in Business
The purpose of the project is to develop a fully immersive virtual reality (VR) program to improve students’ ability to act in a supportive manner that recognizes the feelings of another group. Students will understand Philippine history, values, communication styles, culture, beliefs and practices, and economic conditions by immersing themselves in a virtual reality environment where Filipino’s talk about their culture, communication style, environment, economy, values, beliefs, and practices.
I want the final product to be as realistic as possible, so students are engaged in the virtual reality environment and motivated to learn. The use of visual and auditory cues in a fully immersive environment can provide deeper engagement, learning, and retention of cross-cultural knowledge and behaviors than a bricks-and-mortar classroom.
Various Chemistry Courses
I have been using 3D printing to create interactive models for use in chemistry classes for several years now. 3D printing is excellent at making simple models of concepts but fails when the concepts get too complex. For example, I have been able to make models of atoms and crystal structures but have failed to model nuclear reactions or solubility. These latter concepts are simply too difficult to model with pieces of plastic.
My plan is to study the effectiveness of using augmented and virtual reality in teaching these complex topics. The topics I plan to start with are the solubility of ionic compounds, radioactive decay, and equilibrium. The goal is to make virtual overlays for 3D printed and entirely virtual models to express these concepts.
Sponsors
Center for Teaching and Learning social media channels