MIT Sloan Executive Education innovation@work Blog

Shifting realities: Augmented, virtualized, and mixed realities in the classroom

Contributed by Paul McDonagh-Smith, Digital Capability Leader, MIT Sloan, Office of Executive Education. With a focus on driving digital transformation and harnessing emerging technologies, Paul works with the team to create learning programs to fit how we live and work in today's digital age.

TERF

Let's be honest—reality isn't what it used to be. Right now, all around us, carbon world experiences are being augmented, virtualized, and mixed by a series of emerging technologies and their offspring products. These new flavors of reality are offering tantalizing opportunities to imagine and invent improved online learning interactions to complement what we're currently doing in classrooms and on campus.

In one corner, we've got augmented reality (AR) overlaying content in the form of computer generated, gesture features and 360-degree video onto the physical world. Aiming to add meaning and context to our carbon-based reality, this content floats over it like a butterfly, if you will.

In the other corner, virtual reality (VR) is loitering with some serious heavyweight intent, looking to replicate or simulate physical and imagined environments via hardware such as Google Cardboard and HTC Vive. Whether using pre-rendered or rendered software, VR computer generated immersive experiences are getting ready to rumble.

Positioned somewhere in the middle of the ring, we've got the hybrid of mixed reality (MR), which in effect merges real and virtual worlds to produce new environments and visualizations where physical and digital objects can co-exist and interact in real time. MR hardware includes products such as Microsoft's Hololens device, which might be seen as ushering in a new era of holographic computing.

Here at the MIT Sloan Office of Executive Education, we're experimenting to learn more about how existing and emerging augmented, virtual, and mixed reality technologies and products can practically and effectively support remote and co-located learning interactions. We're asking learners (and ourselves) how we can use and perhaps combine these technologies to enhance classroom and online interactions by applying the right reality at the right time.

All of our experiments in this space aim to fix real teaching and learning problems. We're using telepresence robots to enable individuals with mobility challenges to attend classes that they otherwise wouldn't be able to without significant difficulty. We've also built immersive virtual classrooms in an effort to bridge the energy, engagement, and idea flow of MIT's physical places into digital spaces that can be accessed from wherever learners live and work.

Our next rounds of AR/ VR/ MR experiments aren't aimed at identifying which one works best for us but rather which elements of their respective DNA will improve the only reality that's key to us--that is our learner's reality.

Paul McDonagh-Smith Robot

Paul McDonagh-Smith is Digital Capability Leader at MIT Sloan's Office of Executive Education. With a focus on driving digital transformation and harnessing emerging technologies, Paul works with the team to create learning programs to fit how we live and work in today's digital age. Paul's recent work includes the successful experimentation and introduction of AR / VR, telepresence robotics and AI technologies into MIT Sloan Executive Education programs and activities. In a series of upcoming blogs, Paul will share practical insights and ideas regarding the role and use of technology in education. To learn more about Paul’s work please contact him at: mcdonagh@mit.edu

Comments

Search innovation@work Blog

Subscribe to Blog by Email

Interested in writing a guest post?

Cutting-edge research and business insights presented by MIT Sloan faculty.