Using Augmented Reality for Contextual Mobile Learning

At DevLearn 13, I had the honor of speaking on an exciting topic:Augmented reality (AR) and its uses in mobile learning. After the presentation Iheard from a number of attendees who were equally excited about thepossibilities and examples I shared, and many were also interested indiscussing the technical and pedagogical challenges that we often dismissbecause of the novelty associated with AR technology.

While I had collected numerous resources on AR inpreparation for DevLearn, I wasn’t able to share all of them during mypresentation. In addition, several attendees and peers have asked for links tothe AR examples and resources I covered, so I’m making them available throughthis article.

Mobile augmented reality

What is “augmented reality”? AR comprises a live viewof a real-world environment (“reality”) with computer-generated input(including sound, graphics, text, video, and GPS information) supplementing(“augmenting”) the visual elements in the view. In other words, ARprovides us with an enhanced view of the real world. In spite of the impressionof novelty that eLearning practitioners and their managers may have, AR has“been around” now for many years—it is not a new phenomenon.

Why might AR matter to the field of eLearning? Allow me tocite a key point made in a paper from the Open University: “eLearning designers,developers, and educators often lack clarity regarding the impact that alearner’s situation has on their learning.” (See References at the end of thisarticle.)

Mobile augmented reality provides learning designers andeducators with a new opportunity to start thinking more deeply about the mobilelearner’s context and situation. In fact, the key thing to remember about mobileaugmented reality is that it is about augmenting experiences in real-worldenvironments, wherever the learner happens to be. AR technologies can take anysituation, location, environment, or experience to a whole new level of meaningand understanding. AR is uniquely changing the way people learn with mobiledevices.

A note of caution is in order before I go further. During my session at DevLearn, Nancy Proctor,head of mobile strategy and initiatives at the Smithsonian Institution, pointedout that there are many examples of ineffectiveaugmented reality. These often involve applications that their creators couldhave easily been developed by using more primitive forms of engagement, such asstatic graphics. I’m in agreement with Nancy about the misuse of AR, and we canexpect to see the consumer world exploiting the novelty and commercialism of ARin the coming years. (See the story by Stephen Vagus in the References at theend of this article.)

Having said that, I believe that some the best opportunitiesto leverage AR technology for learning are during situated activities or contextual experiences—in otherwords, where a person is, while a person is doing something: mobile. Howbig will this mobile AR opportunity be? According to Semico’s report onaugmented reality (see References), over 864–million high-end cell phones willbe AR-enabled by 2014, with revenues related to AR technology approaching 600billion by 2016. While still in its infancy, mobile AR is starting to driveinnovation within the education, gaming, medical, mobile, automotive, andmanufacturing markets.

ClassifyingAR

There are many forms of AR (see Sidebar 1, and also the CommonCraft video listed in References). My interest for this article lies specificallywith mobile augmented reality as one of the most powerful forms of contextual mobilelearning. In addition to the many examples of AR that utilize smartphones andtablets, I’m also interested in mobile wearables such as Google Glass that will provide us with evenmore options for contextual learning in the mobile AR landscape.

Sidebar 1: Augmented virtuality: similar to but not the sameas mobile AR.

I’d like to point out an important distinction betweenmobile AR and a similar application. It is possible to augment virtual and realworld environments and to merge them together. This falls under the category of“augmented virtuality” and is really outside the scope of contextual mobile learning.Much interesting work has been done in the area of virtual reality, mixedreality, and virtual worlds and education; however, as I’ve probably madeclear, I’m most intrigued with how learning takes place in an augmented realworld.

While collecting a number of mobile AR examples during myresearch work, the paper from the Open University (cited earlier) provided somemuch needed clarity and guidance for augmented reality when examining its implicationsand unique affordances for mobile learning. The authors provided a workingdefinition of AR to include the fusion of any digital information within real-worldsettings, i.e. being able to augment one’s immediate surroundings withelectronic data or information, in a variety of media formats that include notonly visual and graphic media but also text, audio, video and haptic overlays.

The authors also addressed AR in a broader “reality” contextand provided several important distinctions between virtual reality and mixedreality. However, I found the following four aspects that are unique to mobileAR worth mentioning. Combining mobile with AR fosters use of the followingtypes of information in support of learning:

  • The mobility of the user
  • The user’s geographical position
  • The physical place where learning can occur
  • Formal learning connections to informal learning

The authors also investigated a variety of device types thatI wouldn’t think of as being truly “mobile.” However, the classification schemethey presented is well suited for analyzing today’s different forms of mobileAR. The authors of the paper classified AR according to these key aspects(Figure 1):

  1. Device/technology
  2. Mode of interaction
  3. Type of media used (sensory feedback method)
  4. Personal or shared experience
  5. Character of the experience
  6. Learning activities or outcomes

Figure 1: Classification table for differenttypes of mobile augmented reality

Most of these classification categories are intuitive and don’t require muchexplanation. However, “modes of interaction” warrants some discussion. Thesemodes relate to either providing passive information overlays to the learner,depending on their physical location, movements, and gestures, or engaging thelearner in an exploratory mode where they are encouraged to actively discoveror create media nearby in order to solve a problem or meet characters from astory.

The authors pointed out that more modes of interaction couldevolve in the future although some, such as the “constructionist” mode, may bemore relevant to specific knowledge domains (e.g. architecture or structuralengineering) while the “active/exploratory” mode is more relevant to AR games.

Many of the technical and pedagogical challenges identifiedin the paper are common concerns often associated with designing and developing for mobile.However, some of the key AR concerns identified by the authors include:

  • The noveltyof AR technology may detract from the learning experience
  • Using theAR technology may require tech support (if not easy to use and install)
  • The overlayof labels and features could harm observation skills through excessivereinforcement

This paper really helped me to begin thinking about thedifferent characteristics of mobile augmented reality and how we as designersand developers might begin thinking about leveraging augmented realitytechnologies for learning through this classification lens.

Mobile learningAR app examples

The classification scheme presented in the paper from theOpen University was additionally useful in identifying the augmented realityexamples I wanted to show at DevLearn. While there are many other examples ofmobile AR out there, the following examples were primarily selected becausethey provide excellent models of using AR for contextual mobile learningexperiences or performance support. I hope this list of examples provides someideas for those who want to get started using AR for mobile learning. Whilelooking at these examples, consider reflecting upon the categories in theclassification scheme. What other examples of augmented reality for contextualmobile learning have you seen?

  1. Dow Day is one of the most widelyknown AR examples for mobile learning and was developed using the open source Augmented Reality Interactive Storytelling (ARIS)platform. (ARIS video)
  2. Word Lens builds on the flash-cards concept providing real time wordtranslation using AR technology. (Word Lens video)
  3. Fun Maps for Kids augments a world map to provide morecontexts when learning about the continents, geographical landmarks, andanimals. (Fun Maps video) 
  4. Star Walk is an augmented reality astronomy guide and providesa real-time view of the sky’s stars, constellations, and satellites by pointingthe camera at the sky. (Star Walk video)
  5. Leafsnap is a free electronic field guide for trees, provides leaf-shaperecognition, and has thousands of photos of several species of trees’ flowers,fruit, bark, and more. (Leafsnap video)
  6. Anatomy 4D allows learners to explore the human anatomy and wasbuilt using Qualcomm’s vuforia platform. (Anatomy 4D video)
  7. DASH Smart Instrument Technologiesis a portable surgical navigation system designed to assist orthopedic surgeonsin performing knee and hip joint replacement procedures. (DASH video)
  8. HP Support’s performance support apphelps you change the ink cartridges in select HP printers. (HP Support video)
  9. Audi’s augmented owner’s manual app shows the range of functions the car offerswithout having to read the manual. This app was developed using the Metaio platform. (Audi owner’s manual video)
  10. Volkswagen’s Mobile Augmented Reality Technical Assistance (MARTA) app provides servicesupport. This app was developed using the Metaio platform. (MARTA video)
  11. Aurasma’s augmented reality app can be usedto create auras for augmenting any object by triggering and loading a 3-Dobject, image, or pre-recorded video, and is ideal for creating learningopportunities for training or performance support. Check out the two differentexamples using Aurasma below.

    Combat Medic is card gamethat is augmented by Aurasma and was createdby the University of Central Florida’s METIL lab for the U.S. Army Research Lab. (Combat Medic video)

    The second example below is avideo captured by a mechanic while using Aurasma to show how this AR technologycombined with pre-recorded videos could be used for training novice mechanicsas well as providing performance support to more experienced ones. (Aurasma demo: mechanic)

Mobile AR contentcreation and development platforms

In some of the examples above I shared links to the ARcreation apps or development platforms that their authors used. During mypresentation at Devlearn I also concluded with a list of the AR creation apps,tools, and development platforms that I’ve been exploring.

Mobile wearablesand the future of AR

While many of the examples in the paper from the OpenUniversity addressed more than just mobile AR, the paper still provided a goodfoundation for thinking about the attributes of new mobile-device types such aswearables like Google Glass.

Believe it or not, there are many AR concepts already underdevelopment for Google Glass. In fact, Junaio,a well-known AR platform, recently announced support for Google Glass at the InsideAR conference. In terms of contextual mobile learningwith Glass there is Field Trip App that is also available on iOS and Android mobile devices. One ofthe most recent examples of contextual mobile learning I’ve seen on Glassutilizes a new feature called Vignettes. It’s a Word of the Day App that actually evolvedout of combining the Glass App with social media.

The potential uses of Google Glass for performance support have been a hot topic this year, but it’svery compelling to see actual proof of concepts being developed for real-world situations. But what do wearables hold for thefuture of augmented reality and contextual mobile learning? While looking forwearable AR examples like Google Glass I found this conceptual and futuristicvideo on Space Glasses.Future solutions like this seem to suggest a direction toward a contextuallyrich mixed-reality environment. While there are mixed-reality AR examplesavailable today such as Zspace, they don’t allow for mobility as wearables do. Withthe advent and adoption of wearable mobile devices the future of AR couldevolve into something I previously thought only Hollywood movies mightportray.

References

Common Craft (June 10, 2010). “Augmented Reality—Explainedby Common Craft.” (Free Version). https://www.youtube.com/watch?v=D-A1l4Jn6EY

FitzGerald, Elizabeth, Anne Adams, Rebecca Ferguson, Mark Gaved,Yishay Mor, and Rhodri Thomas. “Augmented Reality and Mobile Learning: TheState of the Art.” 11th World Conferenceon Mobile and Contextual Learning (mLearn 2012). 2012. https://oro.open.ac.uk/34281/1/ARpaper_FINAL.pdf

Semico Research Corporation. Augmented Reality: Envision a More Intelligent World. October 2012.https://www.semico.com/content/augmented-reality-envision-more-intelligent-world

Vagus, Stephen. “Mobile Augmented Reality May Havea Bright Future.” Mobile Commerce News.12 November 2013. https://www.qrcodepress.com/mobile-augmented-reality-may-bright-future/8524103/

Share:


Contributor

Topics:

Related