Metafocus: Personalized Lifelong Learning

This winter, Lauren Della Bella and Dick Thomas will publishtheir book 9 Billion Schools: Why theWorld Needs Personalized Lifelong Learning for All. (Full disclosure: Iwrote a chapter in this book on VR and the future of education, and I alsoco-wrote the chapter on AI and the future of education. The views and forecastsin this column represent the opinions of the author, not of 9 Billion Schoolsnor of Learning Solutions Magazine.)

As described in detail in the book, the 9 Billion Schools (9BS) organization maintainsthat:

  • Education shouldn’t end when we exit formalschool systems (i.e., high schools or universities), but instead should belifelong, life-wide (including every topic we wish to learn), and life-deep (goingas deep into these topics as we wish)—abbreviated as L3
  • Education should be highly personalized
  • Educators and instructional designers should useMind Brain Education (MBE) science to massively and continually improve educational models andmethods

Augmented intelligence, which is artificial intelligence that communicates andinterfaces with us through virtual or augmented reality, is the precisetechnological tool we need to make the 9BS vision a reality—a real reality, nota virtual one.

Homo augmentus

For simplicity’s sake, I’m lumping virtual, augmented, andmixed realities into the single, catchall term expanded reality (XR). XR will soon be omnipresent in our dailylives, much as smartphones, social media, and the internet are today. We willwear glasses or contacts that overlay images and data onto the real world. Wewill hear voices, sound effects, and music via hidden earphones (we already do,to lesser degrees). Haptic devices strapped to our bodies will vibrate, push,heat, and otherwise allow us to feel and interact with the virtual worldsaround us. Cameras and other sensors on and in our bodies will record ourenvironments, biometric data, and movements. We will continue to connect to andinteract with our computers, the internet, the cloud, and one another more directlyand continually every year. Step by step, we ourselves are becoming thecybernetic organisms as envisioned in 20th-century science fiction.

(Editor’s note:See The eLearning Guild’s white paper Using Enhanced Realities for Learning: Are You Ready? as a first step in acquiring the knowledge and skillsnecessary for the new world of learning through enhanced realities.)

Homo cognitus

As this happens, artificial intelligence will also continueto evolve. Inevitably, software engineers will create ways for us tocommunicate with our ever-more-powerful devices and smart assistants even moredirectly using our myriad XR and internet of things (IoT) interfaces. Alexa,Google, and Siri are just a humble beginning, a mere whisper of what will bepossible perhaps only five or ten years from now. The research and databases theseAIs draw from will continue to grow, as will AI’s abilities to learn. Eventually,we’ll continuously and directly communicate with our smart assistants across ourcountless platforms and devices, and they’ll respond back with the useful andintuitive information we need.

As all this becomes possible, mundane even, intelligenttutoring systems (ITS) will inevitably employ XR-enabled smart assistants, or augmentedintelligence tutoring systems (AITSs). I define an AITS as an intelligenttutoring system (ITS) that agnostically communicates through virtually alldevices and all platforms, specifically designed to teach. The only realdifference between an ITS and an AITS is that an ITS isn’t defined by platform,and thus, typically only interacts with students via a single computer and/or amobile app. However, while wearing AR/MR glasses and earphones, we’dcontinually see and/or chat with an AITS all day long, as well as message withit via phone, tablet, and laptop. Even more advanced AITSs will collect datafrom various sensors and cameras, interact with other objects in our virtualenvironment, track tons of data, and communicate through multiple hapticdevices in addition to the visual and aural. An AITS’s domain will include theentirety of the virtual world and all connected devices in the real world. Sortof like Mr. Clippycrossed with Alexa crossed with Yoda.

Because why wouldn’t educators and software engineers usethe most powerful technologies of the day, just as today’s teachers andstudents regularly use computers and the internet? Further, AITSs will access allthe MBE science and educational research, observe students’ behavior andlearning patterns, and match psychology with methodology to create highly personalizedlessons and curricula, tailored to each student like a fingerprint.

AIs will soon create uniquely customized lesson plans foreach student. The Stanford report ArtificialIntelligence and Life in 2030: One Hundred Year Study on ArtificialIntelligence predicts: “In the next fifteen years, it is likely that humanteachers will be assisted by AI technologies with better human interaction,both in the classroom and in the home.” (See the Additional Resources sectionat the end of this column for a link to this report.) The report continues:

“AI techniques will increasinglyblur the line between formal, classroom education and self-paced, individuallearning. Adaptive learning systems, for example, are going to become a corepart of the teaching process in higher education because of the pressures tocontain cost while serving a larger number of students and moving studentsthrough school more quickly. While formal education will not disappear, theStudy Panel believes that MOOCs and other forms of online education will becomepart of learning at all levels, from K-12 through university, in a blendedclassroom experience. This development will facilitate more customizableapproaches to learning, in which students can learn at their own pace usingeducational techniques that work best for them. Online education systems willlearn as the students learn, supporting rapid advances in our understanding ofthe learning process. Learning analytics, in turn, will accelerate thedevelopment of tools for personalized education.”

Students will be able to access their own AITS to keeplearning at any time of day for decades on end. Learning can happen whilesitting in class, while walking home from soccer practice, and perhaps evenwhile sleeping. Meanwhile, the AITS will collect more and more data (from boththe student’s behaviors and the latest MBE research) to further refine learningefficacy, retention, and engagement.

For example, imagine a student waiting in a long line at anice cream shop. Her AITS plays videos demonstrating ice cream being made,chemical structures of different ice cream flavors, and formulas for trajectoriesand impact forces of virtual ice cream scoops if thrown across the street.While she sits under a tree outside the shop eating the ice cream, the AITS identifiesthe tree species and age and shows an overlay of what the street looked like150 years ago when the tree was planted, complete with a virtual reenactment ofa famous Old West shootout that took place nearby. When she finishes her icecream, the student returns her focus to studying for her architecture courseexams. She walks past and through actual buildings that demonstrate the architecturaldesign principles she’s learning. The AITS overlays data, illustrations, andvideos upon the real environment she’s seeing around her.

When I finally have access to an AITS like this, I’ll neverturn it off. I’ll want to keep learning day and night, for as long as I amable, even though I finished grad school years ago. Thus, an AITS will be perfectfor lifelong learners. As the educational content expands to include lessonsand tutorials on virtually everything—much as YouTube and MOOCs already havefor 2-D screens—we’ll all be able to use AITSs to learn any topic we want(life-wide) to any degree of mastery we want (life-deep).

Science nonfiction

Notably, the technology for all this already exists; we simplyneed to create the content and applications and keep doing the MBE research.Here are a few examples of current projects (see the Additional Resourcessection):

  • Stanford’s Virtual Human Interaction Lab (VHIL) studies how people interact in VR.They use MBE and behavioral science VR to examine racism, childhooddevelopment, empathy, sustainability, learning in virtual classrooms, and othertopics. (See an article in LearningSolutions about VHIL here.)
  • The University of Texas–Dallas Virtual RealitySocial Cognition Training Department studies cognitive development and VR. Its Brain Performance Institute VRtraining and development programs train children with autism, ADHD, and othersocial learning differences, with astounding results.
  • In yet another case study of MBE scienceapplications in VR, the University of Washington HITLab and Harborview BurnCenter developed a VR game called SnowWorld to help severe burnvictims cope with and reduce pain. The perception of pain has such a strongpsychological component that playing a fun VR game set in a snowy landscape canactually reduce the pain felt by burn victims. Other examples of VR games thatreduce pain are here.
  • Variant, an adventure video gamedesigned to teach calculus, and ARTé: Mecenas, an art history video game, were both created by Triseum and Texas A&M University. Studentsas young as six years old have taught themselves calculus by playing Variant. One student played a singlegame for more than 60 hours and even drove to the Variant headquarters to get the solution to the final level becausehe couldn’t finish the game on his own but didn’t want to give up. Note thatthis game was assigned as an extra credit homework assignment. This level ofdedication and engagement sounds ludicrous but is not uncommon with educationalgames. Similarly, students play ARTé:Mecenas 10 times for two to four hours per session, on average, with muchhigher retention rates as compared with traditional teaching methods. Note thatboth Variant and ARTé: Mecenas are 2-D video games, not VR games. However, seriousgames like these could be orders of magnitude more immersive and engaging ifcreated in VR. How diligently would students work on homework if they got topaint alongside Michelangelo and create a masterpiece stroke by stroke, orrocket through the solar system, using calculus to avoid crashing into Jupiteror to slingshot around Saturn with a planetary gravity assist for an up-closeflyby of Pluto?
  • IBM is using AI to create personalized learning tools for educators. IBM Watson Element for Educators “provides teachers with a single 360-degree view ofstudents by consolidating various academic, social, and behavioral datasources. These insights generate suggestions on how best to help each studentso they receive targeted support in the classroom more quickly.” Similarly, IBM Watson Enlight (for Educators), “built for teachers, by teachers, is a planningtool that supports teachers with curated, personalized learning content andactivities to align with each student’s needs. Teachers have access to keyinsights into students’ academic strengths and weaknesses as they create individualizedlearning experiences.”
  • The University of Southern California (USC)Institute for Creative Technologies has developed “virtual humans” that look,move, and speak like real humans, albeit on large screens. (See the link in theAdditional Resources section below.) These virtual humans employ MBE scienceand ITS technology to create learning experiences in schools, museums, and medicalresearch facilities. Virtual humans “add a rich social dimension to computerinteraction,” answering questions at any time of day so students never feel completelystuck.
  • Pearson,in partnership with IBM Watson,is creating learning experiences that improve student engagement on a broadrange of topics and provide insights to teachers. The AI assistant “will assessthe student’s responses to guide them with hints, feedback, explanations, andhelp to identify common misconceptions, working with the student at their paceto help them master the topic.”

I could list many, many more examples of existingapplications and research projects that push the boundaries of XR and AI ineducation. A number of them are included in the Additional Resources section atthe end of this column.

Homo deus

If the 9 Billion Schoolsorganization is right that the future of education will be lifelong, life-wide,and life-deep, highly personalized, and driven by Mind Brain Education science,and if my predictions about what will be possible, nay inevitable, by combiningexpanded reality, augmented intelligence, and intelligent tutoring systems intoeffectively omniscient and omnipresent augmented intelligence tutoring systems,then augmented intelligence tutoring systems will soon become the single mostpowerful educational tool the world has ever known.

Additional resources

Carnegie Learning: Mika

CourseQ: The Helpful Chatbot for Higher Ed

Faggella, Daniel. “Examples of Artificial Intelligence in Education.” TechEmergence. 7 March 2017.

Harari, Yuval Noah. Homo Deus: A Brief History of Tomorrow. New York, NY: Harper,2017.

Hardesty, Larry. “Explained: Neural networks.MIT News. 14 April2017.

IBM Education

IBM Watson Education

IBM Watson Education. “First IBM Watson Education App for iPad Delivers Personalized Learning for K-12 Teachers and Students.” 19 October 2016.

Letzter, Rafi. “IBM’s brilliant AI just helped teach a grad-level college course.Business Insider. 27 May 2016.

Miles, Kathleen. “Ray Kurzweil: In The 2030s, Nanobots In Our Brains Will Make Us ‘Godlike.’Huffington Post. 1 October 2015.

Muehlenbrock, Martin. “Learning Group Formation based on Learner Profile and Context.” German ResearchCenter for Artificial Intelligence DFKI. January 2006.

Netex: learningCloud and smartED

Newton, Casey. “Can AI fix education? We asked Bill Gates.TheVerge. 25 April 2016.

Perez, Sarah. “Sesame Workshop and IBM team up to test a new A.I.-powered teaching method.TechCrunch. 7 June 2017.

Peterson, Dan. “AdmitHub launches first college chatbot with Georgia State.” AdmitHub Blog. 17 May2016.

RAND Corporation, The. “Effectiveness Research: CarnegieLearning Study Flyer.” August 2017.

Rea, Shilo. “Teaching Science to the Brain: Carnegie Mellon Scientists Discover How the Brain Learns the Way Things Work.Carnegie MellonUniversity News. 17 March 2015.

Singer, Natasha. “How Google Took Over the Classroom.” NewYork Times. 13 May 2017.

Slyker, Karen. “Professor honored for research on impact of virtual reality on learning.” Texas Tech Today. 19 May 2017.

Stanford University. Artificial Intelligence and Life in 2030: One Hundred Year Study on Artificial Intelligence. 2016.

Stanford University. Galileo Correspondence Project.

Swartout, William, et al. “Virtual Humans for Learning.” AI Magazine.Winter 2013.

von Radowitz, John. “Intelligent machines will replace teachers within 10 years, leading public school headteacher predicts.Independent.11 September 2017.

Share:


Contributor

Topics:

Related