This winter, Lauren Della Bella and Dick Thomas will publish their book 9 Billion Schools: Why the World Needs Personalized Lifelong Learning for All. (Full disclosure: I wrote a chapter in this book on VR and the future of education, and I also co-wrote the chapter on AI and the future of education. The views and forecasts in this column represent the opinions of the author, not of 9 Billion Schools nor of Learning Solutions Magazine.)

As described in detail in the book, the 9 Billion Schools (9BS) organization maintains that:

  • Education shouldn’t end when we exit formal school systems (i.e., high schools or universities), but instead should be lifelong, life-wide (including every topic we wish to learn), and life-deep (going as deep into these topics as we wish)—abbreviated as L3
  • Education should be highly personalized
  • Educators and instructional designers should use Mind Brain Education (MBE) science to massively and continually improve educational models and methods

Augmented intelligence, which is artificial intelligence that communicates and interfaces with us through virtual or augmented reality, is the precise technological tool we need to make the 9BS vision a reality—a real reality, not a virtual one.

Homo augmentus

For simplicity’s sake, I’m lumping virtual, augmented, and mixed realities into the single, catchall term expanded reality (XR). XR will soon be omnipresent in our daily lives, much as smartphones, social media, and the internet are today. We will wear glasses or contacts that overlay images and data onto the real world. We will hear voices, sound effects, and music via hidden earphones (we already do, to lesser degrees). Haptic devices strapped to our bodies will vibrate, push, heat, and otherwise allow us to feel and interact with the virtual worlds around us. Cameras and other sensors on and in our bodies will record our environments, biometric data, and movements. We will continue to connect to and interact with our computers, the internet, the cloud, and one another more directly and continually every year. Step by step, we ourselves are becoming the cybernetic organisms as envisioned in 20th-century science fiction.

(Editor’s note: See The eLearning Guild’s white paper Using Enhanced Realities for Learning: Are You Ready? as a first step in acquiring the knowledge and skills necessary for the new world of learning through enhanced realities.)

Homo cognitus

As this happens, artificial intelligence will also continue to evolve. Inevitably, software engineers will create ways for us to communicate with our ever-more-powerful devices and smart assistants even more directly using our myriad XR and internet of things (IoT) interfaces. Alexa, Google, and Siri are just a humble beginning, a mere whisper of what will be possible perhaps only five or ten years from now. The research and databases these AIs draw from will continue to grow, as will AI’s abilities to learn. Eventually, we’ll continuously and directly communicate with our smart assistants across our countless platforms and devices, and they’ll respond back with the useful and intuitive information we need.

As all this becomes possible, mundane even, intelligent tutoring systems (ITS) will inevitably employ XR-enabled smart assistants, or augmented intelligence tutoring systems (AITSs). I define an AITS as an intelligent tutoring system (ITS) that agnostically communicates through virtually all devices and all platforms, specifically designed to teach. The only real difference between an ITS and an AITS is that an ITS isn’t defined by platform, and thus, typically only interacts with students via a single computer and/or a mobile app. However, while wearing AR/MR glasses and earphones, we’d continually see and/or chat with an AITS all day long, as well as message with it via phone, tablet, and laptop. Even more advanced AITSs will collect data from various sensors and cameras, interact with other objects in our virtual environment, track tons of data, and communicate through multiple haptic devices in addition to the visual and aural. An AITS’s domain will include the entirety of the virtual world and all connected devices in the real world. Sort of like Mr. Clippy crossed with Alexa crossed with Yoda.

Because why wouldn’t educators and software engineers use the most powerful technologies of the day, just as today’s teachers and students regularly use computers and the internet? Further, AITSs will access all the MBE science and educational research, observe students’ behavior and learning patterns, and match psychology with methodology to create highly personalized lessons and curricula, tailored to each student like a fingerprint.

AIs will soon create uniquely customized lesson plans for each student. The Stanford report Artificial Intelligence and Life in 2030: One Hundred Year Study on Artificial Intelligence predicts: “In the next fifteen years, it is likely that human teachers will be assisted by AI technologies with better human interaction, both in the classroom and in the home.” (See the Additional Resources section at the end of this column for a link to this report.) The report continues:

“AI techniques will increasingly blur the line between formal, classroom education and self-paced, individual learning. Adaptive learning systems, for example, are going to become a core part of the teaching process in higher education because of the pressures to contain cost while serving a larger number of students and moving students through school more quickly. While formal education will not disappear, the Study Panel believes that MOOCs and other forms of online education will become part of learning at all levels, from K-12 through university, in a blended classroom experience. This development will facilitate more customizable approaches to learning, in which students can learn at their own pace using educational techniques that work best for them. Online education systems will learn as the students learn, supporting rapid advances in our understanding of the learning process. Learning analytics, in turn, will accelerate the development of tools for personalized education.”

Students will be able to access their own AITS to keep learning at any time of day for decades on end. Learning can happen while sitting in class, while walking home from soccer practice, and perhaps even while sleeping. Meanwhile, the AITS will collect more and more data (from both the student’s behaviors and the latest MBE research) to further refine learning efficacy, retention, and engagement.

For example, imagine a student waiting in a long line at an ice cream shop. Her AITS plays videos demonstrating ice cream being made, chemical structures of different ice cream flavors, and formulas for trajectories and impact forces of virtual ice cream scoops if thrown across the street. While she sits under a tree outside the shop eating the ice cream, the AITS identifies the tree species and age and shows an overlay of what the street looked like 150 years ago when the tree was planted, complete with a virtual reenactment of a famous Old West shootout that took place nearby. When she finishes her ice cream, the student returns her focus to studying for her architecture course exams. She walks past and through actual buildings that demonstrate the architectural design principles she’s learning. The AITS overlays data, illustrations, and videos upon the real environment she’s seeing around her.

When I finally have access to an AITS like this, I’ll never turn it off. I’ll want to keep learning day and night, for as long as I am able, even though I finished grad school years ago. Thus, an AITS will be perfect for lifelong learners. As the educational content expands to include lessons and tutorials on virtually everything—much as YouTube and MOOCs already have for 2-D screens—we’ll all be able to use AITSs to learn any topic we want (life-wide) to any degree of mastery we want (life-deep).

Science nonfiction

Notably, the technology for all this already exists; we simply need to create the content and applications and keep doing the MBE research. Here are a few examples of current projects (see the Additional Resources section):

  • Stanford’s Virtual Human Interaction Lab (VHIL) studies how people interact in VR. They use MBE and behavioral science VR to examine racism, childhood development, empathy, sustainability, learning in virtual classrooms, and other topics. (See an article in Learning Solutions about VHIL here.)
  • The University of Texas–Dallas Virtual Reality Social Cognition Training Department studies cognitive development and VR. Its Brain Performance Institute VR training and development programs train children with autism, ADHD, and other social learning differences, with astounding results.
  • In yet another case study of MBE science applications in VR, the University of Washington HITLab and Harborview Burn Center developed a VR game called SnowWorld to help severe burn victims cope with and reduce pain. The perception of pain has such a strong psychological component that playing a fun VR game set in a snowy landscape can actually reduce the pain felt by burn victims. Other examples of VR games that reduce pain are here.
  • Variant, an adventure video game designed to teach calculus, and ARTé: Mecenas, an art history video game, were both created by Triseum and Texas A&M University. Students as young as six years old have taught themselves calculus by playing Variant. One student played a single game for more than 60 hours and even drove to the Variant headquarters to get the solution to the final level because he couldn’t finish the game on his own but didn’t want to give up. Note that this game was assigned as an extra credit homework assignment. This level of dedication and engagement sounds ludicrous but is not uncommon with educational games. Similarly, students play ARTé: Mecenas 10 times for two to four hours per session, on average, with much higher retention rates as compared with traditional teaching methods. Note that both Variant and ARTé: Mecenas are 2-D video games, not VR games. However, serious games like these could be orders of magnitude more immersive and engaging if created in VR. How diligently would students work on homework if they got to paint alongside Michelangelo and create a masterpiece stroke by stroke, or rocket through the solar system, using calculus to avoid crashing into Jupiter or to slingshot around Saturn with a planetary gravity assist for an up-close flyby of Pluto?
  • IBM is using AI to create personalized learning tools for educators. IBM Watson Element for Educators “provides teachers with a single 360-degree view of students by consolidating various academic, social, and behavioral data sources. These insights generate suggestions on how best to help each student so they receive targeted support in the classroom more quickly.” Similarly, IBM Watson Enlight (for Educators), “built for teachers, by teachers, is a planning tool that supports teachers with curated, personalized learning content and activities to align with each student’s needs. Teachers have access to key insights into students’ academic strengths and weaknesses as they create individualized learning experiences.”
  • The University of Southern California (USC) Institute for Creative Technologies has developed “virtual humans” that look, move, and speak like real humans, albeit on large screens. (See the link in the Additional Resources section below.) These virtual humans employ MBE science and ITS technology to create learning experiences in schools, museums, and medical research facilities. Virtual humans “add a rich social dimension to computer interaction,” answering questions at any time of day so students never feel completely stuck.
  • Pearson, in partnership with IBM Watson, is creating learning experiences that improve student engagement on a broad range of topics and provide insights to teachers. The AI assistant “will assess the student’s responses to guide them with hints, feedback, explanations, and help to identify common misconceptions, working with the student at their pace to help them master the topic.”

I could list many, many more examples of existing applications and research projects that push the boundaries of XR and AI in education. A number of them are included in the Additional Resources section at the end of this column.

Homo deus

If the 9 Billion Schools organization is right that the future of education will be lifelong, life-wide, and life-deep, highly personalized, and driven by Mind Brain Education science, and if my predictions about what will be possible, nay inevitable, by combining expanded reality, augmented intelligence, and intelligent tutoring systems into effectively omniscient and omnipresent augmented intelligence tutoring systems, then augmented intelligence tutoring systems will soon become the single most powerful educational tool the world has ever known.

Additional resources

Carnegie Learning: Mika

CourseQ: The Helpful Chatbot for Higher Ed

Faggella, Daniel. “Examples of Artificial Intelligence in Education.” TechEmergence. 7 March 2017.

Harari, Yuval Noah. Homo Deus: A Brief History of Tomorrow. New York, NY: Harper, 2017.

Hardesty, Larry. “Explained: Neural networks.MIT News. 14 April 2017.

IBM Education

IBM Watson Education

IBM Watson Education. “First IBM Watson Education App for iPad Delivers Personalized Learning for K-12 Teachers and Students.” 19 October 2016.

Letzter, Rafi. “IBM’s brilliant AI just helped teach a grad-level college course.Business Insider. 27 May 2016.

Miles, Kathleen. “Ray Kurzweil: In The 2030s, Nanobots In Our Brains Will Make Us ‘Godlike.’Huffington Post. 1 October 2015.

Muehlenbrock, Martin. “Learning Group Formation based on Learner Profile and Context.” German Research Center for Artificial Intelligence DFKI. January 2006.

Netex: learningCloud and smartED

Newton, Casey. “Can AI fix education? We asked Bill Gates.The Verge. 25 April 2016.

Perez, Sarah. “Sesame Workshop and IBM team up to test a new A.I.-powered teaching method.TechCrunch. 7 June 2017.

Peterson, Dan. “AdmitHub launches first college chatbot with Georgia State.” AdmitHub Blog. 17 May 2016.

RAND Corporation, The. “Effectiveness Research: Carnegie Learning Study Flyer.” August 2017.

Rea, Shilo. “Teaching Science to the Brain: Carnegie Mellon Scientists Discover How the Brain Learns the Way Things Work.Carnegie Mellon University News. 17 March 2015.

Singer, Natasha. “How Google Took Over the Classroom.” New York Times. 13 May 2017.

Slyker, Karen. “Professor honored for research on impact of virtual reality on learning.” Texas Tech Today. 19 May 2017.

Stanford University. Artificial Intelligence and Life in 2030: One Hundred Year Study on Artificial Intelligence. 2016.

Stanford University. Galileo Correspondence Project.

Swartout, William, et al. “Virtual Humans for Learning.” AI Magazine. Winter 2013.

von Radowitz, John. “Intelligent machines will replace teachers within 10 years, leading public school headteacher predicts.Independent. 11 September 2017.