Oversimplification is pernicious. I recently attended an event at which presenters offered short demonstrations of emerging technologies. Among these were several providing introduction-level content about artificial intelligence (AI), augmented reality (AR), virtual reality (VR), and the like. One speaker demonstrated how easy it is to get started building AI-powered chatbots. The base product turned out to be a tool meant for building branching simulations that could be accessed via mobile device. Touting his demonstration as a “leadership coach,” the presenter offered several examples that were nothing more than bad multiple choice questions with “gimme” answers:

“Jim, your employee, is coming to work later and later every day. While he is productive when he’s at work, being late means he sometimes misses important updates and changes, and can impact other staff. What should you do?”
A. Ignore the problem.
B. Speak to Jim about the problem, explaining the way his tardiness affects workflow and his coworkers.
C. Punch Jim in the nose.
D. Call in sick.

The “mentee” then received feedback on his choice (“Great answer!”) and another question based on his answer, with options to go back and try again if the chosen answer was not the best.

So what? Well, a few things:

This is not a good example of a chatbot. It’s not even a good example of a multiple choice question. The presenter did a quick demo of the tool, showing how easily one could create a similar interaction. Those new to the technology really wanting to learn more about AI were not served by this oversimplified approach: They left thinking that a chatbot was just a scenario tool, built from inputting simple scripts. Rather than spend the time showing some examples of what a good chatbot might do, and maybe a bit of the background in programming one (see the eLearning Guild’s research report, The Human Side of Technology, for an example of the complexity behind Domino’s Pizza Pal ordering tool) the demonstrator instead chose to spend time showing how quickly you could program a simple multiple choice question that would work on a phone. It looked effortless—and the end result showed that. Remember this the next time you’re caught in an endless customer service chat loop with a bot programmed with only minimal scripting and little understanding of a customer’s needs.

[To be clear: There is nothing wrong with the product itself. If your goal is to build scenarios with branching decision making and feedback, it seems like a great solution. It was easy to use and launch. The company website touts it as a tool for creating stories and games. Given its capacity for outlining complex scenarios, I can even see it as a possible tool for prototyping or designing a bot but the person demonstrating didn’t suggest that, either. ]

“Is There a Rapid Tool for This?”

I saw something similar in an “Introduction to Virtual Reality” session, this time flipped from speaker to learner. Before the presenter even got going, a hand shot up and a woman asked, “Is there a rapid tool for this?” This from an attendee who admitted having no experience with AR—not even Pokémon Go—and virtually no understanding of what VR can do or how we might use it in the workplace.

The Ultimate Cost

Yes, you can create a multiple choice question easily. I’m sure there’s some quick tool that will let you overlay hats and mustaches over photos of coworkers. But here’s the thing: Time and time again people in learning and development latch on to the next new thing, don’t take time to fully understand it, and find a business only too happy to sell a “rapid” tool for it. We do it badly and then the powers that be say, “We tried chatbots/AR/VR/interactive video/social media tools/QR codes…and it didn’t work.” Or, as with the multiple choice click answer approach, we see a new tool and leap on it to only replicate what we’ve always done. I remember the first time I saw Second Life showcased at a conference—featuring a virtual presenter standing behind a virtual podium presenting slides to a virtual audience.

The Better Investment

A better use of time and energy is investing in learning more about the tool that interests you—or better yet, get clear on a problem and look for the best solution to solve it. Take yourself on a tour of good chatbots, like cognitive behavior therapy-based Woebot, which offers those struggling with depression and anxiety an always-on listening ear, curated videos, daily check-ins, and more. Recent research showed that Woebot is therapeutic, and its anonymity is especially appealing to young people uncomfortable with disclosing a mental health problem.

Chatbots are becoming increasingly “human,” able to recognize emotion and use predictive analytics to help customers, clients, and learners. Take a look at “best chatbot” showcases. Pay attention to the level of analysis and complexity of scripting a good chatbot requires. For more on the basics of building a chatbot, check out this concise, realistic take from Forbes. Do the same with approaching AR, VR, and mixed realities. Then with interactive video. And social media tools. And QR codes. And don’t miss recent eLearning Guild reports on AI, AR and VR in the workplace, designing for AR and VR, and emerging technologies promising to change not only the future of training but the future of humans.

While we all work under time and economic constraints, when considering taking a fast/cheap/easy approach, remember at the end of the day our output still needs to be good. Taking time to investigate and explore will pay off in better products for our learners.

Want more?

Listen to Jane Bozarth at DevLearn in Las Vegas October 23-25 present a recap of recent eLearning Guild research, and experience the return of her popular ukulele learning session, benefitting the Children’s Hospital of Nevada.