When I started this column, my editor had several suggested topics. One thing in common was that they were all evaluating things with acronyms. I wasn’t too keen because each of them was kind of waning. However, I decided I could do them all in one swell foop (as the saying goes).

What she wanted was an update on three acronyms: ADDIE (Analysis – Design – Development – Implementation – Evaluation), SCORM (Shareable Content Object Reference Model), and MOOCs (Massively Open Online Courses). Since the request their status has changed, but not significantly. So, here’s an update on these topics, and I’ll close with some off-the-cuff thoughts about trends.

Is ADDIE dead?

ADDIE was developed in 1975 by Florida State University for the US Army. Since then, it has become a mainstream design approach in use across instructional design. It hasn’t, of course, stayed static, having changed according to understandings and external pressures. Is it still relevant?

One issue needs to be made clear: ADDIE is not an instructional theory. ADDIE makes no claims about what are good practices to achieve learning ends. It’s not Elaboration Theory, Cognitive Load Theory, Four Component Instructional Design, or any other approach to determining instructional requirements. Instead, it’s a process model. It’s about the steps to be taken to go from a need to an implemented solution. And you can slot any theory into the design phase that you want!

When you look across design domains (interface, industrial, graphics, and more), you’ll see three-step models and four-step models predominantly. Some assume the analysis in the design, or more commonly the implementation is lumped in with the development. ADDIE separates out each step, and with plausible reason. Design Thinking, the new umbrella term for design approaches, asks you to diverge and converge on the problem (analysis) before you similarly do so for design solutions. And the issues involved in implementation can plausibly be separated from the development issues.

The big issue is whether ADDIE has kept up with the times. One of the downsides is that it originated as a ‘waterfall’ model; each phase leads to the next, and finishes. As interface design recognized in the 80s, an iterative approach tends to uncover initial assumptions and adapts to increased user awareness through testing. Of course, one of the adaptations of ADDIE has been to become iterative. Still, an approach like Michael Allen’s Successive Approximation Model (SAM), or Torrance Learning’s LLAMA (Lot Like Agile Management Approach) both more naturally emphasize iteration.

What does this mean to answer the question? ADDIE on principle is just fine. If, however, it has made it easy for your organization to make one pass instead of testing and refining, then it’s a burden, not a boon. My inclination is to abandon ADDIE for its baggage and take up a new approach just to keep the focus on iteration. As the saying goes: “your mileage may vary”.

What’s the status of MOOCs?

A term that’s more recent and has ascended (and descended) more rapidly is the MOOC. These started as higher-education, open-enrollment asynchronous courses for self-paced learning such that many people could take them. There may be synchronous lectures, but at the scale that they were being taken up (10 K learners at a time), the marking was all auto-marked and self-evaluation. And they, too, changed over time.

There were two relatively immediate phenomena. One was that many folks started, but the completion rate tended to hover around 10 percent. This wasn’t viewed positively. Another was that they were free, but if you wanted certification for completion there typically was a fee to pay.

Another development was the emergence of MOOC platforms. While in theory any LMS could be used, the scale tended to require re-engineering, and dedicated platforms emerged. Some were collaborations; others emerged from endeavors within particular laboratories. And these platforms quickly combined capabilities with specific business models. Businesses emerged around the different offerings, including selling the lists of students to potential employers (e.g., software course finishers to tech companies).

The proponents of MOOCs argued that 10 percent made sense; these were people who went in, found what they wanted, and didn’t care about completion. In fact, the constructivist versions of this were designed for self-directed learning. Others argued that such a rate meant that there were significant problems. And one of those problems soon became obvious.

The lack of external evaluation, owing to the scale of the courses, was problematic. Students quickly formed their own groups to at least share their understandings. Subsequently, many MOOC platforms added social capabilities, including peer review. This emerged as a way to manage scale, and instructors might supervise the peer review to keep it on track.

Still, responding to canned questions isn’t necessarily a good way to learn. Typically, a domain that represents a complex topic requires complex responses, which are as yet still hard to automatically evaluate. Critics rightly noted that such courses might teach you something like AI (a popular topic), but wouldn’t make you an AI engineer.

Ultimately, I think MOOCs have morphed into more traditional courses. The model of ‘free’ didn’t last, and the lack of interaction with instructors was too critical. Instead, we’re seeing a movement to more astute pedagogy and different types of approaches to meet urgent skilling needs. I think MOOCs are, thankfully, gone, replaced by more appropriate models of delivery of learning experience. They may work for relatively straightforward topics, but those are infrequently of meaningful interest.

Does xAPI replace SCORM?

Another acronym, xAPI (the Experience API) is a new standard from the same folks who gave us SCORM, the Advanced Distributed Learning (ADL) initiative of the Department of Defense. This raises the legitimate question of whether there’s still a role for SCORM. And that question requires examining the motivations and history of the two standards.

Tired of the endless ‘angels dancing on the head of a pin’ arguments entailed around the efforts to create an interoperability standard, ADL finally chose a ‘good-enough’ interpretation. Labeled SCORM, and with the weight of the US government behind it, it became the de facto standard. With effort to generate awareness and uptake, it gained a foothold.

And it worked; while there were initial hiccups, eventually SCORM became a reasonable bet that content developed would be migratable. However, there was a gap; the responses were at the ‘course’ level. If you wanted finer granularity—for instance to see what people were accessing, how independent elements were doing, etc.—you were out of luck. Or, rather, you had to create your own mechanisms.

Inspired by the detailed data available through web activity tracking, there was a push for a finer granularity. The ultimate result was the xAPI, a simple standard for reporting data in a <who> <did> <what> format. These required a new mechanism to aggregate the data, and the LRS (Learning Record Store) was born. The data alone isn’t necessarily useful, but correlating data like ‘who does what’ with outcomes from other business intelligence systems starts giving a richer picture of performance. xAPI isn’t the only such—IMS for instance has a similar standard called Caliper with a Sensor API—but xAPI is more workplace-focused while Sensor API is more targeted at higher ed.

All this doesn’t answer the question: Does xAPI replace SCORM? What does is the fact that ADL has released an aggregation of APIs that accomplish the same thing SCORM does. The cmi5 specification is, at heart, a set of xAPI statements plus rules that actually are a simplified, better SCORM. Ultimately, xAPI is a richer format for more types of data, and cmi5 is to supersede SCORM. Yes, SCORM is dead, but xAPI was only an enabler.

From acryonym to buzzword

As a slight aside, it appears that there’s a movement away from acronyms. (Maybe we’ve hit acronym fatigue!) Regardless, we’re not missing out on buzzwords, but they’re becoming more phrase-like. Right now, the hot topic is workflow learning, though microlearning isn’t dead yet. And the aforementioned Design Thinking (even I’m guilty) is in vogue.

The quick point is that there will always be shiny new objects with the associated hype. Whether it’s a pendulum swing from acronyms to buzz phrases, or not, we always need to be discerning about what’s real (or not). It pays to track the trends, do the due diligence, understand the real opportunities, and engage when it makes sense. Make sense to you?