It wouldn’t be surprising if the attendees of this year’s Learning Solutions Conference & Expo and Eco 2016 looked at the theme for the show (“Sharing What Works”) and thought to themselves, “Right, the speakers are sharing effective strategies, we as attendees are sharing our own—yeah, I got it.”

But what if “Sharing What Works” is far more than a disposable, three-word tagline? What if it’s literally the key to future success in a rapidly changing L&D industry?

Clark Quinn said at the show, “L&D has the opportunity to be the most valuable part of your business.” And as controversial or far-fetched as that may sound, he’s absolutely correct—and moving the bulk of observation and sharing outside of training is how you get there.

Bill Nye kicked off Learning Solutions 2016 with a frenetic keynote that dove into everything from WWII codebreaking to mechanical engineering in the 1970’s to the future of space exploration. But it was his statement, “Learning starts with observation,” that I found more powerful than any of the amazing scientific breakthroughs that he shared.

How much observation do we have into the reasons an eLearning course was designed the way it was, or the reasons it was produced at all? What hindrances to availability prevent that course from being observable and consumable by its learners? How much observation does the business side of your organization have into how training’s efforts are truly impacting the business?

Many of the week’s speakers who followed Nye’s keynote (not an easy feat!) shared familiar challenges to scaling up observation, and some great ways to overcome them.

For observability at a granular level, Carrie Ann Desnoyers and Jennifer Hendryx tackled the challenges around making learning objects reusable, interoperable, durable, and accessible. As yesterday’s bulky eLearning courses are replaced with easier-to-consume microlearning units, it will be increasingly important to incorporate all of the aforementioned learning-object characteristics into their original design. By doing so, your learning objects can be reused in a variety of courses, interoperable on any LMS, and ultimately accessible whenever and wherever designers, instructors, and learners need them.

Going from learning objects to looking at increasing the accessibility of entire courses, I had the chance to attend Nina Talley’s conference session and to interview her about her role in providing technical support for virtual, instructor-led training deliveries around the globe—often to low-bandwidth areas. Nina shared not only the familiar headaches caused by latency and bandwidth constraints, but also the responsibility that L&D shoulders for making sure that learners are equipped to troubleshoot issues by working with them long before a virtual class ever begins, or even before the developer created the virtual class.

And this is just the start of the observability that we’re looking for, and what really amounts to what I’m calling, “Sharing What Works V2.0.” So many Learning Solutions speakers encouraged us in the L&D field to get out of our comfort zone, cast aside our silo, and to then align ourselves alongside everyone from IT (yes, even IT) to product management, to sales, and to bring the business and customers into the eLearning design process.

This isn’t just the L&D industry sharing what solely works for training. This is L&D collaborating with other departments in an organization and sharing what works for them. According to conference speaker Clark Quinn, too few L&D professionals are leveraging that kind of feedback and data in order to incorporate business impact in design. Doing so eliminates worthless and unjustified assumptions about the level of business impact that training will make. Instead, this will greatly increase the probability and expectations of that impact being delivered every time.

“We aren’t doing a good enough job,” said Quinn. “Only five percent of L&D organizations excel at using data to align and run efficiently.”

When Quinn asked attendees to name some of the metrics applied at their own organizations, his point was proven and immediately hit home for many. Participants named all-too-common metrics around “Pass or fail,” “Number of courses created or completed,” “Time spent developing courses,” “Likes and shares.” And while these metrics can lead to insights around efficiency, they only speak to knowledge around the efficiency of L&D alone, and offer zero insight around efficiency increases of the learners themselves. They provide no proof of business impact being made by anyone.

Quinn suggested that the measurements that L&D should focus on, and focus on first, are those that reflect actual business impact. How long does it now take to close a sale? Has your code quality improved? How quickly can you now resolve defects? Are your call centers able to close support tickets with less escalation? This feedback doesn’t come from learners sharing how much they enjoyed a course, or how quickly they were able to locate it or complete it. This feedback comes from non-learner sources such as the xAPI and the tools learners use, from learners’ managers, and from customers directly.

The goal here goes far beyond getting learners—and learning professionals—to share what works for them, and focusing only on meeting learners’ expectations. “Sharing What Works,” more times than not means getting the feedback from everyone outside of training to find what measurements they’re looking to improve, and designing classes and eLearning accordingly.

I can’t speak for The eLearning Guild, but I have to assume that “Sharing What Works” was in no way intended to be limited to the confines of the conference, or to be kept private from anyone outside of L&D. And I hope that training does become the most valuable part of many organizations. Company-wide collaboration will be L&D’s key to making it happen—how many of us are already on the way? I’d love to hear your story.