Behavior-centered Design at Google: A Case Study

Ifemployees self-select to attend training, how can we make it compelling enoughthat they will want to participate and, better yet, share it with their peers?

Thisyear, we designed and developed a program for supplier managers, Googlers whopartner with third parties to get work done. Studies show that once a partnersigns a contract with Google there is significant value leakage in cost,quality, and ease. (See References at the end of this article for studycitation.) Thoughtful performance management can help minimize this valueleakage. We want to have positive, enjoyable partnerships with our supplierswhile making sure they continue to deliver quality and value to Google.Effective supplier managers know how to measure performance, manage costs,mitigate risks, and ensure contract compliance as well as have positivesupplier relationships.

Thereare thousands of supplier managers at Google, though this is not a primary jobfunction for most of them. We needed to create a pragmatic and scalablesolution that quickly shared best-known practices with any Googler who wantedto learn more about managing supplier relationships. To ensure we achieved ourgoal of preventing value leakage, we centered our design on business outcomesand the behaviors that would support those outcomes.

Wefocused on proven tools and directional improvement in business outcomes. Thishelped us to create a comprehensive training with robust ongoing support thatuses scalable media for foundational knowledge and follows through with personalizedsupport on the job. We integrated the training evaluation design with thelearning program, and included a pre-survey, community of practice, and on-demandconsultation. These supports create an effective way for us to measure impactthrough continuous data collection.

This approach provides our learners with customized,one-on-one support, while allowing us to measure the frequency and quality withwhich Google employees (“Googlers”) deploy methods covered in the training.

Behavior-centered program design, notBehaviorism

We avoid training behavior with rewards andpunishments. We prefer to tap into intrinsic rewards that motivate behaviorchange. We do thisthrough shared business goals and values as well as examples of excellence. Wetrain on foundational methods, encouraging our audience to self-assess andapply the tools as they think best fit the situations in their job. Then, wesupport them in making adjustments.

Oursubject matter expert agreed to focus on specific skills, incidents, and impacton business to inform our design rather than rely on a content-centric orpolicy-focused approach. We worked together through the entire design processin sessions where we cascaded business outcomes, learning objectives,activities, assessments, evaluation, delivery methods and, finally, content(storyboards). We chose to combine stepped and case-study instructional modelsthat both integrate our evaluation plan and extend into our post-trainingsupports.

Usingthe “best tool” to create a learning object for each step in this programdesign could result in a disconnected experience for learners. One of thechallenges with mixing Google and third party creative tools was delivering acoherent experience. We created a cohesive end-to-end user experience byarranging the program on a single path. (Figure 1.) We have an internallydeveloped curation tool where we are able to combine multiple URLs into asingle package.

 

 

Figure1. The Introduction to Supplier Management program is on a single path, inorder to provide a coherent experience for learners.

Becausewe chose easy-to-use Google tools and a relatively simple approach to the look andfeel of the training, we had very low hard costs (e.g. development software,artwork licensing); and we found that we had below-average soft costs (e.g.design and development time). This helps us achieve good return on investmentwhile we create flexible programs that are easy to change on the fly withoutexpert technical knowledge.

Setting the stage for a Googley learningexperience

Tone,quality, and usefulness are critical to the success of any training at Google.Our culture is highly collaborative and grassroots and Google employeesself-select into most of our trainings. To have an impact, our training teamsmust create fun, relevant experiences that Googlers want to share with eachother. This is no small feat!

Ouraudience accepts and expects humor, variety, fun, efficiency, quality,practical value, and flexible formats in training. Although we do not seeevidence that designing for visual, auditory, or kinesthetic learning stylepreferences lead to improved outcomes, we strongly believe in using multiplemedia formats appropriate to each piece of content, snappy pacing, meaningfulinteractivity, and light-hearted tone to encourage interest and, therefore,consumption. In other words, we view active, positive attention as aprerequisite of knowledge transfer.

Wedesigned for content quality using concise learning objectives, which helpedthe pacing, relevance, and credibility of our content. We also featuredunscripted Googlers in our instructional and case study sections that exemplifiednot only excellence in supplier management but who also spoke directly to theconcerns of our audience from a shared point of view.

Bringingit all together

Theuse of primarily Google or open-source tools when developing this program wasnot a coincidence or a decision made from necessity. One of Google’s coreoperating principles is “eat your own dog food,” meaning, if you haveconfidence in your products, use them! We really take this to heart in thelearning and development space.

Weare using the following Google products for this program:

  • Internal curation tool: Web-basedapp that allows users to track their progress through multiple learning itemsas one cohesive presentation and connect with peers studying the same item. 
  • CloudCourse(open sourced): We use a LearningManagement System developed internally for course listing and discovery. 
  • Google Sites:Serves as ourmain program page, where participants could go to learn about the program,launch the training, copy templates, and request individual support.
  • Google Forms:Used in ourevaluation strategy to conduct pre- and post-learning surveys measuringparticipant satisfaction (level 1), behavior change (level 3), and businessimpact (level 4).
  • Google Docs Video: Trainingtrailers (teaser videos) we can embed or launch from QR codes for marketingcampaigns.
  • Google Docs:Houses our design docs, storyboards, flat/screen reader versions, and templatesembedded into our Sites pages.
  • Internal Tools:We have a user feedbacktool to assess page quality, track the scorecard review process, and otherrequests through an internal support ticketing system.
  • Google Analytics:Used to track Web site usage, content popularity, and eLearning hits.

Wealso used these free or OS tools:

  • Moodle(“Noodle”): Aninternal instance used to deliver quiz questions at the conclusion of thetraining module.
  • Audacity:Audio recordingfor eLearning modules.

Whenthere truly was no free Google or OS option, we used licensed software andartwork to fill in the gaps (e.g. Articulate, iStockphoto, FlipCams, Camtasia, andPhotoshop).

Wewanted to include easy-to-use formats for all Googlers, who have a wide varietyof preferences for information resources. We wanted to make a training that noone has to ask for accommodation in order to use, consume in their preferredformat, or read in their native language. We created a screen-reader-friendlyversion of the training available from day one. This is also useful because agood portion of our audience rejects Flash-based trainings and wants a textonly, searchable version. For our learners who prefer to use Google Translate,this version of the training offers an easy-to-convert document.

“Fuzzy evaluation” and evidence of change

We’re“scrappy” and innovative in our approaches to learning and development, but we’restill highly analytical and data-focused. Our main goal with an evaluationdesign like this is to tell a story, even if the data is only directional. Wecan still show the general impact on business of a program in a compelling way.

Wedesigned a comprehensive evaluation strategy for this training program,incorporating all four levels of the Kirkpatrick model and using only Googleand open source tools. Our learning evaluation strategies include multiplesources of data to increase validity. Our main goal with an evaluation designlike this is to tell a story, even if the data is only directional rather thancausal.

Ourevaluation methods include:

  • Level 1:Satisfaction surveys, administered immediately after the participant completesthe online module. (Google Forms)
  • Level 2:A fifteen-question quiz, administered at the conclusion of the online module.(Moodle)
  • Level 3:Pre- andpost-learning participant surveys to assess perceptions and current statepractices (frequency and quality), in addition to follow-up assignments thatare evaluated against a standardized rating scale. (Google Forms)
  • Level 4:Pre- and post-learning participant surveys targeted to assess the perceivedvalue supplier managers are receiving, founded on baseline measures craftedfrom industry research studies. (Google Forms and Sites)

Givenour behavior-centered design approach, we wanted to incorporate learningreinforcement and support models into the program to further encourage change.To limit the participant’s time requirement, we layered our reinforcementstrategy with our evaluation strategy, using follow-up assignments to reinforcelearning and evaluate impact – thus providing value to our learners andincreasing participation. The follow-up assignment asked participants to builda “supplier scorecard” of their own, using the methods taught in the training,to help them measure the performance of their suppliers. They then submit theirscorecards to the “Supplier Center for Excellence” at Google where a member ofthe Supplier Management team evaluates their work against a standardized ratingscale and provides personalized feedback and coaching. (Figure 2) This approach provides our learners withcustomized, one-on-one support, while allowing us to measure the frequency withwhich they perform the desired behavior as well as the quality.

 

 

Figure2. The evaluation and reinforcement strategies come together in a simplescorecard approach.

What we learned

Thefeedback and interaction we have received from the program has been veryuseful. For example, we asked in our Level 1 survey, “What have you NOT learnedtoday that you expected to learn during this training?” We are using theanswers as clear direction from our learners about gaps that we need to fill inthe content published on our community of practice. We also asked questionsbefore and then 90-days after the training about perceived value of supplierrelationships, metrics in performance management, and the use of periodicbusiness reviews. This data gives us a picture of how our training program resultsin things changing.

Wehad strong responses that the training is a good use of time, is applicable,and is worth recommending. We attribute this success to a great subject-matter-expertpartnership, clear business outcomes, specific learning objectives, and snappydesign. Learners overwhelmingly thought that the delivery format was well suitedto the content. They enjoyed the methods we presented, and even somewhat-experiencedsupplier managers felt that the training was a good use of an hour. One learnerremarked, “I am brand new to handling suppliers, so I learned a lot! I thinkthe main point I will take away is to think more about the supplier/Googlerelationship going both ways.” We want to have positive, enjoyable partnershipswith our suppliers while making sure they continue to deliver quality and valueto Google, and this program is supporting that goal.

References

Webb, M., & Hughes, J. (2009,Autumn). Building the case for SRM. CPOAgenda. https://www.cpoagenda.com/previous-articles/autumn-2009/features/building-the-case-for-srm/

Clark, Ruth Colvin. (2010) Evidence-BasedTraining Methods: A Guide for Training Professionals. Alexandria, VA: American Society for Training & Development.

 

Share:


Contributors

Topics: