The engineering education team’s staff meeting on May 2, 2012began like any other: reports of new engineer orientation, computer scienceoutreach efforts, and an updated mission statement: “To provide Googleengineers and the world with relevant and timely technical content, learningresources, and tools.”
With five minutes remaining in the meeting, the directorannounced that she was recruiting team members who were willing to tackle anaudacious goal: create an online course for ten million people in eight weeks.Many of us left the room that day with more questions than answers: Could wereally create a course from scratch for that many people? In only eight weeks?What would we teach, and why? How would we know if we were successful?
MOOCs? Why MOOCs?
Many elements of Google culture have contributed to ourexperiments with massive open online courses (MOOCs), including the company’smission, desire to think big, commitment to our users, and ability to launchand iterate. A year and a half since that fateful staff meeting, we have servedover 360,000 students by launching five courses for the general public,developed a handful of courses for our own engineers, and assisted numerouspartners to launch courses using Google’s open-source Course Builder platform.
You might be wondering about Google’s interest in MOOCs. Ourcompany mission statement is, “To organize the world’s information and make ituniversally accessible and useful.” Enabling educators to share their expertisewith the world fits in this mission, as does expanding education to everyone.We had enthusiasm and a vision, but what should we teach?
After some brainstorming, we decided to start with what weknow—our own products. We have worked with teams to enhance the user experienceof Google tools through education. If people know how to better use ourproducts, they will likely use them more, which helps the company meet itsbusiness goals. Google has also focused resources on helping individualprofessors, small colleges, and non-profit organizations scale their educationefforts.
At Google we hope to help MOOCs evolve from their currentimplementation by encouraging others to build interesting courses and sharetheir discoveries about effective pedagogy. Several strategies that have workedfor us include experimentation (hypothesizing, testing, evaluating, anditerating), student community, more activities, short videos, alternativeevaluation methods, and paying attention to student goals.
Experimentation
At Google we are encouraged to experiment by hypothesizing,gathering feedback, launching, evaluating data, and iterating. An intrepid teamof content experts, designers, and engineers worked together to develop ourfirst course. After eight weeks of development we launched Power Searching with Google.This course consisted of 28 lessons, each containing a video, text transcript,and activity, three assessments, as well as certificates of completion. Theinterface had some rough edges, and we spent late nights fixing bugs. However,by releasing an early version of the course with its imperfections, we wereable to collect student interaction data that in turn guided future designdecisions. We offered Power Searching a second time a few months later withclearer activity instructions, explicit links to the discussion forum, and newassessment questions. Based on the organic community interactions via thecourse forum, we also decreased the number of support staff answeringquestions.
Since the first course, we have experimented with numerouspedagogical elements including community, videos, activities, designing forstudent goals, and assessment strategies. These observations have helped informthe Course Builder technology.
Lessons learned: Releasecourses early, even if they are not perfect, and gather feedback to inform andimprove course content. Given the size of many online courses, it’s impossibleto predict every student experience.
Student community
Technology enables students to connect with hundreds orthousands of other students. This also presents a challenge for coursedevelopers: how do we ensure the same course is valuable for students ofdiverse backgrounds, levels of technical savviness, varying experience with atopic, in locations around the world, and with different ways to apply theconcepts.
Despite conducting several usability studies with members ofwhat we assumed to be our target audience, we discovered that our actualaudience was more diverse than we had anticipated. For example, we found thatchallenge activities at the end of each module motivated a small number ofpeople but were too difficult for other students. We therefore offered thesechallenge activities as supplementary to the primary content. We also realizedthat we hadn’t considered how the Google search interface would appeardifferently in other regions. Students in Brazil, for example, saw slightlydifferent search interfaces than peers in Japan and the US. We had developedthe course with US-centric examples, resulting in student confusion in onelesson. Because these students could ask questions and share their experiencesvia the course forum, they were able to help each other achieve the lesson’sgoals.
We also found that students shared examples of how they couldapply the course content. A librarian in Power Searching commented in the forumthat she had used color filtering in image search to help one of her patronssearch for a particular book with a green cover. Educators shared lesson plansfor using Google Maps in their classrooms. Although we could have anticipatedthe different ways students would interact with each other, the fact that wehad a course forum enabled students to differentiate the course for each otherin ways that we did not expect. Furthermore, students crowd-sourced solutionsto overcome learning barriers.
Lessons learned:Students will have diverse interests, backgrounds, and needs. Enable studentsto personalize the course for their own needs and share experiences with eachother. Giving the students freedom to share how they will apply the courseconcepts enables them to help each other.
Videos and activities
Many MOOCs consist of videos with intermittent quizzes tomaintain student interest and engagement. From our data, it’s not clear whetherwatching videos is the most effective way to learn the skills we taught; wehave found that many students prefer clicking on a text lesson instead of, orin addition to, watching videos. In fact, when we featured the videopredominantly on the page, with a small button linking to the text version ofthe lesson, students clicked on the video about seventy percent of the time andthe text lesson thirty percent of the time.
In the Advanced Power Searching course, we presented videolinks next to text-version links of the same lessons. In this course, whichalso gave students opportunities to try search challenges before viewinglessons, we found that students clicked on the text and video lessons in equalnumbers. We have discovered that shorter, targeted videos seem to holdstudents’ interest better than longer videos. In our courses, videos shorterthan five minutes have, on average, an eighty percent engagement rate (meaningthat students watched an average of eighty percent of the video).
Online education enables us to give students opportunities toapply skills and receive instant feedback. In our courses we coupleinstructional videos with activities where students practice the skills andreceive guidance about how well they are mastering the content. We havediscovered that significantly more students complete activities than watchvideos. One hypothesis is that students jump directly to activities, try them,and assess whether they need to review the relevant lessons. In fact, studentswho completed course activities had a higher course completion rate thanstudents who did not do activities.
Lessons learned:Students appreciate control over their learning experiences; make it easy forstudents to choose activities, text lessons, or video lessons in theirpreferred order. We plan to use short videos for motivation, rationale, andauthentic examples of the content.
Goals
We asked students to select a goal when they registered forthe Mapping with Google and Introduction to Web Accessibility courses. Weprovided a list of goals including “Meet all course requirements in order toearn a certificate of completion,” “Learn one or two new things about GoogleMaps [or web accessibility] without achieving a certificate of completion,” and“I’m curious about how this online course is taught.”
Surprisingly we found that only fifty-four percent of Mappingregistrants (and fifty-six percent of Web Accessibility registrants) intendedto complete course requirements to earn a certificate. The vast majority ofother registrants only wanted to learn one or two new things, either out ofcuriosity or for a work-related need. Based on what we know about studentprogress in the Mapping with Google course, we inferred that forty-two percentof active students did achieve the goals they set out to meet (compared tothirteen percent of all registrants who completed the course).
Lessons learned: Weshould consider changing course designs to meet a variety of student goals.Instead of assuming that all students will interact with all course materialsfrom A to Z, make it easier to search for small nuggets of content. Publishclear learning objectives that enable students to self-select whether they willget what they want out of the course. Lastly, consider publicizing multiplepaths that students could take through the course.
Self-evaluation
Although peer grading has become quite popular in MOOCsbecause it relieves professors of the burden of grading thousands ofassignments, we believe that self-evaluation has greater benefit to thestudents. Self-grading helps build students’ metacognition that they will usewhen applying the skills from the class. For example, after the class we wantstudents to stop and think about the qualities of an effective Google Map whencreating a map. By having them evaluate their maps against these criteria, ourhope is that they will continue to apply these skills after the class.
In Advanced Power Searching, students submitted two casestudies that detailed how they solved complex challenges related to their livesin order to earn certificates of completion. Students provided great examplesof how they used Google tools to research their family’s history, the originsof common objects, or trips they anticipate taking. In addition to listingtheir queries, they wrote details about how they knew websites were credibleand what they learned along the way. They graded their own assignments based ona rubric we provided. Similarly, in Mapping with Google, students created mapsand evaluated them based on a checklist.
Teaching assistants (TAs) graded a random sample of studentassignments. We found a modest yet statistically significant correlationbetween TA’s grades and student’s grades, low incidents of cheating (duplicateassignments), and an overall high quality of work. In fact, the majority ofstudents graded themselves within six percentage points of how an expert graderwould assess their work. This is a positive result, since it suggests thatself-graded project work in a MOOC can be a valuable assessment mechanism.Reading stories of how people used their new skills to plan vacations, findjobs, and research ordinary objects was one of the most inspiring aspects ofthis course for the TAs.
Lessons learned:Continue to explore self-evaluation as an assessment mechanism. Test rubricswith a broad sample of potential assignments submitted. Provide additionalguidance to students on evaluating their work.
Areas of future exploration and reflection
We have launched five courses and iterated to improve aspectsof each course. Future product education courses will involve many moreexperiments, as many hands-on activities as possible, opportunities forstudents to connect with each other, short videos, and opportunities forstudents to evaluate their work.
We also anticipate continuing to experiment with motivation,community, and personalization. How do we inspire students to achieve theirgoals? How do we maximize the value of having tens of thousands of peopleworking on the same content at roughly the same time? How do we help studentscollaborate with each other to further differentiate the content? How do weprovide personalized learning experiences for all students?
Though much has changed since that staff meeting ayear and a half ago, and over 45,000 students have completed our courses, westill have more questions than answers.




