Tuesday, 9 April 2013

Advanced Power Searching with Google: Lessons Learned

Posted by Dan Russell, Uber Tech Lead, Search Quality & User Happiness and Maggie Johnson, Director of Education and University Relations



Large classes are something you normally want to avoid like the plague. So the idea of being in a class with tens of thousands of students seems like a completely crazy idea.



But in March, 2013, Google offered a free ?MOOC? (a Massive Open Online Course) to teach Advanced Power Searching (APS) to a wide variety of information professionals.



The wholly online class ran for two weeks covering advanced research skills in a challenge-based format. It also had a bit more than 35,000 students sign up for the class.



In this case, the large class size was a boon to the students. Not only was there a vigorous discussion of the material in the social media, but with a class this large, anytime you had a question, someone else in the class had almost certainly asked the same question and had an answer ready. As in many MOOCs, the large online class size did not stress any lecture hall capacities, but it did give the students the benefit of multicultural classmates that were effectively always present in the social spaces of the MOOC.



A typical Massive Open Online Course (MOOC) is a simple progression through a series of mini-lectures--usually a short video followed by reflective questions, problem sets and a few assessments. MOOCs can have huge numbers of students; dozens have been offered with over 150,000 students enrolled. Based on our experiments with Power Searching with Google in 2012, we wanted to do something different. When we offered Advanced Power Searching with Google (APS) in January of 2013, we decided to try out a number of new ideas.



Through this course, we wanted to enable our students to solve complex research questions using a variety of tools, such as Google Scholar, Patents, Books, Google+, etc.. We defined complex problems that had more than one right answer and more than one way to find those answers.



Unlike a traditional MOOC, the APS course had twelve challenges that students could tackle in any order they liked. There were four easy, four medium and four difficult challenges. Part of the design of the class was to have students discover the skills they?d need to solve the challenges and select appropriate video or text lessons. Students could also access case studies that showed how others solve similar problems.



We called our MOOC design ?Choose your own adventure.? Each challenge presented a research question like this:






?You are in the city that is home to the House of Light. Nearby there is a museum in a converted school featuring paintings from the far-away Forest of Honey.





What traditional festival are you visiting??





In this class, the large cohort of 36,000 students worked through the materials together, using online forums to ask questions as well as Google+ Hangouts to attend office hours and collaborate on solving challenges. Instructor Dan Russell and a group of teaching assistants monitored students? activities and provided support as needed.



If they needed additional help, students could post a question on the forum or see how others solved the challenge. Students could post their solutions to challenges in a special ?Peer explanations? section; a feature that many students appreciated as it let them see how others in the class approached the problem in their own ways.



In analyzing the data, we found that there were a decreasing number of views on each challenge page, indicating that students most likely tried the challenges in order given. While some liked the ability to jump around, most tended to go through the content in the linearly. Most students who completed the course tried (or at least looked at) all twelve challenges. Many students who did not complete the course tried three or fewer challenges.



To earn a certificate of completion, students submitted two detailed case studies of how they solved a complex search challenge. Students provided great examples of how they used Google tools to research their family?s history, the origins of common objects, or trips they anticipate taking. In addition to listing their queries, they wrote details about how they knew websites were credible and what they learned along the way.



To assess their work, we experimented with letting the students grade their assignments based on a rubric. We collected their scores and compared them with a random sample of assignments graded by TAs. There was a moderate yet statistically significant correlation (r=0.44) between student scores and TA scores. In fact, the majority of students graded themselves within two points of how an expert grader assessed their work. This is a positive result since it suggests that self-graded project work in a MOOC can be valuable as a source of insight into student performance.



The challenge format seemed to be effective and motivating for a small, dedicated population of students. We had 35,000 registrants for this advanced course, and 12% earned a certificate of completion. This rate is somewhat lower than what we saw for Power Searching with Google, a more traditional MOOC. Students who did not complete the course reported a lack of time, and difficulty of the content as barriers.



One interesting point was that labeling the challenges as easy, medium or difficult likely had an unintentional effect. The first challenge was marked as ?easy,? but many people found it difficult. This may have de-motivated students from attempting more difficult challenges. Next time, we plan to ask students if the first challenge was too easy, or too challenging, and then send them to a challenge at an appropriate level of difficulty.



Watch for more MOOCs on our products and services in the coming months. And watch for more experimentation as we apply what we have learned, and try more ideas and new approaches in future online courses.


crawled from : Blogspot

No comments:

Post a Comment