Work-Learning Research

  • Work-Learning Research
  • Catalog of Publications
  • LearningAudit.com
  • Will's Speaking & Workshops
My Photo

About

Search

  • Google
    This Blog Web

Notable Books

Recommended Books

  • Turning Research into Results: A Guide to Selecting the Right Performance Solutions, by Richard E. Clark, Fred Estes
  • How People Learn: Brain, Mind, Experience, and School: Expanded Edition, by National Research Council, edited by John Bransford, Ann L. Brown, Rodney R. Cocking
  • Criterion-Referenced Test Development 2nd Edition, by Sharon Shrock, William Coscarelli, Patricia Eyres
  • Michael Allen's Guide to E-Learning, by Michael Allen
  • e-Learning and the Science of Instruction, by Ruth Colvin Clark, Richard E. Mayer
  • Efficiency in E-Learning by Ruth Colvin Clark, Frank Nguyen, John Sweller (2006)

Best-Selling Books

  • The Long Tail

Google Advertisements

Thursday, 08 June 2006

Good Info, Bad Design

Here is an example of e-learning for e-learning's sake.

I think a table would have been much more valuable, and I'd like a search capability. Also, what if I want to know whether to buy an organic tomato or not?

http://www.consumerreports.org/cro/food/organic-products-206/test-your-organic-iq/index.htm

The information may be good, but it's hard to get to.

Monday, 17 October 2005

Rewards for Learning? Be Careful!!

Jonathon Levy, currently Senior Learning Strategist at the Monitor Group, tells the story from his days as Vice President at Harvard Business School Publishing. The story is funny and sad at the same time, but it’s very instructive on several fronts.

Levy’s client decided that he would award end-of-year Christmas bonuses based on how successful his employees were in completing the Harvard online courses. Levy advised against it, but the client did it anyway.

The results were predictable, but they might never have been noticed if Jonathon’s Harvard team had not integrated into all their courses a tracking system to provide themselves with feedback about how learners used the courses. The tracking system showed that learners didn’t read a thing; they just scanned the course and clicked where they were required to click. They just wanted to get credit so they could maximize their bonuses.


Although very little learning took place, everyone was happy.

  • Learners were happy because they got their bonuses.
  • The client (training manager) was happy because he could show a remarkable completion rate.
  • Line managers were happy, because the learners wasted very little time in training.
  • Senior management was happy because they could demonstrate a higher rate of utilization of online learning.


What can we learn from this?

  1. Be careful what you reward, because you might get the specific behavior you reinforced (course-completion in this case).
  2. Completion rates are a poor measure of training effectiveness. Same as butts in seats.
  3. We need authentic measures of training effectiveness to prevent such silliness.
  4. Instructional-design activities benefit when good tracking and feedback mechanisms are built into our designs.