There are basically four types of software tools that can be used for developing instruments to measure learning. There are tools that are dedicated measurement-development tools, for example Questionmark's Perception. There are e-learning authoring tools that offer an assessment-development capability, for example Adobe's Captivate. There are learning content management systems, for example, Blackboard's Academic Suite. And finally, there are general purpose software-development tools, for example, Adobe Flash Professional. To put into a list form:
- Dedicated Assessment-Development Tools
- E-Learning Authoring Tools
- Learning Content Management Systems
- General-Purpose Software-Development Tools
When we asked e-learning professionals from the eLearning Guild membership about their use of tools in developing learning-measurement instruments, they told us an interesting story.
Specifically, we asked them, "What PRIMARY tool do you use to develop your measurement instruments?"
The most popular two answers were (1) we didn't use a tool, and (2) we used a tool developed in-house. See the graph below.
When we broke this down by corporate and education audiences, and looked at product market shares, other interesting findings appear.
Take a look at the corporate results, excluding all education and government respondents:
Adobe's Captivate dominates with over 50% of the market share—that is, over 50% of respondents said they used Captivate to develop their measurement instruments (they may have used other tools). Even more telling is that six of the top seven items are authoring tools or part of authoring tool suites. You read that right. Authoring tools are by far, by far, by far the way people develop assessment items in the corporate e-learning space. Only Questionmark's Perception and Adobe's Flash Professional sneak into the top nine responses before "Other" takes the tenth spot.
This makes sense if we assume, like a dismal economist, that people do what is easy to do. Our authoring tools remind us to add questions, so we add questions. It also tells me that maybe our field puts very little value on measuring learning if our behavior is so controlled by our surroundings that we don't look further than our authoring tools. Or, could it be that our authoring tools provide us with all we need?
Let's take a look at the education (with some government) results. (Note to those using the eLearning Guild's Direct Data Access capability: I filtered only for students, interns, academics, and practitioners).
The results for the education results are interesting as well, especially as compared with the corporate results. Note how many dedicated-assessment tools in the top ten. There are three (Respondus, StudyMate, and Questionmark's Perception). So perhaps educators care a little bit more about testing. Okay, that makes sense too. Still, there are a lot of e-learning authoring tools at the top, with Captivate dominating again.
The Leverage Point
The clearest conclusion I will draw from this data is that to improve our e-learning assessment practices, we need to do it at the one clear leverage point—at the one point that we seem to think about measurement the most—in our authoring tools. How might this work:
- Okay, we could just train people to create better measurement instruments with the idea that they'll use that information the next time they boot up their authoring tool.
- Better would be to train them to create better measurement instruments while they are using their authoring tool. And give them practice as well, with feedback, etc. You learning researchers will be chanting "encoding specificity" and "transfer-appropriate processing" and those of you who have ever had one of my workshops on the learning research will be thinking of "aligning the learning and performance contexts" to "create spontaneous remembering."
- Better would be to develop job aids indexed to different screen shots of the authoring tool.
- Better would be for the authoring tools to be seeded with performance-support tools that encouraged people to utilize better measurement practices.
Oh crap. The best way to do this is to get the authoring-tool developers to take responsibility for better measurement and better product design. Entrepreneurial minded readers will be thinking about all kinds of business opportunities. Hey Silke, how about giving me a call? SMILE.
Not much of this is going to happen anytime soon, is my guess. So, besides engaging someone like me to train your folks in how to create more authentic assessments, you're pretty much on your own.
And we know that's not going to happen either. At least that's what the data shows. Hardly anybody brings in outside experts to help with learning measurement.
I guess somebody thinks it's just not that important.
More on this as the series continues…
The data above was generated by a group of folks working through the eLearning Guild. The report we created is available by clicking here.
Here's some more detail:
The eLearning Guild Report
The eLearning Guild report, "Measuring Success," is FREE to Guild members and to those who complete the research survey, even if not a member.
Disclaimer: I led the surveying and content efforts on the research report and was paid a small stipend for contributing my time, however, I will receive nothing from sales of the report. I recommend the report because it offers unique and valuable information, including wisdom from such stars as Allison Rossett (the Allison Rossett), Sharon Shrock, Bill Coscarelli, (both of Criterion-Referenced Testing fame) James Ong (at Stettler Henke where he leads in efforts of measuring learning results through comprehensive simulations), Roy Pollock (Chief Learning Officer at Fort Hill Company, which is providing innovative software and industry-leading ideas to support training transfer), Maggie Martinez (CEO of The Training Place, specializing in learning assessment and design), Brent Schenkler (a learning-technology guru at the eLearning Guild), and the incomparable Steve Wexler (The eLearning Guild's Research Director, research-database wizard, publishing magnate, and tireless calico cat herder).
How to Get the Reports
1. eLearning Guild Measuring Success (Free to Most Guild Members)
- If Member (Member+ or Premium): Just Click Here
- If Associate Member, Take measurement survey, then access report.
- If Non-member, Become associate member, take measurement survey, then access report.
2. My Report, Measuring Learning Results: Click through to My Catalog
More tomorrow in the Learning Measurement Series...