In the eLearning Guild report I worked on with several other brilliant authors (SMILE), we asked e-learning professionals whether they were happy with the learning measurement they were able to do. Here's what they said (All the data reported in this blog post is for respondents who create e-learning for workers in their own organizations. The Guild's powerful database technology makes it possible to split the data in different ways).
In general, are people able to do the learning measurement they want to? See the graph below.
Only about 17 percent were happy with their current measurement practices. About 73 percent wanted to be able to do MORE or BETTER measurement. Clearly there is a lot of frustration.
In fact, one of the top reasons people say they can't do the measurement they want to is they don't have the knowledge or expertise to do it. It's in a virtual dead heat for the third most important reason given. See the diagram below.
The question then becomes, if people don't have the expertise to do measurement they way they want to, do they hire expertise from outside their organizations? A full 88.8% said they did all their measurement themselves. Wow! The graph below could not be more striking.
When we asked people what kind of expertise they do utilize—whether it was in house or contracted—they told us the following (I added a color-coded legend at the top):
Most of the folks doing learning measurement are instructional designers and developers with no particular expertise in measuring learning. A full 84% of respondents indicated that non-expert instructional designers are doing measurement at their organizations. Only 51.7% of respondents said their organization uses instructional developers with some advanced education. Only 20% of organizations have masters-level degrees in measurement. Only 6.7% utilize doctoral-level experts.
Note that when we look only at organizations that claim to be getting "high value" from doing learning measurement versus all the others, the results are intriguing.
Respondents Reporting the Level of Value | ||
Less Than High Value |
High Value for Measurement | |
Percentage of Respondents saying they Utilize People with |
16.4% |
31.4% |
Percentage of Respondents saying they Utilize People with |
5.6% |
10.1% |
Percentage of Respondents saying they Utilize People with |
3.7% |
11.8% |
Percentage of Respondents saying they Utilize People with |
3.4% |
6.1% |
Wow!! Those folks who think they are getting very high value for their measurement efforts utilize people with masters degrees on staff more than 91% more compared with those reporting less than high value for their measurement efforts. The high-value people also utilize more doctoral degrees on staff compared to the non-high-value people by 80%, more masters degrees hired from outside by 219%, and more doctoral degrees hired from outside by 79%. While this data is correlational and self-report data, it suggests some sort of relationship between the measurement expertise employed and the level of value an organization can get for their learning measurement.
Summary
To recap, in a survey of over 900 e-learning professionals, many are frustrated because they want to do more/better measurement. A significant portion of their frustration results from not having the expertise to do measurement correctly. They have very few high-level measurement experts on staff, and they hire almost nobody from the outside to help them.
What the hell is wrong with this picture?
It confirms for me that learning measurement is just not given the importance it deserves.
More to come tomorrow in the Learning Measurement series...
Will,
Thanks for breaking down some of the data for us.
I've just reviewed the series posts so far. The hard data is reassuring, but not yet very surprising.
Yes, better educated practitioners can effect better learning evaluation and educate stakeholders, which leads to more valuable evaluation results. But we work in an non- (or anti)-intellectual context (country?). I think this is one of the reasons there are few Masters on the job and that they are more rarely hired as consultants to improve measurement. So many businesses, big ones in particular, succeed despite the things they do not do well. This is one of them.
I'm a big fan of evaluation and urge it on clients by linking its implications to action planning, recruiting, coaching, and talent management to make data work for managers. Even among people who buy into that logical link, it is an uphill road to convince them to do the hard work and use the data. It points toward a need for long-term planning and organizational development, all so far beyond the scope of where "training" is supposed to be playing.
Keep up the good work. Looking forward to the holy-crap,-I-didn't-see-that-coming moment.
Cheers, John
On the blog itself, fyi: I can see the width of the charts in my feed reader, but not in the post. How about a click-through to view the chart? Or something?
Posted by: John D Roberts | Wednesday, 19 December 2007 at 07:42 AM