We, the members of the workplace learning-and-performance field, think about our field from time to time. Of course we do.
Here's what I wonder. How much have our mental models changed in the last 50 years?
That's too big a question for me to answer right now, but it does raise an interesting question. Today, as I was reading an scientific article (citation below) on how people have insights, the authors reported that, in their study of naturalistic insights, most sudden insights occured when people looked at new data--after having spent a great deal of time before the new data thinking about the issue.
Here's the thing: We, in our field, haven't really created new data sets or methodologies too often. Yes, we have Jack Phillips ROI methodology and Robert Brinkerhoff's Success Case Method--both of which got many of us to rethink what we're doing--but these methods have been at the results end of the causal chain from learning to performance to results. Important stuff--there is no doubt--but not enough.
When it comes to getting new data about learning engagement, remembering, and on-the-job application; we haven't seen much innovation in our field.
If the research on insight is right, then without new data (or new methods to gather data in the case of an industry-wide perspective) we will not have breakthrough insights about how to improve our training and other learning interventions.
We need to continue working toward better data-gathering methods.
I think the Performance-Focused Smile Sheet offers a glimmer of hope, but the recent infatuation with benchmarking does NOT.
What have you done recently to help your company get better data about your learning-and-performance initiatives?
Citation for article that triggered this insight:
Klein, G., & Jarosz, A. (2011). A naturalistic study of insight. Journal of Cognitive Engineering and Decision Making, 5(4), 335-351.