Learning in the Citizenry
Learning is a many-splendored thing. Want evidence? Consider the overabundance of theories of learning. Greg Kearsley has a nice list. To me, this overabundance is evidence that the human learning system has not yet been lassoed and cataloged with any great precision. Ironic that DNA is easier to map than learning.
Being a political junkie, I'm fascinated with how a population of citizens learns about their government and the societal institutions of power. Democracy is rooted in the idea that we the citizenry have learned the right information to make good decisions. In theory this makes sense, while in practice imperfect knowledge is the norm. This discussion may relate to learning in the workplace as well.
Take one example from recent events. On September 11th, 2001, the United States was attacked by terrorists. The question arose, who were these terrorists? Who sent them? Who helped them? One particular question was asked. "Was Saddam Hussein (dictator of Iraq) involved?" I use this question because there is now generally-accepted objective evidence that Saddam Hussein was not involved in the 9/11 attack in any way. Even President Bush has admitted this. On September 17th, 2003, Bush said, in answer to a question from a reporter, "No, we've had no evidence that Saddam Hussein was involved with September the 11th." Despite this direct piece of information, the Bush administration has repeatedly implied, before and after this statement, that the war in Iraq is a response to 9/11. We could discuss many specific instances of this---we could argue about this---but I don't want to belabor the point. What I want to get at is how U.S. citizens learned about the reality of the question.
Take a look at polling data, which I found at PollingReport.com. I've marked it up to draw your eyes toward two interesting realities. First, look at the "Trend" data. It shows that we the citizens have changed our answer to the question asked over time. In September of 2002, 51% of Americans incorrectly believed that Saddam was personally involved in September 11th. Last month in October or 2005, the number had dived to 33%. The flip side of this showed that 33% correctly denied any link between Saddam and 9/11 in October of 2002, while today the number is a more healthy 55% correct, but still a relatively low number. If we think in terms of school-like passing-grade cutoffs, our country gets a failing grade.
The second interesting reality is how different groups of people have "Different Realities" about what is true. You'll notice the difference in answering these questions between Republicans and Democrats.
These data encourage me to conclude or wonder about the following:
- Even well-established facts can engender wide gaps in what is considered true. Again, this highlights the human reality of "imperfect knowledge."
- Stating a fact (or a learning point) will not necessarily change everyone's mind. It is not clear from the data whether the problem is one of information exposure or information processing. Some people may not have heard the news. People who heard the news may not have understood it, they may have rejected it, or they may have subsequently forgotten it.
- Making implied connections between events can be more powerful than stating things explicitly. It is not clear whether this is also a function of the comparative differences in the number of repetitions people are exposed to. This implied-connection mechanism reminds me of the "false-memory" research findings of folks like Elizabeth Loftus. Are the Republicans better applied psychologists than the Democrats?
- Why is it that so many citizens are so ill-informed? Why don't (or why can't) our societal information-validators do their jobs? If the media, if our trusted friends, if our political leaders, if our religious leaders, if opinion leaders can't persuade us toward the truth, is something wrong with these folks, is something wrong with us, is there something about human cognitive processing that enables this disenfranchisement from objective reality? (Peter Berger be damned).
- I'm guessing that lots of the differences between groups depends upon which fishtank of stimuli we swim in. Anybody who has friends, coworkers, or family members in the opposing political encampment will recognize how the world the other half swims in looks completely different than the world we live in.
- It appears from the trend data that there was a back-and-forth movement. We didn't move inexorably toward the truth. What were the factors that pushed these swings?
These things are too big for me to understand. But lots of the same issues are relevant to learning in organizations---both formal training and informal learning.
- How can we better ensure that information flows smoothly to all?
- How can we ensure that information is processed by all?
- How can we ensure that information is understood in more-or-less the same way by all?
- How can we be sure that we are trusted purveyors of information?
- How can we speed the acceptance of true information?
- How can we prevent misinformation from influencing people?
- How can we use implied connections, as opposed to explicit presentations of learning points, to influence learning and behavior? Stories is one way, perhaps.
- Can we figure out a way to map our organizations and the fishtanks of information people swim in, and inject information into these various networks to ensure we reach everyone?
- What role can knowledge testing, performance testing, or management oversight (and the feedback mechanisms inherent in these practices) be used to correct misinformation?