Onboarding is ubiquitous. Every organization does it. Some do it with great fanfare. Some make a substantial investment. Some just let supervisors get their new hires up to speed. Unfortunately, most organizations make critical mistakes in onboarding—mistakes that increase turnover, raise costs, weaken employee loyalty, and lower productivity.
Fortunately, recent research highlights onboarding best practices. If organizations would just use the wisdom from the research, they’d save themselves money, time, and resources—and employees in those companies would have to deal with many fewer headaches.
Recent reviews of the research suggest that there four key outcomes that enable onboarding success:
New hires have to quickly and effectively learn their new job role.
New hires have to feel a sense of self-efficacy in doing their job.
New hires have to learn the organizational culture.
New hires have to gain acceptance and feel accepted by their coworkers.
Recent research suggests that the following factors are helpful in ensuring onboarding success:
What New Hires Can Do
Be proactive in learning and networking
Be open to new ways of thinking and acting
Be active in seeking information and getting feedback
Be active in building relationships
What the Organization Can Do
Ensure that managers take a very active and effective role
Provide formal orientations that go beyond information dissemination
Provide realistic previews of the organization and the job
Proactively enable new hires to connect with long-tenured employees
Five Biggest Mistakes (In Reverse Order of Importance)
5—Providing an Information Dump during Orientation
The research shows that employee orientations can facilitate onboarding. However, too many organizations think their orientations should just cram tons of information down the throats of employees. Even worse are orientations that have employees sit and listen to presentation after presentation. Oh the horror. New employees are excited to get going. Putting them into the prison of listening—even to great content—is a rudeness that shouldn’t be tolerated. The best orientations help build relationships. They get employees involved. They prepare new hires for how to learn and grow and network on their own. They help new hires learn the organization culture—both the good and the bad. They share the organization’s vision, passions, and its strategic concerns.
4—Thinking that Training is Sufficient
Training can be essential to help get new employees competent in their new roles, but it is NEVER sufficient on its own. Training should be supported by prompting mechanisms (like job aids), structures and support for learning on the job, reinforcement and follow-through, and coaching to provide feedback, set goals, and lend emotional support.
3—Forgetting the Human Side of Onboarding
New hires are human beings, and, just like the rest of us, they too are influenced by the dynamics of social interaction. They don’t just learn to do a job. They also learn to love and trust a company, a work unit, or a group of coworkers—or they don’t. In return, new hires are either trusted and respected by their coworkers or they’re not. The research is very clear about this. One of the keys to successful onboarding is the strength of the relationships that are built in the first year of a person’s tenure. The stronger the bonds, the more likely it is that a person will stay and bring value to the organization.
2—Considering Onboarding as Something that Can Be Done Quickly
Some companies offer a one week orientation and then cut loose their new hires to sink or swim. Enlightened companies, on the other hand, realize that onboarding is like relationship-building—it takes time. It takes time to really learn one’s job well. It takes time to integrate into the organizational culture. It takes time to connect with people. Realistic estimates suggest that onboarding can take 6 months, 12 months, or even 18 months to fully integrate a person into a new organization.
1—Not Preparing Supervisors
Supervisors are the single most important leverage point for onboarding success. You’ve probably heard it said that people don’t quit their companies, they quit their supervisors. Well, the flip side can also be said. People don’t join a company, they join a supervisor and his/her workgroup. Unfortunately, most supervisors just have no idea about the importance of onboarding and how to do it correctly. Where best practices give supervisors training and an onboarding checklist, too many supervisors just wing it. The real tragedy is that the investment in onboarding training and a checklist for supervisors is quite small in the greater scheme of things.
Final Thoughts on Onboarding
As a workplace learning-and-performance consultant, when I’ve been called in to advise companies on their onboarding programs, I often see incredibly dedicated professionals who are passionate about welcoming new people into their organizations. Unfortunately, too many times, I see organizations that have the wrong mental models about what makes onboarding successful. It’s a shame that our old mental models keep us from effectiveness—when the research on onboarding now gives us sound prescriptions for making onboarding successful.
I know I'm going completely against most training-industry practice in saying this, but it's the truth. Likert-like scales create poor data on smile sheets.
If you're using questions on your smile sheets with answer choices such as:
Neither Agree Nor Disagree
You're getting data that isn't that useful. Such questions will create
data that your stakeholders--and you too--won't be able to decipher very
well. What does it mean if we average a 4.2 rating? It may sound good,
but it doesn't give your learners, your stakeholders, or your team much
information to decide what to do.
Moreover, let's remember that our learners are making decisions with every smile-sheet question they answer. It's a lot tougher to decide between "Strongly Agree" and "Agree" than between two more-concrete answer choices.
and Bill Coscarelli, authors of the classic text, now in its third edition, Criterion-Referenced Test Development,
offer the following wisdom: On using Likert-type
Descriptive Scales (of the kind that use response words such as “Agree,”
“Strongly Agree,” etc.):
resulting scale is deficient in that the [response words] are open to many
interpretations.” (p. 188)
So why do so many surveys use Likert-like scales? Answer: It's easy, it's tradition, and surveys have psychometric advantages often because they are repeating the same concepts in multiple items and they are looking to compare one category to another category of response.
Smile sheets are different. On our smile sheets, we want the learners to be able to make good decisions, and we want to send clear messages about what they have decided. Anything that fuzzes that up, hurts the validity of the smile-sheet data.
When asked for a simple heuristic in how to use the spacing effect--the
finding that repetitions spaced in time are more effective in
supporting remembering than repetitions squished narrowly in time--I've
often told people that the ideal spacing interval is one that will equal
the retention interval one desires. If you want you learners to
remember for a month, give them one-month spaced repetitions. If you
can't do that, longer is better, and there seems to be something magical
about repeating something overnight.
But new research suggests that even short spacings of only
half-a-minute or so can have lasting benefits over non-spaced
Rawson, K. A., & Dunlosky, J. (2012). Relearning Attenuates the Benefits and Costs of Spacing. Journal of Experimental Psychology: General. Advance online publication.
I few quick comments (before I've read the actual science that Popular Science cites and other related research, which, I must add, I find a fascintating topic):
1. If Popular Science is using only one or two studies to draw conclusions, they don't understand social science. Specifically, they don't understand that social science research generally requires--at a minimum--dozens of studies to draw firm conclusions, fence of boundaries, and discover contingencies. Not always, but usually.
2. I agree that society today is getting more and more anti-science, anti-evidence, and anti-wisdom.
3. I agree that there is justification for worrying about the effect of social-media pollution. As a guy who reads over 200 articles from scientific refereed journals on learning, memory, and instruction each year--and thus who probably knows more than the average bear in my field--I've seen lots of categorically-wrong information floated in social-media comments in our field. Of course, I am not infallible, all-knowing, nor omnicient. Anyone who reads the research knows how little of the whole he/she can possibly know. But still, I do know enough to know when some notions of learning are fundamentally flawed. AND in the workplace learning-and-performance field, there is much that is foolhardy, misinformed, and harmful--and social media has not stopped this from happening.
4. There are some victories, however, even if they are not complete. For example, social-media and the internet have made it less likely that people in our field are spouting off about people learning 10% of what they see, etc. Maybe I have made a difference.
5. I know that comments have been helpful to me personally in other contexts. For example, the New York Times comments have been very helpful to me in seeing the strengths and weaknesses of the original article.
6. I got rid of unmoderated comments on my blog purely due to the large amount of spam that was being posted. Most commenters here have helpful things to say.
6. Popular Science argued specifically that people in general are misinformed about science, lending credence to the idea that comments on vetted scientific articles for a popular audience may be a special case.
7. Popular Science also argued (see the NPR interview) that they made the decision because they would rather put their resources into creating good articles in the first place rather than moderating their comment sections.
8. I wish someone who studies this issue intensively would create a rubric, helping us understand when and how comments can be valuable--and when they cause more harm then good. It can't be black and white--comments are good, comments are bad. Most things in human nature don't work like that.
9. For the workplace learning-and-performance field, my recommendation to you is: Don't assume that comments are good or comments are bad. Do assume however, that you may need a way to regulate, monitor, or control comments to make them helpful. I'll never forget the time that I was arguing with a social-media evangelist who was claiming that social media was always corrective in time. A member of the audience interrupted with the story of how social media killed a couple of soldiers when they used information from social-media to attempt to deal with an improvised explosive device.
The Northern New Jersey ASTD Learning Leaders Forum has invited me to speak at their October 15th meeting. To whet the appetite, we created the following video interview, separated into separate videos.
I'd like to send special thanks to Tony Irace--a long-time colleague and a great learning leader--and his co-conspirator at the Northern New Jersey ASTD Learning Leaders Forum, Meg Paradise. Thanks for your interest in my work and for organizing this great event!!
We, the members of the workplace learning-and-performance field, think about our field from time to time. Of course we do.
Here's what I wonder. How much have our mental models changed in the last 50 years?
That's too big a question for me to answer right now, but it does raise an interesting question. Today, as I was reading an scientific article (citation below) on how people have insights, the authors reported that, in their study of naturalistic insights, most sudden insights occured when people looked at new data--after having spent a great deal of time before the new data thinking about the issue.
Here's the thing: We, in our field, haven't really created new data sets or methodologies too often. Yes, we have Jack Phillips ROI methodology and Robert Brinkerhoff's Success Case Method--both of which got many of us to rethink what we're doing--but these methods have been at the results end of the causal chain from learning to performance to results. Important stuff--there is no doubt--but not enough.
When it comes to getting new data about learning engagement, remembering, and on-the-job application; we haven't seen much innovation in our field.
If the research on insight is right, then without new data (or new methods to gather data in the case of an industry-wide perspective) we will not have breakthrough insights about how to improve our training and other learning interventions.
We need to continue working toward better data-gathering methods.