It's almost impossible to read an HR or business publication without encountering an alarming statistic - like only 10% of the global workforce is engaged - for example. When you see these statistics it’s important to realize that you can't always take them at face value.
Context can change any statistic
Whenever you look at a figure it’s important to have some context to it. If a statistic says that 10% percent of the workplace is engaged, you need to understand how those people have been classified. Alarming statistics can be made simply by creating your own definitions. Without any information about how the scores were calculated or how people (or their answers) were classified it’s hard to make even basic sense of the statistic.
A lot of statistics have thresholds - for example, to be classified as ‘engaged’ you might have to answer above a certain threshold on 12 separate questions. Based on this definition, you may only have to answer below the threshold on one of those questions to be considered not engaged. In this case, to create an artificially low or high score all you need to do is change a question or the number of questions you use to measure something. Does that really mean you are engaged or disengaged? Or maybe the definition is designed to create alarming statistics?
At Culture Amp, we always try to explain what our measures mean as transparently as possible. For example, you can read about our five key engagement questions and the specific way we interpret answers to questions using a Likert scale in our Academy.
It’s also important to keep in mind that your engagement score depends on which questions you ask. If your company changes a question you can only compare your engagement score against companies that asked the same questions. So if your company asks three specific questions, we’ll only compare you to the average scores for those three questions. This ensures you’re comparing apples to apples.
It’s also important to understand that when you say a company has an average engagement score of 70% that’s not the same as saying that 30% of people are disengaged. To say that we would first need to define what a disengaged person is and then measure that.
The validity of a statistic comes from its results
The validity in a score doesn’t come from the score itself, but rather from you being able to see results from it. Just saying that your organization has 70% engagement is not particularly helpful, but if we can show you that having a higher engagement score can predict your Glassdoor score or a rise in your stock price then the score has validity and is more meaningful.
It’s important to create a compelling narrative with data if you want people to be spurred into positive action. So whenever you see an alarming statistic always take it with a grain of salt unless there’s transparency around how the score was arrived at. Without that context or any validity, there’s no reason to be alarmed