Research often drives mission-critical decisions, so I hope you have taken care to get the previous steps correct before moving onto data analysis. If you came across this article in the series without reading the others, I would suggest you go back and do so. If your survey questions are poorly worded or your data collection process is flawed, your employee survey analysis will be incorrect. Once you reach a representative sample of collected data, the fun begins.
On to analysis:
Your hard work bears fruit. You have valid data. The responses to your survey questions are now represented by numbers, the distribution of responses can be plotted to show you where most respondents lie on the
Now, what do you do with this data? How do you turn the data into the information that you need to make good decisions?
Or maybe the answers seem to jump right out at you. But can you trust them?
In general you are searching for the answers that address your clearly defined objectives (you did have clearly defined objectives, correct? See Creating the Survey).
Now you begin to dig through the data to look for the root cause of any issues you may be having. You may begin to identify problem areas or make comparisons between departments or recognize achievements.
However, you must beware of the obstacles that befall the uninitiated researcher. Here are a couple of oversimplified examples of how your data can be misinterpreted:
Compared to what?
X number of people liked your customer service
Y number of people didn’t like you customer service
If X is larger than Y you are doing a good job, right? Well, maybe not.
Let’s say 70% of your customers liked your customer service. You might consider that a success. You may want to improve on that a little, but overall it’s a good number, right?
But what if 92% of your competitor’s customers liked their customer service?
Now your 70% number doesn’t look quite so good, does it? To avoid this mistake you need something to which you can objectively compare your results. This is where you need access to a normative or benchmarking database. This allows you to benchmark against other companies to see how you rate against the rest of your industry.
Not everyone has a normative database to measure against. In some cases you can purchase the norms that are specific to your industry or line of research. NBRI sells normative data (NBRINorms©) that allows you to make this comparison. Without this capability you will never be able to assess the true “goodness or badness” of your survey scores.
Are you sure that’s the issue?
Having the answer to “do you like our customer service?” does not do you much good unless you can identify what is causing some customers to dislike your customer service. Are your customer service people gruff? Do you have long hold times on inbound service calls? There could literally be hundreds of causes.
What you are looking for is the fewest number of issues that, when addressed will have the most impact on the most issues. These issues are root causes. Sometimes a minor correction can have a large impact on every aspect of your business. Correctly identify the root causes of the data from your survey research, take corrective action, and you could completely turn a business around.
However, sometimes what looks like a root cause may not be. Here is an example using ice cream sales and swimming deaths that will drive this point home.
Example: Let’s say you collect survey data that indicates that ice cream sales increase during the summer. You also have parallel data that indicates that swimming deaths increase during the summer. Does this mean that ice cream sales is a direct (or root) cause of drowning deaths or vice-versa? Of course not, obviously it is the summer heat that drives up both ice cream sales and swimming (and therefore swimming deaths). There is a correlation between ice cream sales and swimming deaths, but one does not cause the other. The root cause is summer heat.
The above example may not relate to your business, but you can see how easy it could be to come to an incorrect conclusion that when acted upon could adversely affect your business. Or at the very least, waste your time and resources trying to fix a problem that will not address your research objectives.
Due to the fact that every business and every survey is different, I cannot list all of the pitfalls you might come across in analyzing your data. Suffice it to say, take your time. Think long and hard. Ask other people. Department managers or coworkers can often provide you with another viewpoint. If necessary re-survey in order to clarify a certain question.
Or – hire a professional research firm to analyze your survey data.
The next installment in this series will discuss taking action to correct the root causes of your survey data.