Survey design tip: how will you slice the data?

Courtesy geralt on pixabay

Courtesy geralt on pixabay


When it comes to the survey design, things can get complicated quickly. (I think there’s an exponential curve to the level of complication around survey design based on the number of individuals involved in the survey design.) The aim of these survey design tips is to help you reduce the level of complication, regardless the number of people involved. So, let’s look at another pre-writing-the-survey tip: deciding on the analytics.

How will you slice it and dice it?

Last week, I talked about the importance of deciding on your audience for your survey. One element that can affect your audience is what types of data analysis you intend to use. The audience can also affect your data analysis (especially when it comes to data cleaning, as the linked recording from a recent webinar by Annie Pettit, Chief Research Officer at Peanut Labs shows).

Linking your survey purpose with analytics design

My very first survey tip had to do with focusing in on the purpose for your survey. This will naturally affect the audience that you decide on targeting with your survey, as well as the data you want to collect. However, equally important to deciding what data you need to collect to answer your business question is the analytics you’ll conduct. Many DIY survey software providers have built-in high-level analytics — quick bar charts and pie charts to give you a quick view of the data collected. These quick charts are great, but taking your data analysis even one step farther can produce some real insights.

One survey I’d conducted about a year ago dealt with the public’s perception of Massive Open Online Courses, or MOOCs. When deciding on the content of the survey, one of the pieces of analytical data I wanted to be able to do was see if age had anything to do with the way that MOOCs were being used. I was also interested in how an individual’s current level of education affected the use or openness to taking a MOOC. This affected a few things about the survey design: where I would put those demographic questions (I didn’t want them up front for fear people would think the answers affected whether they could continue with the survey); the granularity of the demographic questions (especially for age brackets); the types of questions being asked (write-in versus a select one answer question).

Deciding on those analytics

How can you decide what analytics to include? Take your survey purpose, decide what data you think will help you answer the question, and then, if you have stakeholders, think what their questions are going to be when you present the findings. If you don’t have stakeholders to answer to, consider what you think the findings will say, and then decide how important it will be to do things like understand various groupings of survey participants, see where there is overlap between answers to questions, and how many variables you want to include in your analysis. For example, for the MOOC survey, I wanted to know things like: did people take courses from more than one provider; was education level or age a factor in the likelihood to participate in MOOCs; were MOOCs being used by students already in college or were they being used more by people in careers to help fill in knowledge gaps? You might also be looking at more advanced analytics such as correlations which could help answer the question: are these two variables correlated, and if so, is it a positive correlation (one goes up, so does the other) or a negative correlation (one goes down, so does the other) and how strongly are they correlated?

Data cleaning

Perhaps an oft-overlooked aspect of data analytics is data cleaning (at least for DIY survey designers or those new to the market research industry). Data cleaning is a very valuable aspect of preparing the data for analysis. For example, in a previous role, I would check first for responses from those within the company (something I could determine by a combination of location and ISP address). (This was especially helpful when I suddenly saw the customer satisfaction rating drop dramatically; turns out some of the company employees had been testing the survey and selecting the lowest scores on the rating scales!) Other tests include checking for straight-lining (in a matrix question, did the participant just select all the options in one line?); time taken to complete the survey (you should have a general idea for the length of your survey and can eliminate those who seemed to have breezed right through it – an indication they likely weren’t being thoughtful as they completed the survey); over-clicking (does it seem like too many of the “choose all that apply” options were selected?). These tests, when applied, can help you be sure that you’re only including valid responses in your survey, especially if you’ve sent your survey to a panel, where respondents might be more likely to finish surveys as fast as possible in order to obtain rewards.

Can I start writing my survey already?

Let’s see, we’ve focused the purpose of the survey. We’ve decided on the audience. And now we’ve decided on the data we’ll want to be collecting so that we can do some specific cuts of data afterwards. I’ve already shared some tips on how to write unbiased survey questions, but next week, I’ll also look at question types and some ideas for writing device-agnostic surveys.

1 thought on “Survey design tip: how will you slice the data?

  1. Pingback: Survey Design Tips from MRXplorer.com

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.