I am a big fan of public transportation. It gives me the chance to people-watch, nap, read, listen to podcasts, and generally avoid being stuck in traffic. Recently, I got a bit of a market research lesson on my bus ride to work.
He spent how long on that survey?
I sat on the packed bus, looking around to note the number of people on their phones. I happened to glance at the phone screen of the person next to me and noticed he was taking the “I side with” 2016 political quiz. I thought, “Ah, yes, I remember taking that last election and seeing my friends’ answers on Facebook. I didn’t realize they had updated it. I wonder how he’s answering.”
Then, as we continued our journey, I realized that the quiz was actually fairly long – or at least was taking him a long time.
Now, as a market researcher, we hear time and time again how surveys need to be shorter, packing a punch, only asking the most pertinent questions. When it comes to mobile, this is an even more ubiquitous refrain.
But here’s the thing: we were on that bus for about 30 minutes. 25 of those minutes were spent by this individual taking one survey.
What I noticed
I couldn’t help but notice a few things about this experience. (Please note: I did NOT read his answers as he was taking the quiz. I was far enough away to see some general things.)
- The survey was mobile-friendly. It was easy to read, with only single-choice options being asked, and when ratings were presented, it wasn’t on sliding scales, but simple scales with buttons clearly marked for each option on the scale.
- He was captivated by the survey. I noticed him scrolling up and down, reviewing answers given previously, seeming to read through every possible option before answering. He was definitely an engaged respondent.
- The survey has a single purpose: to tell you which of the many political parties in the United States you might agree with most.
- To accomplish this purpose, the survey is split into very clearly defined sections. Each question within a section seemed to offer three options, one of which would expand to show more options. When I later went to the quiz to see it myself, I was intrigued by the fact the answers were such that you were presented with either of the polar opposite opinions, then given “Other stances” as the option to see more nuanced variations.
- He never seemed to get tired of the survey.
- The results looked fairly extensive, using charts to plot where one’s stances on groups of social issues seemed to be.
- If your survey is about something that the respondent is passionate about, they are likely to be more willing to spend more time on the topic.
- This survey was truly device-agnostic. It rendered differently on mobile than it did on my laptop, and both were well-laid out.
- It’s possible to keep everything on a single page, and still keep someone interested in the survey.
- The results were easy to review.
My own personal opinion is that this quiz/survey is popular in part because it tells us something about ourselves, and can even help us learn about political parties we may not have been aware of previously. But it did make me consider why people would never think of spending that sort of time on most market research studies. The answer is fairly easy: the topics typically aren’t nearly as compelling. Frankly, they don’t always have to be, either. But occasionally, when done right, and when the topic IS compelling, you can get people to spend more than 5 minutes taking a survey on any device.