Webinar recap: How Non-Native English Speakers Respond to English Surveys

webinar recap: how ESL speakers respond to surveys
Last week, Annie Pettit from Peanut Labs gave a really fascinating presentation on how non-native English speakers respond to English-language surveys. (For the remainder of this post, I’ll be referring to these two groups as ESL and native English speakers.) Here are my take-aways, with some of my own thoughts mixed in, since this really caused me to reflect on my market research work to date.

Why this topic is important

Annie shared some stats that surprised me, and caused me to realize that, unless it’s a global study, I generally have paid no attention to whether my respondents are ESL or not. In fact, my assumption has always been that my respondents are fluent enough in English to respond to the survey. Nor did I take any time to consider what might happen if an ESL respondent took the survey. But here is a fact that Annie presented that surprised me: 20% of those in the United States don’t speak English at home; 9% of US residents don’t speak English well. That should not have surprised me (my mother is from Central America), but this is not something that is ever addressed when creating surveys (at least, not in my experience so far with product, satisfaction, and brand surveys). I have focused so much on the survey questions being clear, concise, etc., without really considering the full audience of my surveys.

While keeping in mind the ESL audience is important for any survey, it’s especially important for surveys on social policy topics, or political polling, or any survey that is being conducted to gather information that would directly impact this demographic.

Data cleaning tests

While keeping survey language clear and concise is a rule of thumb for any survey, there are some other elements to survey design that can make things a bit easier for ESL survey participants.

Annie reported on a 12-minute survey that was conducted. The survey included a question asking for respondents to report on their own English speaking skills. The questions were pretty generic, nothing super elaborate. She then applied a number of data quality tests.

  • Over-clicking: native English speakers were fairly click-happy, especially compared to ESL respondents.
  • Red herrings: ESL speakers were more likely than native speakers to select red herrings that were included in check-all-that-apply lists.
  • Following instructions: ESL speakers were a little more likely that native speakers to not have followed directions correctly. This was tested on a question that did not check to be certain that respondents chose no more than 3 items in a list of check-all-that-apply options.
  • Satisficing: ESL participants were more likely to choose “None of the above” when listed as an option in a list. This can be seen as “satisficing” which means avoiding options that might be considered socially unacceptable.
  • Open-ended: ESL participants were more likely to skip open-ended questions or to provide shorter answers than native English speakers.
  • Time spent taking the survey: ESL participants were more likely to be both faster and slower at taking the survey, compared to native English speakers.

Straightlining was also a data quality measure Annie applied, but without significant differences between ESL and native English speakers.

So, which tests won’t skew towards eliminating ESL participants from your data set?

  • Over-clicking.
  • Straight-lining.
  • Not following instructions.

These tests were among those that ended up not being a strong indicator of language skill among respondents.

Final take-aways

When it comes to survey design, using clear language is always a good rule of thumb. Additionally, consider if you’re creating questions that are going to make it more difficult for ESL respondents to take the survey. This includes things like including red herrings in your lists, asking too many open-ended questions, and not including simple validation such as that required to make sure respondents are following directions like, “Select a maximum of X answers from the list.” Add to that making sure that you aren’t not using data quality tests that naturally skew against ESL participants, and you can make for a better overall experience for ESL respondents to your surveys.

About Zontziry (Z) Johnson

Z's passion is learning about and sharing best practices and new trends in market research (MR), from writing the best questionnaire possible to crafting a story that will resonate with stakeholders. Follow her musings on the MR industry on Twitter (@zontziry).

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.