Sometimes, the answer is a simple question away.
In this, the final part of the “Applying Research for Effectiveness” series, we focus on quantitative research, which collects information through surveys.
Popular online platforms now dominate other ways of conducting surveys (paper questionnaires) for a good reason. Not only do these platforms automatically tabulate responses, but SurveyMonkey, Survey Gizmo, Survey Planet and SoGoSurvey also provide easy to build and manage solutions at little to no cost.
Design the questionnaire
Your goal is for people to complete your survey. First rule: Keep it short and simple. Ask colleagues to review or answer the questions as a test prior to distribution to participants.
As you craft your survey, carefully consider the questionnaire’s structure and strategy to avoid inadvertent bias. Many things can skew responses, including the phrasing of questions and answers. Use these tips to avoid potential pitfalls.
Start simply. Avoid lengthy intros and questions with complex ratings. Open-ended questions require deeper thought. Respondents tend to drop out if surveys begin hard. Instead, start with a simple multiple-choice question to draw in your audience. According to Survey Monkey, these are the most successful tactics:
- Begin with screening questions. If you’re targeting a specific group of people, use screening questions at the beginning of your survey. That way, you qualify or disqualify respondents early, reducing wasted efforts.
- End with demographic questions. To avoid early abandonment, build trust and engagement before asking sensitive personal demographic questions, i.e. household income and sexual orientation. Note: if you’re using a demographic question as a screener, it’s okay to include those at the beginning of the survey.
Limit page breaks. Page breaks mean extra clicks for respondents. However, there are times when page breaks are necessary, i.e. separating themes of your survey.
Hide the previous button. Surveys often evaluate first impressions or awareness, so it’s important that people can’t go back and change their initial answers.
Avoid confusing options. There should be no overlaps in answer options. Don’t repeat numbers in ranges.
- Example: How much time do you spend watching TV in a typical week? 1-3 hours? 3-6 hours? 6-9 hours?
- This question is poor because the options overlap, and it doesn’t account for people who watch TV less than one hour or more than 9 hours.
Balance answer options. For best results, include five to seven “how much do you agree or disagree with ...” (Likert questions) to offer equal positive, neutral and negative answers.
Give respondents an easy way out. Assure that people who take your survey are able to answer questions, even if the survey question is irrelevant to them. Include answers for “other,” “none of the above,” “prefer not to answer” or “never.”
Avoid jargon. Never assume the people taking your survey understand common denominational acronyms or jargon. Bad data results from misunderstood questions. Craft wording as clearly and simply as possible. Explain necessary non-standard terms with examples.
- Example: How often do you use streaming services?
- This question is poor because the respondent may not understand whether it’s audio or video or both. Ask instead: How often do you watch videos using streaming services like Netflix or Hulu?
Avoid double-barreled questions. Be careful when using conjunctions such as “and” and “or” in your survey questions, Using them implies that you are actually asking two questions in one. Stick to one topic per question, or split your question into two parts.
- Example: How useful would this app be for getting your daily news and accessing popular media?
- This question is poor because it asks about daily news and popular media. Does the respondent’s answer refer to daily news or popular media? Both? You don’t know.
Randomize the order of answers. Even the sequence of answers can influence response. Reorder options to avoid bias. Exceptions: Don’t randomize strongly agree/disagree (Likert questions) or if an alphabetical arrangement is more logical, i.e. US states.
Launch the research
When: The best time to begin a survey depends on project goals and plans. However, statistics (and common sense) show that more people take surveys during the workday than at night or on weekends or holidays.
How: The most common way to distribute a questionnaire is to send an email with a link to the survey to your identified audience. For a broader reach, include a link to the survey on the church website and social media and in your e-newsletter or bulletin.
Analyze the data
Assure that you’re working with quality data before beginning to evaluate it.
Wait until all surveys are returned. It’s exciting to peek at the number of completes and data as responses are tallied in the online survey tool. Be careful, though! Wait until the survey closes before beginning your analysis.
Clean your data. Once the survey closes, a quick cleanup is worth the time. Deleting or filtering responses from people not meeting your criteria or from people who partially completed the survey, improves the remaining data’s reliability.
Things to look for when cleaning up survey data:
- Incomplete responses.
- Speeders: respondents who completed your survey too quickly.
- Unrealistic answers: 5,000 hours of TV per week? Yeah, right….
- Gibberish in open-ended responses: It’s possible that someone doesn’t have an answer for the question. Write-ins like “none,” “n/a” or misspellings aren’t a sign of poor data quality. Keep those.
Write the report
Yes, you have to take the data (there will be a lot of it), and write a formal report. It should Include background information, objectives, findings and recommendations. Summarize your methodology summary to provide context for your result. A survey methodology summary should contain:
- The data collection method
- Timeline of survey (date of launch to date of completion)
- Sample size
- Country/location
- Age range of participants
For example, This study was fielded using SurveyMonkey from June 20 to June 22, 2019, with a U.S. sample of 1,000 adults age 18+.
Wrap up the research
Go on a research roadshow, presenting findings and recommendations to the project team and key stakeholders. Then follow up, follow up, follow up! One of the best ways to ensure your recommendations are implemented is to hold your team and stakeholders accountable. Make a plan to regroup with the team as needed to check in on action items. Ask stakeholders:
- What actions were taken?
- What was the business outcome?
- If no actions were taken, why not?
- How can our research be more actionable next time?
Quantitative research will help guide strategy, avoid mistakes, build credibility and show you where to go next. Give it a try.
Teresa Faust is the Senior Manager of Research and Metrics at United Methodist Communications. She spent most of her career in advertising research, predominantly working with consumer packaged goods companies at a large market research vendor. She grew up in Dayton, Ohio, and relocated to Nashville, Tennessee, when she joined UMCom in 2014. She has two adult sons and enjoys reading and walking. Contact her »