The purpose of creating a program survey is to collect the data you need to improve your decision-making, maximize client impact, and improve your performance. Therefore, it is important that you follow survey best practices to avoid wasting time and producing irrelevant data.

We’ve developed the following 7 key principles to help you maximize your survey response rate and collect data that is useful, unbiased, thorough, and actionable:

1. Set your objectives

Your programs have objectives and your surveys should be built to reflect and measure the achievement of these objectives. Don’t just blindly jump into the survey creation process without asking yourself a few questions first.

Before you begin writing your survey questions, you and your team should determine what your objectives are:

  • What are the objectives/goals of the program?
  • How are you going to evaluate the quality of your program?
  • What kinds of changes are you hoping to produce? How will you measure these changes?
  • What is the critical information you need to help improve service delivery?

2. Create performance measures first

Your surveys should be designed to help you evaluate the impact of your programs. Therefore, you need to determine how you’re going to measure this impact with the use of performance measures. Then, you can design your survey questions to collect the necessary data.

All performance measures fall into the following three categories:

  • How much are you doing? (e.g. How many people are you serving? How many tasks are you performing? How many people have you reached out to and successfully recruited?)
  • How well are you doing it? (e.g. How well are you delivering your services? Are your instructors certified? Is the program accessible to the maximum number of eligible people)?
  • Is anyone better off? (e.g. Is your program delivering the intended impact, no impact, or negative impact? Are there any tangible changes in the perception, behaviors, attitudes, and or circumstances of your program participants?)

 For example, if you’re running a job training program, one of your “better off” performance measures might be “% of program participants who gain full-time employment within 3 months of program completion.” In this scenario, you should include a survey response related to the participant’s employment status and send follow-up surveys to monitor employment status at the appropriate intervals.

Go here to learn more about creating performance measures.

3. Keep your questions simple

Your survey data is critical to your continued success – don’t make survey completion any harder than it needs to be! Avoid using industry specific jargon as much as possible and write in a way that most people will understand. Many sources cite 8th grade as the average reading level for adults in the United States, but you may want to conduct research on your specific community.

Language and culture matter! Make sure the language your survey is composed in is appropriate for the community. Make translations available for diverse program participants. This means using the appropriate spoken language (English, Spanish, etc.) and using the culturally appropriate words and phrases to communicate your ideas. 

Finally, you may need to consider whether electronic or paper-based surveys will be effective at all. You may be dealing with literacy issues in your community. If so, be sure to offer alternative solutions for clients who may have difficulty reading and completing surveys on their own.

4. Don’t ask too many questions

Generally speaking, the longer a survey takes to complete, the fewer survey responses you will receive. For example, Survey Monkey found that survey completion rates are directly correlated with the number of questions. 10 question surveys were found to have an 89% completion rate on average, with the rate reducing by about 2 percentage points for each additional 10 questions.

To get a good response rate, ask only the important questions you need to meet your survey’s objectives, evaluate the quality of your programs, and evaluate impact.

5. Include relevant demographics

A lot of social programing is designed to promote equity in the community and close the opportunity gap between people of different races, genders, etc. 

To apply an equity lens, data must be disaggregated to enable you to develop strategies that ensure that race and other factors do not predict one’s success while also improving outcomes for all.

If equity is an important part of your work, make sure you collect the appropriate demographic data you need to disaggregate program results and impact.

The following are examples of demographics (not exhaustive):

  • Age
  • Gender
  • Sex
  • Household Income
  • Ethnicity
  • Race
  • Education
  • Marital status
  • Grade level
  • Employment status

6. Understand question types

Not every question can be answered with a multiple-choice response. Sometimes, you’ll need participants to elaborate on their experiences to fully understand the quality and impact of your services.

The following are examples of different types of questions you can use to gather different types of qualitative and quantitative data:

Here are the types of survey questions you can use:

  1. Open-ended questions
  2. Closed-ended questions
  3. Rating questions
  4. Likert scale questions
  5. Multiple choice questions
  6. Picture choice questions
  7. Demographic questions

Check out this article to learn more about the different types of survey questions and when to use each one.

7. Avoid bias 

A leading question is one that “prompts or encourages the desired answer.” For example, if someone wanted to evaluate the quality of their program and only included positive response choices, it might look something like this:

How well did we deliver our service to you?

  1. Somewhat well
  2. Generally well
  3. Extremely well

The problem with this question is that it only includes positive response choices. Even if clients are not satisfied with your performance, the worst possible scenario would be that you delivered the service “somewhat well.” This type of question produces unreliable data to staff, partners, funders, and the community.

To avoid leading questions, ensure you always include “negative” response choices and “indifferent” response choices.

A better question might look like this:

How well did we deliver our service to you? 

  1. Very poorly
  2. Somewhat poorly
  3. Not sure/ somewhere in the middle
  4. Somewhat well
  5. Extremely well 

Next Steps:

Survey creation is both a communication art and a communication science, so you may want to consider diving deeper into the research behind question development and survey structure.