Survey Says: Revealing the Intricacies of Survey Crafting
Have you ever whipped up a survey at the last minute to collect data, only to later discover that the data you collected wasn’t very useful? You know what they say - garbage in, garbage out.
While many people believe that crafting a survey is as easy as pulling up Survey Monkey and letting it rip, there is a TON of thinking and development that goes into crafting an effective survey. In fact, there are entire professions that focus solely on crafting surveys or questionnaires.
Professional evaluators know how complicated the survey-building process is, and typically will budget enough time to carefully consider all aspects of their design. But what does the crafting process look like? The following are a few of the steps that I as an evaluator take when I am constructing a survey for a client:
I usually start with conversations with various users about what they’re interested in measuring and how the data might be used. A few examples of the people I typically interview are program managers, coordinators, directors, funders, developmental departments, and occasionally program participants.
Then, I might follow with a literature review of similar studies or programs to identify if any established measures are available, how other researchers or evaluators have approached this topic, best practices for designing with a certain audience in mind, etc.
Next, there will be a period of thinking and alignment where I (the evaluator) connect the client’s goals with new or established tools, adapting or creating questions when needed. This stage is also when I start to think about the best way to ask a survey question. Do scale-based questions make the most sense for this client? Or do open-ended questions present a better method of obtaining the information needed? There are many different types of questions that could reflect partial or similar answers to what a client is looking for, but understanding the nuance of the client’s goals is what ultimately makes the difference between a good instrument and a great instrument.
The next step is to figure out logistics, like when data needs to be collected, how much, and through what medium.
Finally, once I have a grasp on the content of the survey and how I want to approach each data point, I will build out the instrument. I’m careful to avoid common survey mistakes like double-barreled questions (more on that later), and to take my time to ensure that questions are clear and specific. I pay attention to the length of the instrument, and the intro and outro language that explains the study and provides the participants with information to inform their consent.
There are a few more important things I consider as I am building out a survey. If there is an Institutional Review Board involved in the evaluation, I will need to manage that process, ensuring the project is designed in a way to protect participants. I will also confer with the client to ensure their expertise on the program and their specific audience is reflected in the survey. Then, I will pilot test the survey, make edits, and program it into a survey platform. Often I’ll even create a copy for the client to use to introduce the tool to their participants, either verbally or via email.
As you can tell, there are many steps and considerations that go into building a successful survey. The items outlined above are just the broad strokes I consider when crafting an instrument.
Next month, we will take a closer look at the content-development process, and talk more specifically about how I assess a client’s goals in order to create an effective tool for them.
If you enjoyed this post, follow along with Improved Insights by signing up for our monthly newsletter. Subscribers get first access to our blog posts, as well as Improved Insights updates and our 60-Second Suggestions. Join us!