Close Search
 
MEDIA, JOBS & RESOURCES for the COMMON GOOD
Opinion  |  Charity & NFP

What makes a survey good?


25 February 2020 at 8:17 am
Mike Davis
When TaskForce launched a customer demographic survey across its programs there were a number of key learnings. Here, Mike Davis shares some of the things he found out about good survey design and implementation.


Mike Davis | 25 February 2020 at 8:17 am


0 Comments


 Print
What makes a survey good?
25 February 2020 at 8:17 am

When TaskForce launched a customer demographic survey across its programs there were a number of key learnings. Here, Mike Davis shares some of the things he found out about good survey design and implementation.

We rely heavily on surveys in the for-purpose sector. They are often our main way of understanding our client demographic’s attitudes, behaviours, circumstances and knowledge over time. But they are also fraught with bias and human error on a number of fronts – that of the survey designer (what outcomes are they seeking?), the client responder (how do they feel on the day and relate to the question?), and the survey administrator (how do they promote, communicate and facilitate the survey experience for the client?). 

At TaskForce, our clients – like most people – generally do not enjoy filling out surveys. Similarly, our clinical and non-clinical staff would prefer to spend their time providing more care rather than doing more “admin”. These human challenges are compounded by the technical challenge that each program at TaskForce only captures information specific to that program and funders reporting requirements. 

These were two barriers we were keen to overcome in implementing a universal TaskForce-wide demographic survey across our programs that would enable us to collect quality demographic information, social circumstances and push to understand service pathways and wellbeing. 

This would enable us to move toward a “birds-eye view” of our clients, service usage and patterns, common issues and adherence to our wraparound model of care. Throughout this process we learned a lot about good survey design and implementation. Here are a few things we do to ensure we can respectfully and effectively collect survey data:

  • Keep it short and simple

Surveys should be short, direct and to the point, and written in plain English. They should take no longer than two to three minutes to complete and questions should follow a logical order. I would suggest a maximum of 15 questions. 

survey monkey genius

If you are collecting demographic and behavioural information you can easily get away with a check box format, where the client simply has to check the relevant box. You can also use “yes”, “no” or “other” and ranges as question types. 

You can also limit free text responses so that the client can simply fill out relevant information by ticking the right option. This will likely increase your response rate a great deal. 

We’ve managed to obtain a very high response rate (much higher than the predicted 73 per cent) probably due to the short amount of time, effort and cognitive load imposed by our survey. 

  • Provide only relevant options

Your survey questions should offer a maximum of five choices. Any more than this and you risk putting your respondents into a state of choice fatigue. Use choice ranges that are usual for the type of question you are seeking to answer. 

There are standard age ranges, gender and sexual identity questions and even simple ways to ascertain regularity of service use. Standard scales can also be used to assess things like subjective wellbeing at a point in time. 

This is a simple way that we make an informed guess at the average mental health of our clients as they first visit TaskForce.   

how would you rate your mental health on a scale from very good to very poor

Avoid having respondents choose between a tick answer and free text entry options as this leads to decision fatigue and will reduce answer accuracy. Make drop down boxes and multiple choice options your friend. 

  • Focus on coaching effective implementation

Designing a good survey is just the first part of getting useful data from your clients. You’ll need to think about how you will collect that data, who will administer the survey and how they will communicate the purpose of the survey and its importance. 

A key part of this is shifting from a burden “extra admin” mindset to a focus on gathering intelligence that will enable better care and service support. Doing this with your team and championing your why before getting to the technical process detail is essential. 

Here is an excerpt of the email communication sent to all-staff in implementing our new demographic survey:

extract from an all staff email sent to TaskForce staff about the upcoming survey

A few months ago we couldn’t tell you with any certainty who our clients were, where they came from, how they used our services or how they felt their wellbeing was when visiting us. 

Today, with close to 1,000 surveys having been completed across our programs we know a lot more:

  • The majority of our clients are quite young, being under 35 years old (62 per cent). 
  • 9 per cent of our clients identify as LGBTI, 10 per cent of our clients are homeless or housing insecure and 22 per cent of our clients are culturally or linguistically diverse.
  • 18 per cent of our clients have experienced family violence recently with just 3 per cent of these clients receiving care other than from TaskForce.
  • The majority of our clients are now attending TaskForce for education and training programs including behaviour change (58 per cent).
  • Our next most popular programs are drug and alcohol counselling at 31 per cent, employment services at 10 per cent and family violence comprising the fastest growing remainder.

We are now turning our attention to the effectiveness of our wraparound model of care. From our demographic survey we know that at least 15 per cent of our clients are attending at least one other service at TaskForce in addition to their primary visit. 

We know that informal referrals are likely (not captured in this data) and are probably much higher across our service, particularly at our Youth Hub with our co-located partners. So our additional research will look at journey maps of our wraparound model, the client level impact and the community level social and economic impact. 

We will also be undertaking additional research with partners including Our Community and their data science team to explore complex questions around client journeys through our services including:

  • Which clients are most likely to unexpectedly leave our programs? Can we build a predictive model and early intervention strategy around this?
  • Which clients are most likely to benefit from our programs? Can we gain a better understanding of why this is and share this knowledge? Can we build use this to enhance our wraparound model?
  • Do voluntary or forensic clients benefit more from our programs? Would this knowledge contribute to research and policy to produce better outcomes?

What are your survey design and implementation challenges? Are your surveys producing helpful and actionable insights or just churning data?

“The goal is to turn data into information and information into insight.” – Carly Fiorina, former CEO, Hewlett Packard.


Mike Davis  |  @mikedav84

Mike Davis is a for-purpose executive leader, chief podcaster at Humans of Purpose and a board director at not for profits SIMNA Ltd and L2R Dance.


Get more stories like this

FREE SOCIAL
SECTOR NEWS


YOU MAY ALSO LIKE

Is impact just a buzz word?

Sonia Vignjevic

Monday, 20th February 2023 at 3:29 pm

Jumping to judgement

David Crosbie

Wednesday, 28th September 2022 at 9:13 pm

What Save The Children learned from becoming an impact investor

Paul Ronalds

Wednesday, 14th September 2022 at 12:55 pm

Access to quality and “clean” data key to ACNC collaboration with sector 

Samantha Freestone

Wednesday, 7th September 2022 at 9:16 pm

pba inverse logo
Subscribe Twitter Facebook
×