Top Five Trends in Social Impact Measurement in 2017
1 February 2017 at 8:10 am
If 2016 was all about innovation and agility, then 2017 is about the convergence of being data-driven and, at the same time, human-centred, writes evaluation expert Jenny Riley who reveals the top trends in social impact measurement.
I love the summer break for reflecting on trends in the measurement and evaluation space (no, really – I do!) So, here are the top five trends in social impact measurement for 2017.
- Big data
Big data is everywhere. But what is it?
Big data is data sets so huge that they require more complex methods, applications and programs to analyse them.
These are not your average data sets, eg a phone list. Big data is high volume, being produced at a scale so exponential that we are at risk of not being able to keep up – and it is coming at us from a vast variety of ways. The data is both structured, eg from traditional databases, and unstructured – think Facebook, Twitter, Google searches, machine to machine, video, audio and financial transactions.
How is it being used? Corporations have been using big data for some time now. One use is to forecast purchasing patterns to produce or make available services. Think Netflix – all of their TV series are chosen using data analysis of what people are already watching. Walmart analysed sets of data and found out that Apple Pop Tarts sold out during Hurricane Katrina.
What do you think they now order when the weather gets bad? Imagine harnessing this power beyond TV shows and confectionary. In Indonesia, the UN tracks Twitter to keep a close eye on pandemics so it can react faster.
The possibilities for us in the social sector are immense, especially used as one of the many approaches to understand the community. In Bendigo, I am working on a project where Facebook groups are being tracked to get a pulse on community interests, needs and concerns; eg heat waves, gastro outbreaks or what to do when a local clinic is providing poor service.
The social sector needs to start using big data in two ways – data for action and data for impact. We need to better understand our world so we can produce innovative services and products (data for action) as well as accessing and analysing big data to help us understand if we are making an impact (data for impact).
- Data science and business intelligence
With the explosion of data comes tools and approaches to make sense of it. We are hearing things like data scientists, data analysts and data engineers.
Data science is a field that brings together maths and statistics, coding and subject matter expertise. These are the magicians that use a whole bunch of analytics such as regression analysis along with coding – and then use programming language such as Python and software such as Hadoop to make sense of all the big data out there.
Can I see not-for-profit organisations hiring data scientists and engineers to set up these systems? Not in the short term. For one, they are rare and therefore very expensive. Some of the big research institutes have, since the beginning of time, employed research experts to analyse trends and undertake data analytics, and government agencies such as the ABS are beginning to employ data scientists to explore big data and analyse trends – it would be great if they shared their insights.
Instead of data science I believe our sector needs to adopt thinking from business intelligence (BI). Definitions vary but essentially business intelligence is a set of strategies, processes, applications and data which are used to support the collection, analysis, presentation and dissemination of business information. Often this information is presented visually in a dashboard.
Simply put, BI provides an architecture for inputting, storing, analysing and reporting – the major difference between BI and data science is that BI focuses on internal data rather than external (big) data sources. Increasingly BI is integrating big data into their analytics to inform business decisions. We will see more data hubs and data sharing in 2017 and tools such as Microsoft BI and Tableau being used in our sectors.
- Learning systems
What to do with all this data and reports? Well we need to learn and adapt. Peter Senge first talked about learning organisations in The Fifth Discipline – the role of evaluators and measurement experts will be to work with facilitators, learning and development folk to create organisations and collaborations that can intentionally learn and adapt their work.
Learning systems require data inputs in the form of qualitative and quantitative data that can be analysed, reported and actioned. Monitoring and evaluation is now often referred to as MEL – monitoring, evaluation and learning.
Learning systems will require:
- high levels of personal awareness of cognitive bias and mental models
- adaptive leadership skills, especially the skill to create “containers” for learning
- timely, appropriate and “best as we are going to get” data
- an appreciative reflection and inquiry mindset
- courage and skill to ask “strategic questions” and not know the answers.
- Evaluating complexity
Over the past couple of years the social sector has embraced the concepts of complexity and systems thinking to inform our work. This shift has challenged us to move from best practice to emergent practice, to a “probe, sense, respond” as opposed to “sense, categorise, respond” approach. Part of this shift in thinking has been recognising that traditional design and evaluation approaches such as the logical framework, causal chains, program logic are limited in supporting and accounting for emergent and developing work.
As we move to evaluating our work in complexity, some of the methods we will see more of will include:
- most significant change
- outcome harvesting
- contribution analysis
- bellwethers or sentinel indicators
- process for monitoring impacts.
- Convergence of design and evaluation
Developmental evaluation was the new black in 2016 and it ruffled all sorts of feathers. A colleague and I recently designed a project using “developmental evaluation” and we couldn’t help but think that “developmental evaluation” was really using evaluation for ongoing design.
With the explosion in 2016 of innovation, design thinking, human centred design, design labs etc. the evaluation community looked on and saw their demise – but hang on – we do that! Design and evaluation were the key themes at both the American Evaluation Conference and the Australian Evaluation Conference and probably many others.
In response to this, The Australian Evaluation Society has a new special interest group called Design and Evaluation – their first meeting by web-link was on 31 January.
About the author: Jenny Riley is the founder and director of Navigating Outcomes and an associate with Collaborate for Impact. She specialises in evaluation and collaboration and has spent several years leading developmental evaluations and action research projects in complex settings. Riley has extensive experience establishing and supporting a range of collective impact initiatives in the areas of homelessness, youth unemployment, mental health and school readiness.