Blog: Impact for whom? Making big data work for the little guys

Feb 2017
22/02/2017

Who does big data work for?

We’ve all been there. You’ve reached a point in your research where you have more data than you know what to do with, and you’re busy turning all those great results into papers, presentations and articles aimed at the people who really matter: other researchers, academics and policymakers in the countries either funding or (hopefully) benefitting from your evidence. In the world of big data, evolving research into education policy and practice is a highly political, contested and complex space.

With all those documents to produce, who has the time – let alone the money – to go back to the actual places where that data was collected and share results with the local communities and stakeholders who gave you the information in the first place? And do those groups really need to know this information, anyway? It’s so complex, so difficult to understand, so academic.

This dialogue is common among many in the research community. But deciding when, how, where and with whom to share big data surely lies at the heart of the reason why we collect it in the first place. If research intends to create change and impact real lives, who deserves to hear the results most, and to be included centrally, and not peripherally, in its dissemination? 

These issues were central to many discussions taking place at a workshop I recently attended for Economic and Social Research Council (ESRC) and Department for International Development (DFID) grant holders of the Raising Learning Outcomes in Education Systems Research programme. A common theme emerged amongst the researchers: do researchers focus their data-sharing at the national and international level (mainly targeting policymakers, government stakeholders and other academics and researchers) at the expense of downscaling for downstream audiences?

Many researchers faced common barriers to sharing data with local audiences. Challenges regarding the time it takes to share information, the cost of preparing and packaging data and disseminating it, issues surrounding the relevancy of the information and data collected during the study for downstream sharing, and the complications of how to effectively communicate complex information to local audiences were frequently cited as barriers to this process. Yet, with all of these similar challenges, it is possible to come up with similar solutions to resolving them?

(Re)defining impact for downstream audiences

For me (it is worth pointing out here that I am not an academic. I’m more of a practitioner turned wannabe researcher), real impact lies in creating change in the homes, schools, communities and people that we capture information from. While policy and knowledge generation are incredibly important, if the information we capture and analyse does nothing to change the lives of the people who gave us the information in the first place, we have missed out on the most important area of impact we are responsible for creating. Policies only work if they are put into practice, and practices can only improve if we use results to help people make better decisions that advance their daily lives.

In order to do that, we must put downstream audiences – the very people we observe, survey and assess in schools, homes and communities – at the centre of our results-sharing efforts.

Use a language that is known and understood

So, how do you share data with these critical downstream audiences and communicate to them in a language they understand? If the results themselves are so complex, how do we make them digestible for the audiences who arguably deserve to see them the most?

When communicating with downstream audiences, what is communicated, how it is communicated and who it is communicated to matters. Results and information must be explained in a language and context that communities and beneficiaries can make sense of.

So how do we move from this:

 

EGRA Score

Letter Name

Initial Sound

Full-cost program

0.545***

0.850***

0.654***

(0.122)

(0.169)

(0.140)

Full-cost program * Any right

0.225

0.402***

-0.020

(0.173)

(0.146)

(0.187)

Lower-cost

0.095

0.316*

0.052

(0.110)

(0.178)

(0.110)

Lower-cost * Any right

0.088

0.270**

0.045

(0.152)

(0.129)

(0.151)

Any right

0.019

-0.055

0.190*

(0.105)

(0.080)

(0.100)

Number of Students

1438

1453

1458

Adjusted R-Squared

0.151

0.224

0.106

Control Mean

0.146

6.002

0.623

 

To this?

 

Report Card

 

Description of Marks

In Uganda, I support an early primary literacy project implemented by Mango Tree, a local education communication and tools company. A research organisation and think tank I run, Ichuli Institute, helps Mango Tree track and share information related to student learning outcomes in literacy across schools in northern Uganda, a post-conflict region of the country where education service delivery was severely constrained for over 20 years due to war. Through a series of community and school dialogues, we created a simple but innovative way of sharing information on children’s literacy outcomes with parents.

When we communicate results to rural (many of them illiterate or semi-literate) audiences, we use situations and examples to explain difficult information that makes sense in their daily lives and local environment. Rather than present data and information in a series of numbers, figures, tables and graphs, we share information using agricultural metaphors to explain to parents how their children are performing in key literacy tasks.

Using a series of tree images, each one corresponding to a different level of performance, we share complex information about student achievement in key literacy competencies. As our parents are nearly all rural farmers, we explain the ‘inputs’ required to help their children grow and thrive – weeding becomes school engagement, watering becomes the provision of school supplies and feeding, and sunshine becomes home learning and reading.

In order to make their child – their seedling – grow, all of these inputs are critical to their success and to making them ‘bear fruit’ on their path to literacy. By explaining these concepts to parents in this way, they are able to engage in dialogue about their children’s performance in a ‘language’ they know and understand.

3 ways to make big data work for everyone

Making big data work for multiple audiences is a complex process requiring time, energy and commitment on the part of everyone in your research team. However, there are a few things we have learned about downstream results communication that may help others share information with their local partners in a simple and engaging way:

1. Make it relevant: Big data must be relevant to the people receiving it. It must be important to them and to their lives and work in order for them to consider it important enough to discuss and understand. 

2. Make it accessible: Big data must be broken down into a series of manageable chunks for it to be useful. Presentation is critical in helping big data tell its own story. Different audiences often need the same information presented in very different ways.

3. Make it actionable: What we learn from our research must lead to changes in policy and practice. In order for results to make sense to the audiences we share them with, they must be structured in a way that allows recipients to take action and change behaviours that will lead to better lives, systems and polices.

Downstream sharing should be a key feature of any results-sharing initiative. By planning for it in your studies and making the people you collect data from central to your impact process, the evidence we gather really can transform lives and turn incredible research into real time practice.

 

 

The Impact Initiative blog posts are either from individual researchers or from major research programmes. Some of the blog posts are original source and are written by researchers and experts connected to the two research programmes jointly funded by ESRC and DFID: the Joint Fund for Poverty Alleviation Research and the Raising Learning Outcomes in Education Systems Research Programme. Other blog posts are imported from related websites and programmes. 

The views expressed in these blogs reflect the opinions of each individual and may not represent the Institute of Development Studies, the University of Cambridge, ESRC or DFID.

Comments:

The Impact Initiative welcomes comments.  To enable a healthy environment for discussion we reserve the rights to remove comments if they are considered abusive or disruptive. All comments are reactively moderated. This means that comments are usually only checked if a complaint is made about them.