Blog | November 22, 2017 | Tom Walker

When we summarised findings from Making All Voices Count’s research on how civil society organisations design and use technology, we found that they often struggle to set realistic expectations for their tech projects, do the right amount of user research, connect with the people they want to reach and adapt to unexpected changes (Read more on the programme’s findings on other themes, like government responsiveness and public participation).

We wanted to try setting these messages out in a way that made them useful and easy to read. So we developed a microsite (a standalone, single-page website that displays a specialised set of content) to summarise the messages, and point readers directly to the evidence behind them: https://researchfindings.tech/

 

Microsite_graphic

 

Why a microsite?

Researchers have been asking why existing research is not being used for some time now. Recently, there have been a series of calls to understand what kind of evidence practitioners are actually likely to use, and to deal with the tech-for-social-change sector’s ‘weak culture of evidence and accountability’.

These are key parts of the picture. But we think it’s also important to think about how that evidence is presented. When practitioners look for evidence, it may be hard for them to quickly work out what will be practically useful. The findings are there - they’re just mixed up with large blocks of text, or written in a style the practitioner isn’t used to.

We really need a focus on joining up lessons and making them digestible when designing anything that looks at technology and transparency in governance. - Amy O’Donnell, Oxfam GB

Maybe it’s about the format of the evidence itself – rather than expecting busy practitioners to read long blocks of text, what about asking them to listen to experts talking about the research in their own words? In many places around the world, radio is the main way that people get information. Though obviously we couldn’t tap into radio stations or those networks, we wanted to try out providing research information in digestible soundbites.

So, at Making All Voices Count's final learning event "Appropriating technology for accountability', we interviewed people from various countries and types of organisation: Michael (Miko) Canares from the Web Foundation’s Open Data Labs, Amy O'Donnell from Oxfam GB, Koketso Moeti from amandla.mobi in South Africa and the independent consultant Linda Raftree. Listen to the audio here:

Testing how practitioners read research online

Earlier in 2017, we at The Engine Room had put this idea to the test by asking practitioners to read and engage with research we’d written ourselves.

At the time, we were working on Alidade: an interactive tool that walks you through the process of choosing a technology tool, highlighting relevant resources and research findings along the way. We decided to run online walkthroughs with 10 project managers working on transparency and accountability initiatives in countries ranging from Nigeria to Indonesia.

We asked them to read a one-page, online summary of the research, while sharing their screens with us so that we could see how they browsed. Our hopes were high: all of them had said they were interested in reading evidence about their work, and we'd tried to make the page as short as we could, split findings up into bullet points and highlighted key passages.

Microsite_graphic_2

Perhaps unsurprisingly, we were disappointed. Most only skimmed the research findings. Those with a stronger interest in research said that their eyes were drawn to the most skimmable parts: text in larger type, and the few graphics we'd included - even when they weren't the most accurate reflection of the most important messages. Like all of us, they were dealing with an online information glut by trying to quickly assess if any of the information was relevant - and ignoring the rest.

These observations won't be news to anyone familiar with research into how we read online.

People will often say that we don’t know where to find existing research. In a space like ours you’re exposed to so much of it, but you don’t make use of it. - Koketso Moeti

They also fit with the broader trend that we identified throughout Making All Voices Count’s research on the design of technology projects: that organisations weren’t doing enough research into their users’ needs. We asked Koketso, Linda, Miko and Amy why.

 

So, the microsite we produced aims to highlight five simple messages, the most important evidence backing this up, and some ideas on how to do things differently. It’s designed to only show information that the reader is specifically interested in, reducing the amount of information to a minimum. Is this format a useful one for presenting evidence? We’ve been encouraged by some of the feedback we’ve already had:

However, we’re also keen to hear more views - is this a useful way of sharing evidence? What could we do differently? Let us know at research@theengineroom.


About the author

Tom Walker is Research Lead at The Engine Room, an international organisation that helps activists and organisations working for social change to make the most of technology and data. The Engine Room has been commissioned by Making All Voices Count to review and communicate findings, especially those relating to the design and use of technology in projects.
Share