Thursday 11 December 2014

Impact and Learning blog will no longer be updated but you can read all the latest opinions from the IDS community on our website

This will be the last post to be published on this blog. The Institute of Development Studies now publishes all our members’ and guest bloggers’ posts directly onto our website.

You can keep up to date with the latest opinion and expert analysis from your favourite development bloggers, comment on and share their posts, by visiting our website, following us on social media or subscribing to updates from the following IDS units:


Alternatively, you may be interested in these IDS research and knowledge clusters:

We are sure you will continue to enjoy the conversation.

Best Wishes

The IDS Communications and Engagement Unit

Monday 1 December 2014

Realist Evaluation

By Rob D. van den Berg

On 20 November I attended a workshop on realistevaluations, organised by IDS Fellow Inka Barnett and co-sponsored by the Centre for Development Impact. The focus was ongoing work in evaluations and how this could be improved. Realist evaluation experts Bruno Marchal and Sara van Belle, who came over especially from Antwerp, provided an excellent overview of the realist evaluation paradigm and actively engaged with workshop participants. Throughout the day it became clear that the realist paradigm is the most detailed and sophisticated of the theory of change / theory based evaluation approaches. These are various theoretical frameworks that aim to look at assumptions that underlie policies, programmes and interventions. These assumptions can then be evaluated to see what works, for whom, how and under which circumstances.



The examples of realist evaluations explored during the workshop were rich and varied. They ranged from using mobile phone technology for nutrition surveillance to a realist synthesis of evaluative evidence on water and sanitation issues, to an evaluation of influencing the Chinese position on global health issues. Participants struggled a bit with some of the jargon: assumptions are especially framed in terms of ‘CMOs’: context-mechanism-outcome configurations that describe how a specific mechanism is thought to bring change (the desired outcome) in a specific context. However, this is the element that makes realist evaluation potentially richer than other theory based approaches.

As an IDS Visiting Fellow I am involved in supporting the Centre for Development Impact, so I was interested in how realist evaluation could be positioned in the range of impact evaluation methodologies that the Centre promotes. For me, the realist evaluation approach scores high on a number of issues. It is the richest and theoretically most satisfying version of the theory based approaches that have been developed over the past decades. The approach accommodates the greatest range of potential evaluation questions. This is immediately clear in the questions that are often used to characterise realist evaluation: what works for whom, how, when and where under which circumstances. The focus on causality, on mechanisms that are supposed to bring change, makes it especially worthwhile for impact evaluations of complex interventions. And a great advantage is that realist evaluation is not dogmatic as regards evidence, counterfactuals and tools and methods, and has a theoretical underpinning for this. In other words: realist evaluators have rigorous justification for the use of data and evidence through the realist evaluation framework.

Some drawbacks also need to be mentioned. The jargon seems somewhat forced – CMOs need to be explained to outsiders. As usual with these frameworks, the need to clearly define what you are talking about may actually turn against you – the best way to prevent this in evaluations is to maintain a strong focus on the questions that need to be answered. If the jargon stands between your question and your answer, the realist framework may not be the best to follow.

A second more profound difficulty is the fact that the realist evaluation framework is all about social change. This is a limitation that becomes a bit of an obstacle when other types of change need to be taken into account. Evaluation is increasingly confronted with change processes in other domains: climate change, environmental degradation, use of natural resources, value chains, and so on. If the follow up to the millennium development goals in 2015 will integrate use of natural resources and environmental issues into new sustainable development goals, the realist framework in its current incarnation will not help them. A new challenge emerges: that of adapting realist evaluation to include non-social processes of change.

The evaluations that were discussed are very relevant to the work of the Centre for Development Impact. Compliments to Inka Barnett for organising this! The Centre for Development Impact should invite these evaluations to become part of a community of practice on impact evaluations, hosted by the Centre, where evaluators may exchange questions, solutions, issues to further explore and learning on new developments. I hope to see more of this in the near future!

Rob D. van den Berg is a Visiting Fellow at the Institute of Development Studies. 

Friday 25 July 2014

It’s stakeholder mapping Jim but not as we know it

By James Georgalakis

What happens when you create fictitious organisations working on make-believe influencing scenarios and ask a bunch of people who have never worked together before to develop stakeholder maps for them? Quite cool stuff actually.

At a recent IDS short course designed to provide a broad overview of research and policy communications we included a section on stakeholder mapping tools. Over the past decade I have run many workshops which included some form of stakeholder mapping exercise. But whether the sessions took place with civil society organisations in Malawi, researchers in Nepal or social workers in Ukraine, context was always key. During all these capacity building events we worked on real scenarios. So what were we to do in a single day with 25 participants from a broad range of think tanks, universities, NGOs and consultancies, two thirds of whom had never undertaken any kind of stakeholder mapping before? Make it up of course!

We simply created five pretend scenarios based on very different policy contexts inspired by the range of participants we were expecting. We surveyed them first checking what kind of policy actors they typically targeted. We then got pretty creative making up an organisational profile, an influencing objective and we even suggested some potential stakeholders they might want to kick things off with. The exercise was otherwise pretty much like any other network mapping style process. They identified further stakeholders, they placed them on a large map to indicate their relationship to their made up institution and they looked at their relationship to one another. They then scored each one in terms of their level of influence on the issue and the likelihood of them being allies, opponents or neutral in relation to the hoped for influencing goal.

One of the main differences from a more conventional session was it was faster. Detailed maps were produced in under an hour and a half – although they would have all happily taken much longer if we’d let them. With less time spent on reviewing objectives (we provided these) and dissecting deep rooted institutional issues around identity, legitimacy, power and profile, the participants quickly explored different external stakeholders and their potential usefulness. However, the discussions still contained much of the richness that more conventional sessions do. Our approach meant that we had mixed groups learning from one another’s institutional and sector perspectives. They quickly discovered they had quite different ideas about how change happens and the impact of research, and begun exploring hidden power relationships. In the group I facilitated they challenged the narrow list of parliamentary and governmental stakeholders we had suggested and wanted to extend their map to include civil society organisations, social scientists and the media. This in turn helped them unpack what it means to try to influence the quality of a particular public policy discourse with evidence (they were pretending that they were going to try and engage with the controversial debate on the impact of immigration on the UK in the run up to a General Election!)

The participants really seemed to gain an appreciation of the importance of mapping your policy environment and how documenting the discussion is almost as useful as the map itself. They revisited their maps in the afternoon and used them to select priority audiences for whom a communications plan was then developed. It all felt pretty real by the end and the fictitious scenarios appeared to deliver much the same learning and tools, that they could apply back in their own organisations, as any more realistic grounded exercise would have done.


You can download all the course materials including the stakeholder mapping scenarios and facilitators notes here. IDS is currently developing an Achieving Influence and Impact Series, so do let us know if you would like to be kept informed of future courses and free resources on this topic: training@ids.ac.uk

James Georgalakis, is the Head of Communications at IDS
Follow James @ www.twitter.com/bloggs74

Other posts by James Georgalakis on research communications:

The Guardian
Has Twitter killed the media star?
Marketing: still the dirty word of development?

On Think Tanks
Is it wrong to herald the death of the institutional website?
How can we make research communications stickier? 

Impact and Learning 
Digital repositories – reaching the parts other websites cannot reach
Influencing and engagement: why let research programmes have all the fun?
Going for gold: why and how is IDS bringing our journal back in house and making it open access?



Monday 14 July 2014

Going for gold: Why and how is IDS bringing our journal back in-house and making it open access?

By James Georgalakis


The recent announcement by IDS that we are not planning to renew our contract with Wiley Blackwell for the publication of our journal, the IDS Bulletin, will have delighted some and baffled others. Re-launching our flagship publication as a gold open access digital journal means the end of subscription income and the end of a large publisher’s marketing support. From January 2016 the Bulletin will be produced in-house and will be available to all for free.


Since 1968 the IDS Bulletin has been an integral part of IDS’ research dissemination strategy, covering the major themes and influencing debates within international development. As we move forward we will build on its unique characteristics including the thematic issues that mobilise scholars from multiple disciplines, around key development issues. However, for the first time in its history from 2016 there will be no paywall, no embargos and few licencing restrictions to obstruct researchers, students, policy actors and activists from using the Bulletin to support their work.


This new open access IDS Bulletin will be supported by robust editorial and peer review processes with an editorial steering group made up of IDS fellows from all of our key research areas plus an advisory body to provide oversight. Academic editors of issues will be drawn from across the IDS community, including our partners, and a small in-house production team will provide a high quality publication available for free digitally and in print for those that need it.

Who Pays?
Of course the key conundrum of the open access movement has always been who pays? If not the subscriber then surely the researcher (via their funder of course). What this means in practice for conventional social science journals is paying article processing charges (APCs) of around £2000 per article to commercial publishers. In the case of the new IDS Bulletin we are dispensing with this process altogether and simply bringing production and distribution in-house and charging projects and programmes a fixed sum of just under £6000 for a whole issue of up to 9 articles and then fundraising to meet the shortfall. With its long history, policy focused thematic issues, not-for-profit financial model and full compliance with even the most stringent open access policies, we are confident that the IDS open access Bulletin will attract the financial support it needs to continue to provide fresh new thinking on key development issues.

Why not just publish a hybrid?
A hybrid publishing route involves placing submitted versions (post peer review but pre editing and formatting) of your articles in a repository and making others gold open access with APCs. Many publishers including Wiley have developed hybrid systems. However, the Bulletin is quite different to other journals. With thematic issues that are often built around a specific research project or programme the use of APCs is almost impossible. We need to fund each whole issue otherwise it is game over. Hence the charge we apply to projects wanting to commission an issue. Plus, simply releasing the submitted versions of articles still fails to meet the strictest of open access mandates such as DFID’s. In fact, this hybrid route was originally recommended by the UK Government as part of a five-year transition (from 2012 to 2017) towards fully Gold Open Access. It does not work for us and it may not represent a long term solution for other learned societies in the social science sector.

An open access strategy fit for fifty years in development studies
Our emerging open access strategy and our desire to pursue engaged excellence demands that we open access to as much of our evidence and knowledge as possible. The schemes to open up access to journals to southern institutions such as HINARI, AGORA and OARE under the Research4Life programme are good but no longer go far enough. The ongoing evolution of the IDS Bulletin is in part thanks to Wiley Blackwell themselves who have over the last six years helped build its credibility and reach. However, the expiry of our contract with Wiley at the end of 2015 marked an opportunity to take the journal into the next exciting phase of its development. It will be re-launched in our fiftieth year and form a key part of our anniversary celebrations as we release the entire back catalogue. This major publishing event will form part of the narrative around the Institute’s fiftieth birthday as we explore our future role in a changing world in which development knowledge is generated globally and we seek to share it with all those that need it.

James Georgalakis, is the Head of Communications at IDS
Follow James @ www.twitter.com/bloggs74

Other posts by James Georgalakis on research communications:

The Guardian
Has Twitter killed the media star?
Marketing: still the dirty word of development?

On Think Tanks
Is it wrong to herald the death of the institutional website?
How can we make research communications stickier? 

Impact and Learning 
Digital repositories – reaching the parts other websites cannot reach
Influencing and engagement: Why let research programmes have all the fun?

Friday 14 March 2014

Hosting webinars: lessons from a recent ‘on air’ experience engaging stakeholders in real-time

By Adrian Bannister

As part of a recent e-dialogues week delivered by the Making All Voices Count programme team, colleagues from IDS Research and Knowledge Services departments worked to convene an entirely web-based audience of invited stakeholders to two online events. 

These book-ended 5 days of asynchronous online discussion that took place on the Eldis Communities platform – you can read my top tips for facilitating online discussion here.

So what’s a webinar?
For the uninitiated a webinar is simply defined as ‘a seminar conducted over the Internet’. Like a real-world seminar this generally involves one or more presenters speaking to, and receiving questions from, an audience. 

It happens in real-time and typically involves the streaming of audio and video. For the audience it enables them to be ‘virtually there’ - for the project team it means working like a television production unit.
By Jorge Díaz (Flickr: On air) CC-BY-SA-2.0 

One key distinctive feature, and advantage, is that nobody (from the panel of speakers or the audience) has to be in the same location. However, each participant must have a broadband Internet connection and knowledge of ‘where the virtual room is’ i.e. a web link. In some cases it is also necessary to download software e.g. At&T Connect.

Rationale and approach
We chose to open the e-dialogues week using a webinar event for three reasons:
:- to present the programme’s own thinking on the questions the e-dialogues week was posing (the focus was on the experience of Technology for Transparency and Accountability Initiatives - T4TAIs) 
:- to provide an opportunity for programme stakeholders to ‘meet’ the programme staff and hear them speak directly, and;
:- to begin to stimulate thinking and interaction among participants around aspects of the online discussion to come

Both events were an hour-long and followed a broadly similar format. A panel of IDS colleagues sat together in the same room, though they could have each been disparately located, and each shared a short presentation that was live-streamed to the audience. 

The audience were able to interact with each other via a chat-room and, and also to queue-up questions to the panel. These were fed to the speakers during an extended question and answer session. 

The closing webinar provided a mechanism for the panel to verbally summarise and provide commentary on the areas of the discussion that they had facilitated. 

A note about technology
To host these webinars we setup dedicated and private virtual rooms using the Click Meeting platform, which was provided and supported by Webgathering. Participants were able to access the space via a web browser, which only required a minimum software download, the popular Adobe Flash plug-in.

Though there were only 80 or so invitees for our events, Click Meeting can host up to 1000 attendees for large webinars and has useful text translation features for a variety of languages. It allows for meetings to be recorded as a video file (in MP4 format) which makes it accessible and shareable with most computer users afterwards. 

Web Gathering has an extended facility with Click Meeting so they can accommodate more presenters than a standard plan however a free trial is also available.

Key lessons
Based largely on these experiences, we’ve gathered our thoughts into the following: 15 top-tips for managing webinars which may be useful for your own projects.


Adrian Bannister is the Web Innovations Convenor & Eldis Communities Coordinator at IDS

Thursday 27 February 2014

Influencing and engagement: Why let research programmes have all the fun?

By James Georgalakis

Is it time for us to start applying some of the strategies for bridging research to policy used so effectively at a project level to our institutions as a whole? Could a more coherent approach to policy engagement and research communications applied cross-organisationally lead to greater impact?

There is a chasm to be crossed in many research organisations and think tanks. It is the deep gap that exists between high level institutional strategies and project based impact plans. Many project funders pile pressure onto researchers to develop theories of change and detailed research communication strategies, but there is little incentive for research organisations to look at the policy and knowledge landscapes they operate in at an institutional level. Is it really only campaigning organisations, like international NGOs with their advocacy frameworks, which need to take a holistic approach to policy engagement?

The challenge for many research producing organisations, made up of multiple centres, projects and programmes, is how can they be greater than the sum of their parts? Even academic institutions and think tanks with a clearly articulated mission of actively engaging in policy discourse risk entirely vacating key policy debates or abandoning prime influencing opportunities when certain projects come to an end.

Research programme vs institutional priorities

I recently co-facilitated a series of workshops in Nepal as part of the Think Tanks Initiative Policy Engagement and Communications South Asia Programme, with an inspiring group of researchers and communications professionals from fifteen South Asian think tanks. They were all interested in the development of institutional level engagement strategies and were simply not willing to restrict their planning to a specific project. Or as one participant put it, “Why should the research programmes have all the fun?”

They each developed a clear policy engagement goal, or set of goals, that reflected their vision of change. For some these were softer changes in the nature of the policy discourse for others quite specific changes in the direction of policy in a particular area. They then mapped their context and gained a real understanding of how change happens in relation to their hoped for long term impact. The lack of a specific set of project or research goals did not seem to dilute the richness of their discussions. But it did lead to a different set of answers. They each looked at their emerging institutional level impact strategies in relation to an earlier exercise that had assessed their capacity in relation to policy engagement and communications. Areas they needed to invest in at an institutional level whether: social media, publications or knowledge management skills, quickly emerged. As did key relationships, networks and knowledge of policy processes they needed to grow.

See the Center for Study of Science, Technology and Policy from India take a dizzying six second journey around this strategic planning process: https://vine.co/v/MZzY3xhLrz6

Breaking down barriers

This experience also helped me to reflect on IDS’ approach to institutional level strategic planning. We too have been on this journey in trying to identify a wider set of engagement priorities. Take the Post2015 debate for instance. Here is a prime example of something that cuts across projects and programmes and research centres. By actively prioritising it in a cross institutional strategy and mapping out our strengths and weaknesses and the key areas of potential engagement, whether in the media, UN processes or the UK Government and Parliament, we have been able to add real value to the work of our project teams and their partners. Some of these groups are explicitly focused on this debate, such as Participate. Others find this framing essential, using it to push their research up the agendas of key policy audiences. We have been able to create a more enabling environment for their work by actively identifying key influencing and engagement opportunities (and challenges), building relevant networks and alliances and prioritising the timely profiling and intelligent framing of their outputs.

This process has also led to a great deal of cross-organisational collaboration, breaking down the barriers between research teams, projects and multi-sited research centres. So, whilst all our engagement and communications activities remain entirely based on our research (there is not retro fitting of evidence to advocacy objectives here) we are not wholly driven by the ubiquitous project log frame which cannot always facilitate the type of policy entrepreneurship needed to engage effectively at a national or international level.

There are a wealth of academic papers, blogs, donor guides and other materials on effective research communications and the incorporation of impact strategies into projects. However, there is far less about cross institutional approaches. Some commentators claim that cross institutional strategies focused on policy outcomes are simply too broad but is it time to challenge this? I would love to hear from those who have experience in this area. We need to share our learning and explore ways that researchers and communications professionals can work together to build a strategic framework at an institutional level to support those committed to making sure their research makes a difference.

James Georgalakis, is the Head of Communications at IDS
Follow James @ www.twitter.com/bloggs74

Other posts by James Georgalakis on research communications:

The Guardian
Has Twitter killed the media star?
Marketing: still the dirty word of development?

On Think Tanks
Is it wrong to herald the death of the institutional website?
How can we make research communications stickier? 

Impact and Learning 
Digital repositories – reaching the parts other websites cannot reach

Tuesday 18 February 2014

Open knowledge spells murky waters for M & E

By Ruth Goodman

In mid-January I ran a session on monitoring and evaluation at the Eldis Open Knowledge Hub Partnerships Meeting. The meeting housed a group of individuals united by a concern with opening up access to research evidence and, particularly, increasing the visibility of research from developing countries.

The partnerships meeting was undertaken as part of the Global Open Knowledge Hub (GOKH) – a 3 year DFID funded project. The vision for GOKH is that IDS, and partners, will build on their existing information services to create an open data architecture for exchange and sharing of research evidence – the so-called Hub. For insight into the issues that need to be addressed in trying to set up an open knowledge hub see Radhika Menon’s recent blog The Global Open Knowledge Hub: building a dream machine-readable world.

Our hope is that through the open data approach the partners and third-party users of the Hub will be in a position to extract and re-purpose information about research evidence that is relevant and contextual to their audiences. This in turn will contribute to research content being more visible thereby enabling otherwise unheard voices to contribute to global debate and decision making. My session on M & E then was concerned with how we can know if this is being achieved.

M & E is great. It allows you to track and evidence what works and what doesn’t so that you can learn, improve and grow. In order to reach this end though, you need to know how to evaluate your work. When it comes to approaching M&E for the Hub, the waters are murky.
Photo by Kessie-Louise Given at deviantart.com
Open data approaches are still (relatively) new and the body of evidence for M & E when working with open data, let alone the specifics of evaluating and learning from this sort of Hub model, is sparse. The traditional technical methods of tracking information on the internet fall over when you make the data open. By making data open you give up most, if not all, of the control over how your data is used, implemented and displayed. There are ways to implement tracking but these are easily circumvented, so the statistics you can obtain do not reliably represent the whole picture. So, depending on how they implement the content, if organisation A is consuming data from the hub  that organisation B has contributed to the Hub then it may be that the ‘hits’ register on organisation A’s web statistics, not organisation B’s. Even if/when we do identify the most suitable metric for measuring impact in open knowledge, as we discussed at the workshop, numbers aren’t really enough. Indeed, web metrics are unreliable at the best of times and their value lies in spotting trends in behaviour – not for demonstrating impact. To engage with quantitative data people need to be clear on what that data is telling you. If open knowledge data is not the most exciting thing in the world for you, or maybe something that you don’t quite understand, then numbers are likely to do little to inspire understanding or perceived value of open data initiatives such as the Hub. However, if you can tell a story about what the Hub has allowed users to do then people have something real to engage with. Not only will they have a better understanding of the nature of your work and the value of it but they are more likely to be motivated to care. At the workshop we discussed the potential of collating stories of use as one approach to M & E that might allow us to translate the value and challenges of open knowledge work to a wider audience.

Other possibilities we discussed were around helping and supporting each other. If partner organisation A is featuring content from organisation B, delivered by the Hub, then potentially A could tell B how many hits they are getting for your content. If doing some M & E of their own, could partner A even add a couple of questions to their user survey about partner B’s data? And what about the experiences and perceptions of those partners using the Hub? Partner organisations own reflections and observations are as important as those of users in gaining a full understanding of the value and potential of the initiative.

Moving forward, our aim is to convene an M & E working group which, among other things, could serve as a community of good practice where we can be open with each other about our evaluation efforts. By sharing our experiences of different M & E approaches and the challenges of these we can work toward a position where we can know the influence of this work, we can translate this to others in a comprehensive way, and we can start to identify what we need to do to realise the potential in this exciting new arena.


Ruth Goodman is Monitoring, Evaluation and Learning Officer at the Institute of Development Studies

Thursday 6 February 2014

The Global Open Knowledge Hub: building a dream machine-readable world

by Radhika Menon

The word ‘open’ has long been bandied about in development circles. We have benefited in recent years from advocacy to increase open access to research articles, and open data shared by researchers or organisations. But open systems that enable websites to talk to each other (e.g. open application programming interface) have been a little harder to advance into greater use, simply because they are not built for non-technical users.

The International Initiative for Impact Evaluation (3ie)  recently joined eight other partners that are part of the new, DFID-funded and Institute of Development Studies led Global Open Knowledge Hub project to discuss several issues related to open systems. It was no surprise that all the partners spent quite a bit of time coming to their own understanding of an ‘open system’ and an ‘open hub’.

Put simply, the Global Open Knowledge Hub project will build an open system for sharing data between participating partners and with the wider world. As each of the participating partners offers knowledge services, there are thousands of research documents, articles and abstracts that are on our websites. To facilitate the sharing of these knowledge products, an open, web-based architecture will be built so that we can all just go to one place, i.e. the hub, and find high quality, diverse and relevant content on any chosen topic that is available from the partners.

To understand how the sharing works, step out of the human-readable world and step into the machine-readable world. If a machine can be programmed to search and read through the data, then the amount of data that can be processed starts to boggle the mind. The hub is a place where huge amounts of data in machine readable formats can be queried, accessed, used and combined with other data. If you are interested in climate change, one of the topics on which the hub project will focus, a huge amount of the research that exists on climate change, spread across continents, disciplines and sectors can be accessed in a matter of a few seconds. The sheer scale of it is awe inspiring. Think tons and tons of data, woven together in a kind of semantic web. This is what the web 3.0 world will look like.

All of this might sound like a grand vision. And as partners involved in pioneering work, we are aware that we need to get several things right for this vision to be realised:

Understand demand

As Edwards and Davies say in this paper, the current understanding of open data is primarily from the supply-side perspective. It’s not enough to just put out large quantities of data; we also need to get a better sense of the demand for the data. Who are our potential users? What kind of data would they need? What will they use it for? These are questions that need some serious investigation.

The IDS Knowledge Services Open Application Programming Interface is an example of a successful open system in the development sector. The Open Application Programming Interface (API) provides open access to tens of thousands of development research documents in its repository. According to Duncan Edwards, IT Innovations Manager at IDS, there is good demand for the IDS open API from both Northern and Southern development organisations.

But data has been accessed primarily via several applications - a mobile web application, regional document visualization application and a tag cloud generator. And these have been built to make the data accessible to non-technical users. So, we need more of this to happen to make the data in the hub more user-friendly and spur demand.

Get IT and content providers to work together 

These open systems are not made for the ‘non-techie’, average user. When I first looked at the Open API of a website, the programming language that came up on the screen did not make any sense to me. But there is clearly a lot that the system can throw up for generating useful content for those with the technical skills to use it. For this to happen, researchers and communicators would have to work alongside a technical team and play a more active role in the curation of data. This is the only way the potential of the system can be fully explored.

Map taxonomies

Research is often labeled according to the needs and interests of its user. So the same piece of research may be tagged as agriculture development, rural development or farmers.  In the machine-readable world, this becomes a crucial difference that prevents data on the same themes from linking to each other. The taxonomies we use to describe information change depending on the organisation, sector and country.

So for the hub, we need a system for classifying data, which maps these different languages to ensure that data on the same theme and topic can find each other and hang together.

Work out branding, attribution, licensing and copyright

How open can we be about sharing content? When our content gets used in some way e.g. featured on another website, will credit be given to the knowledge producer? If the knowledge service involves the production of summaries and abstracts of research articles, then it would be important to clarify with the original research producer on how they license others to re-use their content (e.g. creative commons).

Since research producers, knowledge service providers and funders often use web analytics as a metric for measuring success, organisations are often concerned that if their content is ‘open’, users may not ever visit their website. Thus they would be denied access to these important metrics. We need to therefore explore new ways of tracking how ‘open’ content is used beyond our own websites. Or we need to agree to share enough data so that users are directed to the originator’s website for full information.

The partners contributing to the Global Open Knowledge Hub are working through these issues. All the partners believe that development research has a crucial contribution to make to poverty reduction, but only if it is easily available and quickly accessible to users. So, what we are building together needs to become the prototype of what open systems should look like.

Radhika Menon is Senior Communication Officer at 3ie.

Friday 17 January 2014

So who did join the global conversation last year? And will it continue this year?

Lessons learnt from IDS success on Facebook by Digital Communications Officer Robin Coleman

Like most research organisations and think tanks the Institute of Development Studies (IDS) has invested heavily in our use of social media to engage people around our work and in 2013 we experienced very rapid growth of Facebook fans. We recently produced this short film providing an overview of our year in social media.


There is no doubt that some people within the international development community and beyond will be surprised that IDS now has over 50,000 Facebook fans. This is double the number who follow the University of Sussex, where we are based, and slightly more than follow either the UK’s Department for International Development or the OECD. So who are our fans and what does this milestone really mean in terms of the impact of our research communications?

Let’s breakdown who our fans are comparing end of year results:

Top 5 counties were:

2012% of all fans2013% of all fans
UK
17.64%
INDIA
9.87%
INDIA
7.32%
USA
6.77%
USA
6.96%
UK
5.85%
PAKISTAN
3.61%
PAKISTAN
5.32%
BANGLADESH
2.95%
EGYPT
3.59%

Being a UK based organisation we expected to see a strong home crowd support but within a year our Indian fan base had doubled. Further analysis reveals that engagement and geographic spread i.e. where our fans are based has grown from 35 to 144 countries (as seen below) suggesting that we are developing a truly international audience.
Created using AMMAP

Breakdown by Gender and Age


Looking at the fan base data we can see that the majority of fans are aged between 18 and 34. Ok, not that unusual as it would be rare to see anything different with the social media industry reporting the same trend. Since IDS is an academic institute offering MA and PhD courses with a target audience aged between 18 and 34 then this fan base shows how well this social network works perfectly. As well as the majority of fans either studying or recent graduates, it is encouraging to see an older audience using Facebook. This means an opportunity to impress and influence those in professional positions and generate a viral effect through sharing our content.

From a gender perspective, industry reports suggest that Facebook users are in the majority Female in developing countries. However, nearly all the age categories for IDS fans show that Males actually tip the balance which matches recent figures of those in developing countries like India.

Who’s actually interacting?
Having more fans than some of our counterparts is great to brag about but the real objective should be engagement i.e. how many people are actually listening and responding to our content. Luckily Facebook Insights allows us to see the breakdown by country, city, gender and age using the measurements of People Talking About the Page and Positive Feedback (previously known as PTAT or People Talking About This). These two measurements combine the number of fans (and non-fans through sharing to friends) Liking, Commenting and Sharing our Facebook content be it posts, photos, links shared or mentions of our page.

Here’s how it breaks down for IDS fans on Facebook:

There’s an obvious correlation between age and interaction/engagement (expressed here as PTAT). The youngest groups show a high percentage of interaction then the trend line declines as the age sets get older, with the exception of the Over 65s. Could this be just the novelty for those new to social media? By Gender, our Male fans are also liking, sharing or commenting more on IDS posts than Females.

Rule of 90 – 9 – 1
Studies have shown that online communities share a common engagement statistic and this same ratio still rings true in social media. Think of it as a Pareto principle for the internet. Defined on Wikipedia as the 1% rule, it expresses the percentages of an internet community and defines them as follows:
  • 90% are 'lurkers' - those who are happy to just view content and follow silently
  • 9% are 'contributors'; those interact with the content owner, mention you, comment occasionally
  • 1% called the 'creators': hardcore fans who can produce great content, always interacting but could also be problematic if handled incorrectly.
Here’s a funny example of the 1% which we’d all love to deal with.

Does this ratio stand with our Facebook page?
Referring to the ‘Who’s talking with or About IDS’ graph then yes it does. The trend line ratio percentage if combined across all ages and gender would be 10.95%. Using the 90-9-1 ratio those contributing (9%) and creating (1%) content would equal 10% therefore this golden rule has been matched.

2014: changes are happening
There are a couple of inter-related reasons why social media and in particular Facebook could be on the wane.

Within the blogosphere of those studying and specialising in social media, there’s been speculation that networks like Facebook and Twitter are experiencing a ‘cultural lag’. In other words the honeymoon period has ended and social media is no longer the novelty anymore. With so much content being shared and competition for users’ attention there’s only so much time and effort fans and followers will spend.

This cultural lag for the major social networks may become global and effect those not just from the early adopting countries like the USA. Although in 2013 social media has been proved to be popular for countries like India, Bangladesh and Egypt, this has been fuelled by the availability of smartphones and mobile technology. The future may also see new ways to communicate online if we're to learn from MySpace's demise or the popularity of native language based networks like Sina Weibo, China's own micro-blogging service.

Facebook as a company is responding to this issue and the feedback of brand pages muscling in with their marketing content. They know that the nature of a social network is to retain those who make it. Being social animals we want to converse with our friends and not companies and organisations. Unless we can connect with their beliefs and interests close to their heart then page owners will be and have been getting ignored.

The solution to this, Facebook believe is to make changes to its news feed algorithm where individuals will see more posts in their feed that have more interactions (i.e. posts that are liked, commented and shared). This could actually mean page owners will have to consider using advertising. Fine if you have the resources to pay for this service. For those in research communications it will be a challenge in terms of either better content writing or looking for extra funding.

Despite having an extremely accurate targeting tool based on its users’ data in place (age, gender, geographic location and what we Like) Page owners will need to see proven results that this advertising works.

So, the challenge for creating interesting content continues. Keep your 10% stimulated and learn from the spikes of traffic.