Showing posts with label ResearchCommunication. Show all posts
Showing posts with label ResearchCommunication. Show all posts

Wednesday, 23 October 2013

Digital repositories – reaching the parts other websites cannot reach

By James Georgalakis

If you write a blog about what’s trending on Twitter or the latest website design fad you normally get some good engagement. If you write about Open Access to research you can normally get a debate going.

But when you start writing, or talking for that matter, about institutional repositories people’s eyes tend to glaze over. This is unfortunate because a key, perhaps essential, element of an innovative digital communications strategy that promotes Open Access to research is the use of a repository. Perhaps it is the name that puts people off or they simply assume this is the sole preserve of librarians.

Whatever the blockage is we need to get over it and fast.

Many research funders have long expected to see repositories host outputs they have funded. Universities have long been fully equipped in this area but many members of the development research community from think tanks to NGOs are simply publishing their outputs onto a traditional website.

This week, which is also Open Access Week 2013, IDS announced that it is in the process of digitising and publishing onto OpenDocs, its open access repository, its entire back catalogue of almost 2,000 research reports, working papers, practice papers, and other IDS Series Titles.

I will attempt to explain why I believe digital repositories are essential if you are serious about open access publishing and research uptake.

What is a digital repository, anyway?


Repositories are built on software that is international and interoperable, facilitating data exchange and re-use. In other words, they are highly compatible with other systems! The full text of each archived document is rapidly indexed by search engines and securely stored for the long term. In this way a repository like OpenDocs hugely increases the discoverability of IDS and our partners’ research through search engines such as Google scholar. This means more citations and hopefully more uptake and influence.

At IDS we have a good publications search area on our website which is one of our most popular pages. What you may not notice is that most of the documents you download are actually hosted on our repository. This is because OpenDocs adds value, securing additional hits from searches made from outside the IDS site as well as those done within the site. We also provide the links to all our partners’ and projects’ websites so that the research outputs they appear host all get downloaded directly from OpenDocs, which is crucial for our monitoring systems.

Many institutions use repositories to profile special collections or archives. This can result in large numbers of downloads where there are topical themes or big name academics involved. Perhaps not surprisingly, the very early days of IDS’ repository were marked with the launch of a Robert Chambers archive. Many repositories have built in analytics so that you can view downloads of specific publications or whole collections. Again, due to the wider reach of repositories, this gives you a far fuller picture of usage than the google analytics reports you may have been producing on your own website’s page views and downloads.

Of course repositories vary hugely and some can do quite different things to others. IDS uses a software package called DSpace which represents a different approach to digital collections management compared with other popular systems like Fedora.

IDS’ repository actually hosts two completely distinct collections:
  1. One is rather obviously the IDS Research Community  collection
  2. the other is the BLDS Digital Library of over 2,000 full-text publications from research organisations in Africa and Asia. 
Presently OpenDocs is mainly populated with text-only material but over time it may include datasets and multimedia content as well.

Warning: Digital repositories do not automatically meet all open access mandates


Open Access literature is digital, online, free of charge, and free of most copyright and licensing restrictions. Clearly repositories, with their ability to make research more widely available should form a crucial part of any open access strategy.

However, just because publications and other outputs are freely downloadable from repositories it does not mean they are free of all licensing restrictions. This means they may not meet the open access requirements of some funders. Every effort is now being made at IDS to ensure material in OpenDocs has a Creative Commons Attribution license.

So, if you are amongst those whose research and knowledge is not supported with a digital repository, you need to ask some hard questions. Does your institutional website offer the same benefits and if not what are you going to do about it? Failure to invest in this technology and promote its use across your networks may be undermining your potential reach.
Find out more about IDS' approach to open access publishing.
  
James Georgalakis is Head of Communications at the Institute of Development Studies. Follow James on Twitter


Previous blogs by James on Impact and Learning:

Thursday, 12 September 2013

# Hashtags, likes, and maintaining followers and friends: what we have learnt using social media tools

By Fatema Rajabali

Social media has multiple recognised benefits: it not only enables the quick dissemination of information to a wide global audience, but it also encourages immediate feedback and engagement with other users. Many development-related institutions are present on these platforms and use them actively.

The Eldis Climate Change Resource Guide (CCRG) has been experimenting with social media tools in 2013, with the support of the Climate and Development Knowledge Network (CDKN). Viivi Erkkila captured our learning in a paper that we would like to share with the wider development community. Although the guide already has a wide global audience, social media presence is considered a valuable addition in broadening CCRG’s global outreach and directly engaging with its users. Our social media work focused on:

•    Twitter (@EldisClimate)
•    Facebook
•    LinkedIn 

We had two primary objectives in experimenting with these social media tools:

1.    Increase the outreach of Eldis content to new audiences
2.    Engage directly with Eldis users to become more demand driven

So what did we learn from this pilot project?

Define your target audiences and research how and why they use social media

Even when using social media, it is important to define your target audiences: Who and where are they? How do they prefer to receive information? Which social media tools do they use or do they use them at all? This does require a little of knowledge of how your user base seek and access knowledge.

Social media platforms are not all alike and people use them for different purposes: make sure you adjust your updates according to each medium used. For example, we found that facebook users respond more to visual content so interesting images with posts are important.

It is important to have adequate resources so that sufficient levels of activity take place within the social mediums being used – especially, if you want to build a profile and following which generally requires a reliable number of updates daily.

Define metrics for success

Define specific metrics for success: What does a certain number of followers or likes mean for you? Are you reaching you target audience? These metrics should not only include numbers of followers and likes, but also look at audience behaviour – for e.g. do they comment or share posted content?

If one of your aims is to bring social media users/followers to your website, it is worth looking at how much time users spend on your website vs. other users who may be directed to your content in other ways. Is social media contributing to an increase of return visits to your website?

Make sure you have a clear editorial policy

Define a clear editorial policy to ensure quality and consistency across platforms. If multiple people are using the accounts, make sure everyone knows these boundaries. Much of social media content is opinions and it is important to state disclaimers, if personal opinions are shared using an institutional social media accounts

Social media is a two way process: participating in discussions is more effective than simply disseminating your own material. Remember that your audience may be on different time zones.

Monitor activity and engagement

Monitor content posted by others and respond to comments in a  timely manner, because failure to reply may result in unfollows and/or unlikes.

Monitor current events and news to be able to offer relevant, engaging and well timed contributions
Set up M&E measures in the beginning and make sure you know how to use them. There are many online tools for tracking your posts and ‘influence’.

Positive impact

We’re continuing to use and explore these mediums – the impacts to-date on the Eldis Climate Resource Guide has been positive. Since April 2013, we have had over 600 unique page views to various climate change resource guide resources via Twitter, Facebook and linkedIn – and we are hoping this number will keep growing as we interact and engage with new and current users via social media.

Fatema Rajabali is the Climate Change Convenor at the Institute of Development Studies. This is an adapted version of a blog originally published on Eldis Communities.

Wednesday, 21 August 2013

Has Twitter killed the media star? (or How I stopped worrying and learned to love social media)

By James Georgalakis

Credit: Stefano Corso/Pensiero
Back at the end of the last millennium my biggest preoccupation at work was how to secure more column inches for my employer.

As a press officer it was my job to feed carefully crafted press releases into the fax machine and get on the phone to pitch in the story. My success or failure to engage bored sounding interns manning the phones of busy news desks with a new report or announcement determined my organisation’s ability to set the news agenda, engage the attention of policymakers and raise profile.

Ten years on, social media has firmly established itself as a channel through which you could tell your stories and engage key opinion formers. But this was a slow transition and an organisations’ ability to pitch stories into traditional media outlets remained of paramount importance.

However, last June, the balance seemed to tip.

Perhaps in other sectors and in other contexts this has occurred many times before but at least here at the Institute of Development Studies it felt like a significant moment.

IDS was busy launching the Hunger and Nutrition Commitment Index (HANCI).  The index provided scores for donor countries on their commitment to reducing hunger and undernutrition in terms of their aid spending, policies and endorsement of international agreements. Our strategy had been to coincide with the high profile UK-hosted Nutrition for Growth summit to maximise media interest.

Results were mixed. Despite around twenty pieces of media coverage, many of the most highly valued outlets we had hoped would cover it ignored us. Save the Children and the big NGOs, who were busy promoting their own stories for the hunger summit, with their enormous resources and access to media-friendly human interest stories and celebrities had the likes of the BBC pretty much sown up.
 
The tipping point: Social media versus traditional media 


But all was not lost. Our social media strategy was having a real and immediate impact in terms of engaging policymakers with the index.

We had carefully targeted key influencers on Twitter and key bloggers. Some of these were project partners, others were organisations we had identified through our stakeholder mapping as having a shared advocacy or research agenda.  We shared with these networks advance links to our assets including an infographic, a website and a short animated film. We met with them, briefed them in nutrition network meetings or simply fired off an email a few days ahead of the launch asking them if they could help. Once the launch was underway we also tweeted messages to some of our key followers.

The result of this strategy was that even if HANCI failed to make the grade in the news-room it was an instant hit on Twitter and in the blogosphere.Within hours of HANCI going live a shout went up from our Digital Communications Officer:

“Canada’s just tweeted HANCI!”

Sure enough, Julian Fantino, Canada’s Minister of International Cooperation tweeted Canada’s HANCI score.

This was followed-up by the Canadian government’s Nutrition Coordinator contacting IDS directly to get the full data. “The Dutch just tweeted” went another shout. Head of Food Security and Financial Sector at the Ministry of Foreign Affairs in the Dutch Government Marcel Beukeboom tweeted: “disappointing 16th place for Netherlands. We are more ambitious than that. Interested in indicators to learn how to improve.”  By the following day the Index received an official response from Irish Aid, with a press release quoting Ireland’s Minister for Trade and Development Joe Costello welcoming the index.

In none of these cases could we find any obvious link to traditional media. There had been no Canadian press release, Dutch or Irish media coverage.

Duncan Green’s influential blog and retweets from key development and nutrition influencers like DFID, Oxfam, ONE, Concern, Action Against Hunger, Save the Children and the IF campaign really stirred things up. The top five tweets and re-tweets alone received more than 134,000 potential impressions. The most popular content was the infographic swiftly followed by our animation which was watched a thousand times in just a few days.

Weeks later, the reverberations continue. Just in, the United Nations System Standing Committee on Nutrition have tweeted about the HANCI animation: “This video gave me goose bumps.

Targeted social media = real engagement

Am I heralding the death of traditional media and its usurpation by Twitter, Facebook and Google+?


Absolutely not, but this is still a watershed moment for me because it has provided such a blatant example of the primacy of social media in engaging relatively niche audiences with our research.

In this case Twitter was providing us with direct responses to HANCI from our target policymakers. This is exactly what we had set out to achieve. We want HANCI to be taken seriously by governments, and used in the advocacy of those that seek to hold them to account.

The news-room intern and the Today Programme night editor are no longer the only means of getting government ministers and NGOs to respond to our research. This is more than just the blurring of boundaries between traditional media and social media.If we can be innovative enough we can bypass traditional media altogether.

Of course, we are still reliant on intermediaries, in the form of Twitter followers and bloggers and occasionally the media itself. Traditional media coverage can still be enormously impactful especially in relation to more mainstream or topical political issues. But much as marginalised citizens around the world have been able to harness social media to make their voices heard, research organisations can use non-traditional media to engage key audiences around their research without having to rely upon the overstretched editor or broadcaster.

IDS already has a digital communications strategy that places social media at its core. But this recent experience provides reassurance that we are moving in the right direction.  I will always look back fondly on those days of the busy press office and the excitement of a story really taking off but I am glad that social media provides another dimension to engaging key decision makers and influencers around our work. Social media may not have killed the media star yet but it has certainly dimmed it a little.

This blog was originally published on WonkComms - what is the future for think tank and research communications? James Georgalakis is Head of Communications at the Institute of Development Studies. 

Other blogs by James Georgalakis on Impact and Learning: 

Wednesday, 7 August 2013

Open data and increasing the impact of research? It's a piece of cake!

By Duncan Edwards

I talk to a lot of friends and colleagues who work in research, knowledge intermediary, and development organisations about some of the open data work I’ve been doing in relation research communications. Their usual response is “so it’s about technology?” or “open data is about governance and transparency, right?”. Well no, it’s not just about technology and it’s broader than governance and transparency.

I believe that there is real potential for open data approaches in increasing the impact of research knowledge for poverty reduction and social justice. In this post I outline how I see Open Data fitting within a theory of change of how research knowledge can influence development.

Every year thousands of datasets, reports and articles are generated about development issues. Yet much of this knowledge is kept in ‘information silos’ and remains unreachable and underused by broader development actors. Material is either not available or difficult to find online. There can be upfront fees, concerns regarding intellectual property rights, fears that institutions/practitioners don’t have the knowhow, means or time to share, or political issues within an organisation that can mean this material is not used.

What is “Open data”? What is “Linked Open Data”? 

The Open Knowledge Foundation says “a piece of content or data is open if anyone is free to use, reuse, and redistribute it — subject only, at most, to the requirement to attribute and/or share-alike.”

The Wikipedia entry for Linked Data describes it as“a method of publishing structured data so that it can be interlinked and become more useful. It builds upon standard Web technologies such as HTTP and URIs, but rather than using them to serve web pages for human readers, it extends them to share information in a way that can be read automatically by computers. This enables data from different sources to be connected and queried…. the idea is very old and is closely related to concepts including database network models, citations between scholarly articles, and controlled headings in library catalogs.

So Linked Open Data can be described as Open Data which is published in a way that can be interlinked with other datasets. Think about two data sets with country categorisation – if you publish these as linked data, you can then make the link between related content between different datasets for any given country.

For more definitions and discussion on data see Tim Davies post "Untangling the data debate: definitions and implications".


Why should Open Data be of interest to research producers? 

The way in which the Internet and technology has evolved means that instead of simply producing a website from which people can consume your content, you can open up your content so that others can make use of, and link it in new and exciting ways.

There are many theories of change which look to articulate how research evidence can affect development policy and practice. The Knowledge Services department at the Institute of Development Studies (IDS) works with a theory of change which views access to, and demand for, research knowledge, along with the capacity to engage effectively with it, as critical elements to research evidence uptake and use in relation to decision-making within development. Open Data has significant potential in relation to the ‘access to’ element of this theory of change.

Contextualisation and new spaces 

When we think about access to research knowledge – we should go beyond simply having access to a research document. Instead we must look at whether research knowledge is available in a suitable format and language, and whether it has been contextualised in a way which makes sense to an audience within a given environment.



I like to use a Data cake metaphor developed by Mark Johnstone to illustrate this - if we consider research outputs to be the data/ingredients for the cake, then we organise, summarise and catalogue this (i.e. add meta-data) to ‘bake’ into our information cake. We then present this information in a way in which we feel is most useful and “palatable” to our intended audiences with the intention they will consume it and be able to make use of new knowledge. It’s in this area that Open Data approaches can really increase the potential uptake of research – if you make your information/ content open it creates the possibility that other intermediaries can easily make use of this content to contextualise and present it to their own users in a way that is more likely to be consumed.

Essentially by opening up datasets of research materials you can reduce duplication, allow people to reuse, repurpose, remix this content in many more spaces thereby increasing the potential for research findings to be taken up and influencing change in the world.

While I see significant benefits in researchers making their outputs available and accessible in an open manner we must redress the dominance of knowledge generated in the global North. We need to continue to invest in the strengthening of intermediaries at local, national, and international levels to make use of research material and Open Data to influence positive change.

Duncan Edwards is the ICT Innovations Manager at the Institute of Development Studies (IDS) - you can follow him on Twitter: @duncan_ids

NOTE: an admission on Open Access - The original article this post is based on, “Davies, T. and Edwards, D. (2012) 'Emerging Implications of Open and Linked Data for Knowledge Sharing in Development', IDS Bulletin 43 (5) 117-127”, published in the IDS Bulletin: “New Roles for Communication in Development?”. Ironically, considering it’s subject matter, is only partially open access (two free articles per issue). But you can access this article as green open access in Open Docs - http://opendocs.ids.ac.uk/opendocs/handle/123456789/2247

Monday, 10 December 2012

Forget asking if policymakers understand evidence – do we understand policy?

By Emilie Wilson

I am setting aside my role as editor for this blog for a minute to share some reflections on a recent workshop I attended. It was called Beyond Communication: exploring approaches to research uptake; and was organised by the UK Collaborative on Development Sciences (UKCDS) and UK Department for International Development (DFID).

Is “Research uptake” more jargon, or a different way of understanding communication? 

The first time I came across the expression “research uptake” was in 2010, when the new UK coalition government was voted in, and a marketing and communication “freeze” across all UK departments was implemented.

I was new to my Communications job at the time, and would have been inclined to agree with my colleague Jeff Knezovich’s observation that “The term was specifically designed to obfuscate its purpose—ironic, given that the whole point was to help clarify and communicate research findings. This obfuscation was a result of a change of government in the UK, and a “communications” witch-hunt from the Conservative-led coalition, branding such activities as “wasteful Labour spending” in a time of austerity”

However, a quick Google search reveals that “research uptake” was around pre-2010 and a number of different sectors have used the term when grappling with the issue ‘how on earth do you get people to do something with the research?’.  In health, for example in this paper “Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science"; or in political economy, for example The Politics of Trade (these examples were thrown up by the ‘quick Google search’ so I’m not holding them up as exemplary merely indicative).

At the workshop, Kirsty Newman – Research Uptake Manager at DFID (and contributor to this blog) – described Research Uptake as “allowing us to take a holistic view” of this issue – here is a reproduction of the sketch I drew in my note-book to capture her description:



The assumption being that previously communicating research findings had focussed on ‘dissemination’ or ‘diffusion’ – i.e. supply; and that research uptake was shifting the focus to ‘demand’ – either ‘what do they want’ or ‘can we persuade them to want what we have’ Or more eloquently described by Jeff as “stimulating an enabling environment among end users of research to commission and find appropriate information to support their own policy processes”. Most people who work in marketing and communications will already be pretty familiar with the focus on stimulating demand, and I’m sure would be only too pleased share their wealth of research on this area!

Evidence-based policy – one way to reduce poverty 

It seems we are still living in an Age of Reason (at least in the northern hemisphere), believing strongly that research validated by expert peers and based on tried and tested methodological approaches has something to offer to people in power. The offer is that research can demonstrate what works or doesn’t work, and it is not supposed to be tainted by money, politics or tribal affiliation – it’s a global public good, a credible source of authority. The assumption is that decisions based on this objective knowledge source will be of the greatest possible benefit, and not privilege a minority or prop up a flawed system.

So, should the ‘demand’ box in my diagram be renamed “creating demand” – i.e. the process is still supply-driven. Evidence (derived from research) – like vitamins, for example – is good for you (and your policymaking) and we need to develop increasingly sophisticated ways for people to engage with it. But do we know much about the demand other than through dialogue which merely engages our ‘end users’ with our research products and processes? Evidence literacy has been used as a term to describe the ability of policymakers and practitioners to understand and apply research evidence, but have we stopped to ask ourselves about our own literacy?

The Kirsty Newman policymaking quiz (PowerPoint)

Kirsty Newman did just that during the workshop, when she put the shoe on the other foot, and asked those present at the workshop some simple questions mostly about the UK policy environment. Such as – what are the three principle functions of the UK parliament? How would you define a “civil servant”? (this was a multiple choice one). I’m ashamed to say that, even with a degree which includes a minor in politics, I only got 4/7 questions correct. And most people present scored less! 

It may have been that people in the room would have been more familiar with other policy environments (and not specifically the UK's), but Kirsty's quiz did well to make the point: do you know the language of the people you are trying to communicate with?

Which is why I think we should consider sticking with the term “communication”, describing a two-way process (from the Latin verb to share) rather than “uptake”, implying a one-way process.

Emilie Wilson is Communications Officer at the Institute of Development Studies, and editor of the Impact and Learning blog. 

Thursday, 1 November 2012

Redefining the Researcher, and the Research

By Zachary Patterson

Tessa Lewin and I wrote an article entitled “Approaches to Development Research Communication” for the recent IDS Bulletin on New Roles for Communication in Development? We looked at the evolution of development research communications alongside the evolution of different development paradigms. The article points to a scattered and shifting communication field, which draws from a diverse range of development paradigms often in a way that is both incoherent and contradictory. Alongside the plurality of communication approaches, innovations in technology have contributed to changes in how researchers and practitioners both access and communicate research. However, despite the growth in platforms that allow open access to information, we remain limited by what many regard as anachronistic power structures.


Slide from Tessa Lewin's presentation at the Institute of Development Studies. The whole presentation can be viewed at: http://www.slideshare.net/idsuk/evidence-influence-agency-what-new-roles-for-communication-in-development-research

New technologies are transforming the field of development research communication and this process raises new ethical dilemmas. Nowhere is this more apparent than in the tensions around intellectual property. There is an inherent tension between the academic system, which relies on sole authorship and a profitable publication model, and the idea of development research serving the communal, public good. Much development research is practice and policy-based, further augmenting these complications. While development research positions itself as a public good that strives to be altruistic and value-free, the power dynamics of capital continue to disrupt the public availability of its findings. The current tug-of-war occurring within the arena of academic publishing can offer a useful lens to unmask and illuminate broader dynamics within the field.

Are new technologies helping Open Access campaigning gather momentum?

A confrontation over the access to academic research has played out in the UK over the past few months, as academics continue to heavily criticize the publishers of scientific journals. Scientific and medical academic journals, that publish work largely funded by taxpayers, have charged UK universities around £200m annually for access in recent years. Supporters of what has become known as the ‘academic spring’ have argued that the findings of publicly-funded research should be made openly available to academic institutions and the general public, for whatever purpose.  Since the initial arguments of those involved in the ‘academic spring’, more than 12,000 academic researchers have signed a boycott of the Dutch publisher Elsevier, in an attempt to broaden the campaign against the pro-market model of academic research and publication. The campaign has since influenced the UK government to approve a plan to make all publicly funded scientific research immediately available for anyone to read. Meanwhile, the public availability of many academic publications that could aid the effectiveness and efficiency of development approaches and outcomes remains limited.

Paul Mason, Newsnight's economics editor, has argued that our current political landscape is shaped by a combination of innovations in contemporary communication technologies, shifts in global demographics, and the public realization of the power of networks over hierarchies.  In particular, technological developments have helped to consolidate local citizen awareness and action, while placing researchers in positions where they are doing more than collecting and sharing knowledge. However, while the new age of research communication technology has opened the way for unprecedented availability and distribution of knowledge, and the blurring of divisions  between academic, professional, and amateur researchers, the availability of development research remains varied.  Access to academic and scientific development research through ICTs and informative virtual spaces continues to be surrounded by conflict and tension in terms of ownership (or privatization) and openness.

In article earlier this year for the London Review of Books, Slavoj Žižek suggests that the current communications and virtual space tug-of-war between global citizens and government and private interests began with an attempt by the powerful to ‘privatize general intellect’.  He warns that because the academic research and publication model is rooted in the capitalist market, there is a strong pull to continue the privatization of research – including that which serves the 'public good'. While the academic system struggles to survive in an increasingly market-driven environment, new communications approaches are creating more layered and complex options for accessing and sharing knowledge. Despite the hopes that we may have for information and communication technologies in development research communications, it is not yet clear how this current chapter of opportunities and challenges will play out.

Interested in reading more around this topic? Try..
The Internet and Democratic Citizenship by Stephen Coleman and Jay G. Blulmer
Radical media: rebellious communication and social movements by John D.H. Downing

With many thanks to Tessa Lewin for her support in the writing of this blog post. 


More blogs on the IDS Bulletin New Roles for Communication in Development? 

  • Challenges in communicating co-constructed knowledge to influence policy (By Fran Seballos)
  • How are the roles of researchers and research communicators changing? (By Tessa Lewin)

  • Wednesday, 17 October 2012

    How are the roles of researchers and research communicators changing? Forthcoming blog series

    By Tessa Lewin

    What does validity mean in an environment where bloggers and journalists are often viewed as more credible, useful or accessible sources than researchers? How are the roles of researchers and research communicators changing? This landscape has been undergoing a significant shift in recent years.

    The emergence of new technologies has been accompanied by other shifts in the politics and business of development knowledge: the understanding of what constitutes ‘expert knowledge’, a growing emphasis on process over product in research, and new understandings of what drives social change and policy influence.

    With the rise of participatory and co-constructed communications have come suggestions that the rigour and ‘hard evidence’ needed to influence policy has been neglected. As some have turned back to grassroots forms of communication such as community radio, they face ambivalence from others struggling to see what is new or innovative about such ‘archaic’ approaches.

    Alongside colleagues Blane Harvey and Susie Page, I have written for and edited the latest edition of the IDS Bulletin journal, entitled
    New Roles for Communication in Development?
    .

    We wanted to explore these interesting changes by drawing on the experiences of practitioners, theorists and community intermediaries from a wide range of disciplines.

    We came from a range of disciplines and experiences ourselves - I'm Communications Manager for the Pathways of Women's Empowerment research programme, Blane is a Research Fellow in the Climate Change team at IDS and worked recently been working on Climate Airwaves, a community radio project, whereas Susie Page was manager for the Impact and Learning team, focused on 'how communicating research brings about change'.

    The Bulletin's articles reflect the overlaps and disconnects within different fields (particularly on how new technologies, approaches and configurations of research communication are influencing the practice of development) and sit, at various points, in tension or consensus with one another. They reflect the unresolved nature of the politics and practice of research communication – and begin to map a complex picture of this arena.

    We outline our thinking on this in more detail in the Bulletin's Introduction: Is development research communication coming of age? (PDF)

    Over the next few months we will be inviting the contributors to this Bulletin to write a series of blog pieces, outlining and reflecting on their articles in the Bulletin.

    Watch this space….

    Tessa Lewin is Research Office in the Participation, Power and Social Change research team at the Institute of Development Studies. She's also Communications Manager for the Pathways of Women's Empowerment research programme consortium

    Read the full blog series.. 

    Friday, 12 October 2012

    Comparing research and oranges: what can we learn from value chain analysis?


    By Elise Wach

    A conversation with a colleague the other day about how we would communicate our research findings for a nutrition initiative struck me as remarkably similar to the conversations I held under orange trees in eastern Uganda about market research and value chain analysis a few years ago.

    In Uganda, the government was promoting the cultivation of certain fruit trees based on studies that had shown which varieties were agriculturally viable.  Farmers transitioned their plots from cassava to orange trees on the assumption that there would be a market for their oranges once their trees started fruiting several years down the line. 

    Obviously, to us value chain analysts, this was crazy – it was necessary to do some market research first to find out where there were opportunities for these fruits in the national, regional, or international markets, and then grow and prepare the right crops accordingly. 

    What can we learn by applying value chain concepts to our research?
    Image: statesymbolsusa.org
    Our thinking was shaped by the countless instances of NGOs and donors promoting the production of something (whether oranges, soaps, water pumps, etc.) without doing their homework to find out if anyone might purchase them and under what conditions: whether there was an opportunity in the market for the product (e.g. will people buy the oranges to eat, or would a juicing company be interested in them?), whether product could be improved to better meet consumer needs and preferences (e.g. are Naval oranges preferred over Valencia for juicing?  What about for eating?), whether demand could be stimulated (e.g. can we promote orange juice as a healthy breakfast option to increase consumption?), etc.  Without doing this research first, there is a significant risk that the oranges that farmers produce will not bring them the returns they hoped for. 

    So I wondered, is producing research first and then deciding how to communicate it afterwards the same as growing an orange and then deciding how and where it will be sold? 

    We invest a substantial amount of time and resources into producing our research and for most of us, having our research reach other people is our primary concern.  

    What does the value chain for research look like?

    Our product, or ‘oranges’ are our research studies. Our ‘market analysis’ is our ‘audience research’.  Our ‘marketing approach’ is our ‘research uptake strategy’. Our ‘value chain analysis’ is the research we do about ‘evidence into policy’ or ‘knowledge into action’.

    We work to strengthen the knowledge value chain.  We build demand for our products through increasing the demand for research and evidence.  We alter our products to our consumer needs through producing 3-page policy briefs for some and Working Papers for others.  And we create or strengthen bridges between our producers and consumers (e.g. individuals such as knowledge intermediaries / knowledge brokers or systems such as the policy support unit that IFPRI is supporting within the Ministry of Agriculture in Bangladesh).  We understand that policy decisions are complex, just as markets have long been recognised as being complex (the outputs from value chain analysis, when done well, never look like actual chains, just as a theory of change never fits into log frame boxes). 

    Obviously, there are differences between research and oranges.  The shelf-life of research is clearly longer than the shelf-life of oranges, and research can be dusted off time and time again and used in a variety of ways, many of which we’re unable to anticipate.  But much of the impact of our research does rest on the timely communication of our findings.  While Andy Sumner’s research on the bottom billion will certainly facilitate a better historical understanding of poverty, I will venture to guess that he also hopes that this information will shape development policy so as to better tackle this issue. 

    We do face many similar issues as our business-minded colleagues.  When is audience research necessary, and when does the ‘if we build it, they will come’ assumption apply?  Where is the line between research communication and advocacy?   How can we create demand and to what extent should we do so?  Do our ‘consumers’ have balanced information about the products available or did they only have access to the one that we produced (Catherine Fisher wrote an excellent blog about policy influence vs evidence informed policy)?  How much do we let the market dictate what we produce and how we produce it?   

    Are there opportunities to apply lessons from our colleagues working in markets and value chains to our work on ‘evidence informed decision making’?  Should we be comparing research and oranges?

    Elise Wach is a Consultant Evaluation & Learning Advisor with the Impact and Learning Team, at the Institute of Development Studies


    Wednesday, 12 September 2012

    As good as gold? How and why to publish open access research

    By Rachel Playforth

     The scholarly publishing revolution that has been steadily building for the past decade may now have reached a tipping point - the UK Government has pledged that all publicly funded research will be open access by 2014; the World Bank, UNESCO and many other major international organisations and funding bodies are backing open access; and a new set of recommendations updating the original Budapest Open Access Initiative is due out this year. But the corresponding media interest in open access hasn’t necessarily increased understanding – we’re all talking about it but do we really know what it is, what it’s for, or how to do it?


    What is open access?
    Image from: http://openreflections.wordpress.com/
    The free and irrevocable availability of research outputs on the public internet, permitting any user to read, download, copy, distribute, print, search, or link to the full text of these outputs, without financial, legal, or technical barriers.


    What is not open access?
    Content that requires registration or is offered free for a limited period only. Formats that prevent downloading, saving, printing or copying. Arguably, content where text mining or indexing by web crawling tools is prevented.



    Why open access?
    Because removing access barriers will enrich and accelerate research. Because scholars in poorer institutions and poorer countries shouldn’t be excluded. Because publishers shouldn’t make huge profits from research, peer-reviewing and editing work done by academics for free. Because we shouldn’t have to pay twice for publicly funded (and potentially vital) research, once through our taxes and once through subscriptions and fees paid to commercial publishers of scholarly journals.

    Why else open access?
    Because many funding bodies, including the Wellcome Trust, RCUK and DFID, require it as a condition of funding. Even if you are half-hearted about the ideology, you may have to embrace the reality.


    Gold or green? 
    There are two routes available to researchers who want (or need) to make their work open access, known as ‘gold’ and ‘green’. The costs of publishing in peer-reviewed journals are currently met by the reader (probably via their library), though subscription charges and pay-per-view fees.

    Gold open access shifts the cost to the author, who pays (probably via their research funding or their institution) to publish in an open access journal. This was the approach most strongly recommended by the recent Finch report on expanding access to published research findings, and is the ultimate goal of the UK government. Based on the idea that full gold OA will eliminate the ‘paying twice’ problem with subscription journals, it’s been estimated that it could lead to whole system savings of around £80 million per year.

    The other route is known as green open access, represented by research repositories. The majority of commercial scholarly publishers allow some form of ‘self-archiving’ in subject or institutional repositories, usually but not always with an embargo period to protect their revenues for the first few months after publication. If all journals were open access, there would of course be no need for embargo periods, and arguably, no need for repositories. (The Finch report sees their role shifting more towards preserving/sharing research data and grey literature). But in the current transition period where the subscription model coexists with the OA model, repositories are working successfully with both.

    Repositories also offer advantages to researchers and institutions beyond open access policy compliance:
    1. Impact: research shows that open access articles tend to be more cited than comparable material behind paywalls 
    2. Discoverability: the protocols used by repository software are international and interoperable to facilitate data exchange and reuse, and the metadata standards mean the content is quickly indexed by Google and repository indexes. 
    3. Preservation: the repository can store copies of research for posterity in a way that is independent of the original format (which may become obsolete). 
    4. Reputation: a repository provides both an accurate record of, and shop window for, an institution’s (and an individual researcher’s) intellectual output. 
    5. Flexibility: repositories can contain all forms of work including peer-reviewed articles, book chapters, working papers, presentations, images, audio, and data. 
    But what about my intellectual property? 

    True open access is compatible with protecting copyright and intellectual property – the one restriction on reuse is that the work should be properly attributed. There are various Creative Commons licences that can help make this explicit. An author who retains their copyright and makes their work open access has more control over that work than if they had transferred the copyright or given exclusive rights to a publisher, as is standard in many publishing contracts.

    And my impact? 

    Many researchers worry about diluting the impact and credibility of their research by taking the open access route. The number of established open access journals is currently too small to rival the impact factors of the major subscription offerings, it’s true, but this will change as open access is mandated more widely. As for repositories, self-archiving a copy of your article does not necessarily have an adverse effect on citations of the published version. It will certainly increase the number of times it is read, and many repositories provide a DOI and specify that the published version should be cited.

    More information 
    Find open access repositories on OpenDoar
    Find open access journals on DOAJ 
    Check journal self-archiving policies on SherpaRomeo  

    Rachel Playforth is Repository Coordinator at the British Library for Development Studies, based at IDS. To find out more about the IDS Repository, hosted by OpenDocs, contact Rachel.

    Tuesday, 28 August 2012

    Can a policy brief be an effective tool for policy influence?

    By Catherine Fisher


    Popular wisdom suggests that busy policy makers don’t want to read dense academic journal articles or books. Instead they want something short that summarises findings into accessible language and draws out the main implications. Consequently, policy briefs; short jargon-free summaries of research findings, have become an increasingly popular tool for researchers trying to achieve policy influence. So popular that a in a recent discussion on the Knowledge Brokers Forum, Nasreen Jesani, commented "there is policy brief fever…. People feel like it is the silver bullet and so there is an upsurge of policy brief creation."

    Yet, anyone who has tried to produce one will know that policy briefs can be time consuming and expensive to create and disseminate, and study by IDRC Think Tank Initiative (pdf) raised questions about their popularity with policy makers


    So how effective are policy briefs at influencing beliefs and prompting people to act differently?

    Penelope Beynon from the Impact and Learning Team joined forces with others at IDS, 3ie,and the Norwegian Agency for Development Cooperation (Norad) to try to find out. The team used a randomised control design to explore three research questions:
    • Do policy briefs influence readers?
    • Does the presence of an op-ed type commentary within the brief lead to more or less influence?
    • Does it matter if the commentary is assigned to a well known name in the field?
    And it is this last question where IDS Director Lawrence Haddad bravely stepped in to test what has described as “the Haddad effect”. Did having his name onthe commentary make a difference?

    Full details of the study findings and methodology

    Download summary of study findings (PDF)

    Download full report (PDF)

    Key findings

    Funnily enough, the study did not find that policy briefs are a "silver bullet". However, the findings are striking and have implications for how communication experts design policy briefs and how we evaluate research communication. The study found that:

    • The policy brief was more effective in creating 'evidence-accurate' beliefs amongst those with no prior opinion than among those who already held an opinion
    • Messengers matter when it comes to readers' intended actions: the authority (Haddad) effect influenced likelihood of taking certain actions but not on beliefs
    • Gender and self-perceived levels of influence affect people’s intention to act after reading the policy brief : women were less likely to report that they would act differently after reading the brief.

    The report offers some recommendations for those creating policy briefs. In particular they recommend including opinion and authority features as they may help to ensure briefs are shared and passed on. They also suggest that the startling difference in response to the brief between men and women in the study should be investigated further.

    The authors have acknowledged the methodological limitations of the study, not least that a policy brief is rarely consulted in isolation. However this study is a contribution towards understanding the effectiveness of different approaches to communicating research.

    It is important for those involved in trying to strengthen the connections between research and policy to think about the tools they are using and the change that they are trying to effect. As research uptake becomes increasingly important we need to invest more in understanding what works and how we can meaningfully test that.


    This study is a contribution to that important debate, let us know what you think.

    Comments on this study:
    Is there a "Haddad" effect? Results from a randomised controlled trial Lawrence Haddad
    Should think tanks write policy briefs? What an RCT can tell us Jeff Knezovitch 
    Summary of e-discussion on policy briefs and information needs of decision-makers Yaso Kunaratnam on KBF

    Wednesday, 30 May 2012

    Philosopher-craftsmen: interesting times for research communications professionals

    Plato - snapshot from Raphael's The School of Athens. Image from http://drishtantoism.wordpress.com/philosophers/plato/
    Plato, the Greek philosopher
    By Emilie Wilson

    Two exciting new publications have landed on my desk today :
    (1)  Knowledge, policy and power in international development: a practical guide and the latest edition of the IDS Bulletin,
    (2)  Action research for development and social change.

    Knowledge, policy and power in international development: a practical guide, not a definitive model


    The first, a book by researchers at the Overseas Development Institute (ODI), aims to be a "practical guide to understanding how knowledge, policy and power interact to promote or prevent change". However, the authors are quick to put in a disclaimer:

    "...we acknowledge that, although some models provide useful analyses of some aspects of the interface between knowledge and policy, it is impossible to construct a single one size-fits-all template for understanding such a complex set of relationships".

    That is not to say the authors aren’t aiming high: "this book seeks to provide: 
    • a state-of-the-art overview of current thinking about knowledge, policy and power in international development 
    • present empirical case studies that provide concrete examples of how these issues play out in reality 
    • offer practical guidance on the implications of this knowledge base” 
    I’m looking forward to getting stuck in, and am particularly intrigued by their “Questions this section will help you to answer” approach to structuring some of the content. I’m also looking out for references to work by IDS Knowledge Services around knowledge intermediation (well, of course I am!).

    Action research for development and social change


    The second, edited by Danny Burns, who heads up the Participation, Power and Social Change team at IDS, is the latest edition of the IDS Bulletin.

    IDS Bulletins come in a variety of shapes and sizes – some very theoretical, others with more practical examples. This one appears to provide a nice balance of both, and has a stellar cast of leading lights at IDS on action research and participatory approaches.

    Again, there is a disclaimer "we have not sought to draw firm conclusions or a single 'theory of practice'" but then a helpful identification of recurrent themes around which to hang your reflections as you read along: power and complex power relations, learning, and action.

    Both these works, I think demonstrate what an exciting time it is to be working in the realm of research uptake, weaving analysis into practice, and giving us communications professionals space to reflect on the impact of our work.

    I’m not a development practitioner, I’m a communications professional...


    In my early days at IDS, when I had more enthusiasm than experience, I remember a conversation with a colleague in which I referred to us as “development practitioners” and she responded “I am not a development practitioner, I am a librarian”. She’s quite right, in many ways – a librarian with a whole heap of experience in international development.

    I guess that description could apply to me too: a communications professional experienced in international development. Just as others are engineers, agronomists, doctors, project managers...experienced in international development.

    That is, we should not forget, while we muse on power, complexity and social change, that we are also master craftsmen. Our understanding of communication, our craft, is based on an understanding of human behaviour. While it needs to be nuanced by peoples culture, worldview, literacy, all manner of contextual factors - we remain craftsmen who understand what to look for and how to build it in different contexts. It provides us with a lens through which to see the world.

    Hopefully, with my bedside reading all set up now for the next month, the theory (and practical guidance) will percolate into my communications practice and I can aspire (grossly paraphrasing Plato) to being a ‘philosopher-communicator’...(albeit with less beard!)

    Thursday, 26 April 2012

    Policy influence or evidence-informed policy: what is the difference?

    By Catherine Fisher

    “We all want a culture of evidence informed policy making, don’t we?” asked Dr Ruth Nyokabi Musila from African Institute for Development Policy (AFIDEP) at the opening of her presentation at the International Conference on Evidence Informed Policy.

    It was a commitment to this ideal that had united over 50 researchers from 4 continents, brought togther in Ile Ife, Nigeria, earlier this year. I was attending under the auspices of the IDS Mobilising Knowledge for Development Programme (MK4D) and had been invited to present and chair a session.

    Policy influence is not the same as evidence informed policy
     
    Throughout the conference I was struck by a blurring between the (admittedly closely related) concepts of research having policy influence and evidence informed policy. The difference seems pretty obvious to me but I sometimes struggle to explain it.  

    Let me try this…  
    • Effective research communication (which aims to influence policy) is indicated by change in policy/process/discourse based on the research findings you are communicating.

    • Effective evidence informed policy is demonstrated by a culture (systems, processes, attitudes and behaviours) that mean that people in decision making processes regularly engage with research from a wide range of sources when formulating, implementing, reviewing policy.

    And to illustrate this difference, here are two examples from the conference:

    Image: http://profile.ak.fbcdn.net/

    Firstly, Kakaire Ayub Kirunda, shares his learning on how to influence policy. He observed that  “while members of parliament might be an ultimate target, they hardly have time and it is their clerks and assistants who do the lion's share of their research..."

    He adds that, in a conversation with Ugandan MP, Honurable Obua Denis Hamson, who also chairs the Science and Technology Committee of Parliament, about how he would want researchers to approach him with evidence, the MP suggests “Probably the easiest way is to first give me a brief summary of your research findings. We can start from there.”


    Ah yes, the ubiquitous policy brief. IDS' Impact and Learning Team recently conducted some research around the effectiveness of these as a communication tool, but that is for another blog.

    By contrast, an example of supporting evidence informed policy was brilliantly illustrated by Jorge Barreto. He described the creation of an “Evidence Centre” in PiriPiri, a town in a poor region of Brazil. The Centre promoted the use of health evidence locally to improve municipal decision-making process.

    Over a beer the night before, Jorge had told me that infant mortality rates in Piripiri were far lower than in other similar towns, his colleague added “20 babies survive a year because of these local policies”.  

    Jorge’s presentation concluded that “current efforts to improve local government’s capacity to use research evidence to define problems, find tested interventions, assessing the quality of global and local evidence and translating evidence to key stakeholders are worth continuing. This is our little contribution towards addressing the knowledge to action gap.” Not so little for those children who survive and their families.


    I feel it is worth maintaining the distinction between policy influence and evidence informed policy as the activities you undertake to influence policy with research will be different to those you might undertake if you wish to bring about a culture of evidence informed policy.

    Such as...Research communication versus knowledge brokering


    Two areas of activity which seek to either influence policy and/or support evidence informed policy are research communication (sometimes referred to as research uptake) and knowledge brokering (sometimes referred to as knowledge mobilisation). These distinct activities also often get confused (see my earlier post Buzzing about brokers).

    Working closely with IDS Knowledge Services, engaged in knowledge brokering activities, and the IDS Communications Team, focused on supporting IDS research, this is something we decided to explore in more depth at an Impact and Learning team ‘learning lab’, a reflective practice tool we’ve been using to create a space for shared learning.

    Here are some notes from the lab, which focused on "desired outcomes": 

    "Research Communication and Knowledge Brokering get confused because while they start from different places (one piece of evidence versus many pieces of evidence) they use similar methods and communication tools (e.g. policy briefs). However, they can be untangled again when you look at the outcomes they are trying to achieve:
    • Desired outcomes of ‘Research Communicators’ relate to a change in a specific/thematic policy or practice i.e. you know RC activities have succeeded if a specific policy decision is made
    • Desired outcome of ‘Knowledge Brokers’ relate to a change in the information-seeking and decision making behaviour of policy/practice actors i.e. you know KB activities have succeeded if decision makers consider a diverse range of evidence to inform their decisions

    Importantly, power matters: in Research Communication, the relationship between the researcher (or research institution) and decision maker makes a difference to whether the decision maker gets to hear about a specific piece of evidence (e.g. informal encounters, ‘Beer Buddies’) whereas knowledge brokers, such as the IDS Knowledge Services, can work to equalise that power imbalance for less powerful researchers (or research institutions). For example, the British Library of Development Studies' work around improving access to research published in the global South

    I will explore how ‘politics’ comes to play on these two strands of research uptake activity in my next blog. Meanwhile, you can follow me on Twitter @CatherineF_IDS; I'm currently at the K* Conference in Hamilton, Canada.