Showing posts with label Intermediaries. Show all posts
Showing posts with label Intermediaries. Show all posts

Thursday, 19 September 2013

The 4 c's of Google Adwords – content, context, clicks and conversions

By Alan Stanley

I manage Eldis – an online platform providing free access to international development research and policy documents. We’re a global service with roughly 45% of our users in developing countries and a strong emphasis on highlighting research produced by the smaller research organisations and networks based in the so-called global “South”. We get about half a million visitors per year.

Like most other online knowledge platforms Eldis relies heavily on Google as a source of traffic to our website (61% last year). And to do this we rely on getting our links into the listings on search engine results pages that appear because of their relevance to users search terms (referred to as natural or organic search).

Recently though, with the support of a small amount of funding from the Climate and Development Knowledge Network, we’ve been exploring the use of Google Adwords (pay-per-click adverts appearing prominently on Google results pages) to help us achieve some of our marketing and promotion objectives. This short article highlights some of what we learned from this process and links to a longer draft learning paper we’ve produced which describes the process we went through in more detail. We’re not experts - we started from pretty much zero knowledge and still have many unanswered questions. So my hope is that this short article might prompt others to share their experience from which we can all learn.

Jargon alert


Working in both international development and knowledge brokering requires a certain natural tolerance (even a slight fondness) for jargon but the world of pay-per-click advertising takes this to a whole new level. Working across a small team of three to five people we discovered that it took a few weeks of meeting fairly regularly before we could even have a fairly straightforward conversation with each other about what we’d been doing! In the end we put together a basic glossary of terms to help us (see the learning paper for the full list).



Matching content to context is key


We experimented with Adwords campaigns promoting three different Eldis services:
We ran similar campaigns in 18 different countries. What clearly generated the most clicks in the most cost effective way was marketing our country specific content (e.g. Bangladesh Country Profile) to audiences in that country (Bangladesh).

This might seem obvious – Google users in Bangladesh interested in climate change are most likely to be interested in information about climate change in Bangladesh – but something called “quality score” also comes in to play. In determining how prominently your ads will be displayed, and what you pay for that position, Google looks at how closely the text on the web page you are promoting matches search terms you have chosen to target and the text of the ad that will be displayed in the search results. A strong match gives a higher quality score which will boost the prominence / reduce the cost of your ad. Our country profiles content clearly performed better in this regard.   



Concentrate on conversions and not clicks


We began our Adwords campaigns with two broad objectives – firstly to boost traffic to our site (overall but also specifically from our priority countries) and secondly to increase the number of regular users (return visitors) using our services.

We soon found that generating large numbers of clicks was relatively straightforward and, with some tweaking of keywords, budget and how much we were willing to pay for each ad, it was possible to steadily reduce the cost-per-click and improve the cost efficiency of the campaigns.

Success! Well, no because as we did this, we were also looking at the behaviour of our new users when they arrived at our site and found that the vast majority left again almost immediately and, worse, didn’t appear to ever come back. So in other words we were paying to bring new users to our site and then disappoint them – hardly good value for money!

This led us to rethink our strategy. Firstly we refocused our campaigns to emphasise quality over quantity - to try to make sure the people that clicked on our ads were likely to be interested in what we were offering rather than just focusing on getting as many as possible within the limits of our budget. Secondly we focused on what we wanted the users to do on our site once they arrived – and re-worked the wording and presentation of our pages to reflect this.

We’re sure we still have a long way to go with this. For example one of our targets is to get new visitors to subscribe to our email newsletter (in the jargon this is known as a goal conversion). By adjusting our keywords and re-writing and re-organising our subscribe page we’ve managed to double the conversion rate. Success! Well, yes but we’re still only getting 6% of new visitors to subscribe (up from 3%!).

Google Adwords - useful but complex and time-consuming


We’ve found Google Adwords to be a useful tool but complex and time-consuming to use effectively. It’s particularly helped us to reach new audiences in countries where, without active partners or contacts, we would have struggled to use more conventional marketing approaches. It isn’t cheap – either in the cost of advertising or the level of staff time required – but we have found it to be broadly cost-effective compared to other approaches we might use. Adwords is highly geared towards the commercial world where success is measured in sales so for a non-profit operation just engaging with it has challenged us to think very differently about our whole approach to producing web-based services – from content to target audiences. That thinking in itself has been valuable and I’m pretty sure we run a better service now as a result.

* Read more about this experiencing in the draft IDS Knowledge Services learning paper “Learning from Google AdWords Marketing” by Viivi Erkkilä, Fatema Rajabali and Alan Stanley. This blog was originally published on the Knowledge Brokers Forum.

Alan Stanley is a Senior Thematic Convenor at the Institute of Development Studies, and manages the Eldis programme and services. 

Monday, 25 February 2013

Research does not automatically generate knowledge - rethinking product and process

By Catherine Fisher

The overwhelming conclusion I have drawn from my involvement as a contributor to the IDS Bulletin, New Roles for Communication in Development?, is the need for research communication to place a greater focus on process rather than product. I draw this insight from both the process (what I've learned from being involved) and the product itself (what I've learned from some of the great articles in the Bulletin). I'm not arguing that there is no role for product, far from it, but that there is value in re-examining the relationship between the two.
Product v Process? Baking Czech bread
Image credit: Chmee2 (Own work) CC-BY 3.0


Overall I would argue that articles in the Bulletin collectively encouraged us to challenge our assumptions around three related areas:
  • What is research? 
  • What is knowledge?
  • How does change happen?  

And to explore  links between these areas: how does research generate knowledge, whose knowledge produces the research, how does research or even knowledge lead to change?  All speak to a greater focus on process. I explore a couple of examples below.

Research changes those involved 

Patta Scott Villier's article, This research does not influence policy, explores how participation in a research process brought change for both the researchers and the ‘researchees’,  even if the resulting output didn't shape high level policy processes  So the process generated more change than the product. But maybe the product (or its story as described in Patta's article) will inspire someone somewhere else to undergo a similar process, that might itself generate change...

Research doesn’t necessarily build knowledge

In our article, Kirsty, Louise and I explore the factors that shape whether actors engage with research and are willing and able to draw on research.  In doing so, we challenge ideas that research, even if it is shared at the right time and packaged in the right way, will somehow automatically generate “knowledge” in target audiences.  Research does not automatically generate knowledge. This happens through a process of sense-making in which the “knower” is an active participant not a passive recipient, who may deliberately or inadvertently choose to reject the intended message of the product. As Penelope Beynon’s et al's article illustrated, even carefully constructed outputs such as a policy brief, often seen as the silver bullet of research communication, can lead to the creation of different knowledge than was intended.

Growing importance of “process architects”

One implication of this greater focus on process is a greater role for intermediaries, knowledge brokers or innovation brokers within the broad spectrum of research communication. The focus on process over product helps us to see a greater role for these actors which is not just about turning research outputs into attractive products but about seeing research  and research-based products as part of the processes of knowledge creation and change, and supporting the design of these processes. These processes will often draw on products from multiple sources, current or historical, local or from far away. Often these processes themselves produce products which then go on to inform other processes. This speaks to some of the points I make in the Introduction, Is Development Research Communication Coming of Age? (PDF).  

The value and opportunity for exposing the process behind the product

One final reflection is that I doubt anyone will learn as much from the papers that appear in the journal  as I have learned from writing them. The process of writing a paper with a colleague forced us both to reexamine our assumptions and beliefs. An early version of our paper included a box outlining the differences in position about research and knowledge, and how it contributes to change that attempted to share some of the discussions we had  Yet journal articles require a compelling single narrative, an authoritative position and do not allow for divergence and discussion. They also have a strict word count. So this was dropped. While journal articles play an important role in communicating ideas and indeed validating the rigour of those ideas, particularly those from more formal research processes, perhaps there is also space – particularly in the social sciences for us to be more open in our workings, less authoritative in our positions. New media such as blogs like this, or even tweets enable “work in progress/emerging thinking” type products to be shared more widely that can  trigger thinking and knowledge generation processes for others. 

I hope that both this blog series and the Bulletin itself has prompted you to think and re-examine your ideas as it has done for me.

Catherine Fisher contributed to the Introduction in the IDS Bulletin, entitled Is Development Research Coming of Age? (PDF) and to the Bulletin article entitled Stimulating Demand for Research Evidence: what role for capacity building?. Catherine is International Capacity Building Co-ordinator at Amnesty International. She was formally Capacity Support Coordinator with the Impact and Learning Team at IDS  

More blogs on the IDS Bulletin New Roles for Communication in Development?

Friday, 12 October 2012

Comparing research and oranges: what can we learn from value chain analysis?


By Elise Wach

A conversation with a colleague the other day about how we would communicate our research findings for a nutrition initiative struck me as remarkably similar to the conversations I held under orange trees in eastern Uganda about market research and value chain analysis a few years ago.

In Uganda, the government was promoting the cultivation of certain fruit trees based on studies that had shown which varieties were agriculturally viable.  Farmers transitioned their plots from cassava to orange trees on the assumption that there would be a market for their oranges once their trees started fruiting several years down the line. 

Obviously, to us value chain analysts, this was crazy – it was necessary to do some market research first to find out where there were opportunities for these fruits in the national, regional, or international markets, and then grow and prepare the right crops accordingly. 

What can we learn by applying value chain concepts to our research?
Image: statesymbolsusa.org
Our thinking was shaped by the countless instances of NGOs and donors promoting the production of something (whether oranges, soaps, water pumps, etc.) without doing their homework to find out if anyone might purchase them and under what conditions: whether there was an opportunity in the market for the product (e.g. will people buy the oranges to eat, or would a juicing company be interested in them?), whether product could be improved to better meet consumer needs and preferences (e.g. are Naval oranges preferred over Valencia for juicing?  What about for eating?), whether demand could be stimulated (e.g. can we promote orange juice as a healthy breakfast option to increase consumption?), etc.  Without doing this research first, there is a significant risk that the oranges that farmers produce will not bring them the returns they hoped for. 

So I wondered, is producing research first and then deciding how to communicate it afterwards the same as growing an orange and then deciding how and where it will be sold? 

We invest a substantial amount of time and resources into producing our research and for most of us, having our research reach other people is our primary concern.  

What does the value chain for research look like?

Our product, or ‘oranges’ are our research studies. Our ‘market analysis’ is our ‘audience research’.  Our ‘marketing approach’ is our ‘research uptake strategy’. Our ‘value chain analysis’ is the research we do about ‘evidence into policy’ or ‘knowledge into action’.

We work to strengthen the knowledge value chain.  We build demand for our products through increasing the demand for research and evidence.  We alter our products to our consumer needs through producing 3-page policy briefs for some and Working Papers for others.  And we create or strengthen bridges between our producers and consumers (e.g. individuals such as knowledge intermediaries / knowledge brokers or systems such as the policy support unit that IFPRI is supporting within the Ministry of Agriculture in Bangladesh).  We understand that policy decisions are complex, just as markets have long been recognised as being complex (the outputs from value chain analysis, when done well, never look like actual chains, just as a theory of change never fits into log frame boxes). 

Obviously, there are differences between research and oranges.  The shelf-life of research is clearly longer than the shelf-life of oranges, and research can be dusted off time and time again and used in a variety of ways, many of which we’re unable to anticipate.  But much of the impact of our research does rest on the timely communication of our findings.  While Andy Sumner’s research on the bottom billion will certainly facilitate a better historical understanding of poverty, I will venture to guess that he also hopes that this information will shape development policy so as to better tackle this issue. 

We do face many similar issues as our business-minded colleagues.  When is audience research necessary, and when does the ‘if we build it, they will come’ assumption apply?  Where is the line between research communication and advocacy?   How can we create demand and to what extent should we do so?  Do our ‘consumers’ have balanced information about the products available or did they only have access to the one that we produced (Catherine Fisher wrote an excellent blog about policy influence vs evidence informed policy)?  How much do we let the market dictate what we produce and how we produce it?   

Are there opportunities to apply lessons from our colleagues working in markets and value chains to our work on ‘evidence informed decision making’?  Should we be comparing research and oranges?

Elise Wach is a Consultant Evaluation & Learning Advisor with the Impact and Learning Team, at the Institute of Development Studies


Thursday, 17 May 2012

Reflections on the K* summit: beyond K-Star-wars?

By Catherine Fisher

It was only a matter of time before someone made the KStarWars joke at the K* Conference that took place at the end of April in Canada. I’m only sorry it wasn’t me!

However, the K* Conference was notable not for its battles, but for the sense of commonality that emerged among the participants and for the momentum for future action it generated. 


The K* summit aimed to connect practitioners working across the knowledge-policy-practice interfaces to advance K* theory and practice. Its aim was to span the different sectors and contexts and different terms under which this kind of work is undertaken, for example Knowledge Mobilisation (KMb), Knowledge Sharing (KS), Knowledge Transfer and Translation (KTT).  Hence K*:  an umbrella term that attempts to bypass terminology discussions. 

This blog post provides links to some of the great reporting from the event, acknowledges some of the critiques that the event raised and points to the next steps for K*.    
The opening presentation highlighted how K* is about supporting processes of exchange and engagement between knowledge-policy-practice interfaces not the achievement of particular outcomes. It was great to hear this point made by John Lavis, who has something of a guru status in K* in health. Other important points were about learning about context and what that means, not just saying its important!
Another great metaphor courtesy of Charles Dhewa. The importance of multiple knowledges, knowledge hierarchies and the role of K* actors in helping to facilitate interactions between those knowledges was a recurring theme. E.g. see video by Laurens Klerxx talking about multiple knowledges and innovation brokers. 
As David Phipps explains in this video, participants from Canada, Ghana and Argentina were able to find considerable commonalities in their work with communities. This transnational comparison may be familiar to those of us who work in international development but it was a first for many of the Canadian participants who are doing really interesting work, for example, in government ministries or communities. I think this points to a strength of the K* movement in connecting people that might not otherwise talk.
The conference illustrated the range and scope of K* work. For example, Jacquie Brown, National Implementation Research Network who works helping communities to implement science, has learnt how this piece fits within the broader scope of K*.  For me, this seeing how different kinds of K* roles are played and how they intersect is important.  

In this video, I share some of my reflections at the time: brokering in the Canadian context including an  examples of brokering at the point of research commissioning:  power dynamics in brokering; and the way that informing role of knowledge brokering is getting a “bum rap” compared to more relational knowledge brokering work. I also get distracted by bangs, crashes and the emergence of breakfast!  

Critiques and the importance of engaging with them

The conference has generated some robust critiques. For example, Enrique Mendizabal sparked a discussion on his blog, On Think Tanks with a range of critiques including whether knowledge brokers are required, how knowledge is shared, and a critique of elitist professionalisation of this field. Scroll to the bottom of his blog post to read the responses, including mine. Meanwhile, Jap Pels argued that the nature of the debate at K* was pretty basic knowledge-sharing stuff.

I think both of these critiques raise interesting points but I think they constitute arguments For K*, not against it. K* recognises that the knowledge work is changing and proliferating, that there is considerable experience and understanding that is not shared across the different spaces in which the role is played. It aims to bring together bodies of expertise (for example that which Jaap Pels points to) to raise the game of all practitioners. It will hopefully provide spaces for debates and engagement with the kinds of critiques that Enrique raises.   

So what next for K*?


The conference generated a range of areas for further collaborative action, and plans for taking the K* initiative goes from here. 

Areas for further collaborative action included:
  • Understanding impact: a group agreed to share the tools data collection tools they are already using, I’ll be participating in this group, building on work of Knowledge Brokers Forum
  • K* in developing countries: a predominantly African group explored the particular dimensions of K* work in their contexts generating a number of action points
A group of participants gathered on Saturday to work out what next for K* as a whole. Consolidation of the K* Green Paper is considered an important next step – co-organiser  Louise Shaxson will be leading this work. There are ideas of developing a more formalised network, which will be led by UNU-INWEH in the first instance.   

UNU, who have led this process so far, remain committed and aim to get the support of the UNU governance. The World Bank has already provided financial support. Support from such international bodies is important as it will embed the international nature of this initiative, it is not without its risks!    


So to borrow again from StarWars, the force is, for now, with K*.  The scale and ambition of the initiative together with some indications of funding and high profile support suggest it has a future. However it faces both practical and fundamental challenges.

Practical challenges include maintaining ownership and momentum on behalf of the largely volunteer force taking it forward for now, identifying its niche and building connections around such a fragmented field of practice.

More fundamental challenges lie in ensuring that it really can generate value that will improve knowledge-policy-practice interfaces, rather than providing a talking shop for elitist actors.   




Catherine Fisher is a member of the K* Conference International Advisory Committee.

Thursday, 26 April 2012

Policy influence or evidence-informed policy: what is the difference?

By Catherine Fisher

“We all want a culture of evidence informed policy making, don’t we?” asked Dr Ruth Nyokabi Musila from African Institute for Development Policy (AFIDEP) at the opening of her presentation at the International Conference on Evidence Informed Policy.

It was a commitment to this ideal that had united over 50 researchers from 4 continents, brought togther in Ile Ife, Nigeria, earlier this year. I was attending under the auspices of the IDS Mobilising Knowledge for Development Programme (MK4D) and had been invited to present and chair a session.

Policy influence is not the same as evidence informed policy
 
Throughout the conference I was struck by a blurring between the (admittedly closely related) concepts of research having policy influence and evidence informed policy. The difference seems pretty obvious to me but I sometimes struggle to explain it.  

Let me try this…  
  • Effective research communication (which aims to influence policy) is indicated by change in policy/process/discourse based on the research findings you are communicating.

  • Effective evidence informed policy is demonstrated by a culture (systems, processes, attitudes and behaviours) that mean that people in decision making processes regularly engage with research from a wide range of sources when formulating, implementing, reviewing policy.

And to illustrate this difference, here are two examples from the conference:

Image: http://profile.ak.fbcdn.net/

Firstly, Kakaire Ayub Kirunda, shares his learning on how to influence policy. He observed that  “while members of parliament might be an ultimate target, they hardly have time and it is their clerks and assistants who do the lion's share of their research..."

He adds that, in a conversation with Ugandan MP, Honurable Obua Denis Hamson, who also chairs the Science and Technology Committee of Parliament, about how he would want researchers to approach him with evidence, the MP suggests “Probably the easiest way is to first give me a brief summary of your research findings. We can start from there.”


Ah yes, the ubiquitous policy brief. IDS' Impact and Learning Team recently conducted some research around the effectiveness of these as a communication tool, but that is for another blog.

By contrast, an example of supporting evidence informed policy was brilliantly illustrated by Jorge Barreto. He described the creation of an “Evidence Centre” in PiriPiri, a town in a poor region of Brazil. The Centre promoted the use of health evidence locally to improve municipal decision-making process.

Over a beer the night before, Jorge had told me that infant mortality rates in Piripiri were far lower than in other similar towns, his colleague added “20 babies survive a year because of these local policies”.  

Jorge’s presentation concluded that “current efforts to improve local government’s capacity to use research evidence to define problems, find tested interventions, assessing the quality of global and local evidence and translating evidence to key stakeholders are worth continuing. This is our little contribution towards addressing the knowledge to action gap.” Not so little for those children who survive and their families.


I feel it is worth maintaining the distinction between policy influence and evidence informed policy as the activities you undertake to influence policy with research will be different to those you might undertake if you wish to bring about a culture of evidence informed policy.

Such as...Research communication versus knowledge brokering


Two areas of activity which seek to either influence policy and/or support evidence informed policy are research communication (sometimes referred to as research uptake) and knowledge brokering (sometimes referred to as knowledge mobilisation). These distinct activities also often get confused (see my earlier post Buzzing about brokers).

Working closely with IDS Knowledge Services, engaged in knowledge brokering activities, and the IDS Communications Team, focused on supporting IDS research, this is something we decided to explore in more depth at an Impact and Learning team ‘learning lab’, a reflective practice tool we’ve been using to create a space for shared learning.

Here are some notes from the lab, which focused on "desired outcomes": 

"Research Communication and Knowledge Brokering get confused because while they start from different places (one piece of evidence versus many pieces of evidence) they use similar methods and communication tools (e.g. policy briefs). However, they can be untangled again when you look at the outcomes they are trying to achieve:
  • Desired outcomes of ‘Research Communicators’ relate to a change in a specific/thematic policy or practice i.e. you know RC activities have succeeded if a specific policy decision is made
  • Desired outcome of ‘Knowledge Brokers’ relate to a change in the information-seeking and decision making behaviour of policy/practice actors i.e. you know KB activities have succeeded if decision makers consider a diverse range of evidence to inform their decisions

Importantly, power matters: in Research Communication, the relationship between the researcher (or research institution) and decision maker makes a difference to whether the decision maker gets to hear about a specific piece of evidence (e.g. informal encounters, ‘Beer Buddies’) whereas knowledge brokers, such as the IDS Knowledge Services, can work to equalise that power imbalance for less powerful researchers (or research institutions). For example, the British Library of Development Studies' work around improving access to research published in the global South

I will explore how ‘politics’ comes to play on these two strands of research uptake activity in my next blog. Meanwhile, you can follow me on Twitter @CatherineF_IDS; I'm currently at the K* Conference in Hamilton, Canada.

Monday, 16 April 2012

Digital information on the move: the rise of the Tablet

By Simon Batchelor

Here in the UK the ‘new’ devices of smartphones, tablet pcs and ipads are very evident. Just take a train, and you will see at least half the people on it are staring at a screen.  While we may think they are working on their emails, some of them are just playing games or watching films – nevertheless digital information on the move is becoming even easier.

So is this change in device use becoming common in the countries where we work?


Our recent study (yet to be officially launched and published), suggests that for the policy environment these devices are changing access to information.  When asking 368 policy actors in six countries (Bangladesh, Ethiopia, Ghana, India, Kenya, Nepal) what devices they have access to we get the responses as shown in the figure below.

Emerging findings from Impact and Learning Team (IDS) research, full report will be available on www.ids.ac.uk


The figure illustrates that 90% of respondents have a computer desktop either at home or the office, 88% have a laptop for use in either the office or home.  However what becomes interesting is their growing use of Tablet computers.  Tablet use among policy makers in the South is at 12%. 


So how does this compare with the UK scene?  While we don’t have the UK figures (and if anyone has please add as a comment), the Pew internet survey for the USA (June 2011) suggests that use of Tablets across the USA has risen to 8% over the last 12 months.  This means average use among policy actors in the South is slightly higher at 12%.  Perhaps interestingly but unsurprisingly, desktop and laptop use among policy actors is considerably higher than the USA general public average which stands at 58% and 52% respectively.  

Indeed in September 2011, India's Economic Times carried a story announcing that the computer allowance for MPs had been raised from Rs 150,000 to Rs 200,000.  The extra Rs50,000 was specifically to obtain a tablet device such as an ipad or Samsung Galaxy powered by Android. “Owning a tablet is mandatory for all MPs, officials said.”  The article states that “over 125 members from the total 245 have already bought the tablets”.

And how about Smartphones?  The graph shows that 40% of respondents had smartphones. Of these 8% had iphones, 12% Blackberries and 31% were ‘other’ smartphones – where smartphone meant they got their email over the phone and could surf the internet.

What does this increased use of mobile technology mean for Knowledge Intermediaries?  

The information ecosystem is changing.  Policy actors do indeed have access to the latest technology, and the proportion of early adopters among the policy actor subset is approximately the same as the averages of the general public in the USA.  While much intermediary work is digital, the debate continues as to whether it is the best pathway for getting research in front of the key people.

Our forthcoming report explores the behaviour of policy actors, but in terms of potential digital access, the data confirms that, increasingly, policy actors have access to this medium, and we should not miss the opportunity to develop "apps" which engages with these early adopters. A real-time example of the potential in this is the IDS Knowledge Services Open API. It allows developers to create apps for Android-driven Tablets and smartphones which could tap into the BRIDGE and Eldis research databases containing over 32, 000 summaries and documents.

Monday, 19 March 2012

(Still) Seeking a cure for Portal Proliferation Syndrome

By Susie Page

As co-editor to a forthcoming issue of the IDS Bulletin (autumn 2012) which will be focusing on ‘research communications’ (all facets of it!), I am currently reading about some very exciting work around this.

During our Call for Submissions, Geoff Barnard, former Head of the Information Department here at IDS, and now Head of Knowledge Management at the Climate and Development Knowledge Network (CDKN) sent me a link his blog on Seeking a Cure for Portal Proliferation Syndrome.

Geoff aptly captures the dilemma that anyone working in research communication and knowledge brokering will be familiar with – the temptation to solve some of the challenges around research communication and uptake in development policymaking and practice by gathering all the relevant research into a super, sophisticated website. The underlying assumption being – if only people could access the research (at the click of a button), then the rest will follow.

He obviously hit a nerve, as there was a stream of responses to his blog, including one from Catherine Fisher who also contributes to this blog, highlighting her work “Ten Portal Pitfalls” – I would urge you to read Geoff's blog and contribute to the debate. 

Can we 'scientifically' test for what works when it comes to research uptake?

Image from: http://72.167.62.13/
Coming from a medical background, the words “cure” and “syndrome” had immediate resonance for me: in the medical world, we are acutely aware of persistent diseases and syndromes with millions of pounds spent on testing for cures (although take it from me the health sector struggles equally with the business of research uptake! See Lomas on this, for example).

But does similar testing occur in this sector – the one that wants to get good research results out of the lab and into development policy and practice? Is it even feasible to conceive of a scientific test for something as amorphous as “knowledge” and “evidence”?


Going back to 'the cure', we should perhaps be asking whether portals are a syndrome or a symptom. If they are symptom, the problem could be that we think research is not being used in policymaking and practice because people don’t have access to it. Yet, surely the very proliferation of portals in itself highlights that this isn’t the problem – after all, how will one more portal succeed where others have failed? What do we know about the success and failures of portals? What actually do we know about the relationship between portals and research uptake?

With people still wracking their brains over measuring impact of research, there is room for some robust ‘scientific’ testing what is and isn’t effective for supporting research uptake and the place (or otherwise) of portals within this. We recently teamed up with 3ie to carry out an experiment on the effectiveness of the ubiquitous ‘policy brief’ (even more ubiquitous than portals, I would argue). The results are just beginning to come through. Watch this space for more on this – we will of course be sharing our findings with you!

Friday, 24 February 2012

"The future is already here – it’s just not very evenly distributed"

By Emilie Wilson

I overheard this quote the other day, as I was out buying my morning coffee. It’s part of the joy of working on a university campus to be overhearing conversations such as these. The quote is attributed to William Gibson, “the "noir prophet" of the cyberpunk subgenre of science fiction”, according to WikiQuote.

Setting aside the fact that I know nothing about cyberpunk science fiction, the quote sparked a train of thought, which had begun that morning when I was looking at this excellent graphic published the Guardian website entitled “How Africa Tweets”. 

The map looks at the top 20 countries based on a 3-month analysis of geo-located Twitter traffic in Africa – no one will be surprised to see the biggest tweeters being South Africa, Kenya, Nigeria, Egypt, and Morocco. What surprised me were the presence in the ‘top twenty tweeters’ of countries like Sudan, Gabon or Angola, and absence of others such as Zimbabwe, Uganda, Botswana or Senegal (countries I associate, perhaps wrongly, as has having a decent technological infrastructure and a vibrant civil society).

With the West Africa Cable System, a 14, 000 kilometre long fibre optic submarine cable with a capacity of 5.12 terabits per second (Tbps), due to be operational in March this year, these figures are no doubt set to increase.

But what does this mean for development, and more specifically, for sharing, communicating and assimilating research which supports development?

ICT4Development – still the new kid on the block?


The development sector has a reputation for getting excited about technological innovation and hoping it will yield a quick fix to some of the most intractable problems relating to poverty and social injustice. At IDS, there is a whole Research Team devoted to analysing the interface between human development and technological progress. And ICTs have joined the fray alongside agriculture, biology, engineering and medicine – the 3rd International Conference on Mobile Communication Technology for Development is being held in Delhi next week, closely followed by the International Conference on Information and Communication Technologies and Development.
 
Does E.M. Rogers analysis on adopting innovation apply in this context?
IDS Knowledge Services has a small team dedicated to exploiting the opportunities afforded by innovative technology, and especially people in the South’s use of this, to improve access and demand for research. For example, the Open Application Programming Interface (API) project is designed to allow technical developers to access the datasets that sit behind renowned development research and information services, Eldis and BRIDGE. This access means they can pick and choose what data they want and need, and repurpose it by developing their own applications.

And it’s not about geeks playing with geeks - in a concrete example of how this can be used: BRIDGE has been working with Uruguayan NGO, ciedur, developing online resources that both bring together Spanish resources from BRIDGE alongside other relevant materials, in a way that is relevant to the Latin American policy context.

How is this being done? The Latin American resources are collected and shared online using an open source content management system called Drupal. A Drupal plug-in (developed by One World South Asia) allows those managing the system to automatically pull in data they want from the Eldis/BRIDGE dataset and repurpose it for their own website and online services.

And this is good because....?
  • It challenges the ‘top-down’, ‘North-South’ direction of technological innovation and editorial decision-making – IDS Knowledge Services partners develop their own applications and select the data that is relevant and useful for their contexts
  • It is relatively cheap: no need for high-tech R&D laboratories or factories, just creative minds and internet access. And these are active in abundance, not just in the Africa, but in Asia (East, South and West!) and Latin America
  • “Open” also avoids duplication and the needless funding of multiple portals and websites which all do the same thing: collect and disseminate research in the hope that access alone is what we need to get research into policy and practice.
Handy technological short-cuts enable us to focus on emerging areas of research communications and knowledge brokering such as stimulating demand for research, or supporting non-academics to be ‘evidence-literate’ (see work of Kirsty Newman at INASP). 

Some challenges for this ICT4Development model


Having been in this sector for quite a while (BRIDGE is 21 years old, Eldis is 16 years old), we are quite aware of some of the challenges to this paradigm:
  • Quality, trust and credibility: traditionally, “closed access” models for sharing research, such as peer-reviewed journals, ‘paid-for’ materials, such as books, or resources only available via a reputable institution, are supposed to guarantee all three. What happens when the credible is aggregated with the less credible? Can we maintain trust in the resources divorced from the branding and reputation of their original source? How is quality maintained once access is ‘opened’?
  • Ownership, power and access: fundamental issues around funding and publishing research are not really addressed by the ICT4D model (e.g. the incentive systems where researchers achieve greater points for promotion by publishing in prestigious closed access journals), there is still a digital-divide, even with increasing internet access (e.g. urban v rural access);  many free internet services are privately owned and developed (e.g. Google or Twitter), particularly by companies based in the global North: should we be worried about our dependence on them? See ITforChange’s excellent thinking and research around this area
  • Cost and financial sustainability: the open source model is one in which developers innovate ‘for free’, but then are more likely to be employed on the basis of their innovative contribution – but is this sustainable in the development research context? Who will pay for it?
We are still grappling with these challenges – are you?

Wednesday, 2 November 2011

Exploring the black box together: evaluating the impact of knowledge brokers

Cartoon by Sidney Harris (2007)
By Catherine Fisher

I love this cartoon! 

It seems to capture the idea of the "black box" that lies between the activities knowledge brokers and intermediaries undertake and the outcomes and impacts they seek to achieve. That’s not to say that they don’t achieve outcomes in the real world, rather that the pathways by which their work brings about change are difficult to unpack and evaluate.

The Knowledge Broker’s Forum (KBF) has started exploring this "black box" of how to evaluate the impact of knowledge brokers and intermediaries in an e-discussion running from 31 October until 9 November. I am (lightly) facilitating this discussion, along with Yaso Kunaratnam from IDS Knowledge Services.

If you would like to participate, you can sign up on the forum's website, it's open to anyone with an interest in this area.

Challenges in evaluating impact

We know there are a lot of challenges to evaluating impact of knowledge brokering. Some challenges stem from the processes (psychological, social and political) in which knowledge and information bring about change, the contested nature of the relationship between research and better development results, and the challenges of identifying contribution to any changes in real world contexts. This is particularly challenging for actors that seek to convene, facilitate and connect rather than persuade or influence.

As well as these quite high level challenges, there are the very practical issues around lack of time and resources to dedicate to effectively understanding impact. These challenges are explored in a background paper (PDF) I prepared as food for thought for those taking part in the e-discussion.

Being an e-discussion amongst 400+ knowledge brokers from all over the world, I am not sure yet where discussions will go, but I am hoping that it will shed some light on the following areas:

Breadth and depth of impact and outcomes  

How far do people go to identify ultimate outcomes of knowledge brokering work? I feel we can certainly go beyond immediate impact (e.g. personal learning) to push towards what that resulted in, however I wonder if it is meaningful to start looking at human development and wellbeing indicators. It will be interesting to see how far others are going.

Understanding behaviour change

If knowledge brokering is about behaviour changes that ensure greater engagement with research evidence, how are people defining those behaviour changes and are how are they measuring them? Are we too easily impressed with stories of information use when these could in fact hide some very poor decision-making behaviours?

Opportunities for standardisation of approaches and data collection

If people have come up with ways of doing this, is there any appetite for standardising approaches to enable greater comparison of data between different knowledge brokering initiatives? This would help us build a greater understanding of the contribution of knowledge brokers beyond the scope of any one broker’s evaluation.

I’ll also be interested to explore and challenge some of my assumptions – in particular that building some kind of theory or map of change is an important starting point for defining and then seeking to evaluate impact. This has been discussed previously on this blog and is a hot topic at the moment.

Our discussion will face challenges – not least the huge variety of types of knowledge brokering and contexts in which it is undertaken may mean there is not enough common interest. But I am sure that there is a lot of experience in the group that can be brought to bear on these questions and, in 10 days time, we will have a better idea of what is known, who is keen to explore this further and and hopefully how we could move forward to develop our understanding in this area.