This is a read only archive of pad.okfn.org. See the
shutdown announcement
for details.
FastOpenFree
Fast, Open and Free? Putting a value on Open Development Knowledge.
- In the rush towards open connected development information systems, how do we measure the impact of bringing together open content for development?
- How do grassroots organizations contributing to open platforms sustain their work when they can’t demonstrate usage of their content?
- Where traditional methods of tracking information on the Internet fail to record usage, what new metrics can we use?
- Looking at the example of the Land Portal, and the work of knowledge services team at the Institute of Development Studies, we would like to explore the issues in making content open, if we need new measures to assess value and whether we need new engaging ways to explain the benefits of what we do to funders and providers of content.
Presenters:
Peter Mason, Institute of Development Studies
Alan Stanley, Instititue of Development Studies
Laura Meggiolaro, Land Portal
Facilitators:
Simon Colmer, Institute of Development Studies
Ruth Goodman, Institute of Development Studies: r.goodman@ids.ac.uk
- Challenges for measuring feeding back to funders/donors
- How can we see (if) we are making a change
- Writing case studies about how people are using the data and what can this tell us about impact
- How do southern partners justify opening up their data? Convince their donors that what they are doing is of value?
- Data value chain is long so impact is sometimes very far down the line
- how can we develop the standard on communicating the results on activities
- Challenges of opening up data sets - are there different metrics we should be using? Different ways of looking at impact
-
Case studies:
- Eldis: Eldis opened up their data in 2009, fits with our broad ethos. Lots of big organisations have done this but not smaller southern research organisations thus increasing digital divide if only the big guns are able to open up their data. So, we went through a call process and brought in a range of southern partners to open up their data. But, challenge to southern partners as this didn't fit with their traditional business model, i.e. we want more hits/traffic to website - if we can't show this then what is the point? These partners need to go back to their bosses and donors and say this is the value of this open data approach
- Land Portal: LP is an information portal on land issues gathering data together from a variety of partners pooling data organising it in a way that will be useful to all of them (NGOs, researchers, activist groups). Recently launched open data platform making data reusable for the first time. traditional view is that by sharing my data I will get more traffic to my site and my visability is increased but, this doesn't translate to open data. Difficult to report actual use or value added/ difference that being part of the portal makes for small organisations in the south. What is the added value for small southern organisations to add their data in with others? What do we report to funders. So, how to show value of the portal overall and how to show value to individual partners
Key questions:
- How do we show the impact of open data system?
- What are we measuring: are we measing the number of publications? Or measuring the use of those publications? What is a measure of success? And what is value? Need to educate funders - what we measure should reflect the change we are looking for
- sustainability - we are buidling this new world of inter linked data but how is this sustainable esp. when opening up non government data
- what about if open data isn't the thing anymore?
What would the case study look like? Need to look at them differently. Data starting a conversation. How do we frame the case studies
Measuring the impact of the investment in open data
Putting new ideas out in the open generates new collaboration and new innovation
There is a risk question - you don't know what will happen with your data/how it will be used
Measuring is incongruent with open data. If the whole point is to say here is some data, do what you want with it, it is open, you don't need to tell me.
Other ways of measuring: what about Twitter, Google etc. is that where we should be looking to see impact
thinking of a torrent or a bit (??) for tracking [but is this like the image beacon thing that people can strip out?]
We can't put a barrier up to using data as then it would no longer be open so the other thing might be to ask people how they are using it
Improving how people use meta data
We need to articulate that traditional tracking doesn't work and then look at different ways of illustrating value.
What is the thing to track? The investment?
The value of the thing is not just what it cost but how it gets used. And in the open data world it can get reused endless times.
Isn't it ultimately about making someones life better? For example, for Bob, having access to a given data set makes Bob's life better so the thing we should be reporting on is that Bob's life is better.
Ruth - But how do we find Bob?
Shouldn't openness become the norm. So, rather than measuring extra things it shoudl just be a given.
Is the problem of analytics is that we track things in the short term when actually it can take years to have any impact
But still need to know that the money is spent well.
- different ways of looking at case studies - we need to share examples of new forms of case studies - if we can show great things are happening and there is a story that captures the imagination of the public then donors may be swayed
- how can we capture the m & e of smaller organisations, collate evidence of change from different organsiations and collate for a more cohesive body of knowledge. - But, need to remember that there is competition between organisations so they may not be so keen on sharing
http://oerresearchhub.org/ - OER Research Hub gathers research on the impact of open educational resources (OER) on learning and teaching practices.
Things we can do
________________
1. Produce a coordinated "statement" to funders about why some metrics are meaningless -> the impacts are longer term
2. Number of people reached - is this a valid metric?
3. Sharing examples of new forms of case studies
4. Collecting and sharing evidence of change in a standardised way
5. Bring together tools which people are using