Read the latest from the Web Foundation

News and Blogs

User Feedback on the Open Data Index

Web Foundation · December 12, 2012

In September this year, we published a pilot version of the Open Data Index, a ranking of the 61 countries measured by the Web Index but focusing only on indicators of their open data practices. We published the indicators and methodology alongside the Index.

There has been a lively debate about our methodology in the open data community, which we welcome. It will undoubtedly help us to improve the 2013 edition. All kinds of interesting issues have been raised and we’d like to respond to some of the most important.

By way of background, the Open Data Index is computed as a sub-set of the overall Web Index, which is a composite index covering many dimensions of the Web’s growth and social value. So, while a country may have scored poorly in one dimension, it could have scored well on several others, thereby making its overall average score higher than the worst score. At the same time, we could drill into the component questions to see how each performs separately. This approach allows for both an in-depth analysis and a summary of overall average performance.

The Open Data Index is based on data collected through a wide-ranging expert assessment survey in each country. For every country, one person scored, and 1-3 people checked and validated the scores. Therefore each country was assessed by up to 4 people. The scoring is not just opinion-based; scorers were required to provide some evidence to back their scores. This approach is not unique to our work. Several respected organisations such as Global Integrity, the World Bank, the Economist Intelligence Unit, and others, also use similar methods.

The questions asked are not binary (yes/no). Rather, they are questions of extent. For example, we do not ask “are there data on government spending”. Instead, we ask “to what extent are there data on government spending” on a scale of 1 – 10.  We also provide guidance as to what to consider in answering the question (for example, look for websites on certain data, consider the data available on the fiscal accounts of the central government and the state governments, etc. Clearly, the more the data the higher the score.)

An independent, expert statistical organization in the EU applied a rigorous statistical test of the results of the Web Index and its component sub-Indexes. Their report (  http://thewebindex.org/2012/09/EC-JRC-Assessment-of-Web-Index.pdf ) shows that the Index and its results are robust overall, although there are 4 or 5 questions in the entire expert assessment survey that appear to have been widely misunderstood, including one in the open data section.

The experts we engaged come from various fields including academia, law, NGOs and ICT specialists, as well as some open data experts. It is true that some of those who were not open data experts appear not to have fully understood the questions on open data, and this did lead to weaknesses in scoring as mentioned above. Resources permitting, it would be great to have an economist for each country to assess the Web’s economic impact, a political scientist the political impact, an open data specialist for the open data component, and so on. But our resources do not permit that.

In mitigation, we are investigating how to provide clearer definitions and guidance on the open data questions for next year’s survey, and we would appreciate suggestions. In addition, our experience underlines a point that others have made in other contexts: we in the open data community face an important challenge in better communicating our understanding of the meaning of open data to others outside our community, who tend to assume it’s all about data availability.

We also found that assessing the quality of the data released presents further complications (for example, how do you evaluate the quality of the fiscal data released by governments if you don’t have expertise in public finance?).

Finally, the usual tension between de jure (making an announcement, preparing an action plan, ratifying a UN convention) and de facto (implementing the commitments) reared its head in trying to measure progress on open data. While we tried to assess both commitment and implementation, we recognise this may need to be further refined.

We are still working on solutions to all of these challenges, and we hope that we will make good progress in this regard for the 2013 Index. We certainly welcome further dialogue and suggestions.

Our decision to allow scorers to remain anonymous has also been criticised. In some of the countries we surveyed, reviewers might be tempted to soften their scoring on some issues if they knew their participation would be made public; in a few countries, they might even face negative professional or personal consequences. Offering the option of anonymity therefore seemed ethically and methodologically important to us. Although we did encourage all of the people who participated to allow us to publish their names, we respected their wishes if they preferred to remain anonymous. We continue to discuss how best to balance confidentiality with transparency.

As we work to expand and strengthen the Web Index and the Open Data Index, we welcome your critical feedback at every step of the way. Our aim with the Web Index is to provide a resource that others will value and use, so your views on how to make it better are incredibly important to us.  We thank you for your feedback.

Your comment has been sent successfully.