Last Tuesday, I spoke at The Economist ‘s Ideas Economy conference, entitled, Information: Making Sense of the Deluge. In the session “The Promise and Perils of Open Government”, I had the pleasure of sharing the stage with the dynamic Lt. Governor of the State of California, Gavin Newsom. While he was mayor of San Francisco, Gavin launched pioneering efforts like DataSF, which continues to provide budget, housing, map, crime, job, etc. data in machine-readable formats. Four minutes of our 20 minute discussion can be found on the fora.tv site and below.
Talking with Gavin in the green room before our session, I could hear in more detail the challenges and outcomes he experienced while pushing the Open Government Data agenda in San Francisco. I like the Scorecard featured on DataSF, which shows which city agencies are contributing data (stimulating healthy competition, perhaps). During the conference session, which was expertly moderated by the Economist’s Vijay Vaitheeswaran, Gavin and I aimed to introduce, to a largely business-focused audience, the concept, costs and benefits of open government data. I wish now that I had emphasized that opening data properly can help businesses as well.
This talk was timely, in that the Web Foundation is currently accelerating activities around Open Data (government and other data), led world-expert and our new program manager, José Manuel Alonso, and global leaders and WF Directors, Tim Berners-Lee and Nigel Shadbolt.
The rest of the conference was packed with fascinating dialog. Don Tapscott conveyed a commercial open data experience: Rod McEwen, CEO of Goldcorp, bought a large plot of land hoping to find, not surprisingly, gold. After some time McEwen made the extremely unconventional move of releasing all of the geologic data about the property, and challenged the public to identify where to look for veins of gold. A $575,000 prize was offered for the best idea. They eventually found plenty of gold, turning the $100 million company into $9 billion company .. not a bad ROI.
Other session outputs which resonated with me were those from Matthew Hilbert (history of “big data”), James Manyika on (see McKinsey’s recent Big Data report), Tim O’Reilly (including the question, how would human behavior change if almost everything were public?), and PepsiCo’s Bonin Baugh (continued convergence of advertising, social media, products that measure data, etc.). Themes permeated many of the sessions: privacy, security, authenticity, and how to move from data to information to knowledge to wisdom and, finally, to more effective, profitable operations.
I felt that there was an under-appreciation of the importance of data standards to unlocking the value of data. After all, there were data and documents on the Internet before the Web was invented by Tim in 1989. It was the introduction and use of the Web standards HTML, HTTP and URLs that increased immensely the usefulness, accessibility, quality and inter-connectedness of information on the Internet. The appropriate use of data standards (XML, RDF, OWL, etc.) will do the same for data on the Web.