Welcome to the Spatial Reserves blog.
The GIS Guide to Public Domain Data was written to provide GIS practitioners and instructors with the essential skills to find, acquire, format, and analyze public domain spatial data. Some of the themes discussed in the book include open data access and spatial law, the importance of metadata, the fee vs. free debate, data and national security, the efficacy of spatial data infrastructures, the impact of cloud computing and the emergence of the GIS-as-a-Service (GaaS) business model. Recent technological innovations have radically altered how both data users and data providers work with spatial information to help address a diverse range of social, economic and environmental issues.
This blog was established to follow up on some of these themes, promote a discussion of the issues raised, and host a copy of the exercises that accompany the book. This story board provides a brief description of the exercises.
Just over a year ago we wrote about an OpenStreetMap project to support humanitarian aid with open UAV imagery following the destruction caused by Typhoon Haiyan in the Philippines in Nov 2013. Although there were some issues in coordinating the data collection, the benefits of having access to a managed resource of openly accessible aerial imagery were obvious.
One year on and the Humanitarian OpenStreetMap team (HOT) have established the OpenAerialMap (OAM) project, to host and share aerial imagery from a variety of sources including ‘traditional and nano satellites, manned and unmanned aircraft, mapping drones, balloons and kites‘. The project will not only provide access to the imagery but it will also make the management software available to download, to help support local access to the imagery.
Although the project is still in the early stages, one of the first objectives is to establish an imagery catalog to facilitate searches and display. One of the major problems in the Philippines was identifying which individual or agency collected the imagery the humanitarian relief teams needed access to. The next priority will be to create the map engine, or the OAM server, to make the imagery available as a web service.
One of my students recently shared something that I considered to be a thought-provoking analogy in the “fee vs. free” geospatial data debate that we included in our book and discuss on this blog. The debate, in sum, revolves around the issue, “Should government data providers charge a fee for their geospatial data, or should they provide the data for free?”
The student commented, “I tend toward the “at cost” position of the debate for local governments and free side of the debate for federal data. For me, the “tax dollars are used to create the data so it has already been paid for argument” does not hold water. Taxpayers have no expectation (or shouldn’t have) of walking into the local parks department to borrow a shovel that in theory their tax dollars paid for. The same logic could be applied to spatial assets.” The student went on to say that the above argument should be applied to local and regional government data, because “federal level data […] tends to be more directly reflective of the population and the federal government more directly benefits from the economic opportunities created by free data.”
While I have tended to advocate on the side that geospatial data should be freely available, I believe that the student’s snow shovel analogy for local governments has merit. Following this argument, a small fee for data requested that is over and above what that government agency provides on its website seems reasonable. But I still am firmly on the side of that government providing at least some geospatial data for free on its website, citing the numerous benefits as documented in case studies in this blog and in our book. These benefits range from positive public relations, saving lives and property in emergency situations, and saving time in processing requests from data users. Consider what one person can do with the snow shovel versus what one person could do with a geospatial data such as a flood dataset. The shovel might help dredge a small section to help a few neighbors get out of their houses, but the flood dataset could help identify hundreds of houses at risk and provide a permanent, effectively managed solution. There is an order of magnitude difference in the benefit to be gained from making geospatial data easily and freely available.
What are your thoughts on this important issue? We invite you to share your thoughts below.
Although there is perhaps a tendency to think that crowdsourcing data collection initiatives are a recent innovation, the practice of citizen science dates back to some of the earliest known recordings of natural and human-made phenomena. In a recent report by the BBC on the signs of spring ‘shifting’ in trees, the pioneering crowdsourcing work of English naturalist Robert Marsham, best known for his Indications of Spring, was acknowledged. Marsham’s interest was in what became known as phenology, the study of the periodic cycles of natural phenomena. His indications of those cycles, 27 altogether, included recordings of the first leafing of a number of trees such as elm, rowan, and oak, the first hearing of birds such as the cuckoo, swallow and nightingale, and the first croaks of certain amphibians. Marsham’s family continued with his observations after his death in 1797, providing almost 200 years of seasonal observations.
Today the same phenological surveys are supported through the Woodland Trust’s Nature’s Calendar survey, a resource for volunteers to record the signs of the changing seasons where they live. A number of live tracking maps are available, which allow visitors to the site to select a species, a year, a particular event such as a first flowering, and plot the results. I chose snowdrops, one of the signature flowers of spring in many parts of Europe. As of the 25 February there had been 487 recorded sightings of snowdrops this year.
Although the spatial data are not available to download, summaries of the seasonal results are available as PDFs.
A new article entitled “Facilitating open exchange of data and information” published in the January 2015 issue of Springer’s Earth Science Informatics journal has strong ties to the discussions we have had on this blog and in our book, namely to developments in and implications of open data. In the article, authors James Gallagher, John Orcutt, Pauline Simpson, Dawn Wright, Jay Pearlman, and Lisa Raymond are clear that while open data offers great value, there are “a number of complex, and sometimes contentious, issues that the science community must address.”
In the article, the authors examine the current state of the core issues of Open Data, including interoperability; discovery and access; quality and fitness for purpose; and sustainability. The authors also address topics of governance and data publication. I very much like the approach that the authors take–they don’t sugar coat these issues, but acknowledge that “each of the areas covered are, by themselves, complex and the approaches to the issues under consideration are often at odds with each other.” Indeed, “any comprehensive policy on Open Data will require compromises that are best resolved by broad community input.”
The authors’ research stemmed from the activities of an Open Data Working Group as part of the NSF-funded OceanObs Research Coordination Network, and hence has an ocean and atmosphere focus. On a related note, in this blog, we recently wrote about crowd sourcing coastal water navigational data. However, the open data implications that the authors describe span all disciplines that care about location.
The authors cover many topics germane to the purpose of our book and blog, and cover it so well, from their treatment of copyright and creative commons to their down-to-earth realistic recommendations that the community must do to move forward, that I consider this article “required reading” for anyone interested in open geospatial data.
Billed as a stop-gap solution on the path towards emulating some of the larger data portals (such as data.gov.au and open-data.europa.eu), GovPond is an Australian public sector data portal providing access to over 3,600 hand-curated datasets and 11 Government catalogues, including:
- Landgate SLIP
- Australian Ocean Data Network
The motivation to develop the site stemmed from a previous exercise to collate public sector data sets after the site hosts discovered ‘an enormous number of tables and tools and maps and spreadsheets that were tucked away in dark, dusty corners of the internet, near-impossible to find with a quick search.’
For all the recent advances in liberating public sector data, it seems there’s still a niche for initiatives like these to get to those corners of the Internet and provide access to data resources that might otherwise elude all but the most determined data tracker.
The USGS National Elevation Dataset (NED) is transitioning to a Lidar-based elevation model. This transition is part of the 3D Elevation Program (3DEP) initiative, whose goal is to systematically collect enhanced elevation data in the form of Lidar data over the conterminous United States, Hawaii, and the U.S. territories, with data acquired over an 8-year period. Interferometric synthetic aperture radar (IFSAR) data will be collected over Alaska, where cloud cover and remote locations preclude the use of Lidar over much of the state (yes, physical geography still matters!).
This initiative was born in response to a study funded by the USGS named “The National Enhanced Elevation Assessment.” The study documented business uses for elevation needs across 34 federal agencies, agencies from all 50 States, selected local government and Tribal offices, and private and not-for profit organizations. Each need was characterized by the following:
- Data accuracy.
- A refresh cycle for the data.
- Coverage for geographic areas of interest.
Conservative annual benefits for flood risk management alone are $295 million; for infrastructure and construction management, $206 million; and for natural resources conservation, $159 million. Results are detailed in the Dewberry report on the National Enhanced Elevation Assessment, which details costs and benefits, how the data will be collected, standards and specifications, and organizations involved in the effort. An additional report details how the data could help in terms of taking action for climate change.
How will this affect us in the geospatial data community? The NED activities and website will continue until a full transition to 3DEP is completed. 3DEP planning and research is underway at the USGS to transition to a unified service that will provide both gridded bare earth data products and point cloud data, along with capabilities to produce other derived elevation surfaces and products from 3D data. When the data does appear, data users should notice the difference in resolution and quality. In our book, we detailed the rise of Lidar data, and since its publication, these data sets have greatly expanded in quality and availability.