Welcome to the Spatial Reserves blog.
The GIS Guide to Public Domain Data was written to provide GIS practitioners and instructors with the essential skills to find, acquire, format, and analyze public domain spatial data. Some of the themes discussed in the book include open data access and spatial law, the importance of metadata, the fee vs. free debate, data and national security, the efficacy of spatial data infrastructures, the impact of cloud computing and the emergence of the GIS-as-a-Service (GaaS) business model. Recent technological innovations have radically altered how both data users and data providers work with spatial information to help address a diverse range of social, economic and environmental issues.
This blog was established to follow up on some of these themes, promote a discussion of the issues raised, and host a copy of the exercises that accompany the book. This story board provides a brief description of the exercises.
We created 10 exercises that help data users build skills with making decisions with public domain data. I assigned several of those hands-on exercises recently to university students in a GIS course on public domain data, and I was amazed at the high quality of their analysis and of the cartography on their final maps. A selection of these maps, involving the creation of a database and map for an ecotourism company in New Zealand, are shown below. This is Exercise 7 in our set.
Particularly impressive about the results from this assignment is that this activity is open ended. In previous exercises, analysts are directed to specific websites to obtain data, but by this exercise, they are ready to tackle a problem without much guidance. Here, they determine the type of ecotourism they will focus on, the data they will need, the organizations from which they will obtain the data, how they will format, project, and analyze the data, the scale, cartography, and types of maps they will make, and the methods they will use to communicate their results. Try one of these exercises today and share your results!
Last week the European Union (EU) announced the launch of the Sentinel-1A satellite, as part of the first of six missions that will provide the framework for the Copernicus Earth Observation project. Copernicus, formerly known as GMES (Global Monitoring for Environment and Security), aims to collect data from a variety of sources, including satellite, airborne sensors and ground stations, to support a range of applications including:
Monitoring sea ice zones and the Arctic environment
Assimilation of sea ice observations in the forecasting systems
Surveillance of marine environment, including oil-spill monitoring and ship detection for maritime security
Monitoring land surface motion risks
Mapping of land surfaces: forest, water and soil, sustainable agriculture
Mapping in support of humanitarian aid in crisis situations
A second satellite, Sentinel-1B, will be launched next year. Once the system is fully operational, the aim is to provide almost daily coverage for high priority areas like Europe, Canada and some shipping routes. The radar capabilities on-board the satellite mean that data can be collected independent of weather conditions, day or night.
Image source: http://bit.ly/1el9g6M
All of the data products collected by the Sentinel satellites are to be made publicly available as open data, free of charge, to all data users. This also includes the use of the data for commercial purposes. The Sentinel-1A satellite is expected to be operational within three months.
As readers of this blog and our book are aware, when a geodata portal is confusing or inadequate, we are not afraid to say so. And conversely, when a resource comes along that contains a wealth of content and is actually intuitive to use at the same time, we share that as well. An example of a new, useful, and intuitive resource comes from the Demographic and Health Surveys of the US AID program and the US Census Bureau. The site provides detailed demographic data primarily for countries that receive assistance via the President’s Emergency Plan for AIDS Relief (PEPFAR). The data are available for single countries and also multiple countries through a data package, all of which the user chooses and customizes. Through the site, the US Census Bureau has added to and updated the online collection of subnational population data linked to maps.
To access the maps and data, begin at the main website for the project, select Data, select countries, select indicators (variables), select the format (shapefile or geodatabase), and indicate whether you want to download it now in a browser or receive an email when the package is ready. You can choose up to 25 variables at a time to be included in the package. I tested it and it worked marvelously. Also, in the near future, the US Census Bureau will release a seamless global map containing population estimates for tens of thousands of subnational administrative areas globally. Wouldn’t it be grand if all sites were this simple to use?
In a review of government open data initiatives David Buxton, CEO of Arachnys, makes a good point that although more and more governments are making their data available online (World map of open government data initiatives) simply having open access to such data doesn’t necessarily mean that the data will be better; there are no assurances that the data will more accurate, current, useful or even relevant. He does however point to the growing evidence that opening up access to data is generally having a positive influence and cites the success of an initiative in Mauritius to map land ownership across the entire country, the results of which have been a decrease in land grabs and better public scrutiny.
Added these government initiatives, the sheer volume of data that is increasingly being collected and made available via the various resources we have discussed in earlier posts (Internet of Things/Everything, UAVs ( unmanned aerial vehicles) and crowd sourcing) means that it is all the more important for data analysts and end users to understand the provenance, quality and relevance of the data. With increasing choice of data to work with, comes increasing responsibility to make sure it’s the appropriate data for any given application.
A legislative hearing was recently held to establish the National Geospatial Technology Administration within the United States Geological Survey to “enhance the use of geospatial data, products, technology, and services, to increase the economy and efficiency of Federal geospatial activities” and for other purposes. The associated proposed bill, H.R. 1604, is to improve federal land management by requiring the Secretary of the Interior to develop a multipurpose cadastre of Federal real property and identify “inaccurate, duplicate, and out-of-date Federal land inventories, and for other purposes.” For the full text of these bills, see this link at the Library of Congress.
While I wholeheartedly agree with the “plain language” title of the bill, which is “Map It Once, Use It Many Times”, I am wary of yet another administrative body created that is concerned about spatial data. I have great respect for the USGS as a former employee but wonder how it would accomplish this with their limited staff resources. But I am at the same time hopeful that if enacted, that this would serve to improve decision making at all levels of government, academia, nonprofit organizations, and in private enterprise by the coordination and dissemination of geospatial data. In particular, as Jill Clark and I have written about in this blog and in our book, if some useful data portals could be established, and in particular, a reworking and expansion of the National Map portal, we would, in a word, rejoice.
The bill details 10 data layers to be included in a “national geospatial database”. I was encouraged by seeing text in the bill allowing for acquiring data from commercial sources as I think that partnerships are critical. I wonder how such data could be distributed to GIS analysts and if so, the cost and any restrictions in doing so. I also liked seeing the language in the bill that encourages private enterprise, and also that which encouraged geospatial research and development. Also encouraging was the hearing (included in the link above) on H.R. 916 to improve federal land management and conservation by identifying inaccurate or duplicate federal land inventories. Time will tell, and these two bills are worth keeping track of.
Two recent releases. one app and a new phone, highlight a couple of issues we have discussed recently – personal location information and data privacy.
The Connect web/iOS app allows users to map the location of not only the contacts in their address book but also connections in their social networks. In a TechCrunch review Sarah Perez quotes one of the co-founders Ryan Allis as saying the app aggregates data from social media rather than using GPS to track connections which he considers a bit ‘creepy’. App users request access to their social media networks and once configure, the app will display connections on a map, based on their current address, a check-in via another application such as Facebook or from a geo-tagged posts on other platforms such as Twitter. The app also provides some options for configuring how and when connect alerts will be received (for example, when a favourite contact is within a certain range from your location). I wonder if those same connections realise just who may be using their location information? When they update their location, maybe they don’t want some of their connections to know they are in town? Will the more public social media platforms, such as Twitter, also provide options for users to broadcast their location information selectively or will the default position remain by choosing to make your location available, you accept you will have little or no control over who has access to it and how they use it? Apps like Connect introduce new options for keeping track of contacts but it’s also a reminder to think carefully about posting location information online.
As for the phone, the recently launched Blackphone uses encrypted messaging and calls (both sender and receiver have to use the same device or app) to restrict access to data. Any casual snooper would see the traffic but should not be able to access the content, although the phone makers stress that the device isn’t 100% hacker proof and a determined individual or organisation would still be able to get at the data. Following on from our post on the secret lives of phones, the Blackphone also promises to provide more control over both how and what data is transmitted wirelessly (often unknown to the phone user). Hopefully such default privacy settings will soon be the norm, not the exception.
I have created a series of 22 new videos describe decision making with GIS, using public domain data. The videos, which use the ArcGIS Spatial Analyst extension, are listed and accessible in this YouTube playlist. Over 108 minutes of content is included, but in easy-to-understand short segments that are almost entirely comprised of demonstrations of the tools in real-world contexts. They make use of public domain data such as land cover, hydrography, roads, and a Digital Elevation Model.
The videos include the topics listed below. Videos 10 through 20 include a real-world scenario of selecting optimal sites for fire towers in the Loess Hills of eastern Nebraska, an exercise that Jill Clark and I included in the Esri Press book The GIS Guide to Public Domain Data and available online.
1) Using the transparency and swipe tools with raster data.
2) Comparing and using topographic maps and satellite and aerial imagery stored locally to the same type of data in the ArcGIS Online cloud.
3) Analyzing land cover change with topographic maps and satellite imagery on your local computer and with ArcGIS Online.
4) Creating a shaded relief map using hillshade from a Digital Elevation Model (DEM).
5) Analyzing a Digital Elevation Model and a shaded relief map.
6) Creating contour lines from elevation data.
7) Creating a slope map from elevation data.
8) Creating an aspect (direction of slope) map from elevation data.
9) Creating symbolized contour lines using the Contour with Barriers tool.
10) Decision making using GIS: Introduction to the problem, and selecting hydrography features.
11) Decision making using GIS: Buffering hydrography features.
12) Decision making using GIS: Selecting and buffering road features.
13) Decision making using GIS: Selecting suitable slopes and elevations.
14) Decision making using GIS: Comparing Boolean And, Or, and Xor Operations.
15) Decision making using GIS: Selecting suitable land use.
16) Decision making using GIS: Selecting suitable land use, slope, and elevation.
17) Decision making using GIS: Intersecting vector layers of areas near hydrography and near roads.
18) Decision making using GIS: Converting raster to vector data.
19) Decision making using GIS: Final determination of optimal sites.
20) Creating layouts.
21) Additional considerations and tools in creating layouts.
22) Checking Extensions when using Spatial Analyst tools.
How might you be able to make use of these videos and the processes described in them in your instruction?