April 16, 2012 5 comments

Welcome to the Spatial Reserves blog.

The GIS Guide to Public Domain Data was written to provide GIS practitioners and instructors with the essential skills to find, acquire, format, and analyze public domain spatial data. Some of the themes discussed in the book include open data access and spatial law, the importance of metadata, the fee vs. free debate, data and national security, the efficacy of spatial data infrastructures, the impact of cloud computing and the emergence of the GIS-as-a-Service (GaaS) business model. Recent technological innovations have radically altered how both data users and data providers work with spatial information to help address a diverse range of social, economic and environmental issues.

This blog was established to follow up on some of these themes, promote a discussion of the issues raised, and host a copy of the exercises that accompany the book.  This story map provides a brief description of the exercises.

Sharing Geoprocessing Tools on the Web

January 15, 2017 Leave a comment

An article co-authored by Benjamin Pross, Christoph Stasch, and Albert Remke, of the 52°North Initiative for Geospatial Open Source Software GmbH; and Satish Sankaran and Marten Hogeweg of Esri describes a development that should interest anyone who uses geospatial data.  The 52°North Initiative for Geospatial Open Source Software has developed an open-source extension to ArcGIS for Desktop that enables access to Open Geospatial Consortium, Inc. (OGC), Web Processing Services (WPS).  The result?  This initiative makes it possible for these services to be used in the same manner as native ArcGIS geoprocessing tools.  In other words, they appear in the list of tools just as a standard buffer or overlay tool would appear.  Yes, it could be just that easy.

The article explains that “while ArcGIS allows geoprocessing tools to be published as a WPS, [ArcGIS] does not offer a WPS client interface. Consequently, it is not easy to access external non-ArcGIS geoprocessing tools such as simulation models, rich data interfaces, or processing capabilities from any other legacy software that supports the WPS interface.”  This points to the reason why this initiative offers such promise:  “The 52°North Extensible WPS Client for ArcMap was implemented as an open-source extension to ArcGIS that fully integrates into the ArcGIS for Desktop environment. It enables OGC WPS to be accessed and used in the same manner as native ArcGIS geoprocessing tools. This makes it easy to run WPS-based processes and integrate the results of that processing into ArcMap for use with other applications.”

In plain language, because the complex issues grappled with by GIS analysts often require major investments of time to generate models, services, and customized workflows and code, why should each analyst have to create all of this from scratch?  An enormous time savings could be realized if there was an easy way to share these things. This article both explains recent progress in this area but also encourages the community to think creatively about how to pursue further collaborative methods.


ArcGIS Web Processing Service client architecture.


Be a Wise Consumer of “Fun” Posts, too!

January 1, 2017 5 comments

Versions of the following story have made their way around the internet recently:

The passenger steamer SS Warrimoo was quietly knifing its way through the waters of the mid-Pacific on its way from Vancouver to Australia. The navigator had just finished working out a star fix & brought the master, Captain John Phillips, the result. The Warrimoo’s position was LAT 0º 31′ N and LON 179 30′  W.  The date was 31 December 1899.

“Know what this means?” First Mate Payton broke in, “We’re only a few miles from the intersection of the Equator and the International Date Line”.  Captain Phillips was prankish enough to take full advantage of the opportunity for achieving the navigational freak of a lifetime.  He called his navigators to the bridge to check & double check the ships position.  He changed course slightly so as to bear directly on his mark.  Then he adjusted the engine speed. The calm weather & clear night worked in his favor. 

At midnight the SS Warrimoo lay on the Equator at exactly the point where it crossed the International Date Line! The consequences of this bizarre position were many:  The forward part (bow) of the ship was in the Southern Hemisphere & the middle of summer. The rear (stern) was in the Northern Hemisphere & in the middle of winter.  The date in the aft part of the ship was 31 December 1899.  Forward it was 1 January 1900.  This ship was therefore not only in two different days, two different months, two different years, two different seasons, but in two different centuries – all at the same time.

I have successfully used many types of geographic puzzles with students and with the general public over the years, and I enjoy this story a great deal.  But in keeping with our reminders on this blog and in our book to “be critical of the data,” reflections on the incorrect or absent aspects to this story can be instructive as well as heighten interest. The SS Warrimoo was indeed an actual ship that was built by Swan & Hunter Ltd in Newcastle Upon Tyne, UK, in 1892, and was sunk after a collision with a French destroyer during World War I in 1918.  Whether it was sailing in the Pacific in 1899, I do not know.

The version of this story on CruisersForum states that it is “mostly true.”  What lends itself to scrutiny?  Let us investigate a few of the geographic aspects in the story.

First, the statement, “working out a star fix” leaves out the fact that chronometers were used to work out the longitude, rather than a sextant.  (And I highly recommend reading the book Longitude by Dava Sobel).  Second, the International Date Line (IDL) as we know it today was not in place back in 1899.  The nautical date line, not the same as the IDL, is a de jure construction determined by international agreement. It is the result of the 1917 Anglo-French Conference on Time-keeping at Sea, which recommended that all ships, both military and civilian, adopt hourly standard time zones on the high seas. The United States adopted its recommendation for U.S. military and merchant marine ships in 1920 (Wikipedia).

Third, the distance from LAT 0º 31′ N and LON 179 30′ W to LAT 0º 0′ N and LON 180′ W is about 42 nautical miles, and the ship could have traveled at a speed of no more than 20 knots (23 mph).  Therefore, conceivably, the ship could have reached the 0/180 point in a few hours, but whether it could have maneuvered in such a way to get the bow and stern in different hemispheres is unlikely, given the accuracy of measurement devices at the time.  Sextants have an error of as at least 2 kilometers in latitude, and chronographs about 30 kilometers in longitude. Or, they could already have reached the desired point earlier in the day and not have known it.  An interesting geographic fact is that, going straight East or West on the Equator along a straight line, it is possible to cross the dateline three times (see map below).

Our modern digital world is full of fragments that are interesting if not completely accurate, but I think as GIS professionals and educators, it is worth applying “be critical of the data” principles even to this type of information.  The story is still interesting as a hypothetical “what could have happened” and provides great teachable moments even if the actual event never occurred.


The International Date Line (CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=147175).

New Exercise Using Open Data Portals from Local Governments

December 18, 2016 Leave a comment

Despite the growing volume of geospatial data available, and the ease of use of much of this data, finding and using data remains a challenge.  To assist data users in these ongoing challenges, I have written a new activity entitled “Key Strategies for Finding Content and Understanding What You’ve Found.”   The goal of this activity ” Key Strategies for Finding and Using Spatial Data” is to enable GIS data users to understand what spatial analysis is, effectively find spatial data, use spatial data, and become familiar with the ArcGIS platform in the process.  I tested the activity with a group of GIS educators and now would like to share it with the broader GIS community.

The document makes it clear that we are still in a hybrid world–still needing to download some data for our work in GIS, but increasingly able to stream data from online data services such as those in ArcGIS Online.  But these concepts don’t make as much sense unless one actually practices doing this–hence the activity.

In the activity, I ask the user to first practice search strategies in ArcGIS Online, using tags and keywords. Then, I guide the user through the process of downloading and using a CSV file with real-time data.   After a brief review of data types and resources, I guide the user of the activity through the process of downloading data from a local government agency to solve a problem about flood hazards.  The next step asks users to compare this process of downloading data with streaming the same data from the same local government’s site (in this case, using data from Boulder County, Colorado) into ArcGIS Online.  The activity concludes with resources to discover more about these methods of accessing data.

Jill Clark and I have created other hands-on activities on this theme of finding and understanding data as well, available here.  We look forward to hearing your comments and I hope this new activity is useful.


Accessing data from the Boulder County local government GIS portal through the lesson described above.

Lasers: The future of data capture and transmission?

December 12, 2016 Leave a comment

Over the last four years we have discussed some of the many challenges posed by the volume of data now available online – issues of quality, determining provenance, privacy, identifying the most appropriate source for particular requirements and so on. Being overwhelmed by the choice of data available or not always knowing what resources are available or where to start looking have been common responses from geospatial students and practitioners alike.

A recent report from the BBC on laser technology highlighted some current and future applications that have or will transform geospatial data capture, including the use of LiDAR and ultra precise atom interferometers that could be used to develop alternate navigation systems that do not rely on GPS. The article also discusses the inherent limitations of our current electronics-based computing infrastructure and the potential of silicon photonics, firing lasers down optical fibres, to help meet the demand for instant or near-instant access to data in the Internet-of-Everything world. If many feel overwhelmed now by the volumes of data available, what will technologies like silicon photonics mean for data practitioners in the future? Just because data may be available at unprecedented speeds and accessed more easily, that alone doesn’t guarantee the quality of the data will be any better or negate current concerns with respect to issues such as locational privacy. A critical understanding of these issues will be even more important if we are to make the most of these advances in digital data capture and transmission.

The Montana Digital Atlas

December 4, 2016 Leave a comment

We wrote extensive reviews of local, regional, state, provincial, national, and international government data portals in our book and from time to time do so in this blog.  One of the finest state geospatial data portals in our judgment is the Montana Digital Atlas.

We have been critical here and in our book about data portals that were obviously set up simply to satisfy some organizational mandate without regard to those who will actually use the data portal.  I have spent time with the MAGIP (Montana Association of Geographic Information Professionals) community, and most recently was honored to give the keynote at their annual conference.  I am happy to report that they have built their data portal with the end user in mind.  What’s more, the Montana State Library has been a leader in the GIS community there for years, and I have found that when library information professionals are involved–people who really understand data–their resources will be extensive, the metadata will be rich, and the services are actually going to work.

The Digital Atlas features geographic databases, aerial photos, and topographic maps of lands in Montana.  The functions begin with an interactive map, where you can select base maps, thematic map layers, and tabular data from which you can draw on the screen, generate reports from in XLS and CSV formats, and download in various GIS formats.  The site features functionality that I wish all data portals had, such as the ability to move popup boxes to locations most convenient for you, choices on datums and projections for your data, the ability to clip data to specific geographic areas, and the ability to search the state library catalogs for articles, books, and other resources for the area you are investigating.  You can even copy the map link to get a web link to the current map that you have created on the site.  Furthermore, you can load some of the layers directly from the Montana State Library to ArcGIS Online, via the services in the ‘MSDI_Framework’ and ‘MSL’ folders, on https://gisservicemt.gov/arcgis/rest/services/MSDI_Framework and https://gisservicemt.gov/arcgis/rest/services/MSL.

Lastly, the portal managers graciously say to contact them at MSLDA @ mt.gov if you experience difficulty with the application, if you see a problem with any of the data, or even–and this is wonderful–to suggest additional map layers.



The Montana Digital Atlas.  I have selected dams, wetlands, and riparian zones.  At this point, I can generate reports, download the data, or clip and otherwise modify my data search.


Aqua People? Reflections on Data Collection and Quality

November 20, 2016 3 comments

Data quality is a central theme of this blog and our book.  Here, we focus on quality of geospatial information, which is most often in the form of maps.  One of my favorite maps in terms of the richness of information and the choice of symbology is this “simple map of future population growth and decline” from my colleague at Esri, cartographer Jim Herries.  Jim symbolized this map with red points indicating areas that are losing population and green points indicating areas that are gaining population.  This map can be used to learn where population change is occurring, down to the local scale, and, with additional maps and resources, help people understand why it is changing and the implications of growth or decline.

But the map can also be an effective tool to help people understand issues of data collection and data quality.  Pan and zoom the map until you see some rivers, lakes, or reservoirs, such as Littleton Colorado’s Marston Reservoir, shown on the map below. If you zoom in to a larger scale, you will see points of “population” in this and nearby bodies of water. Why are these points shown in certain lakes and rivers?  Do these points represent “aqua people” who live on houseboats or who are perpetually on water skis, or could the points be something else?


The points are there not because people are living in or on the reservoir, but because the dots are randomly assigned to the statistical area that was used.  In this case, the statistical areas are census tracts or block groups, depending on the scale that is being examined.  The same phenomena can be seen with dot density maps at the county, state, or country level.  And this phenomenon is not confined to population data.  For example, dot density maps showing soybean bushels harvested by county could also be shown in the water, as could the number of cows or pigs, or even soil chemistry from sample boreholes.  In each case, the dots do not represent the actual location where people live, or animals graze, or soil was tested.  They are randomly distributed within the data collection unit.  In this case, at the largest scale, the unit is the census block group, and randomly distributing the points means that some points fall “inside” the water polygons.

Helping your colleagues, clients, students, or some other audience you are working with understand concepts such as these may seem insignificant but is an important part of map and data interpretation.  It can help them to better understand the web maps that we encounter on a daily basis.  It can help people understand issues and phenomena, and better enable them to think critically and spatially.  Issues of data collection, quality, and the geographic unit by which the data was collected–all of these matter.  What other examples could you use from GIS and/or web based maps such as these?

Enhancements to Landsat Thematic Bands Web Mapping Application

November 6, 2016 Leave a comment

Last year, we wrote about the Landsat Thematic Bands Web Mapping Application, an easy-to-use but powerful teaching and research tool and data set. It is a web mapping application with global coverage, with mapping services updated daily with new Landsat 8 scenes and access to selected bands that allows the user to visualize agriculture, rock formations, vegetation health, and more.  The Time tool allows for the examination of changes over years, over seasons, or before and after an event.  The identify tool gives a spectral profile about each scene.  I have used this application dozens of times over the past year in remote sensing, geography, GIS, and other courses and workshops, and judging from the thousands of views that this blog has seen, many others have done the same thing.

If that weren’t all, the development team at Esri has recently made the tool even better–one can now save a time sequence or a band combination as a permanent URL that can be shared with others.  The flooding of 20 districts in August and September 2016 in Uttar Pradesh, India, for example, can be easily seen on this link that uses the application, with screenshots below.

Another example is the Fort McMurray summer 2016 wildfire in Alberta, Canada  – the user can change the time to see the region’s vegetation cover before and after fire, and the extent of the smoke during the fire.  Or, you can analyze a different band combination, as is seen here.

To do this, open the application.  Note that the application URL has been updated from the one we wrote about last year.  Move to an area of interest.  Select any one of the available thematic band renderers (such as agriculture, natural color, color infrared, and others available), or create your own band combination using build.  Then, turn on “time” to see your area of interest at different periods using your band combination.  Next, share this image with other people.   Simply click on any one of the social platforms (Facebook or Twitter) in the upper right, which will create a short link that can be shared.  When the person you send this link to opens it, the Landsat app will open in exactly the same state it was in before social platform tool was clicked.  Give it a try!


Landsat 8 Image for Allahabad India on 31 May 2016.