April 16, 2012 5 comments

Welcome to the Spatial Reserves blog.

The GIS Guide to Public Domain Data was written to provide GIS practitioners and instructors with the essential skills to find, acquire, format, and analyze public domain spatial data. Some of the themes discussed in the book include open data access and spatial law, the importance of metadata, the fee vs. free debate, data and national security, the efficacy of spatial data infrastructures, the impact of cloud computing and the emergence of the GIS-as-a-Service (GaaS) business model. Recent technological innovations have radically altered how both data users and data providers work with spatial information to help address a diverse range of social, economic and environmental issues.

This blog was established to follow up on some of these themes, promote a discussion of the issues raised, and host a copy of the exercises that accompany the book.  This story map provides a brief description of the exercises.


Ethics in Geospatial Decision-Making

September 11, 2017 Leave a comment

Our book and this blog frequently focus on the importance of making wise decisions when using geospatial data.  We often discuss the two-edged sword in which we are living with regard to the modern GIS era:  ‘Tis wonderful to have a plethora of geospatial data services at our fingertips, many of which are in real time, many are capable of being visualized in 3-D, and many are updated and curated with regularity.  Coupled with these services are a variety of easy-to-use spatial analysis tools that come coupled with desktop and web-based GIS software platforms.  But with this availability of data and easy-to-use tools brings increasing likelihood that decisions will be made based on them without regard to the data’s sources, scales, update frequency, map projection, completeness of attributes, and other measures of quality.

Decisions are still in large part made by humans, and the human element has always been laden with ethical decisions, whether we realize it or not.  Adding to the ethical element is the fact that geospatial decisions involve land, which has economic but also personal and inherent value, and affects people who live on that land.  Geospatial decisions also affect the very air we breathe and water we drink.

How can we be more purposefully aware of ethical elements in our decisions based on geospatial data?  Some insightful chapters and articles will, I think, be of help.  One is the new chapter on Professional and Practical Ethics of GIS&T in the UCGIS GIS&T Body of Knowledge project by David DiBiase.  Another is a 7-Step guide to ethical decision-making, written in 1999 but still incredibly relevant.  I particularly like the tests that the author describes–the harm test, the publicity test, the defensibility test, the reversibility test, the colleague test, and the organization test.

Another excellent resource is Penn State’s ethics education resource for geospatial professionals, which lists interesting and pertinent case studies, codes of ethics, university course syllabi, and other resources.  In a recent article in Directions Magazine, Dr Diana S. Sinton explores how ethics can be integrated into geospatial education.   She advocates that ethics be threaded throughout even an introductory GIS course rather than be relegated to one lecture, as is often the case.

What are your thoughts regarding ethics in GIS?

jjk_question.PNGGeospatial decisions are ethical decisions as well.

Categories: Public Domain Data

Evaluating GIS costs and benefits

August 28, 2017 1 comment

One of the themes in our book and this blog is to carefully evaluate the costs and benefits of geospatial data.  This should be considered if you are a consumer of data, and are debating whether to purchase data that may be “cleaned up”, thereby saving you time, or to download a free “pre-processed” version of that data, which saves you up-front money but may require quite a few hours or your time or your staff’s time.  However, a data producing organization should also evaluate costs and benefits when they decide how to serve it, and if and how to charge for it.

Chapter 4 of our book delves into these questions: “What is the true cost and value of spatial data?  How can the cost and value of spatial data be measured?  How do the policies determining cost and access ultimately affect the availability, quality, and use of spatial data?”

Other resources might be helpful:  One of my favorite pieces is this essay from Geospatial World on the Economic Value of Geospatial Data–The Great Enabler as is this economic studies for GIS operations document from NSGIC.  A series of 10 case studies are summarized in an e-book from Esri entitled Return on Investment, and here is the results of research of 82 cost-benefit assessments across multiple countries.  One of my favorite “benefits from GIS implementation” pieces is this recent brief but pointed document from Ozaukee County.  A dated but still solid chapter on this topic from Obermeyer is here, with a case study in Ghana here.  The economic impact infographic that has probably received the most attention is from Oxera’s well-done “Economic impact of Geo Services” study.


The top of the “Economic Impact of Geo Services” infographic from Oxera’s study.

What are your thoughts?  Should organizations still be charging for data in the 21st Century?  Should all geospatial data be open for anyone to use?  How should organizations pay for the creation and curation of geospatial data as the audience and uses for that data continue to expand?  Once geospatial data services are online, how can they best be updated and curated?

Best Available Data: “BAD” Data?

August 14, 2017 3 comments

You may have heard the phrase that the “Best Available Data” is sometimes “BAD” Data. Why?  As the acronym implies, BAD data is often used “just because it is right at your fingertips,” and is often of lower quality than the data that could be obtained with more time, planning, and effort.  We have made the case in our book and in this blog for 5 years now that data quality actually matters, not just as a theoretical concept, but in day to day decision-making.  Data quality is particularly important in the field of GIS, where so many decisions are made based on analyzing mapped information.

All of this daily-used information hinges on the quality of the original data. Compounding the issue is that the temptation to settle for the easily obtained grows as the web GIS paradigm, with its ease of use and plethora of data sets, makes it easier and easier to quickly add data layers and be off on your way.  To be sure, there are times when the easily obtained is also of acceptable or even high quality.  Judging whether it is acceptable depends on the data user and that user’s needs and goals; “fitness for use.”

One intriguing and important resource in determining the quality of your data can be found in The Bad Data Handbook, published by O’Reilly Media, by Q. Ethan McCallum and 18 contributing authors.  They wrote about their experiences, their methods and their successes and challenges in dealing with datasets that are “bad” in some key ways.   The resulting 19 chapters and 250-ish pages may make you want to put this on your “would love to but don’t have time” pile, but I urge you to consider reading it.  The book is written in an engaging manner; many parts are even funny, evident in phrases such as, “When Databases attack” and “Is It Just Me or Does This Data Smell Funny?”

Despite the lively and often humorous approach, there is much practical wisdom here.  For example, many of us in the GIS field can relate to being somewhat perfectionist, so the chapter on, “Don’t Let the Perfect be the Enemy of the Good” is quite pertinent.   In another example, the authors provide a helpful “Four Cs of Data Quality Analysis.”  These include:
1. Complete: Is everything here that’s supposed to be here?
2. Coherent: Does all of the data “add up?”
3. Correct: Are these, in fact, the right values?
4. aCcountable: Can we trace the data?

Unix administrator Sandra Henry-Stocker wrote a review of the book here,  An online version of the book is here, from it-ebooks.info, but in keeping with the themes of this blog, you might wish to make sure that it is fair to the author that you read it from this site rather than purchasing the book.  I think that purchasing the book would be well worth the investment.  Don’t let the 2012 publication date, the fact that it is not GIS-focused per se, and the frequent inclusion of code put you off; this really is essential reading–or at least skimming–for all who are in the field of geotechnology.


Bad Data book by Q. Ethan McCallum and others. 


Data Practitioner Profile Document Reviewed

July 31, 2017 2 comments

The recent document entitled “Profile of the Data Practitioner” (created by a panel with a diverse background, published by EDC Newton Massachusetts USA) is useful in several ways.  First, it succinctly outlines many of the issues we have focused on in this blog and in our book–data quality, critical thinking, domain knowledge, and others.  Second, it lists skills, knowledge, and behaviors, and therefore is an excellent though brief supplement to the Geospatial Technology Competency Model.  Third, it lists equipment, tools, and supplies, future trends, and industry concerns.  Fourth, page 2 of the document is a practical application of the Geographic Inquiry Model, as it describes how the data practitioner initiates a project, sources the data, transforms the data, analyzes the data, closes out the project, and engages in professional development.

The document should be helpful for those pursuing their own career path in GIS and data science, and for those designing and teaching courses and workshops in GIS in academia, nonprofit organizations, private companies, and government agencies.  I only wish the document was longer or linked to a longer report that would provide more detail.  Still, for a succinct document summarizing some key items that data practitioners need to have in place, this document is worth spending time reviewing and telling others about.

Data Portals for the Chesapeake Bay reviewed

July 17, 2017 3 comments

The Chesapeake Bay, situated along the east coast of the United States, in part because it has long been a focus for environmental restoration, is also a rich source of geospatial data. One primary source is the Virginia Institute on Marine Science, which maintains a Submerged Aquatic Vegetation (SAV) resource.  It also includes an interactive GIS map.  The SAV maps and vegetation data contain information dating all the way back to 1971.

The reasons why VIMS maps the SAV is because this vegetation is one of the best barometers of the water quality; its beds filter polluted runoff, provide food for waterfowl, and provide habitat for blue crabs, juvenile rockfish (striped bass), and other aquatic species; the beds are associated with clear water, and their presence helps improve water quality.  Even if you are not interested in analyzing the vegetation per se, the site is an excellent resource for data and imagery on the Chesapeake Bay.

Some of my other recommended data sites in the region include the Chesapeake Bay Data Hub, the Maryland Open Map portal, (which we reviewed here), the Susquehanna River Basin Commission, Virginia’s open data portal, other Virginia portals, and the USGS’ Chesapeake Bay’s site.  Try these resources and we look forward to your comments below.


Portion of submerged aquatic vegetation imagery and mapped data from the Virginia Institute of Marine Science.

Reviewing the US City Open Data Census Portal of Geospatial Content

July 2, 2017 1 comment

The US City Open Data Census portal is “an ongoing, crowdsourced measure of the current state of access to a selected group of datasets in municipalities across the United States.”  The portal represents another example of a trend we have been noting in this blog for quite some time, a catalog that is a combination of crowdsourced and created by the authors.  In this case, “Any community member can contribute an assessment of these datasets in their municipality at any time. Census content will be peer-reviewed periodically by a volunteer team of Census librarians. [..]  The US City Open Data Census began as a partnership between Code for America, the Sunlight Foundation, and Open Knowledge International. It is maintained by Sunlight Foundation staff members, with technical support from Open Knowledge, local outreach by Code for America brigades, advising from the Open Government Data working group, and contributions from many members of the wider community.”

In the case of this site, don’t think “Census” in terms of demographic data gathered by statistical agencies, but rather, “census” as a catalog of geospatial data for municipalities.  The 18 themes currently cataloged for urban areas include crime, parcels, zoning, and others, but also those that are of interest but may be outside typically considered and sometimes a-spatial categories, such as lobbyist activity, web analytics, and spending.  At this time, the site’s focus is on the U.S. only.  Cities are ranked by the variety and amount of data in the catalog, and at the time of this writing, Las Vegas achieved top score. Testing this site, I was able to find quite a volume of data, in many formats that I could use, and in some formats I was not familiar with but was able to find out more about them.  If the data set I needed was not available, which occurred on more than one occasion, the site tells me who to contact.

If a data user wanted to obtain a set of data to compare across cities, this data set would save that data user quite a bit of time scouring each city’s GIS data site.  Therefore, even though the site’s ambitious list of themes are empty for many cities, and in many ways this project is just getting started, this resource may be valuable for your needs.  And in part because it is crowdsourced and curated, it could become even more valuable in the future.  Time will tell if it persists.  And, like any resource, be critical of its sources and use it if you deem that it will meet your needs.



Visualizing data cataloged by the US City Open Data Census portal, ranked by “score”, with a lower number indicating that a greater volume and wider variety of data is available for that city.

Data Quality on Live Web Maps

June 19, 2017 3 comments

Modern web maps and the cloud-based GIS tools and services upon which they are built continue to improve in richness of content and in data quality.  But as we have focused on many times in this blog and in our book, maps are representations of reality.  They are extremely useful representations, to be sure, particularly so in the cloud, but still are representations.   These representations are dependent upon the data sources, accuracy standards, map projections, completeness, processing and rendering procedures used, regulations and policies in place, and much more.  A case in point are offsets between street data and the satellite image data that I noticed in mid-2017 in Chengdu in south-central China.  The streets are about 369 meters southeast of where they appear on the satellite image (below):


Puzzled, I panned the map to other locations in China.  The offsets varied, but they appeared everywhere in the country; for example, note the offset of 557 meters where a highway crosses the river at Dongyang, again to the southeast:


As of this writing, the offset appears in the same cardinal direction and only in China; indeed; After examining border towns with North Korea, Vietnam, and other countries, the offset appears to stop along those borders.  No offsets exist in Hong Kong nor in Macao.  Yahoo Maps Bing Maps both show the same types of offsets in China (Bing maps example, below):


MapQuest, which uses an OpenStreetMap base, showed no offset.  I then tested ArcGIS Online with a satellite image base and the OpenStreetMap base, and there was no offset there, either (below).  This offset is a datum issue related to national security that is documented in this Wikipedia article.  The same data restriction issues that we discuss in our book and in our blog touch on other aspects of geospatial data, such as fines for unauthorized surveys, lack of geotagging information on many cameras when the GPS chip detects a location within China, and seeming unlawfulness of crowdsourced mapping efforts such as OpenStreetMap.

But furthermore, as we have noted, the satellite images are processed tiled and data sets, and like other data sets, they need to be critically scrutinized as well.  They should not be considered “reality” despite their appearance of being the “actual” Earth’s surface.  They too contain error, may have been taken on different dates or seasons, may be reprojected on a different datum, and other data quality aspects need to be considered.


Another difference between these maps is the wide variation in the amount of detail in terms of the streets data in China.  The OpenStreetMap was the most complete; the other web mapping platforms offered a varying level of detail; some of which were seriously lacking, surprisingly especially in the year 2017, in almost every type of street except major freeways.  The streets content was much more complete in other countries.

It all comes back to identifying your end goals in using any sort of GIS or mapping package.  Being critical of the data can and should be part of the decision making process that you use and the choice of tools and maps to use.  By the time you read this, the image offset problem could have been resolved.  Great!  But are there now new issues of concern? Data sources, methods, and quality vary considerably among different countries. Furthermore, the tools and data change frequently, along with the processing methods, and being critical of the data is not just something to practice one time, but rather, fundamental to everyday work with GIS.