Welcome

April 16, 2012 5 comments

Welcome to the Spatial Reserves blog.

The GIS Guide to Public Domain Data was written to provide GIS practitioners and instructors with the essential skills to find, acquire, format, and analyze public domain spatial data. Some of the themes discussed in the book include open data access and spatial law, the importance of metadata, the fee vs. free debate, data and national security, the efficacy of spatial data infrastructures, the impact of cloud computing and the emergence of the GIS-as-a-Service (GaaS) business model. Recent technological innovations have radically altered how both data users and data providers work with spatial information to help address a diverse range of social, economic and environmental issues.

This blog was established to follow up on some of these themes, promote a discussion of the issues raised, and host a copy of the exercises that accompany the book.  This story map provides a brief description of the exercises.

pdd

Dangermond and Goodchild on building geospatial infrastructure

January 5, 2020 2 comments

A new open access article from Dangermond and Goodchild on building geospatial infrastructure is germane to this blog and our book’s focus on geospatial data.  Moreover, at the dawn of the new decade, I regard this article as an important one to read and to reflect upon.

The article’s abstract states, “Many visions for geospatial technology have been advanced over the past half century. Initially researchers saw the handling of geospatial data as the major problem to be overcome. The vision of geographic information systems arose as an early international consensus. Later visions included spatial data infrastructure, Digital Earth, and a nervous system for the planet. With accelerating advances in information technology, a new vision is needed that reflects today’s focus on open and multimodal access, sharing, engagement, the Web, Big Data, artificial intelligence, and data science. We elaborate on the concept of geospatial infrastructure, and argue that it is essential if geospatial technology is to contribute to the solution of problems facing humanity.”

Besides providing a concise yet insightful history of the evolution of GIS and spatial data, one of the most thought provoking statements in the article, in my opinion, is that “a digital twin should also replicate how the Earth works, by using software to reproduce the processes that modify the Earth’s physical and social systems.”  In other words, for us to solve the complex problems of our 21st Century world, GIS must be able to show how the Earth’s systems interact and work, and moreover, how they should work; that is, how can we use GIS and spatial data to plan a more resilient future?

I also found the following statement to be wonderfully useful, “Today we think of the basic element of geospatial technology as a tuple <x,y,z,t,a> where x, y, and z are location in three-dimensional space, t is time, and a is an attribute of that location in space-time.”  And I have personally used Jack Dangermond’s metaphor of GIS being an intelligent nervous system for the planet, mentioned in the article, dozens of times in my own presentations over the past four years.

But the article is much more than an account of history and a conceptualization of how to understand GIS–it offers challenges to us in the GIS community.   For example, it states that “little advance has been made in measuring, representing, and dealing with the uncertainty that is always present when geographic reality is measured and captured in digital data, but not always present in published maps.  The article also takes special note of key progress that has led us to this very exciting moment in the use of geospatial technology, including (1) portals, (2) open data, (3) engagement, (4) collaboration, (5) story maps, (6) device independence, and (7) Cloud GIS.  These are not just ideas, they are happening now, with real tools and infrastructure that enable people to accomplish real tasks.  The article also highlights some advancements that can lead us the very real possibility that GIS can “break wide open” (my phrase) in the decade of the 2020s with (1) GeoAI, (2) scripting and workflows, (3) replicability, (4) predictive modeling, and (5) real-time analysis.

The article concludes with what I believe to be an excellent question that cuts to the heart of what we in the industry should be asking:  “What, then, should be the goals of geospatial infrastructure, and how should it be configured?”  In other words, the advancements are great, but we need to ask ourselves, where should we be taking the technology, if we are seeking a more sustainable future?  It’s not enough to ride on the ship; we need to steer it.  Dangermond and Goodchild lay out some challenges in this section, such as the following statement, which I think points to “think outside of the software box and re-engineer the software tool if necessary” — “Decisions that were made during periods of very limited computing power become enshrined in practices that may be very hard to shake.”  They also discuss resilience, protecting biodiversity, collaboration, and ensuring individual privacy.   The authors end with this statement, which is I believe a challenge for all of us to take seriously, “But what is missing in our view is a vision, a “moonshot,” a statement of principles against which progress can be measured.”

–Joseph Kerski

 

Download Arctic area digital elevation data from ArcticDEM

December 22, 2019 Leave a comment

Here we are at the winter solstice in the Northern Hemisphere and it seems appropriate to discuss polar data.  ArcticDEM is an NGA (National Geospatial Intelligence Agency) – NSF (National Science Foundation) public-private initiative to produce a high-resolution (2 meter), high quality, digital surface model (DSM) of the Arctic using optical stereo imagery, high-performance computing, and open source photogrammetry software.  The majority of ArcticDEM data was generated from the panchromatic bands of the WorldView-1, WorldView-2, and WorldView-3 satellites. A small percentage of data was also generated from the GeoEye-1 satellite sensor.  The resource covers all land north of 60 degrees north latitude.  Yes!  Not only Alaska, but Scandinavia, Russia, Canada and Iceland.  For more information, see this page.  For a web mapping application the Arctic DEM Explorer from Esri, see this page, and for the bare-bones but useful file index for fast downloading, see this page.

In my opinion, the most useful site about Arctic DEM for downloading the data is this web mapping application, the ArcticDEM index and data download.  This application allows a user to select specific index tiles of digital elevation model data.  The tiles reveal information about the DEM tile and a download web URL.  Each cell is about 2GB, with over 18 TB on the entire site.  Truly a treasure trove of data!  For selecting multiple indices, use the ‘Query’ tool to draw an area and return information on intersecting DEM tile indices. You can export these results for your reference which also include the download web URLs.

Click on any location for attribute information.  Find the “fileurl” attribute, click on More info, and then you will be able to download the 2 meter elevation data for that location.  The query widget allows for the retrieval of information from source data by executing an intersect query either against 2m DEM strips or 2M DEM mosaics.   The resource also includes a swipe tool where you can compare the content of two different layers on the map, such as the index layer and the hillshade.

The best news about this resource, and consistent with our continued mantra about GIS as a SaaS, may be that the site allows for the data to be examined as an ArcGIS Online item and also as an image service via a URL.

arcticdem.PNG

Interface of the Arctic DEM Index and Data Download resource. 

arcticdem2.PNG

The Arctic DEM data streamed and viewed in ArcGIS Online. 

I look forward to hearing your reactions to this resource.

–Joseph Kerski

A Geodata Fabric for the 21st Century, article reflections

December 15, 2019 4 comments

A recent article by on A Geodata Fabric for the 21st Century touches on many pertinent themes in geospatial technology in this column and beyond.

Jeff begins by reminding us of the 4 V’s of big data–volume, variety, velocity, and variability, telling us that we are firmly in the age of big data, with the NISAR satellite soon to be providing 85 TB of data per day, as just one example.  But he also states that geospatial and earth science is not the only field grappling with big data, giving impressive numbers coming out from astronomy and geomics (geonome science).  Jeff says, “We need a more unified approach such that each data provider—whether in the atmosphere, land surface, seismology, hydrology, oceanography, or cryosphere domain—can contribute to a shared and commonly accessible framework.”  To build it, he says we need (1) a new type of storage (such as object storage); (2) minimize the number of times we move data (I think of how many times in a typical project I move data around:  Can I reduce this number?); (3) to take advantage of the cloud; and (4) keep things simple.  Jeff says, “A user should be able simply to ask for—or directly visualize—a desired data set, time range, and area of interest while software behind the scenes automatically provides what was requested.”  Amen to that!  And he makes a good tie to the role that machine learning could play.   Could the Esri geospatial cloud help enable this?

Taking a step back from the technological and logistical aspects of collecting and managing large volumes of data, we also need to ask what we want from all this data, in the short, medium and longer term. Our aspirations and expectations are sometimes harder to define and maintain. What do we want to do with all this data and when do we need to do it? There are many great examples of some of the things we can do with spatial data but sometimes they seem to focus more on the technology, the latest version of a particular software or innovation in data management technology, than on progress towards achieving a longer term goal such as improved environmental and resource management.

The improvements in data collection, storage and management over the last 50 years have revolutionised what we can capture and what we can do with the data. To make the most of these invaluable data assets, we must also avoid the distraction of the bright shiny lights of technology for technology’s sake and keep in mind what we are trying to achieve. Starting with the desired end result:  What data helps achieve that, the best source/format/currency, regardless of how it is stored and whose server does it sits on.

–Jill Clark, Joseph Kerski

 

Categories: Public Domain Data Tags:

Be a Wise Consumer of Fun Posts, too!

December 12, 2019 12 comments

Around this time of year, versions of the following story seem to make their way around the internet:

The passenger steamer SS Warrimoo was quietly knifing its way through the waters of the mid-Pacific on its way from Vancouver to Australia. The navigator had just finished working out a star fix & brought the master, Captain John Phillips, the result. The Warrimoo’s position was LAT 0º 31′ N and LON 179 30′  W.  The date was 31 December 1899.

“Know what this means?” First Mate Payton broke in, “We’re only a few miles from the intersection of the Equator and the International Date Line”.  Captain Phillips was prankish enough to take full advantage of the opportunity for achieving the navigational freak of a lifetime.  He called his navigators to the bridge to check & double check the ships position.  He changed course slightly so as to bear directly on his mark.  Then he adjusted the engine speed. The calm weather & clear night worked in his favor. 

At midnight the SS Warrimoo lay on the Equator at exactly the point where it crossed the International Date Line! The consequences of this bizarre position were many:  The forward part (bow) of the ship was in the Southern Hemisphere & the middle of summer. The rear (stern) was in the Northern Hemisphere & in the middle of winter.  The date in the aft part of the ship was 31 December 1899.  Forward it was 1 January 1900.  This ship was therefore not only in two different days, two different months, two different years, two different seasons, but in two different centuries – all at the same time.

I have successfully used many types of geographic puzzles with students and with the general public over the years, and I enjoy this story a great deal.  But in keeping with our reminders on this blog and in our book to “be critical of the data,” reflections on the incorrect or absent aspects to this story can be instructive as well as heighten interest. The SS Warrimoo was indeed an actual ship that was built by Swan & Hunter Ltd in Newcastle Upon Tyne, UK, in 1892, and was sunk after a collision with a French destroyer during World War I in 1918.  Whether it was sailing in the Pacific in 1899, I do not know.

The version of this story on CruisersForum states that it is “mostly true.”  What lends itself to scrutiny?  Let us investigate a few of the geographic aspects in the story.

First, the statement, “working out a star fix” leaves out the fact that chronometers were used to work out the longitude, rather than a sextant.  (And I highly recommend reading the book Longitude by Dava Sobel).  Second, the International Date Line (IDL) as we know it today was not in place back in 1899.  The nautical date line, not the same as the IDL, is a de jure construction determined by international agreement. It is the result of the 1917 Anglo-French Conference on Time-keeping at Sea, which recommended that all ships, both military and civilian, adopt hourly standard time zones on the high seas. The United States adopted its recommendation for U.S. military and merchant marine ships in 1920 (Wikipedia).

Third, the distance from LAT 0º 31′ N and LON 179 30′ W to LAT 0º 0′ N and LON 180′ W is about 42 nautical miles, and the ship could have traveled at a speed of no more than 20 knots (23 mph).  Therefore, conceivably, the ship could have reached the 0/180 point in a few hours, but whether it could have maneuvered in such a way to get the bow and stern in different hemispheres is unlikely, given the accuracy of measurement devices at the time.  Sextants have an error of as at least 2 kilometers in latitude, and chronographs about 30 kilometers in longitude. Or, they could already have reached the desired point earlier in the day and not have known it.  Even 120 years later, in my own work with GPS receivers at intersections of full degrees of latitude and longitude, it is difficult to get “exactly” on the desired point:  Look carefully at the GPS receiver in my video at 35 North Latitude 81 West Longitude as an example.  An interesting geographic fact is that, going straight East or West on the Equator along a straight line, it is possible to cross the dateline three times (see map below).

Our modern digital world is full of fragments that are interesting if not completely accurate, but as GIS professionals and educators, I think it is worth applying “be critical of the data” principles even to this type of information.  The story is still interesting as a hypothetical “what could have happened” and provides great teachable moments even if the actual event never occurred.

international_date_line

The International Date Line (CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=147175).

New free course on teaching and learning with the ArcGIS Living Atlas of the World

December 8, 2019 Leave a comment

I am very pleased to announce that the course that my colleague and I created on Teaching with ArcGIS Living Atlas of the World is now available!

https://www.esri.com/training/catalog/5dc1b74ce4212b48e187e837/teaching-with-arcgis-living-atlas-of-the-world/

The course is:  Free, fun, and rigorous!  It provides skills and perspectives for making effective use of the wonderful resources that are in the Living Atlas in teaching and learning.  We first wrote about the Living Atlas on this blog, here.  Yes, the course is geared toward educators, but could be useful for non-educators who love data, as well.

–Joseph Kerski

livatl.PNGFront page to Living Atlas course.

Application for Extracting and Exploring Analysis Ready Samples (AρρEEARS)

November 24, 2019 Leave a comment

Imagine a data site where you can upload your own data for processing and spatial analysis, using tools that you do not own!  The Application for Extracting and Exploring Analysis Ready Samples (AρρEEARS) allows you to do just that.  I recently attended a presentation about this application at the Applied Geography Conference on AppEEARS and was very impressed.  AppEEARS offers a simple, efficient way to access and transform geospatial data from a variety of federal data archives, and hence merits highlighting in this Spatial Reserves data blog. AppEEARS enables data users to subset and extract geospatial datasets using spatial, temporal, and band/layer parameters.

Two types of sample requests are available: point samples for geographic coordinates and area samples for spatial areas via vector polygons.  Results stay on the LP DAAC site for 30 days, during which time you can archive them somewhere else or download them to your own device or server.

You need to have an Earthdata free account to use the site, but once you get one here, you can be off and running.  AppEEARS is tied to the LP DAAC (Land Processes Distributed Active Archive Center), in which there is no shortage of data.  Sample requests submitted to AρρEEARS provide users not only with data values, but also associated quality data values. Interactive visualizations with summary statistics are provided for each sample within the application, which allow users to preview and interact with their samples before downloading their data.

What’s more, you can also access the AρρEEARS API. This API allows users to write programs to interact with AρρEEARS. This is largely the same API that powers the AρρEEARS user interface.

My favorite part of AppEEARS is the tutorials and lessons that are in the e-learning resources zone, here.   Presentations, videos, and webinars are housed there, but my favorite part is the tutorials.  These are detailed, clear, and can be used as self-contained lessons for you, your colleagues, or students to learn about analysis methods, spatial data, and earth phenomena such as wildfires.  For example, using a tutorial written by Danielle Golon from Innovate Inc (a USGS contractor), you can generate remote sensing-derived environmental descriptors to monitor Yosemite National Park, without downloading the remotely sensed data itself:  All of your processing is done on the AppEEARS site, and you will use imagery, box plots, whisker plots of NDVI values, and other tools and data to analyze several fires from 2013 to 2018 over space and time.   You will use NASA Visible Infrared Imaging Radiometer Suite data (VIIRS) and MODIS data (Moderate Resolution Imaging Spectroradiometer).

Using another tutorial, you will generate environmental descriptors of bus stops in the Phoenix metro area to determine which bus stops could benefit from heat relief shelters.  This tutorial uses MODIS data and daily surface weather data.

 

appears-output.PNG

Sample AppEEARS temporal data for fire analysis.  

I highly recommend giving the AppEEARS resources and tools a try.

–Joseph Kerski

An introduction to Ethics in GIS

November 10, 2019 3 comments

One of the objectives of this blog and our book is to not only help you gain technical knowledge about GIS and data, but also to help you understand the societal issues surrounding data.  Ethics is central to many of these societal issues.  We have written about ethics in geospatial decision making, ethics in using images in mapping projects, company ethics vs. technical reputation, and ethics surrounding data quality issues.  But here let us discuss one way of introducing ethics to co-workers and to students with an example of how I have integrated ethics into one of my own courses on cartography and geo-visualization.  The following is the actual text and readings that I use in this course.  I look forward to your reactions.

Ethics in GIS.  Ethics in science is an expansive topic; it is introduced here, but you will have the opportunity to explore it further later in this course.  Ethics matter in GIS because:  (1) Knowing that maps are powerful means of communication, you should take that responsibility as map author seriously.  (2) Knowing from our brief discussion on crowdsourcing and citizen science that everyone is now a potential map producer, and no longer just a map consumer, there are more maps in existence than ever before–with a wide variety of quality and purposes–some well documented, some not so.  That said, maps still have an aura of authenticity–they tend to be believed.  Again, take that responsibility seriously, and do not intentionally mislead your audience.

The Social implications for GIS began to be examined during the mid-1990s with books such as Ground Truth.  (Links to an external site.) Another oft-cited book on this topic is How to Lie with Maps (Links to an external site.) by geographer Mark Monmonier, which examined the ways that maps are distortions–intentionally and unintentionally–of reality.

Code of Ethics.  There are several key items that are generally thought to be included in a code of ethics for people working in the field of GIS.  The first is to have a straightforward agenda, ensuring that the purpose of your map is evident to the map reader.  It should not be deceiving or confusing, but rather, transparent in its purpose.  The second code is to get to know your intended audience as much as you can, so you can effectively communicate through maps.  The third code is to not intentionally lie with data–do not symbolize or classify the data with the intent to deceive.  The fourth code is that a map should show all relevant data as completely as possible–do not intentionally leave things or context out that could help the reader understand the phenomenon, again, balancing this with the guidelines about abstracting and generalizing.

The fifth code is that a map should not discard contrary data just because it is contrary.  Rather, your map should be as much as possible a neutral representation of reality, just as your research often should be.   The sixth code is that the map should strive for an accurate portrayal of the data, where the data is not diminished or exaggerated.  The seventh code is to avoid plagiarizing.  Just like your research, you should always properly cite your sources of information. You can cite sources via the map’s metadata.  The eighth code is to select symbols that will not bias the map. The symbols should be neutral representation of features.  The classification and projection, too, should be chosen so that potential bias is minimized.  Code nine is that the map should be repeatable, such that another GIS professional should be able to independently create a similar map using the same data and focusing on the same message. The 10th code is to be sensitive to different cultural values and principles when making your map, such as color and symbols.  In summary, when creating a map, you should strive to provide a truthful, neutral representation of reality targeted specifically for your audiences’ level of knowledge so that your map can effectively convey your intended message.

(Source:  www.spatialquerylab.com (Links to an external site.) for the 10 cartographer’s codes of ethics in this document, with modifications by Joseph Kerski).

For more on geospatial ethics, (1) see the GIS Certification Institute’s Code of Ethics:  https://www.gisci.org/Ethics/CodeofEthics.aspx and (2) see these articles:

(1) http://lazarus.elte.hu/cet/academic/icc2009/dibiase.pdf (Links to an external site.)  – The GIS Professional Ethics Project:  Practical Ethics Education for GIS Pros.  by David DiBase et al. 2009).

(2) A new National Academy of Sciences report:  National Academy of Sciences.  2018.  Data Matters.  Ethics, Data, and International Research Collaboration in a Changing World: Proceedings of a Workshop.  https://www.nap.edu/catalog/25214/data-matters-ethics-data-and-international-research-collaboration-in-a (Links to an external site.)

Joseph Kerski

20190813_161650803_iOS

–Photograph by Joseph Kerski at a high school that is active in preparing students for business careers.   It is my hope that ethics are included in the discussion here and in all other science, business, GIS, and all other academic programs.

Categories: Public Domain Data Tags: ,