Welcome

April 16, 2012 5 comments

Welcome to the Spatial Reserves blog.

The GIS Guide to Public Domain Data was written to provide GIS practitioners and instructors with the essential skills to find, acquire, format, and analyze public domain spatial data. Some of the themes discussed in the book include open data access and spatial law, the importance of metadata, the fee vs. free debate, data and national security, the efficacy of spatial data infrastructures, the impact of cloud computing and the emergence of the GIS-as-a-Service (GaaS) business model. Recent technological innovations have radically altered how both data users and data providers work with spatial information to help address a diverse range of social, economic and environmental issues.

This blog was established to follow up on some of these themes, promote a discussion of the issues raised, and host a copy of the exercises that accompany the book.  This story map provides a brief description of the exercises.

pdd

Advertisements

New working lists of US Federal and State GIS portals

January 15, 2018 Leave a comment

Joseph Elfelt of MappingSupport.com has compiled a very helpful working list of addresses for over 40 federal ArcGIS servers with open MapServer and ImageServer data:

https://mappingsupport.com/p/surf_gis/list-federal-GIS-servers.pdf

And a list of over 50 state server addresses:

https://mappingsupport.com/p/surf_gis/list-state-GIS-servers.pdf

The lists also contain some key caveats and tips for finding local GIS data as well.  Joseph is open to the community contacting him with additional federal or state servers to add them; his contact information is at the top of the lists.  That these already excellent resources will continue to be updated is very good news.

elfelt.JPG

A section of the very helpful federal and state lists of servers with open MapServer or ImageServer data, compiled by Joseph Elfelt. 

Accessing and Using Lidar Data from The National Map

January 8, 2018 Leave a comment

We have written about the USGS data portal NationalMap numerous times in this blog and in our book, but since the site keeps getting enhanced, a re-examination of the site is warranted.  One of the enhancements over the past few years is the addition of Lidar data to the site.  I did some recent testing of searching for and downloading Lidar data on the site and wanted to report on my findings.  For videos of some of these procedures, go to the YouTube Channel geographyuberalles and search on Lidar.

From a user perspective, in my view the site is still a bit challenging, where the user encounters moments in the access and download process where it is not clear how to proceed.  However, (1) the site is slowly improving; (2) the site is worth investigating chiefly because of its wealth of data holdings:  It is simply too rich of a resource to ignore.  One challenging thing about using NationalMap is, like many other data portals, how to effectively narrow the search from the thousands of search results.  This in part reflects the open data movement that we have been writing about, so it is a good problem to have, albeit still cumbersome in this portal.  Here are the procedures to access and download the Lidar data from the site:

  1. To begin:  Visit the National Map:  https://nationalmap.gov/ > Select “Elevation” from this page.
  2. Select “Get Elevation Data” from the bottom of the Elevation page.  This is one of several quirks about the site – why isn’t this link in a more prominent position or in a bolder font?
  3. From the Data Elevation Products page left hand column:   Select “1 meter DEM.”
  4. Select the desired format.  Select “Show Availability”.   Zoom to the desired area using a variety of tools to do so.  In my example, I was interested in Lidar data for Grand Junction, in western Colorado.
  5. Note that the list of  available products will appear in the left hand column.  Lidar is provided in 10000 x 10000 meter tiles.  In my example, 108 products exist for the Grand Junction Lidar dataset.  Use “Footprint” to help you identify areas in which you need data–the footprints appear as helpful polygon outlines.  At this point, you could save your results as text or CSV, which I found to be quite handy.
  6. You can select the tiles needed one by one to add to your cart or select “Page” to select all items.  Select the Cart where you can download the tiles manually or select the “uGet Instructions” for details about downloading multiple files.  Your data will be delivered in a zip format right away, though Lidar files are large and may require some time to download.

 

lidar_results.JPG

The National Map interface as it appeared when I was selecting my desired area for Lidar data.

Unzip the LAS data for use in your chosen GIS package.  To bring the data into ArcGIS Pro, create a new blank project and name it.  Then, Go to Analysis > Tools > Create LAS dataset from your unzipped .las file, noting the projection (in this case, UTM) and other metadata.  Sometimes you can bring .las files directly into Pro without creating a LAS dataset, but with this NationalMap Lidar data, I found that I needed to create a LAS dataset first.

Then > Insert:  New Map > add your LAS dataset to the new map. Zoom in to see the lidar points.  View your Lidar data in different ways using the Appearance tab to see it as elevation, slope, aspect (shown below), and contours.  Use LAS dataset to raster to convert the Lidar data to a raster.  In a similar way, I added the World Hydro layer so I could see the watersheds in this area, and USA detailed streams for the rivers.

lidar_results2.JPG

Aspect view generated from Lidar data in ArcGIS Pro.

There are many things you can do with your newly downloaded Lidar data:  Let’s explore just a few of those.  First, create a Digital Elevation Model (DEM) and a Digital Surface Model (DSM).  To do this, in your .lasd LAS dataset > LAS Filters > Filter to ground, and visualize the results, and then use LAS Dataset to Raster, using the Elevation as the value field.  Your resulting raster is your digital elevation model (DEM).  Next, Filter to first return, and then convert this to a raster:  This is your digital surface model (DSM).  After clicking on sections of each raster to compare them visually, go one step further and use the Raster Calculator to create a comparison raster:  Use the formula:  1streturn_raster – (subtract) the ground_raster.  The first return result is essentially showing the objects or features on the surface of the Earth–the difference between “bare earth” elevation and the “first return”–in other words, the buildings, trees, shrubs, and other things human built and natural.  Symbolize and classify this comparison surface to more fully understand your vegetation and structures.  In my study area, the difference between the DEM and the DSM was much more pronounced on the north (northeast, actually) facing slope, which is where the pinon and juniper trees are growing, as opposed to the barren south (southwest) facing slope which is underlain by Mancos Shale (shown below).

lidar_veght

Comparison of DEM and DSM as a “ground cover” raster in ArcGIS Pro.

My photograph of the ridgeline, from just east of the study area, looking northwest.  Note the pinon and juniper ground cover on the northeast-facing slopes as opposed to the barren southwest facing slope.

Next, create a Hillshade from your ground raster (DEM) using the hillshade tool.   Next, create a slope map and an aspect map using tools of these respective names.  The easiest way to find the tools is just to perform a search.  The hillshade, slope, and aspect are all raster files.  Once the tools are run, these are now saved as datasets inside your geodatabase as opposed to earlier—when you were simply visualizing your Lidar data as slope and aspect, you were not making separate data files.

Next, create contours, a vector file, from your ground raster (DEM), using the create contours tool.  Change the basemap to imagery to visualize the contours against a satellite image.  To create index contours, use the Contour with Barriers tool.  To do this, do not actually indicate a “barriers” layer but rather use the contour with barrier tool to achieve an “index” contour, as I did, shown below.  I used 5 for the contour interval and 25 (every fifth contour) for the index contour interval.  This results in a polyline feature class with a field called “type”.  This field receives the value of 2 for the index contours and 1 for all other contours.  Now, simply symbolize the lines as unique value on the type field, specifying a thicker line for the index contours (type 2) and a thinner line for all the other contours.

lidar_results4

Next, convert your 2D map to a 3D scene using the Catalog pane.  If you wish, undock the 3D scene and drag it to the right side of your 2D map so that your 2D map and 3D scene are side by side.  Use View > Link Views to synchronize the two.  Experiment with changing the base map to topographic or terrain with labels.  Or, if your area is in the USA like mine is, use the Add Data > USA topographic > add the USGS topographic maps as another layer.  The topographic maps are at 1:24,000 scale in the most detailed view, and then 1:100,000 and 1:250,000 for smaller scales.

 

lidar_results3.JPG

2D and 3D synced views of the contours symbolized with the Contours with Barriers tool in ArcGIS Pro. 

At this point, the sky’s the limit for you to conduct any other type of raster-based analysis, or combine it with vector analysis.  For example, you could run the profile tool to generate a profile graph of a drawn line (as I did, shown below) or an imported shapefile or line feature class, create a viewshed from your specified point(s), trace downstream from specific points, determine which areas in your study site have slopes over a certain degree, or use the Lidar and derived products in conjunction with vector layers to determine the optimal site for a wildfire observation tower or cache for firefighters.

Profile graph of the cyan polyline that I created from the Lidar data from the National Map in ArcGIS Pro.

lidartrace

Tracing downstream using the rasters derived from the lidar data in ArcGIS Pro.

lidar_over40

Slopes over 40 degrees using the slope raster derived from the lidar data in ArcGIS Pro.

I hope these procedures will be helpful to you.

 

 

 

Reflections on Why Open Data is not as Simple as it Seems article

December 25, 2017 Leave a comment

Sabine de Milliano, in a relevant and thoughtful article in the GIS Professional newsletter entitled “Why Open Data is Not as Simple as it Seems,” eloquently raises several issues that have been running through this Spatial Reserves blog for the past five years.  She also raises concerns that have been in just about every data and GIS conference for the same amount of time to a new level. Rather than camping on the statement, “open data is great” and leaving it at that, Ms. de Milliano points out that “open data is much more complicated than simply collaborating on work and sharing results to help humanity move forward”.  She recognizes the “common good” of collaboration and innovation, and the transparency that results from open data. She states that access to open data is “only possible by solving the sum of technological, economic, political, and communication challenges.” Indeed.

In this blog and in our book, we have written extensively about the “fee vs. free” discussions that debate whether government agencies should charge for their data, and Ms. de Milliano sums up arguments on both sides. But she goes further and says that challenges to open data range from “ethical to practical”, and that there is a “large grey zone on what data should actually be shared and what should remain private.” What if someone creates a map based on your open data and someone else makes a fatal decision based on an error in this derivative product? Who is accountable?

For Ms. de Milliano, the biggest challenge of open data is discoverability and accessibility. She mentions open data portals including the Copernicus Open Access Hub, Natural Earth Data, USGS Earth Explorer, and the Esri ArcGIS Hub, and we have written about many others in this blog, such as here and here.  Ms. de Milliano holds an impressive set of GIS credentials and makes her points in an understandable and actionable manner.  Her article also points out that despite the advent of open data, some datasets remain “knowledge intensive”, meaning that only a limited number of users have sufficient technical background to understand how to process, analyze, and use them (such as SAR data) and therefore, they remain the domain of experts. I frequently touch on this point when I am teaching GIS workshops and courses, beginning with the thesis: “Despite data and technical advancements in GIS over the past 25 years, GIS is not easy. It requires technical expertise AND domain expertise.”  Effective use of GIS requires the user to be literate in what I see as three legs making up “geoliteracy”–content knowledge, skills, and the geographic perspective. I do not see skills as solely those of acquiring more competency in geotechnologies, but rather including equally important skills in critical thinking, dealing with data, being ethical, being organized, being a good communicator, and other skills.

Article about open data

Sabine de Milliano’s article about open data touches on many of the themes in this blog and in our book in an eloquent and thought-provoking way.

Categories: Public Domain Data

Potential positive or negative impacts to the GIS industry from a proposed Geospatial Data Act (GDA) bill (S.1253)

December 18, 2017 Leave a comment

The provision of GIS data and services, and the consumption of those data and services in a modern economy is linked to a system of laws about copyright, commerce, licensing, and much more.  In this blog and in our book, we discuss the ever-changing landscape of GIS, including laws and potential laws that have and could impact GIS in positive ways or in negative ways.  The proposed Geospatial Data Act (GDA) bill (S. 1253) has been receiving attention in recent months for what many in the GIS community say would limit who can provide GIS and mapping services to licensed architects, engineers, and surveyors from federal government contracts.  Opponents of the bill, including the American Association of Geographers, state that it “would result in a significant loss of jobs throughout the U.S., and would cripple the dynamic and innovative American GIS, IT, and mapping companies and communities that have developed GIS and internet mapping, and now power its continued innovation and growth in jobs and new technologies.”  Proponents of the bill say it will “improve coordination, reduce duplication, and increase data transparency in the acquisition of geospatial data” and is a re-introduction of a bill originally proposed in 2015.

Aligned with our continued messaging here of “be critical of the data” and “conduct the research necessary so that you understand the issue”, we encourage the community to do just that.  The original bill as introduced by the US Senate is here, with the US House version here, along with position papers by the American Association of Geographers, URISA, and GITA, and an article with some recent tweets from GIS Lounge. 

The latest word, as as reported by GIS Lounge, is that the parts of the Act that would limit federal contracts in GIS to licensed architectural and engineering (A&E) firms have been removed.  The folks at GIS Lounge point out that the new revised bill promotes several good ideas:

  • Section 2 defines the term ‘geospatial data’ for the US federal government.
  • Section 3 clarifies the role of a Federal Geographic Data Committee (FGDC).
  • Section 4 clarifies the role of a National Geospatial Advisory Committee (NGAC).
  • Section 5 describes the importance of a National Spatial Data Infrastructure (NSDI).
  • Section 8 describes the creation and operation of the ‘GeoPlatform’ as an electronic service that provides access to geospatial data and metadata for geospatial data.

ngmdb_in_agol_screenshot.jpg

The proposed Geospatial Data Act is a proposed bill that originally raised concerns about limiting the GIS industry but now may benefit the industry.  Investigate the bill yourself with these resources and, as always, make an informed decision. 

Categories: Public Domain Data Tags:

Possible Changes to NAIP Imagery Licensing Model

November 27, 2017 Leave a comment

As this blog and our book make clear, the world of geospatial data is in a continual state of change.  Much of this change has been toward more data in the public domain, but sometimes, the change may move in the opposite direction. The National Agriculture Imagery Program (NAIP) has been a source for aerial imagery in the USA since 2003 and has been in the public domain, available here.  But recently, the Farm Services Agency (FSA) has proposed to move the data model from the public domain to a licensing model.  The collection of this imagery has been under an innovative model wherein state governments and the federal government share the costs.

One reason for the proposed change is that the states have been $3.1 million short over the past several years, and FSA cannot continue “picking up the tab.”  Furthermore, delays in releasing funding from cost-share partners forces contract awards past “peak agriculture growth” season, which thwarts one key reason why the imagery is collected in the first place–to assess agricultural health and practices.  We have discussed this aspect of geospatial data frequently in this blog–that geospatial data comes at a cost.  Someone has to pay, and sometimes, those payment models need to be re-considered with changing funding and priorities.  In this case, agencies and data analysts that rely on NAIP imagery would suffer adverse consequences, but with the expansion of the types and means by which imagery can be acquired nowadays, perhaps these developments will enable those other sources to be explored more fully.  And, possibly, the model could be adjusted so that the data could be paid for and that all could benefit from it.

For more information, see the report by our colleagues at GIS Lounge, and the presentation housed on the FGDC site, here.

Two samples of NAIP imagery, for Texas, left, and North Dakota, right.

Potential Harm to Rare Species from Location-Tagged Data

November 20, 2017 1 comment

In a new study from Yale University entitled “Unnatural Surveillance: How Online Data Is Putting Species at Risk,”  author Adam Welz sounds an alarm about harm that can come from the fact that location information is increasingly tied to data.  In the case of rare and endangered plants and animals, Welz points out that “poachers can use computers and smartphones to pinpoint the locations of rare and endangered species and then go nab them.”   The case highlighted in the article is one of a couple who had been illegally gathering rare African succulent plants after doing research on the location of the plants, and then illegally selling the plants through their own website.  In the past it may have taken perhaps an entire botanical career to gather information on this level of specificity, but “in 2015, a pair of poachers could acquire it in a short time from a desk on another continent.”  Unfortunately, this is not an isolated case, and Welz should know:  He has long focused on writing about and has extensive experience in international and African wildlife issues.

The author thoughtfully raises other ways, trends, and technologies that expose the location of protected species to those with other motives, such as the increased publishing of scientific research in open access journals, VHF radio signals from animal collars, the rise of citizen science, and even geotagged social media posts from tourists who photograph wildlife.  Welz recognizes the positive impact that the growth of data has had on research and on conservation in particular, but raises awareness of the real danger that location-tagged data can pose to the very things that many seek to study and protect.  As a member of the academic community, I have been working with open access journals for years, and I had not considered the potential misuse of this new publication avenue.

As a long-time member of another community–that of caving, I have for decades been sensitive to the related issue of publishing of cave locations, and the resulting harm that can and has  come from those entering caves without a permit and/or those who would seek to vandalize a cave.  I would love to see a researcher conduct further research on the geospatial implications of the points that Welz raises.   Lacking that, Welz’ article still provides an affirmation of one of our themes of this blog and our book:  What is important is what people do with the data.  Data can be used for good and for ill.  It is my hope that articles such as this raise awareness so that data and tool providers build safeguards that make it difficult for people who seek to use data for ill to access that data, while still moving toward the goals of open data access for enabling smart decisions.

canada_forest

A review of an article wherein Adam Welz sounds an alarm about harm that can come from the fact that location information is increasingly tied to data.