Search Results

Keyword: ‘tools’

Sharing Geoprocessing Tools on the Web

January 15, 2017 3 comments

An article co-authored by Benjamin Pross, Christoph Stasch, and Albert Remke, of the 52°North Initiative for Geospatial Open Source Software GmbH; and Satish Sankaran and Marten Hogeweg of Esri describes a development that should interest anyone who uses geospatial data.  The 52°North Initiative for Geospatial Open Source Software has developed an open-source extension to ArcGIS for Desktop that enables access to Open Geospatial Consortium, Inc. (OGC), Web Processing Services (WPS).  The result?  This initiative makes it possible for these services to be used in the same manner as native ArcGIS geoprocessing tools.  In other words, they appear in the list of tools just as a standard buffer or overlay tool would appear.  Yes, it could be just that easy.

The article explains that “while ArcGIS allows geoprocessing tools to be published as a WPS, [ArcGIS] does not offer a WPS client interface. Consequently, it is not easy to access external non-ArcGIS geoprocessing tools such as simulation models, rich data interfaces, or processing capabilities from any other legacy software that supports the WPS interface.”  This points to the reason why this initiative offers such promise:  “The 52°North Extensible WPS Client for ArcMap was implemented as an open-source extension to ArcGIS that fully integrates into the ArcGIS for Desktop environment. It enables OGC WPS to be accessed and used in the same manner as native ArcGIS geoprocessing tools. This makes it easy to run WPS-based processes and integrate the results of that processing into ArcMap for use with other applications.”

In plain language, because the complex issues grappled with by GIS analysts often require major investments of time to generate models, services, and customized workflows and code, why should each analyst have to create all of this from scratch?  An enormous time savings could be realized if there was an easy way to share these things. This article both explains recent progress in this area but also encourages the community to think creatively about how to pursue further collaborative methods.

52north_initiative

ArcGIS Web Processing Service client architecture.

 

Advertisements

Using the Data Interoperability Extension to import SDTS DLG files into ArcGIS Pro

April 16, 2018 Leave a comment

One of the themes of this blog and our book has been the wide variety of spatial data formats in existence.  Some of these spatial data formats have remained challenging to import into a GIS right up to the present day.  To meet this challenge, Esri’s Data Interoperability Extension has been a longstanding and useful set of tools that enables a wide variety of spatial data formats to be imported for use in a GIS.  It is an integrated spatial ETL (Extract, Transform, and Load) toolset that runs within the geoprocessing framework using Safe Software’s FME technology. It enables you to integrate data from multiple sources and formats, use that data with geoprocessing tools, and even publish it with ArcGIS Server.

I recently tested the Data Interoperability Extension in ArcGIS Pro and was thrilled with the results.  Read about how to install and authorize the extension here.  The extension does many things, but one that is particularly useful is that the extension creates a toolbox directly in ArcGIS Pro (graphic below).  I used this toolbox’s Quick Import tool to import a SDTS Format DLG (USGS Digital Line Graph) file directly to a file geodatabase.  The tool, like other ArcGIS Pro geoprocessing tools, walked me right through the process:  Data Interoperability > Quick Import > I then pointed to my DLG files in SDTS format > I named the resulting gdb (file geodatabase).  Once imported, I was then able to work with my hydrography, hypsography, roads, boundaries, and other data.

DLG files have existed since the early 1990s.  Why are we still working with them?  The reasons are that (1) They are dated but still useful vector data sets; (2) Many geospatial data portals still host data only in this format, such as the USGS Earth Explorer.  Another way to import these DLG files into ArcGIS Pro or ArcMap is to use the DLG2SHP tools that I wrote about in this set of guidelines using a standalone program.  See below for step-by-step instructions with the Data Interoperability Extension with screen shots.

data_interoperability

1. Use Toolboxes > Data Interoperability Tools > Quick Import, as shown above.data_interoperability_use_for_dlg_screen1

2.  Using QuickImport pulls up a “specify data source” dialog box, as shown above.

data_interoperability_use_for_dlg_screen3

3.  In the specify data source dialog box, use “find other source” and then specify SDTS format.

data_interoperability_use_for_dlg_screen2

4.  Selecting SDTS format.

data_interoperability_use_for_dlg_screen4

5.  Pointing to the SDTS file (after it has been unzipped and un-TAR’d) and saving it into a geodatabase.

data_interoperability_use_for_dlg_screen5

6. Once the file has been imported into a geodatabase, it can be added to a new map in ArcGIS Pro.  The data is now ready for use, as shown for this hydrography example, above. 

 

The Coastal Atlas from the Maryland Department of Natural Resources

February 5, 2018 Leave a comment

The Maryland Coastal Atlas serves up ocean use and resource data, coastal hazard and shoreline data, and near-shore and estuarine data.  The purpose of the atlas is to make coastal related geospatial datasets available to agencies, researchers, and the general public for viewing and for performing basic overlays.  Tools are being added to make the atlas more versatile for users to do analysis and to help simplify or select data important for different users’ needs. The list of layers is extensive; at least 100 items are included.  But equally impressive is its ability to add dozens more layers from the MDiMapD database on such themes as agriculture, housing, demographics, hydrology, and much more.

The Atlas uses the Esri Web App Builder for its interactive map capabilities.  One of my favorite things about the atlas is the user’s ability to add data to the web interface from ArcGIS Online, a URL, or a file of the user’s own creation.  The site features unexpected helpful touches such as palette of drawing tools that makes the atlas a rich teaching tool, and transects that can be drawn in the map to analyze such things as erosion rates.

A few enhancements on the site could be done to make it more useful, such as an expansion of the fairly limited query tool and an explanation of how it can be used.  I was puzzled how to close the transect results once I had created one, but this and other user interface questions were small; overall, the interface was intuitive.  The Maryland Coastal Atlas provides an excellent addition to the other portals we have written about in this region, such as the Maryland iMap Data Catalog We wrote about the state of Maryland’s GIS portal in the past, and the selected other data portals for the Chesapeake Bay.

The atlas uses the map services available from the Maryland GIS Portal and the iMap Open Data Catalog that we reviewed above.  To obtain the data, go to the Maryland Data Catalog to download the data or get the API to use in an online mapping application.  All of the Maryland Coastal Hazard datasets on the atlas are available through the data catalog but not all are downloadable.  Here is an example of a dataset on the atlas shown in the iMap Data Catalog with the Download and API function available on the listing.  Every layer is a REST service hosted by Maryland iMap, managed by the Geographical Information Office (GIO) and the state IT group (DOIT).

mdcoastal

The Coastal Atlas from the Maryland Department of Natural Resources.

Privacy concerns from fitness maps and apps

January 31, 2018 1 comment

We frequently write about the need to teach about and be aware of location privacy with the rapid advancement and web-enablement of GIS.  Thus it wasn’t a surprise when recent concerns arose over an amazing map from Strava Labs.  Maps generated from GPS-enabled fitness devices and other recreational uses of GPS such as GPS Drawing, as well as those from the fitness tracker market such as Fitbit and Garmin, have for several years been sharable and viewable.  Strava has been one of the leaders in helping people stay motivated to meet their fitness goals by providing tools such as apps and maps.  But perhaps the Strava map attracted more attention than others because it contains an amazing “over 1 billion activities and 13 trillion data points”, or perhaps because the map is so responsive and contains some stunning cartography that the web map user can customize.

Whatever the reason, as reported in USA TodayPopular MechanicsWired, and elsewhere, location privacy concerns have arisen recently over the new Strava map.  Specifically, “Security experts over the weekend questioned whether the user-generated map could not only show the locations of military bases, but specific routes most heavily traveled as military personnel unintentionally shared their jogging paths and other routes.”  Some of the posts have reported that it may even be possible to scrape the data to discover the person behind each of the tracks, and the Strava CEO has responded to these and other concerns.  Any GIS user knows that much can be discovered through mapped layers and satellite imagery these days, shedding new light on what is really “secret” in our 21st Century world, but maps aimed at the recreational user are bringing these discussions to the general public.  The particular concern with the Strava data is not so much just the location information, but the temporal data tied to the location, and potential identification of individuals.

Much of it comes down to what we have been saying in this blog–understand the defaults for whatever you are doing in GIS, whether it is the projection of your geospatial data or the location-based app on your phone.  Ask yourself, “What is the default–is my data public by default? Is my projection Web Mercator by default?  Can I override the default, and if so, how?  What is the best way to represent this spatial information?  Do I need to share this information?  If I need to share the information, how should I do it?”  and then act accordingly.   For more on this topic, I encourage you to read some of our short essays, such as Why Does a Calculator App need to know my location?, Making the Most of Our Personal Location Dataposting cat pictures and The Invasion of the Data Snatchers.

stravamap

A section of the Strava heat map, showing the results of people who have recorded and shared their fitness walks and runs.  As one might expect, city park and a high school track stand out as places where more people conduct these activities.  As with other maps showing locations where people are now or where they have been, location privacy concerns have been raised. 

Accessing and Using Lidar Data from The National Map

January 8, 2018 Leave a comment

We have written about the USGS data portal NationalMap numerous times in this blog and in our book, but since the site keeps getting enhanced, a re-examination of the site is warranted.  One of the enhancements over the past few years is the addition of Lidar data to the site.  I did some recent testing of searching for and downloading Lidar data on the site and wanted to report on my findings.  For videos of some of these procedures, go to the YouTube Channel geographyuberalles and search on Lidar.

From a user perspective, in my view the site is still a bit challenging, where the user encounters moments in the access and download process where it is not clear how to proceed.  However, (1) the site is slowly improving; (2) the site is worth investigating chiefly because of its wealth of data holdings:  It is simply too rich of a resource to ignore.  One challenging thing about using NationalMap is, like many other data portals, how to effectively narrow the search from the thousands of search results.  This in part reflects the open data movement that we have been writing about, so it is a good problem to have, albeit still cumbersome in this portal.  Here are the procedures to access and download the Lidar data from the site:

  1. To begin:  Visit the National Map:  https://nationalmap.gov/ > Select “Elevation” from this page.
  2. Select “Get Elevation Data” from the bottom of the Elevation page.  This is one of several quirks about the site – why isn’t this link in a more prominent position or in a bolder font?
  3. From the Data Elevation Products page left hand column:   Select “1 meter DEM.”
  4. Select the desired format.  Select “Show Availability”.   Zoom to the desired area using a variety of tools to do so.  In my example, I was interested in Lidar data for Grand Junction, in western Colorado.
  5. Note that the list of  available products will appear in the left hand column.  Lidar is provided in 10000 x 10000 meter tiles.  In my example, 108 products exist for the Grand Junction Lidar dataset.  Use “Footprint” to help you identify areas in which you need data–the footprints appear as helpful polygon outlines.  At this point, you could save your results as text or CSV, which I found to be quite handy.
  6. You can select the tiles needed one by one to add to your cart or select “Page” to select all items.  Select the Cart where you can download the tiles manually or select the “uGet Instructions” for details about downloading multiple files.  Your data will be delivered in a zip format right away, though Lidar files are large and may require some time to download.

 

lidar_results.JPG

The National Map interface as it appeared when I was selecting my desired area for Lidar data.

Unzip the LAS data for use in your chosen GIS package.  To bring the data into ArcGIS Pro, create a new blank project and name it.  Then, Go to Analysis > Tools > Create LAS dataset from your unzipped .las file, noting the projection (in this case, UTM) and other metadata.  Sometimes you can bring .las files directly into Pro without creating a LAS dataset, but with this NationalMap Lidar data, I found that I needed to create a LAS dataset first.

Then > Insert:  New Map > add your LAS dataset to the new map. Zoom in to see the lidar points.  View your Lidar data in different ways using the Appearance tab to see it as elevation, slope, aspect (shown below), and contours.  Use LAS dataset to raster to convert the Lidar data to a raster.  In a similar way, I added the World Hydro layer so I could see the watersheds in this area, and USA detailed streams for the rivers.

lidar_results2.JPG

Aspect view generated from Lidar data in ArcGIS Pro.

There are many things you can do with your newly downloaded Lidar data:  Let’s explore just a few of those.  First, create a Digital Elevation Model (DEM) and a Digital Surface Model (DSM).  To do this, in your .lasd LAS dataset > LAS Filters > Filter to ground, and visualize the results, and then use LAS Dataset to Raster, using the Elevation as the value field.  Your resulting raster is your digital elevation model (DEM).  Next, Filter to first return, and then convert this to a raster:  This is your digital surface model (DSM).  After clicking on sections of each raster to compare them visually, go one step further and use the Raster Calculator to create a comparison raster:  Use the formula:  1streturn_raster – (subtract) the ground_raster.  The first return result is essentially showing the objects or features on the surface of the Earth–the difference between “bare earth” elevation and the “first return”–in other words, the buildings, trees, shrubs, and other things human built and natural.  Symbolize and classify this comparison surface to more fully understand your vegetation and structures.  In my study area, the difference between the DEM and the DSM was much more pronounced on the north (northeast, actually) facing slope, which is where the pinon and juniper trees are growing, as opposed to the barren south (southwest) facing slope which is underlain by Mancos Shale (shown below).

lidar_veght

Comparison of DEM and DSM as a “ground cover” raster in ArcGIS Pro.

My photograph of the ridgeline, from just east of the study area, looking northwest.  Note the pinon and juniper ground cover on the northeast-facing slopes as opposed to the barren southwest facing slope.

Next, create a Hillshade from your ground raster (DEM) using the hillshade tool.   Next, create a slope map and an aspect map using tools of these respective names.  The easiest way to find the tools is just to perform a search.  The hillshade, slope, and aspect are all raster files.  Once the tools are run, these are now saved as datasets inside your geodatabase as opposed to earlier—when you were simply visualizing your Lidar data as slope and aspect, you were not making separate data files.

Next, create contours, a vector file, from your ground raster (DEM), using the create contours tool.  Change the basemap to imagery to visualize the contours against a satellite image.  To create index contours, use the Contour with Barriers tool.  To do this, do not actually indicate a “barriers” layer but rather use the contour with barrier tool to achieve an “index” contour, as I did, shown below.  I used 5 for the contour interval and 25 (every fifth contour) for the index contour interval.  This results in a polyline feature class with a field called “type”.  This field receives the value of 2 for the index contours and 1 for all other contours.  Now, simply symbolize the lines as unique value on the type field, specifying a thicker line for the index contours (type 2) and a thinner line for all the other contours.

lidar_results4

Next, convert your 2D map to a 3D scene using the Catalog pane.  If you wish, undock the 3D scene and drag it to the right side of your 2D map so that your 2D map and 3D scene are side by side.  Use View > Link Views to synchronize the two.  Experiment with changing the base map to topographic or terrain with labels.  Or, if your area is in the USA like mine is, use the Add Data > USA topographic > add the USGS topographic maps as another layer.  The topographic maps are at 1:24,000 scale in the most detailed view, and then 1:100,000 and 1:250,000 for smaller scales.

 

lidar_results3.JPG

2D and 3D synced views of the contours symbolized with the Contours with Barriers tool in ArcGIS Pro. 

At this point, the sky’s the limit for you to conduct any other type of raster-based analysis, or combine it with vector analysis.  For example, you could run the profile tool to generate a profile graph of a drawn line (as I did, shown below) or an imported shapefile or line feature class, create a viewshed from your specified point(s), trace downstream from specific points, determine which areas in your study site have slopes over a certain degree, or use the Lidar and derived products in conjunction with vector layers to determine the optimal site for a wildfire observation tower or cache for firefighters.

Profile graph of the cyan polyline that I created from the Lidar data from the National Map in ArcGIS Pro.

lidartrace

Tracing downstream using the rasters derived from the lidar data in ArcGIS Pro.

lidar_over40

Slopes over 40 degrees using the slope raster derived from the lidar data in ArcGIS Pro.

I hope these procedures will be helpful to you.

 

 

 

The ArcGIS Solutions Templates as a Data Source

November 6, 2017 Leave a comment

“ArcGIS solution templates” located here  are ready-to-use maps and apps targeted for specific industries.  They are grouped into categories according to GIS industry sectors–local government, state government, emergency management, water, electric, gas, defense, telecommunication, and parks and gardens.  These solution templates cover a wide variety of themes and scales.  Under each categories are subcategories, such as land records and water utilities under local government, and transportation and fish and wildlife under state government, and so on.  Another convenient way to browse for the solutions of interest to you is to visit the gallery.  The gallery allows for searching on keywords, industries, products, and implementation patterns (such as field mobility and operational awareness).

Why include these solution templates in our blog as a data source?  Particularly for professors of GIS in universities, and those who conduct GIS training for a wide variety of persons, these templates include sample data.  For example, the electric facility templates include a sample geodatabase with an electric utility network, and a map document used to publish the service territory service.  The Tree Assessor template includes a living plant collection dataset, a map document with defined cartography, a toolbox containing analysis tools, and detailed documentation with details of the map properties.

Admittedly, the primary purpose of these templates is not data, per se, but to help avoid the age-old problem in GIS of requiring each organization to create everything they need from scratch.  The solutions templates provide a practical framework, including maps, graphics, workflows, apps, and much more–for an organization to get started, or if they’ve already started with GIS, to follow some best practices.  But the fact that the templates include data and tools make them an excellent source of data particularly for the GIS instructor.  And, many of the data sets, such as water, parcels, and utilities, are those that typically are difficult to obtain, chiefly because they are not distributed by most data portals.  The solutions templates can also, of course, be used with other data sets that can be found via this blog or via other sources.

solutions

A section of the Esri Solutions Gallery.

Categories: Public Domain Data

Era of Big Data is Here: But Caution Is Needed

September 25, 2017 Leave a comment

As this blog and our book are focused on geospatial data, it makes sense that we discuss trends in data–such as laws, standards, attitudes, and tools that gradually helping more users to more quickly find the data that they need.  But with all of these advancements we continue to implore decision makers to think carefully about and investigate the data sources they are using.  This becomes especially critical–and at times difficult–when that data is in the “big data” category.  The difficulty arises when big data is seen as so complex that often it is cited and used in an unquestioned manner.

Equally challenging and at times troublesome is when the algorithms based on that data are unchallenged, and when access to those algorithms are blocked to those who seek to understand who created them and what data and formulas they are based on.  As these data and algorithms increasingly affect our everyday lives, this can become a major concern, as explained in data scientist Cathy O’Neil’s TED talk,  who says “the era of blind faith in big data must end.”

In addition, the ability to gain information from mapping social media is amazing and has potential to help in so many sectors of society.  This was clearly evident with the usefulness of social media posts that emergency managers in Texas and Florida USA mapped during the August-September 2017 hurricanes there.  However, with mapping social media comes an equal if not greater need for caution, as this article that points out the limitations of such data for understanding health and mitigating the flu.  And from a marketing standpoint, Paul Goad cautioned here against relying on data alone.

It is easy to overlook an important point in all this discussion on data, big data, and data science. We tend to refer to these phenomena in abstract terms but these data largely represent us – our lives, our habits, our shopping preferences, our choice of route on the way to work, the companies and organisations we work for and so on. Perhaps less data and data science and more humanity and humanity science.  As Eric Schmidt, CEO of Google, has said, “We must remember that technology remains a tool of humanity.  How can we, and corporate giants, then use these big data archives as a tool to serve humanity?”

Understanding your data

Use caution in making decisions from data–even if you’re using “Big Data” and algorithms derived from it.    Photograph by Joseph Kerski. 

Categories: Public Domain Data

Ethics in Geospatial Decision-Making

September 11, 2017 Leave a comment

Our book and this blog frequently focus on the importance of making wise decisions when using geospatial data.  We often discuss the two-edged sword in which we are living with regard to the modern GIS era:  ‘Tis wonderful to have a plethora of geospatial data services at our fingertips, many of which are in real time, many are capable of being visualized in 3-D, and many are updated and curated with regularity.  Coupled with these services are a variety of easy-to-use spatial analysis tools that come coupled with desktop and web-based GIS software platforms.  But with this availability of data and easy-to-use tools brings increasing likelihood that decisions will be made based on them without regard to the data’s sources, scales, update frequency, map projection, completeness of attributes, and other measures of quality.

Decisions are still in large part made by humans, and the human element has always been laden with ethical decisions, whether we realize it or not.  Adding to the ethical element is the fact that geospatial decisions involve land, which has economic but also personal and inherent value, and affects people who live on that land.  Geospatial decisions also affect the very air we breathe and water we drink.

How can we be more purposefully aware of ethical elements in our decisions based on geospatial data?  Some insightful chapters and articles will, I think, be of help.  One is the new chapter on Professional and Practical Ethics of GIS&T in the UCGIS GIS&T Body of Knowledge project by David DiBiase.  Another is a 7-Step guide to ethical decision-making, written in 1999 but still incredibly relevant.  I particularly like the tests that the author describes–the harm test, the publicity test, the defensibility test, the reversibility test, the colleague test, and the organization test.

Another excellent resource is Penn State’s ethics education resource for geospatial professionals, which lists interesting and pertinent case studies, codes of ethics, university course syllabi, and other resources.  In a recent article in Directions Magazine, Dr Diana S. Sinton explores how ethics can be integrated into geospatial education.   She advocates that ethics be threaded throughout even an introductory GIS course rather than be relegated to one lecture, as is often the case.

What are your thoughts regarding ethics in GIS?

jjk_question.PNGGeospatial decisions are ethical decisions as well.

Categories: Public Domain Data

Data Practitioner Profile Document Reviewed

July 31, 2017 2 comments

The recent document entitled “Profile of the Data Practitioner” (created by a panel with a diverse background, published by EDC Newton Massachusetts USA) is useful in several ways.  First, it succinctly outlines many of the issues we have focused on in this blog and in our book–data quality, critical thinking, domain knowledge, and others.  Second, it lists skills, knowledge, and behaviors, and therefore is an excellent though brief supplement to the Geospatial Technology Competency Model.  Third, it lists equipment, tools, and supplies, future trends, and industry concerns.  Fourth, page 2 of the document is a practical application of the Geographic Inquiry Model, as it describes how the data practitioner initiates a project, sources the data, transforms the data, analyzes the data, closes out the project, and engages in professional development.

The document should be helpful for those pursuing their own career path in GIS and data science, and for those designing and teaching courses and workshops in GIS in academia, nonprofit organizations, private companies, and government agencies.  I only wish the document was longer or linked to a longer report that would provide more detail.  Still, for a succinct document summarizing some key items that data practitioners need to have in place, this document is worth spending time reviewing and telling others about.