Search Results

Keyword: ‘tools’

Sharing Geoprocessing Tools on the Web

January 15, 2017 3 comments

An article co-authored by Benjamin Pross, Christoph Stasch, and Albert Remke, of the 52°North Initiative for Geospatial Open Source Software GmbH; and Satish Sankaran and Marten Hogeweg of Esri describes a development that should interest anyone who uses geospatial data.  The 52°North Initiative for Geospatial Open Source Software has developed an open-source extension to ArcGIS for Desktop that enables access to Open Geospatial Consortium, Inc. (OGC), Web Processing Services (WPS).  The result?  This initiative makes it possible for these services to be used in the same manner as native ArcGIS geoprocessing tools.  In other words, they appear in the list of tools just as a standard buffer or overlay tool would appear.  Yes, it could be just that easy.

The article explains that “while ArcGIS allows geoprocessing tools to be published as a WPS, [ArcGIS] does not offer a WPS client interface. Consequently, it is not easy to access external non-ArcGIS geoprocessing tools such as simulation models, rich data interfaces, or processing capabilities from any other legacy software that supports the WPS interface.”  This points to the reason why this initiative offers such promise:  “The 52°North Extensible WPS Client for ArcMap was implemented as an open-source extension to ArcGIS that fully integrates into the ArcGIS for Desktop environment. It enables OGC WPS to be accessed and used in the same manner as native ArcGIS geoprocessing tools. This makes it easy to run WPS-based processes and integrate the results of that processing into ArcMap for use with other applications.”

In plain language, because the complex issues grappled with by GIS analysts often require major investments of time to generate models, services, and customized workflows and code, why should each analyst have to create all of this from scratch?  An enormous time savings could be realized if there was an easy way to share these things. This article both explains recent progress in this area but also encourages the community to think creatively about how to pursue further collaborative methods.

52north_initiative

ArcGIS Web Processing Service client architecture.

 

Advertisements

The ArcGIS Solutions Templates as a Data Source

November 6, 2017 Leave a comment

“ArcGIS solution templates” located here  are ready-to-use maps and apps targeted for specific industries.  They are grouped into categories according to GIS industry sectors–local government, state government, emergency management, water, electric, gas, defense, telecommunication, and parks and gardens.  These solution templates cover a wide variety of themes and scales.  Under each categories are subcategories, such as land records and water utilities under local government, and transportation and fish and wildlife under state government, and so on.  Another convenient way to browse for the solutions of interest to you is to visit the gallery.  The gallery allows for searching on keywords, industries, products, and implementation patterns (such as field mobility and operational awareness).

Why include these solution templates in our blog as a data source?  Particularly for professors of GIS in universities, and those who conduct GIS training for a wide variety of persons, these templates include sample data.  For example, the electric facility templates include a sample geodatabase with an electric utility network, and a map document used to publish the service territory service.  The Tree Assessor template includes a living plant collection dataset, a map document with defined cartography, a toolbox containing analysis tools, and detailed documentation with details of the map properties.

Admittedly, the primary purpose of these templates is not data, per se, but to help avoid the age-old problem in GIS of requiring each organization to create everything they need from scratch.  The solutions templates provide a practical framework, including maps, graphics, workflows, apps, and much more–for an organization to get started, or if they’ve already started with GIS, to follow some best practices.  But the fact that the templates include data and tools make them an excellent source of data particularly for the GIS instructor.  And, many of the data sets, such as water, parcels, and utilities, are those that typically are difficult to obtain, chiefly because they are not distributed by most data portals.  The solutions templates can also, of course, be used with other data sets that can be found via this blog or via other sources.

solutions

A section of the Esri Solutions Gallery.

Categories: Public Domain Data

Era of Big Data is Here: But Caution Is Needed

September 25, 2017 Leave a comment

As this blog and our book are focused on geospatial data, it makes sense that we discuss trends in data–such as laws, standards, attitudes, and tools that gradually helping more users to more quickly find the data that they need.  But with all of these advancements we continue to implore decision makers to think carefully about and investigate the data sources they are using.  This becomes especially critical–and at times difficult–when that data is in the “big data” category.  The difficulty arises when big data is seen as so complex that often it is cited and used in an unquestioned manner.

Equally challenging and at times troublesome is when the algorithms based on that data are unchallenged, and when access to those algorithms are blocked to those who seek to understand who created them and what data and formulas they are based on.  As these data and algorithms increasingly affect our everyday lives, this can become a major concern, as explained in data scientist Cathy O’Neil’s TED talk,  who says “the era of blind faith in big data must end.”

In addition, the ability to gain information from mapping social media is amazing and has potential to help in so many sectors of society.  This was clearly evident with the usefulness of social media posts that emergency managers in Texas and Florida USA mapped during the August-September 2017 hurricanes there.  However, with mapping social media comes an equal if not greater need for caution, as this article that points out the limitations of such data for understanding health and mitigating the flu.  And from a marketing standpoint, Paul Goad cautioned here against relying on data alone.

It is easy to overlook an important point in all this discussion on data, big data, and data science. We tend to refer to these phenomena in abstract terms but these data largely represent us – our lives, our habits, our shopping preferences, our choice of route on the way to work, the companies and organisations we work for and so on. Perhaps less data and data science and more humanity and humanity science.  As Eric Schmidt, CEO of Google, has said, “We must remember that technology remains a tool of humanity.  How can we, and corporate giants, then use these big data archives as a tool to serve humanity?”

Understanding your data

Use caution in making decisions from data–even if you’re using “Big Data” and algorithms derived from it.    Photograph by Joseph Kerski. 

Categories: Public Domain Data

Ethics in Geospatial Decision-Making

September 11, 2017 Leave a comment

Our book and this blog frequently focus on the importance of making wise decisions when using geospatial data.  We often discuss the two-edged sword in which we are living with regard to the modern GIS era:  ‘Tis wonderful to have a plethora of geospatial data services at our fingertips, many of which are in real time, many are capable of being visualized in 3-D, and many are updated and curated with regularity.  Coupled with these services are a variety of easy-to-use spatial analysis tools that come coupled with desktop and web-based GIS software platforms.  But with this availability of data and easy-to-use tools brings increasing likelihood that decisions will be made based on them without regard to the data’s sources, scales, update frequency, map projection, completeness of attributes, and other measures of quality.

Decisions are still in large part made by humans, and the human element has always been laden with ethical decisions, whether we realize it or not.  Adding to the ethical element is the fact that geospatial decisions involve land, which has economic but also personal and inherent value, and affects people who live on that land.  Geospatial decisions also affect the very air we breathe and water we drink.

How can we be more purposefully aware of ethical elements in our decisions based on geospatial data?  Some insightful chapters and articles will, I think, be of help.  One is the new chapter on Professional and Practical Ethics of GIS&T in the UCGIS GIS&T Body of Knowledge project by David DiBiase.  Another is a 7-Step guide to ethical decision-making, written in 1999 but still incredibly relevant.  I particularly like the tests that the author describes–the harm test, the publicity test, the defensibility test, the reversibility test, the colleague test, and the organization test.

Another excellent resource is Penn State’s ethics education resource for geospatial professionals, which lists interesting and pertinent case studies, codes of ethics, university course syllabi, and other resources.  In a recent article in Directions Magazine, Dr Diana S. Sinton explores how ethics can be integrated into geospatial education.   She advocates that ethics be threaded throughout even an introductory GIS course rather than be relegated to one lecture, as is often the case.

What are your thoughts regarding ethics in GIS?

jjk_question.PNGGeospatial decisions are ethical decisions as well.

Categories: Public Domain Data

Data Practitioner Profile Document Reviewed

July 31, 2017 2 comments

The recent document entitled “Profile of the Data Practitioner” (created by a panel with a diverse background, published by EDC Newton Massachusetts USA) is useful in several ways.  First, it succinctly outlines many of the issues we have focused on in this blog and in our book–data quality, critical thinking, domain knowledge, and others.  Second, it lists skills, knowledge, and behaviors, and therefore is an excellent though brief supplement to the Geospatial Technology Competency Model.  Third, it lists equipment, tools, and supplies, future trends, and industry concerns.  Fourth, page 2 of the document is a practical application of the Geographic Inquiry Model, as it describes how the data practitioner initiates a project, sources the data, transforms the data, analyzes the data, closes out the project, and engages in professional development.

The document should be helpful for those pursuing their own career path in GIS and data science, and for those designing and teaching courses and workshops in GIS in academia, nonprofit organizations, private companies, and government agencies.  I only wish the document was longer or linked to a longer report that would provide more detail.  Still, for a succinct document summarizing some key items that data practitioners need to have in place, this document is worth spending time reviewing and telling others about.

Data Quality on Live Web Maps

June 19, 2017 3 comments

Modern web maps and the cloud-based GIS tools and services upon which they are built continue to improve in richness of content and in data quality.  But as we have focused on many times in this blog and in our book, maps are representations of reality.  They are extremely useful representations, to be sure, particularly so in the cloud, but still are representations.   These representations are dependent upon the data sources, accuracy standards, map projections, completeness, processing and rendering procedures used, regulations and policies in place, and much more.  A case in point are offsets between street data and the satellite image data that I noticed in mid-2017 in Chengdu in south-central China.  The streets are about 369 meters southeast of where they appear on the satellite image (below):

china-google-maps

Puzzled, I panned the map to other locations in China.  The offsets varied, but they appeared everywhere in the country; for example, note the offset of 557 meters where a highway crosses the river at Dongyang, again to the southeast:

china-google-maps2

As of this writing, the offset appears in the same cardinal direction and only in China; indeed; After examining border towns with North Korea, Vietnam, and other countries, the offset appears to stop along those borders.  No offsets exist in Hong Kong nor in Macao.  Yahoo Maps Bing Maps both show the same types of offsets in China (Bing maps example, below):

china_bing

MapQuest, which uses an OpenStreetMap base, showed no offset.  I then tested ArcGIS Online with a satellite image base and the OpenStreetMap base, and there was no offset there, either (below).  This offset is a datum issue related to national security that is documented in this Wikipedia article.  The same data restriction issues that we discuss in our book and in our blog touch on other aspects of geospatial data, such as fines for unauthorized surveys, lack of geotagging information on many cameras when the GPS chip detects a location within China, and seeming unlawfulness of crowdsourced mapping efforts such as OpenStreetMap.

But furthermore, as we have noted, the satellite images are processed tiled and data sets, and like other data sets, they need to be critically scrutinized as well.  They should not be considered “reality” despite their appearance of being the “actual” Earth’s surface.  They too contain error, may have been taken on different dates or seasons, may be reprojected on a different datum, and other data quality aspects need to be considered.

china-agol

Another difference between these maps is the wide variation in the amount of detail in terms of the streets data in China.  The OpenStreetMap was the most complete; the other web mapping platforms offered a varying level of detail; some of which were seriously lacking, surprisingly especially in the year 2017, in almost every type of street except major freeways.  The streets content was much more complete in other countries.

It all comes back to identifying your end goals in using any sort of GIS or mapping package.  Being critical of the data can and should be part of the decision making process that you use and the choice of tools and maps to use.  By the time you read this, the image offset problem could have been resolved.  Great!  But are there now new issues of concern? Data sources, methods, and quality vary considerably among different countries. Furthermore, the tools and data change frequently, along with the processing methods, and being critical of the data is not just something to practice one time, but rather, fundamental to everyday work with GIS.

New LandViewer Tool for Quickly Finding and Analyzing Satellite Imagery

May 7, 2017 2 comments

The LandViewer tool and data portal quickly and painlessly allows you to browse and access satellite imagery for the planet.  The tool, developed by the Earth Observing System Inc.’s Max Polyakov, currently features Landsat 8 and Sentinel 2 imagery with more image sets soon to arrive.  Landsat 8 carries two instruments: The Operational Land Imager (OLI) sensor includes refined heritage bands, along with three new bands: a deep blue band for coastal/aerosol studies, a shortwave infrared band for cirrus detection, and a Quality Assessment band. The Thermal Infrared Sensor (TIRS) provides two thermal bands. Sentinel 2 is an Earth observation mission developed by the ESA as part of the Copernicus Programme to perform terrestrial observations in support of services such as forest monitoring, land cover changes detection, and natural disaster management.

Using the LandViewer tool, you can quickly zoom on an interactive web map to your area of interest.  You can filter on geography and time, including cloudiness, sun angle, and other parameters. At the time of this writing, 18 filters such as Atmospheric Removal, Panchromatic, NDVI, Thermal Infrared, False Color, and more, are available so that you can obtain the band combinations most suitable to your analysis in the areas of agriculture, geology, or other applications. A very helpful image interpretation screen is available to help you choose the combination that are best for your analysis goals.  You can do some contrast stretching in the web tool itself.  Then after signing in to the site, you can download the images in GeoTIF for further analysis using your favorite GIS tools.

The tool was also reviewed on the Geoawesomeness web site, and I wholeheartedly agree with their sentiments expressed–this is one of the most useful and fastest satellite image portals I have used. It is useful for research but also, given its ease of use, can even be used effectively to teach concepts of remote sensing.  Give it a try and let us know in the comments section what you think.

landsat_viewer.JPG

Landsat scenes with band combinations possible for an area on the southwest side of Costa Rica.

 

Geospatial Librarians and the GeoBlacklight Data Portal

March 12, 2017 2 comments

The position of spatial data librarian is not commonplace at universities, but it is growing. I have met at least 10 new librarians in this position over the past several years.  The small but expert and energetic group of spatial data librarians has been making headway in several key innovative projects germane to the themes of this blog and our book.  These include the creation of useful data portals, moving the digital humanities field forward, and coordinating data production, dissemination, and use–not only between departments on their own campuses, but between universities, government agencies, industry, and nonprofit organizations.  A group of these spatial data librarians recently met at a “Geo4Lib” camp, for example, and among other topics, explored a solution called GeoBlacklight to host geospatial data.

One group from Colorado is considering the use of GeoBlacklight tools to host a statewide Colorado GIS data portal.  Colorado is sorely in need of such a portal as Colorado has no curated and supported statewide data organization or portal as exists in Texas with TNRIS or Montana with NRIS, for example.  To see GeoBlacklight in action, see Stanford University’s instance of it here, led by my colleague Stace Maples.

Try the Stanford University instance of GeoBlacklight.  What are your reactions to its usefulness, as a geospatial data professional?  Do you have a geospatial data librarian at your local or regional university?  What can the GIS community do to advocate that universities hire such staffpersons in the library?

stanford_data.jpg

EarthWorks:  Stanford University Libraries geospatial data portal; the university’s instance of GeoBlacklight.

 

Harmonising UAS Regulations and Standards: Article Review

October 23, 2016 Leave a comment

A recent article in GIM International about harmonising UAS (Unmanned Aerial Systems, or UAVs (Unmanned Aerial Vehicles), or “Drone” technologies) regulations and standards is definitely worth reading, providing an excellent summary of this rapidly evolving sector of the geospatial industry.  The article, beginning on page 6, is in a special issue of GIM International dedicated exclusively to UAS, available here.  Peter van Blyenburgh summarizes developments in regulations and standardization in Europe, the USA, Japan, and China, and then provides some down-to-earth advice for companies who are seeing the potential for profits only but may not see the bigger picture about liability, regulations, and safety.  The GIM issue also includes articles about integrating UAS and multibeam echosounder data, multispectral and thermal sensors on UAVs, UAS applications in agriculture, and the article “Airborne laser scanning” provides an excellent introduction to the two main platforms:  fixed-wing and rotorcraft.

If I am reading the “tea leaves” correctly, in the world of education, just about every GIS program offered at a technical college and university will include at least one course in UAS technology and data by this time next year.  And I would expect that a whole host of online MOOCs and other courses will appear from universities, companies, and GIS organizations to help people effectively use these new tools and technologies.  I attended, for example, a multi-hour course at the recent Geo’Ed community college GIS conference on this topic.  This reinforced my opinion that while online courses and programs will be helpful, the face-to-face component, actually working with the software and hardware, is particularly useful when working with UAS:  There is no perfect substitute for rolling up one’s sleeves and working with these devices.

As publishing director Durk Haarsma states in his editorial for this special issue, UASs are disruptive technologies, because they are influencing so many geospatial fields and subfields, such as cadastral surveying, cultural heritage, and precision agriculture, just to name a few.  Because UAS influence how people in an increasing number of professions map and model the world, interpreting the data from those UAS is central to our book and this blog–understanding your data, and how they are obtained, is more critical than ever.

uaslaunch

Launching a fixed wing UAV at the Geo’Ed conference, Louisville Technical College, Kentucky. Photograph by Joseph Kerski.  Video here and analyzing thermal imagery here.

Dusting off the spatial data hidden in museum collections

September 11, 2016 2 comments

This installment of Spatial Reserves is authored by:  Shelley James and Molly Phillips. iDigBio, Florida Museum of Natural History.   We thank these authors very much for their contribution!

If you’ve ever had a need to document where a plant or animal species occurs today, or 100 years ago, perhaps the 1 billion biological specimens housed in natural history collections across the USA, and 5 billion around the world can help!  Each of these specimens imparts knowledge about their existence in time at a specific location.  Fish, fossils, birds, skeletons, mushrooms, skins – all with a date and location of collection.  The data, found on the labels attached to the specimens, in field notebooks and catalogues, is being transcribed by museum professionals and citizen scientists alike, revealing information about the world’s living organisms dating back to the 1600’s, some with very accurate spatial data, others much less so depending on the geographic knowledge of the collector at the time.  iDigBio – Integrated Digitized Biocollections – a project supported by the US National Science Foundation – is collaborating with biological collections across the globe to help combine and mobilize voucher specimen data for research, education, and environmental management uses.

All of this biodiversity data is in a format known as Darwin Core, a standardized set of descriptors enabling biological data from different sources to be combined, indexed, and shared.  The iDigBio data Portal allows open access to this aggregated data, allowing filtering for types of organisms, a spatial region using latitude-longitude co-ordinates, polygons or place descriptions, and many other options.  The data is delivered dynamically, and can be downloaded for use.  Currently about 50% of the biological records in iDigBio (over 30 million records) have a geopoint and error, and georeferencing is something the collections community continues to work on in order to improve this valuable dataset.  Any tools or improvements to data the geospatial community can provide would be a great help as iDigBio expands beyond 65 million specimen records, and we invite you to join the conversation by participating in the iDigBio Georeferencing Working Group.

idigbio

Pigeons and doves from around the world.  The iDigBio Portal maps the distribution of species and provides specimen record details “on the fly” as filters are applied by the user.  The dataset can be downloaded, or data can be accessed through the iDigBio API.