Archive

Author Archive

Creating fake data on web mapping services

February 16, 2020 2 comments

Aligned with our theme of this blog of “be critical of the data,” consider the following recent fascinating story:  An artist wheeled 99 smartphones around in a wagon to create fake traffic jams on Google Maps.  An artist pulled 99 smartphones around Berlin in a little red wagon, in order to track how the phones affected Google Maps’ traffic interface.  On each phone, he called up the Google Map interface.  As we discuss in our book, traffic and other real-time layers depend in large part on data contributed to by the citizen science network; ordinary people who are contributing data to the cloud, and in this and other cases, not intentionally.  Wherever the phones traveled, Google Maps for a while showed a traffic jam, displaying a red line and routing users around the area.

It wasn’t difficult to do, and it shows several things; (1) that the Google Maps traffic layer (in this case) was doing what it was supposed to do, reflecting what it perceived as true local conditions; (2) that it may be sometimes easy to create fake data using web mapping tools; hence, be critical of data, including maps, as we have been stating on this blog for 8 years; (3) the IoT includes people, and at 7.5 billion strong, people have a great influence over the sensor network and the Internet of Things.

The URL of his amusing video showing him toting the red wagon around is here,  and the full URL of the story is below:
https://www.businessinsider.com/google-maps-traffic-jam-99-smartphones-wagon-2020-2

I just wonder how he was able to obtain permission from 99 people to use their smartphones.  Or did he buy 99 photos on sale somewhere?

–Joseph Kerski

 

 

 

Key Global Biodiversity and Conservation Data Sources

February 2, 2020 Leave a comment

Advances in the following two resources and the sheer volume and diversity of data they contain merit mention in this data blog and, I recommend, considering investigating as part of your own work.

  1.  The Global Biodiversity Information Facility (www.gbif.org) contains point data on an amazing number and diversity of species.  It also over 12 million research-grade  observations from the i-Naturalist citizen science using community.
  2. IUCN:  The International Union for Conservation of Nature:  You can filter and use the data with IUCN Spatial data downloads for polygon boundary layers from their data portal, at https://www.iucnredlist.org/resources/spatial-data-download.  The IUCN Red List of Threatened Species™ contains global assessments for 112,432 species. More than 78% of these (>88,500 species) have spatial data.  The spatial data provided on the site are for comprehensively assessed taxonomic groups and selected freshwater groups.  The site indicates that some species (such as those listed as Data Deficient) are not mapped and that subspecies, varieties and subpopulations are mapped within the parental species. The data are in Esri shapefile format and contain the known range of each species, although sometimes the range is incomplete. Ranges are depicted as polygons, except for the freshwater HydroBASIN tables.

To use either resource, all you need is a free account.  The data sets can be combined, after which you can examine potential outliers, perform hot spot analysis, use the data in space time cubes, create habitat suitability models and risk models, and much more.

Joseph Kerski

gbif.PNG

Some of the resources available from the Global Biodiversity Information Facility (GBIF).  

Curated list of thousands of ArcGIS server addresses

January 19, 2020 1 comment

Joseph Elfelt from mappingsupport.com recently added many government ArcGIS server addresses to his curated list. The list features over 2,200 addresses for ArcGIS servers from the federal level to the city level. All links are tested by his code once per week and bad links are fixed or flagged, and a new list is posted every Wednesday morning. The list is here,  While we have written about this very useful list in the past, such as here, this is a resource that is worth reminding the community about. And, as a geographer, I find the geographic organization of this list quite easy to follow.

While browsing the list recently, I found, among many other things, an Amtrak train route feature service (shown below), resources at the Wisconsin historical society, and water resources data from the USGS Oklahoma Water Sciences Center.

Joseph is also actively maintaining his “GISsurfer” application, which allows the user community to examine GIS data in a map-centric manner.

amtrak

Amtrak routes data service, which I found to be fascinating and which I discovered on Joseph Elfelt’s server listing.

I highly recommend that you browse this list if you are in need or anticipate being in need of geospatial data!

–Joseph Kerski

Dangermond and Goodchild on building geospatial infrastructure

January 5, 2020 2 comments

A new open access article from Dangermond and Goodchild on building geospatial infrastructure is germane to this blog and our book’s focus on geospatial data.  Moreover, at the dawn of the new decade, I regard this article as an important one to read and to reflect upon.

The article’s abstract states, “Many visions for geospatial technology have been advanced over the past half century. Initially researchers saw the handling of geospatial data as the major problem to be overcome. The vision of geographic information systems arose as an early international consensus. Later visions included spatial data infrastructure, Digital Earth, and a nervous system for the planet. With accelerating advances in information technology, a new vision is needed that reflects today’s focus on open and multimodal access, sharing, engagement, the Web, Big Data, artificial intelligence, and data science. We elaborate on the concept of geospatial infrastructure, and argue that it is essential if geospatial technology is to contribute to the solution of problems facing humanity.”

Besides providing a concise yet insightful history of the evolution of GIS and spatial data, one of the most thought provoking statements in the article, in my opinion, is that “a digital twin should also replicate how the Earth works, by using software to reproduce the processes that modify the Earth’s physical and social systems.”  In other words, for us to solve the complex problems of our 21st Century world, GIS must be able to show how the Earth’s systems interact and work, and moreover, how they should work; that is, how can we use GIS and spatial data to plan a more resilient future?

I also found the following statement to be wonderfully useful, “Today we think of the basic element of geospatial technology as a tuple <x,y,z,t,a> where x, y, and z are location in three-dimensional space, t is time, and a is an attribute of that location in space-time.”  And I have personally used Jack Dangermond’s metaphor of GIS being an intelligent nervous system for the planet, mentioned in the article, dozens of times in my own presentations over the past four years.

But the article is much more than an account of history and a conceptualization of how to understand GIS–it offers challenges to us in the GIS community.   For example, it states that “little advance has been made in measuring, representing, and dealing with the uncertainty that is always present when geographic reality is measured and captured in digital data, but not always present in published maps.  The article also takes special note of key progress that has led us to this very exciting moment in the use of geospatial technology, including (1) portals, (2) open data, (3) engagement, (4) collaboration, (5) story maps, (6) device independence, and (7) Cloud GIS.  These are not just ideas, they are happening now, with real tools and infrastructure that enable people to accomplish real tasks.  The article also highlights some advancements that can lead us the very real possibility that GIS can “break wide open” (my phrase) in the decade of the 2020s with (1) GeoAI, (2) scripting and workflows, (3) replicability, (4) predictive modeling, and (5) real-time analysis.

The article concludes with what I believe to be an excellent question that cuts to the heart of what we in the industry should be asking:  “What, then, should be the goals of geospatial infrastructure, and how should it be configured?”  In other words, the advancements are great, but we need to ask ourselves, where should we be taking the technology, if we are seeking a more sustainable future?  It’s not enough to ride on the ship; we need to steer it.  Dangermond and Goodchild lay out some challenges in this section, such as the following statement, which I think points to “think outside of the software box and re-engineer the software tool if necessary” — “Decisions that were made during periods of very limited computing power become enshrined in practices that may be very hard to shake.”  They also discuss resilience, protecting biodiversity, collaboration, and ensuring individual privacy.   The authors end with this statement, which is I believe a challenge for all of us to take seriously, “But what is missing in our view is a vision, a “moonshot,” a statement of principles against which progress can be measured.”

–Joseph Kerski

 

Download Arctic area digital elevation data from ArcticDEM

December 22, 2019 Leave a comment

Here we are at the winter solstice in the Northern Hemisphere and it seems appropriate to discuss polar data.  ArcticDEM is an NGA (National Geospatial Intelligence Agency) – NSF (National Science Foundation) public-private initiative to produce a high-resolution (2 meter), high quality, digital surface model (DSM) of the Arctic using optical stereo imagery, high-performance computing, and open source photogrammetry software.  The majority of ArcticDEM data was generated from the panchromatic bands of the WorldView-1, WorldView-2, and WorldView-3 satellites. A small percentage of data was also generated from the GeoEye-1 satellite sensor.  The resource covers all land north of 60 degrees north latitude.  Yes!  Not only Alaska, but Scandinavia, Russia, Canada and Iceland.  For more information, see this page.  For a web mapping application the Arctic DEM Explorer from Esri, see this page, and for the bare-bones but useful file index for fast downloading, see this page.

In my opinion, the most useful site about Arctic DEM for downloading the data is this web mapping application, the ArcticDEM index and data download.  This application allows a user to select specific index tiles of digital elevation model data.  The tiles reveal information about the DEM tile and a download web URL.  Each cell is about 2GB, with over 18 TB on the entire site.  Truly a treasure trove of data!  For selecting multiple indices, use the ‘Query’ tool to draw an area and return information on intersecting DEM tile indices. You can export these results for your reference which also include the download web URLs.

Click on any location for attribute information.  Find the “fileurl” attribute, click on More info, and then you will be able to download the 2 meter elevation data for that location.  The query widget allows for the retrieval of information from source data by executing an intersect query either against 2m DEM strips or 2M DEM mosaics.   The resource also includes a swipe tool where you can compare the content of two different layers on the map, such as the index layer and the hillshade.

The best news about this resource, and consistent with our continued mantra about GIS as a SaaS, may be that the site allows for the data to be examined as an ArcGIS Online item and also as an image service via a URL.

arcticdem.PNG

Interface of the Arctic DEM Index and Data Download resource. 

arcticdem2.PNG

The Arctic DEM data streamed and viewed in ArcGIS Online. 

I look forward to hearing your reactions to this resource.

–Joseph Kerski

A Geodata Fabric for the 21st Century, article reflections

December 15, 2019 4 comments

A recent article by on A Geodata Fabric for the 21st Century touches on many pertinent themes in geospatial technology in this column and beyond.

Jeff begins by reminding us of the 4 V’s of big data–volume, variety, velocity, and variability, telling us that we are firmly in the age of big data, with the NISAR satellite soon to be providing 85 TB of data per day, as just one example.  But he also states that geospatial and earth science is not the only field grappling with big data, giving impressive numbers coming out from astronomy and geomics (geonome science).  Jeff says, “We need a more unified approach such that each data provider—whether in the atmosphere, land surface, seismology, hydrology, oceanography, or cryosphere domain—can contribute to a shared and commonly accessible framework.”  To build it, he says we need (1) a new type of storage (such as object storage); (2) minimize the number of times we move data (I think of how many times in a typical project I move data around:  Can I reduce this number?); (3) to take advantage of the cloud; and (4) keep things simple.  Jeff says, “A user should be able simply to ask for—or directly visualize—a desired data set, time range, and area of interest while software behind the scenes automatically provides what was requested.”  Amen to that!  And he makes a good tie to the role that machine learning could play.   Could the Esri geospatial cloud help enable this?

Taking a step back from the technological and logistical aspects of collecting and managing large volumes of data, we also need to ask what we want from all this data, in the short, medium and longer term. Our aspirations and expectations are sometimes harder to define and maintain. What do we want to do with all this data and when do we need to do it? There are many great examples of some of the things we can do with spatial data but sometimes they seem to focus more on the technology, the latest version of a particular software or innovation in data management technology, than on progress towards achieving a longer term goal such as improved environmental and resource management.

The improvements in data collection, storage and management over the last 50 years have revolutionised what we can capture and what we can do with the data. To make the most of these invaluable data assets, we must also avoid the distraction of the bright shiny lights of technology for technology’s sake and keep in mind what we are trying to achieve. Starting with the desired end result:  What data helps achieve that, the best source/format/currency, regardless of how it is stored and whose server does it sits on.

–Jill Clark, Joseph Kerski

 

Categories: Public Domain Data Tags:

Be a Wise Consumer of Fun Posts, too!

December 12, 2019 12 comments

Around this time of year, versions of the following story seem to make their way around the internet:

The passenger steamer SS Warrimoo was quietly knifing its way through the waters of the mid-Pacific on its way from Vancouver to Australia. The navigator had just finished working out a star fix & brought the master, Captain John Phillips, the result. The Warrimoo’s position was LAT 0º 31′ N and LON 179 30′  W.  The date was 31 December 1899.

“Know what this means?” First Mate Payton broke in, “We’re only a few miles from the intersection of the Equator and the International Date Line”.  Captain Phillips was prankish enough to take full advantage of the opportunity for achieving the navigational freak of a lifetime.  He called his navigators to the bridge to check & double check the ships position.  He changed course slightly so as to bear directly on his mark.  Then he adjusted the engine speed. The calm weather & clear night worked in his favor. 

At midnight the SS Warrimoo lay on the Equator at exactly the point where it crossed the International Date Line! The consequences of this bizarre position were many:  The forward part (bow) of the ship was in the Southern Hemisphere & the middle of summer. The rear (stern) was in the Northern Hemisphere & in the middle of winter.  The date in the aft part of the ship was 31 December 1899.  Forward it was 1 January 1900.  This ship was therefore not only in two different days, two different months, two different years, two different seasons, but in two different centuries – all at the same time.

I have successfully used many types of geographic puzzles with students and with the general public over the years, and I enjoy this story a great deal.  But in keeping with our reminders on this blog and in our book to “be critical of the data,” reflections on the incorrect or absent aspects to this story can be instructive as well as heighten interest. The SS Warrimoo was indeed an actual ship that was built by Swan & Hunter Ltd in Newcastle Upon Tyne, UK, in 1892, and was sunk after a collision with a French destroyer during World War I in 1918.  Whether it was sailing in the Pacific in 1899, I do not know.

The version of this story on CruisersForum states that it is “mostly true.”  What lends itself to scrutiny?  Let us investigate a few of the geographic aspects in the story.

First, the statement, “working out a star fix” leaves out the fact that chronometers were used to work out the longitude, rather than a sextant.  (And I highly recommend reading the book Longitude by Dava Sobel).  Second, the International Date Line (IDL) as we know it today was not in place back in 1899.  The nautical date line, not the same as the IDL, is a de jure construction determined by international agreement. It is the result of the 1917 Anglo-French Conference on Time-keeping at Sea, which recommended that all ships, both military and civilian, adopt hourly standard time zones on the high seas. The United States adopted its recommendation for U.S. military and merchant marine ships in 1920 (Wikipedia).

Third, the distance from LAT 0º 31′ N and LON 179 30′ W to LAT 0º 0′ N and LON 180′ W is about 42 nautical miles, and the ship could have traveled at a speed of no more than 20 knots (23 mph).  Therefore, conceivably, the ship could have reached the 0/180 point in a few hours, but whether it could have maneuvered in such a way to get the bow and stern in different hemispheres is unlikely, given the accuracy of measurement devices at the time.  Sextants have an error of as at least 2 kilometers in latitude, and chronographs about 30 kilometers in longitude. Or, they could already have reached the desired point earlier in the day and not have known it.  Even 120 years later, in my own work with GPS receivers at intersections of full degrees of latitude and longitude, it is difficult to get “exactly” on the desired point:  Look carefully at the GPS receiver in my video at 35 North Latitude 81 West Longitude as an example.  An interesting geographic fact is that, going straight East or West on the Equator along a straight line, it is possible to cross the dateline three times (see map below).

Our modern digital world is full of fragments that are interesting if not completely accurate, but as GIS professionals and educators, I think it is worth applying “be critical of the data” principles even to this type of information.  The story is still interesting as a hypothetical “what could have happened” and provides great teachable moments even if the actual event never occurred.

international_date_line

The International Date Line (CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=147175).