Home > Public Domain Data > Reflections on the Effective Use of Geospatial Big Data article

Reflections on the Effective Use of Geospatial Big Data article

Glyn Arthur, in a thought provoking article in GIM International, entitled “Effective Use of Geospatial Big Data“, raises several issues that have been running through the Spatial Reserves blog.  The first is to point out that the “heart of any geospatial analysis system is increasingly becoming the server.” Glyn, a GIS professional with over 25 years experience, then dives into one of the chief challenges that this environment brings, namely, to deal with the increasing quantity and variety of data that the world produces. Of particular importance is emerging sensor platforms which must be incorporated into future GIS applications. The second point is the need to embrace, and not avoid, the world of big data and its benefits–but also recognize the challenges it brings. The third point is to carefully consider the true costs of the data server and decision making solution when making a purchasing decision.

Frankly, I found the “don’t beat around the bush” theme of Glyn’s article refreshing. This is evident in such statements as, “for mission-critical systems, purposely designed software is required, tested in the most demanding environments. Try doing it cheaper and you only end up wasting money.”  Glyn also points out that the “maps gone digital” attitude “disables.” I think what Glyn means by this is that systems built around the view that GIS is just a digital means of doing what we used to do with paper maps will be unable to meet the needs of organizations in the future (or dare I say, even today). Server systems must move away from the “extract-transform-load” paradigm to meet the high speed and large data demands of users.  Indeed, in this blog we have praised those portals that allow for direct streaming into GIS packages from their sites, such as here in Utah and here in North Dakota.  The article also digs into the nuts-and-bolts of how to decide on what solution should be purchased–considering support, training, backwards compatibility, and the needs of the user and developer community. Glyn points out something that might not set well with some, but I think is relevant and needs to be grappled with by the GIS community, which is this:  A weakness of Open Source software is its sometimes lack of training from people with relevant qualifications and who have a direct relationship with the original coding team, particularly when lives and property are at stake.

Glyn cites some examples from work at Luciad with big data users such as NATO, EUROCONTROL, Oracle, and Engie Ineo. Geospatial server solutions should be able to connect to a multitude of geographic data formats. The solutions must be able to publish data with a few clicks. Their data must be able to be accessed and represented in any coordinate system, especially with temporal and 3D data that includes ground elevation data and moving objects.


Glyn Arthur’s article about effective use of geospatial big data is well-written and thought-provoking.

Categories: Public Domain Data Tags: ,
  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: