Archive

Archive for July, 2012

Reflections on Spatial Data from the Esri International User Conference

Having just spent a week with over 14,000 people at the annual Esri GIS Education Conference and the Esri International User Conference, it was plainly evident that the themes we examine in The GIS Guide to Public Domain Data book remain at the forefront of the conversations that the GIS community is having.  We would also argue that these themes need to be a part of these conversations.  First, spatial data is rapidly becoming ubiquitous on just about every electronic device that we use for work and for play.  Second, with ArcGIS Online and other tools, we are now firmly in an era where every data consumer is also a data producer.  Third, with this avalanche of data and citizen science capabilities comes an increasing responsibility to use and produce data wisely.  Lastly and most importantly, as this slide that  Esri President Jack Dangermond showed as part of the plenary (videos here) illustrates, we cannot afford to be complacent.  The world is changing, and pressing issues of biodiversity, climate, population, food, water, natural hazards, and others need to be solved.  We won’t be able to effectively make decisions about these issues and plan for the future unless we understand spatial data.

Our World Is Changing Rapidly:  Slide from Esri International User Conference

Our World Is Changing Rapidly: Slide from Esri International User Conference

Advertisements
Categories: Public Domain Data

Geospatial Advances Drive the Big Data Problem but Also its Solution

In a recent essay, Erik Shepard claims that geospatial advances drive the big data problem but also its solution:  http://www.sensysmag.com/article/features/27558-geospatial-advances-drive-big-data-problem,-solution.html.  ”  The expansion of geospatial data is estimated to be 1 exabyte per day, according to Dr. Dan Sui.  Land use data, satellite and aerial imagery, transportation data, and crowd-sourced data all contribute to this expansion, but GIS also offers tools to manage the very data that it is contributing to.

We discuss these issues in our book, The GIS Guide to Public Domain Data.  These statements from Shepard are particularly relevant to the reflections we offer in our book:  “Today there is a dawning appreciation of the assumptions that drive spatial analysis, and how those assumptions affect results.  Questions such as what map projection is selected – does it preserve distance, direction or area? Considerations of factors such as the modifiable areal unit problem, or spatial autocorrelation.”

Indeed!  Today’s data users have more data at their fingertips than ever before.  But with that data comes choices about what to use, how, and why.  And those choices must be made carefully.

Categories: Public Domain Data Tags:

Going public with government data

Recent events in Colorado have once again highlighted just how important it is to have access to current and accurate spatial data when faced with extreme events such as wildfires.

As Aliya Sterstein describes in a post for the Nextgov newsletter, once a federal disaster has been declared, the US government can make certain datasets available that wouldn’t otherwise be in the public domain. When analysed with up-to-date mapping, live weather reports and other bulletins, this powerful combination of public and private data has proven invaluable in helping to predict the likely spread of the fires and ensuring resources are available as soon as possible. It also means less risk for emergency workers on site.
Information like this, such as the location of water pumps and power plants, is generally only available in exceptional circumstances, on a need-to-know basis. Aside from federal disasters, are there any other situations when such data are made available? Should this information be readily accessible unless there is a compelling need-to-not-know?
Aftermath of High Park wildfire, Colorado, 2012

Aftermath of High Park wildfire, Colorado, 2012