Welcome to the Spatial Reserves blog.
The GIS Guide to Public Domain Data was written to provide GIS practitioners and instructors with the essential skills to find, acquire, format, and analyze public domain spatial data. Some of the themes discussed in the book include open data access and spatial law, the importance of metadata, the fee vs. free debate, data and national security, the efficacy of spatial data infrastructures, the impact of cloud computing and the emergence of the GIS-as-a-Service (GaaS) business model. Recent technological innovations have radically altered how both data users and data providers work with spatial information to help address a diverse range of social, economic and environmental issues.
This blog was established to follow up on some of these themes, promote a discussion of the issues raised, and host a copy of the exercises that accompany the book. This story board provides a brief description of the exercises.
A new article in Earthzine entitled “Data Drives Everything, but the Bridges Need a Lot of Work” by Osha Gray Davidson seems to encapsulate one of the main themes of this blog and our book.
Dr Francine Berman directs the Center for a Digital Society at Rensselaer Polytechnic Institute, in Troy, New York, and as the article states, “has always been drawn to ambitious ‘big picture’ issues” at the “intersection of math, philosophy, and computers.” Her project, the Research Data Alliance (RDA), has a goal of changing the way in which data are collected, used, and shared to solve specific problems around the globe. Those large and important tasks should sound familiar to most GIS professionals.
And the project seems to have resonated with others, too–1,600 members from 70 countries have joined the RDA as members. Reaching across boundaries and breaking down barriers that make data sharing difficult or impossible is one of the RDA’s chief goals. Finding solutions to real-world problems is accomplished through Interest Groups, which then create more focused Working Groups. I was pleased to see Interest Groups such as Big Data Analytics, Data In Context, and Geospatial, but at this point, a Working Group for Geospatial is still needed. Perhaps someone from the geospatial community needs to step up and lead the Working Group effort. I read the charter for the Geospatial Interest Group and though brief, it seems solid, with an identification of some of the chief challenges and major organizations to work with into the future to make their vision a reality.
I wish the group well, but simple wishing isn’t going to achieve data sharing for better decision making. As we point out in our book with regards to this issue, geospatial goals for an organization like this are not going to be realized without the GIS community stepping forward. Please investigate the RDA and consider how you might help their important effort.
We’ve previously written about the launch and progress of the Sentinal-1A satellite, part of the European Union’s Copernicus earth observation project. Although still being commissioned and not yet in full production mode, the satellite recently provided radar imagery from Northern California captured before and after the Napa Valley earthquake on 24 August.
Using a technique known as ‘Synthetic aperture radar interferometry’, two images of the same area were compared to identify areas of significant change. Changes to the ground surface modify the reflected radar signal detected by the satellite, and those modified signals can be plotted as an ‘interferogram’ (Source: Radar vision maps Napa Valley earthquake.) The result is both colourful and striking; the fault responsible for the 6.0 earthquake was confirmed as the West Napa Fault, and both the scale and the extent of the surface rupture was immediately apparent.
Imagery like these examples captured for the Napa Valley quake looks set to transform how scientists and data analysts map and respond to earthquakes. With the launch of second Sentinel satellite (1B) in 2016, the imagery update cycle will be reduced from 12 to 6 days. The timely and open publication of high resolution data to support activities on the ground and post quake analysis after each event, should provide unprecedented monitoring of the Earth’s surface.
A new web resource from Texas Tech University of playas and wetlands for the southern High Plains region of Texas, Oklahoma and New Mexico offers a wide variety of spatial data on this key resource and region. The playa and wetlands GIS data are available for download here, including shapefile, geodatabase, and layer package formats. The data include 64,726 wetland features, of which 21,893 are identified as playas and another 14,455 as unclassified wetlands; in other words, they appear to be a playa but have no evidence of a hydric soil. The remaining features include impoundments, riparian features lakes, and other wetlands.
As we discuss in our book, (1) Many spatial data depositories seem to have been created without the GIS user in mind. Not this one. Careful attention has been paid to the data analyst. That’s good news! (2) Resources such as this don’t appear without a great deal of time and expertise invested. Here, approximately 5,000 person hours were dedicated to create the geodatabase and website. This project was made possible by Texas Tech University with funding from the USDA Agricultural Research Service – Ogallala Aquifer Program.
For users who only wish to view playas and other wetlands, a web map application exists and can be launched via the playa viewer. A “citizen science” feature is that the map viewer allows interactive comments to be added to the map for future consideration.
Southern Ogallala Aquifer Playa and Wetlands Geodatabase.
We recently came across the Moves App, the always-on data logger that records walking, cycling and running activities, with the option to monitor over 60 other activities that can be configured manually. By keeping track of both activity and idle time calorie burn, the app provides ‘ an automatic diary of your life’ .. and by implication, assuming location tracking is always enabled as well, an automatic log of your location throughout each day. While this highlights a number of privacy concerns we have written about in the past (including Location Privacy: Cellphones vs. GPS, and Location Data Privacy Guidelines Released), it also opens up the possibilities for some insightful, and real-time or near real-time, analytical investigations into what wearers of a particular device or users of a particular app are doing at any given time.
Gizmodo reported today on the activity chart released by Jawbone, makers of the Jawbone UP wristband tracking device, which showed a spike in activity for UP users at the time a 6.0 magnitude earthquake occurred in the Bay Area of Central California in the early hours of Sunday 24th August 2014. Analysis of the users data revealed some insight into the geographic extent of the impact of the quake, with the number of UP wearers active at the time of the quake decreasing with increasing distance from the epicentre.
Source: The Jawbone Blog
This example provides another timely illustration of just how much personal location data is being collected and how that data may be used in ways never really anticipated by the end users. However, it also shows the potential for using devices and apps like these to provide real-time monitoring of what’s going on at any given location, information that could be used to help save lives and property. As with all new innovations, there are pros and cons to consider; getting the right balance between respecting the privacy of users and reusing some of the location data will help ensure that data mining initiatives such as this will be seen as positive and beneficial and not invasive and creepy.
A recent article in the New York Times, discussing “What the Internet Can See from Your Cat Pictures“, began with the statement, “Your cat may never give up your secrets. But your cat photos might.” The article went on to describe a site that is named, appropriately, “iknowwhereyourcatlives.com”, built by Florida State University professor Owen Mundy. The site’s web map shows the locations and photographs of thousands of cats, and, presumably, the location of the cat owners. The site was created to demonstrate “the status quo of personal data usage by startups and international megacorps who are riding the wave of decreased privacy for all,” Professor Mundy wrote describing the site.
We frequently write about location privacy in this blog, and for good reason. As the world becomes ever more monitored and measured, the 7+ billion humans inhabiting it are increasingly affected by these monitoring activities. They are also increasingly contributing to vast archives of data, often inadvertently, in part through the “Internet of Things.” An increasing proportion of the data collected can be mapped, and therefore, so can people’s location, movements, and habits. The Centre for Spatial Law and Policy site alone contains news and dozens of documents pointing to current issues of location privacy in our everyday lives. These include a recent story about privacy in the Boston Marathon and frequent reflections about laws and expectations of location privacy. We recommend that geospatial professionals be aware of the issues through this blog and sites such as the Centre for Spatial Law and Policy.
Geospatial technologies have proven to benefit our planet in ways unimaginable even a few years ago. However, those involved with the geospatial industry need to be included in the conversations about the privacy implications of the types of data collected to make improvements on our planet. And even seemingly innocuous activities such as posting pictures of your cat have their share of privacy implications.
Just as the open government data and free public access movement continues to go from strength to strength, it seems that personal data could soon be a new currency in the digital information markets, where companies and other interested parties bid for the right to use that data for their own purposes.
Jacopo Staiano at the University of Trento in Italy recently conducted an experiment to the perceived value of personal location information. The study, reported in the MIT Technology review, involved 60 participants using smartphones that collected a variety of information including the number of calls made, applications used, the participant’s location throughout the day and the number of photographs taken. Using an auction system, the participants were given the opportunity to sell either the raw data or the data after it had been processed in some way to add value. Of all the information collected during the experiment, personal location data emerged as the most highly valued, and perhaps not surprisingly those who travelled more each day generally placed a higher value on their location data than those who didn’t.
The valuable insights into personal behaviour and preferences provided by such information are what compel the marketers to find ever pervasive ways to tap into that resource. Mobile location-aware applications and services are now commonplace and for many recording location data is the default setting; users have to proactively opt out to avoid being tracked. During the course of the experiment the participants were also asked who they trusted most when it came to managing their personal location data; the responses indicated concerns about the trustworthiness of financial institutions, telecom and insurance companies when it came to collecting and using this information.
The research suggests the emergence of ‘.…a decentralised and user-centric architecture for personal data management‘, one that gives users more control over what data is collected, how it is stored and who has access to it. The study also reports that several research groups are already starting to design and build such personal data repositories and it is increasingly likely that some type of market for personal location information will soon emerge.