Archive

Author Archive

Lasers: The future of data capture and transmission?

December 12, 2016 Leave a comment

Over the last four years we have discussed some of the many challenges posed by the volume of data now available online – issues of quality, determining provenance, privacy, identifying the most appropriate source for particular requirements and so on. Being overwhelmed by the choice of data available or not always knowing what resources are available or where to start looking have been common responses from geospatial students and practitioners alike.

A recent report from the BBC on laser technology highlighted some current and future applications that have or will transform geospatial data capture, including the use of LiDAR and ultra precise atom interferometers that could be used to develop alternate navigation systems that do not rely on GPS. The article also discusses the inherent limitations of our current electronics-based computing infrastructure and the potential of silicon photonics, firing lasers down optical fibres, to help meet the demand for instant or near-instant access to data in the Internet-of-Everything world. If many feel overwhelmed now by the volumes of data available, what will technologies like silicon photonics mean for data practitioners in the future? Just because data may be available at unprecedented speeds and accessed more easily, that alone doesn’t guarantee the quality of the data will be any better or negate current concerns with respect to issues such as locational privacy. A critical understanding of these issues will be even more important if we are to make the most of these advances in digital data capture and transmission.

Public Domain Data Resources Page from Esri Press

October 3, 2016 Leave a comment

Esri Press have now published a new resources page to compliment The GIS Guide to Public Domain Data, cataloguing a list of blog posts from Spatial Reserves that update and augment many of the themes discussed in the book.

sr_resources

The resources site also provides information on accessing the hands-on exercises that accompany the book. The exercises provide an opportunity for novice and experienced data users alike to work through some of the issues discussed in the book.

 

OpenAddresses: A global address collection

September 20, 2016 1 comment

An integral part of many successful GIS projects is access to reliable address information, providing a consistent locational context for a variety of phenomena. One such address resource is OpenAddresses, a collection of authoritative address data sources. The address data are ‘open and free to use’ and made available under both attribution and share-alike (ODbl) licences. Although not yet a global resource, for those countries where address information is available, the data sets may be downloaded as a .zip file (.csv and GDAL .vrt files).

OpenAddresses - global coverage

OpenAddresses – global coverage

Anyone interested in contributing an address dataset to the project can either submit a request to have the data added or submit the data themselves to the repository on Github.

Capturing the Great Indoors with Tango

June 27, 2016 Leave a comment

With the recent announcement of the first Tango-enabled smart phone, Google have taken a big step towards providing a crowd-sourced, indoor mapping solution. The phone’s inbuilt sensors and cameras capture the dimensions of a location and everything inside it, including the furniture. Once captured, all that internal detail becomes a potential back drop for a variety of augmented and virtual reality applications, including interior design and construction, shopping, education and gaming.

Although the data files collected are stored on each phone, Google hopes users will share their Tango data. Perhaps most appealing for Google, although not yet confirmed, the internal data collected and shared by Tango users will provide another platform for expanding their custom advertising and services.

As with other forms of location-based data, there are privacy implications to consider; it’s no longer just where you are or have been, that’s being shared, it is potentially detailed information about your home, your visits to other locations and what you did and saw there. Just how far people will be prepared to trade this new source of location data for services remains to be seen, but given the success of Google Maps and the increasing demand for better internal location information, Tango could help transform the indoor mapping scene.

 

DataPortals.org: A Global Catalogue of Public Domain Data Portals

May 17, 2016 1 comment

The DataPortals.org site, hosted by the Open Knowledge International organisation in conjunction with the LOD2 project, provides a comprehensive repository of over 500 open data portals. The registered portals, published by local, regional and national governments, international organisations and a number of Non Government Organisations (NGOs), provide access to a variety of spatial data sources including administrative boundaries, land use, economic activity and environmental indicators.

DataPortals_org

All data sets referenced by the DataPortals catalogue, including those that form part of a database collection, are published under the Open Data Commons Public Domain Dedication & Licence. The data sets are available to download in a variety of formats including .xls, JSON/GEOJSON and shapefile.

 

 

LIDAR Point Cloud Published as Open Data

April 5, 2016 1 comment

The UK Government’s Department for Environment, Food  and Rural Affairs (Defra), recently announced the release of a LIDAR point cloud, the raw data used to generate a number of digital terrain models (DTMs) that were released last year. In addition to providing terrain models for flood modelling and coastline management, the LIDAR data have also been revealing much about long-buried Roman roads and buildings, such as the Vindolanda fort just south of Hadrian’s Wall in northern England.

Vindolanda Roman Fort. Courtesy of the Environment Agency and Defra

Environment Agency/Defra LIDAR data

The point cloud data have been released as part of the #OpenDefra project, which aims to make 8,000 datasets publicly available by mid 2016. The first release of point cloud data contains over 16,000 km 2 of survey data and is available to download from:

http://environment.data.gov.uk/ds/survey/#/survey

The data are licensed under version 3.0 of the Open Government Licence.

 

 

Three Lessons for Improving Data Access

March 21, 2016 Leave a comment

 

This week’s guest post is courtesy of Brian Goldin, CEO of Voyager Search.

The Needle in the Haystack

Every subculture of the GIS industry is preaching the gospel of open data initiatives. Open data promises to result in operational efficiencies and new innovation. In fact, the depth and breadth of geo-based content available rivals snowflakes in a blizzard. There are open data portals and FTP sites to deliver content from the public sector. There are proprietary solutions with fancy mapping and charting applications from the private sector. There are open source and crowd sourced offerings that grow daily in terms of volume of data and effectiveness of their solutions. There are standards for metadata. There are laws to enforce that it all be made available. Even security stalwarts in the US and global intelligence communities are making the transition. It should be easier than ever to lay your hands on the content you need. But now, we struggle to find the needle in a zillion proverbial haystacks.

Ironically, GIS users and data consumers need to be explorers and researchers to find what they need. We remain fractured about how to reach the nirvana where not only is the data open, but also it is accurate, well documented, and available in any form. We can do better, and perhaps we learn some lessons from consumer applications that changed the way we find songs, buy a book, or discover any piece of information on the web.

Lesson one: Spotify for data.

In 1999, Napster landed a punch, knocking the wind out of the mighty music publishing industry. When the dust settled, the music industry prevailed, but it did so in a weakened state with their market fundamentally changed. Consumers’ appetite for listening to whatever they wanted for free made going back to business as usual impossible. Spotify ultimately translated that demand into an all-you-can-eat music model. The result is that in 2014 The New Yorker reported that Spotify’s user base was more than 50 million worldwide with 12.5 million subscribers. By June 2015, it was reportedly 20 million subscribers. Instead of gutting the music publishers, Spotify helped them to rebound.

Commercial geospatial and satellite data providers should take heed. Content may well be king, but expensive, complicated pricing models are targets for disruption. It is not sustainable to charge a handful of customer exorbitant fees for content or parking vast libraries of historical data on the sidelines while smaller players like Skybox, gather more than 1 terabyte of data a day and open source projects gather road maps of the world. Ultimately, we need a business model that gives users an all-you-can-eat price that is reasonable rather than a complex model based on how much the publisher thinks you can pay.

Lesson two: Google for GIS.

We have many options for finding the data, which means that we have a zillion stovepipes to search. What we need is unification across those stovepipes so that we can compare and contrast their resources to find the best content available.

This does not mean that we need one solution for storing the data and content. It just means we need one place for searching and finding all of the content no matter where it exists, what it is, what software created it or how it is stored. Google does not house every bit of data in a proprietary solution, nor does it insist on a specific standard of complex metadata in order for a page to be found. It if did, Internet search would resemble the balkanised GIS search experience we have today. But when I want GIS content, I have to look through many different potential sources to discover what might be the right one.

What is required is the ability to crawl all of the data, content, services and return a search page that shows the content on a readable, well formatted page with some normalised presentation of metadata that includes the location, the author, a brief description and perhaps the date it was created, no matter where it this resides. We need to enable people to compare content with a quick scan and then dig deeper into whatever repository houses it. We need to use their search results to inform the next round of relevancy and even to anticipate the answers to their questions. We need to enable sharing and commenting and rating on those pages to show where and how user’s feel about that content. This path is well-worn in the consumer space, but for the GIS industry these developments lag years behind as limited initiatives sputter and burn out.

Lesson 3. Amazon for geospatial.

I can find anything I want to buy on Amazon, but it doesn’t all come from an Amazon warehouse nor does Amazon manufacture it. All of the content doesn’t need to be in one place, one solution or one format; so long as it is discoverable in and deliverable from one place. Magically, anything I buy can be delivered through a handy one-click delivery mechanism! Sure, sometimes it costs money to deliver it, other times it’s free, but consumers aren’t challenged to learn a new checkout system each and every time they buy from a new vendor. They don’t have to call a help desk for assistance with delivery.

Today, getting your hands on content frequently requires a visit an overburdened GIS government staffer who will deliver the content to you. Since you might not be able to see exactly what they have, you almost always ask for more than you need. You’ll have no way of knowing when or how that data was updated. What should be as easy as clip-zip-and-ship delivery — the equivalent of gift-wrapping a package on Amazon — seems a distant dream. But why is this?

While agency leadership extols the virtues of open government initiatives, if their content is essentially inaccessible, the risk of being punished for causing frustration is minimal compared with that of exposing bad data or classified tidbits. So why bother when your agency’s first mandate is to accomplish some other goal entirely and your budget is limited? Government’s heart is certainly behind this initiative, but is easily outweighed by legitimate short-term risks and the real world constraints on human and financial resources.

The work of making public content discoverable in an open data site as bullet proof as Amazon’s limitless store seems can and should be done by industry with the support of the government so that everyone may benefit. In the private sector, we will find a business model to support this important work. But here’s the catch. This task will never be perceived as being truly open if it is done by a company that builds GIS software. The dream of making all GIS content discoverable and open, requires that it everyone’s products are equally discoverable. That’s a huge marketing challenge all by itself. Consider that Amazon’s vision of being the world’s largest store does not include making all of the stuff sold there. There really is a place for a company to play this neutral role between the vendors, the creators of the content and the public that needs it.

On the horizon

We have come so far in terms of making content open and available. The data are out there in a fractured world. What’s needed now isn’t another proprietary system or another set of standards from an open source committee. What’s really needed is a network of networks that makes single search across all of this content, data and services possible whether it’s free or for a fee. We should stop concerning ourselves with standards for this or that, and let the market drive us toward those inevitable best practices that help our content to be found. I have no doubt that the brilliant and creative minds in this space will conquer this challenge.

Brian Goldin, CEO of Voyager Search.