Search Results

Keyword: ‘standards’

Harmonising UAS Regulations and Standards: Article Review

October 23, 2016 Leave a comment

A recent article in GIM International about harmonising UAS (Unmanned Aerial Systems, or UAVs (Unmanned Aerial Vehicles), or “Drone” technologies) regulations and standards is definitely worth reading, providing an excellent summary of this rapidly evolving sector of the geospatial industry.  The article, beginning on page 6, is in a special issue of GIM International dedicated exclusively to UAS, available here.  Peter van Blyenburgh summarizes developments in regulations and standardization in Europe, the USA, Japan, and China, and then provides some down-to-earth advice for companies who are seeing the potential for profits only but may not see the bigger picture about liability, regulations, and safety.  The GIM issue also includes articles about integrating UAS and multibeam echosounder data, multispectral and thermal sensors on UAVs, UAS applications in agriculture, and the article “Airborne laser scanning” provides an excellent introduction to the two main platforms:  fixed-wing and rotorcraft.

If I am reading the “tea leaves” correctly, in the world of education, just about every GIS program offered at a technical college and university will include at least one course in UAS technology and data by this time next year.  And I would expect that a whole host of online MOOCs and other courses will appear from universities, companies, and GIS organizations to help people effectively use these new tools and technologies.  I attended, for example, a multi-hour course at the recent Geo’Ed community college GIS conference on this topic.  This reinforced my opinion that while online courses and programs will be helpful, the face-to-face component, actually working with the software and hardware, is particularly useful when working with UAS:  There is no perfect substitute for rolling up one’s sleeves and working with these devices.

As publishing director Durk Haarsma states in his editorial for this special issue, UASs are disruptive technologies, because they are influencing so many geospatial fields and subfields, such as cadastral surveying, cultural heritage, and precision agriculture, just to name a few.  Because UAS influence how people in an increasing number of professions map and model the world, interpreting the data from those UAS is central to our book and this blog–understanding your data, and how they are obtained, is more critical than ever.

uaslaunch

Launching a fixed wing UAV at the Geo’Ed conference, Louisville Technical College, Kentucky. Photograph by Joseph Kerski.  Video here and analyzing thermal imagery here.

Advertisements

A review of the North Dakota State GIS Portal

October 9, 2017 2 comments

I recently had the honor of co-keynoting the North Dakota GIS conference.  While preparing for the conference, I re-acquainted myself with the North Dakota State GIS portal.  The timing was perfect because a team of dedicated and expert collaborators from many organizations had just completed work on a new portal that replaced their old Hub Explorer resource.  The new portal, accessible here, includes information on how to connect with the state’s GIS community through events and networking.  More germane to our topic in this blog, though, it also contains a link to the data sets themselves via the Hub Data Portal.   The portal is thoughtfully laid out, with the ability to view data by content type and topics.  The North Dakota GIS Hub Data Portal uses DKAN, the Drupal-based version of CKAN, the world’s leading open-source open data publishing platform. It provides a complete open source software solution for data publishers, and adheres to the API, data, and functionality standards of CKAN. The goal of this project is to combine the utility of CKAN with the ease of maintenance and extensibility of Drupal.

The portal is designed with the data user in mind:  It doesn’t include a lot of bandwidth-consuming, unnecessary graphics and maps, but allows the user to quickly go to what he or she needs.  The site also provides many options for the data user–the raw data to download, CSVs, HTMLs, XMLs, and even rest endpoints that allow the data to be consumed in web GIS platforms such as ArcGIS Online.  See the example for wildlife management areas here.  And the data sets can be very detailed, too, such as the recent addition of one-foot contours for Bismarck and Mandan.

This portal is unique in that the site includes stories about interesting projects involving people and the land in the state, with links to infographics, maps, and data.  These stories in my opinion provide good “elevator speeches” as to the positive benefits that are derived from the use of GIS, and they also provide good case studies to give students and others ideas for research projects.  The Groups tab gives useful links to “who’s who” in the state.

The site also includes a “Visual ND” site with a rich set of applications, maps, data, documents, and web sites.  The historical aerial photographs of North Dakota are also being scanned, and are available here in TIFF format.  It is my hope that these photos will eventually have REST endpoints that will allow them to be displayed directly into ArcGIS Online and other web mapping applications, such as the resource that we reviewed in Iowa, here.

We have reviewed many data portals in our book and on this blog–some good, some not-so-useful.  The North Dakota GIS Hub data portal is one of the most useful I have ever seen.

ndndndnd

The front page of the North Dakota GIS Hub Data Portal.

Categories: Public Domain Data

Era of Big Data is Here: But Caution Is Needed

September 25, 2017 Leave a comment

As this blog and our book are focused on geospatial data, it makes sense that we discuss trends in data–such as laws, standards, attitudes, and tools that gradually helping more users to more quickly find the data that they need.  But with all of these advancements we continue to implore decision makers to think carefully about and investigate the data sources they are using.  This becomes especially critical–and at times difficult–when that data is in the “big data” category.  The difficulty arises when big data is seen as so complex that often it is cited and used in an unquestioned manner.

Equally challenging and at times troublesome is when the algorithms based on that data are unchallenged, and when access to those algorithms are blocked to those who seek to understand who created them and what data and formulas they are based on.  As these data and algorithms increasingly affect our everyday lives, this can become a major concern, as explained in data scientist Cathy O’Neil’s TED talk,  who says “the era of blind faith in big data must end.”

In addition, the ability to gain information from mapping social media is amazing and has potential to help in so many sectors of society.  This was clearly evident with the usefulness of social media posts that emergency managers in Texas and Florida USA mapped during the August-September 2017 hurricanes there.  However, with mapping social media comes an equal if not greater need for caution, as this article that points out the limitations of such data for understanding health and mitigating the flu.  And from a marketing standpoint, Paul Goad cautioned here against relying on data alone.

It is easy to overlook an important point in all this discussion on data, big data, and data science. We tend to refer to these phenomena in abstract terms but these data largely represent us – our lives, our habits, our shopping preferences, our choice of route on the way to work, the companies and organisations we work for and so on. Perhaps less data and data science and more humanity and humanity science.  As Eric Schmidt, CEO of Google, has said, “We must remember that technology remains a tool of humanity.  How can we, and corporate giants, then use these big data archives as a tool to serve humanity?”

Understanding your data

Use caution in making decisions from data–even if you’re using “Big Data” and algorithms derived from it.    Photograph by Joseph Kerski. 

Categories: Public Domain Data

Data Quality on Live Web Maps

June 19, 2017 3 comments

Modern web maps and the cloud-based GIS tools and services upon which they are built continue to improve in richness of content and in data quality.  But as we have focused on many times in this blog and in our book, maps are representations of reality.  They are extremely useful representations, to be sure, particularly so in the cloud, but still are representations.   These representations are dependent upon the data sources, accuracy standards, map projections, completeness, processing and rendering procedures used, regulations and policies in place, and much more.  A case in point are offsets between street data and the satellite image data that I noticed in mid-2017 in Chengdu in south-central China.  The streets are about 369 meters southeast of where they appear on the satellite image (below):

china-google-maps

Puzzled, I panned the map to other locations in China.  The offsets varied, but they appeared everywhere in the country; for example, note the offset of 557 meters where a highway crosses the river at Dongyang, again to the southeast:

china-google-maps2

As of this writing, the offset appears in the same cardinal direction and only in China; indeed; After examining border towns with North Korea, Vietnam, and other countries, the offset appears to stop along those borders.  No offsets exist in Hong Kong nor in Macao.  Yahoo Maps Bing Maps both show the same types of offsets in China (Bing maps example, below):

china_bing

MapQuest, which uses an OpenStreetMap base, showed no offset.  I then tested ArcGIS Online with a satellite image base and the OpenStreetMap base, and there was no offset there, either (below).  This offset is a datum issue related to national security that is documented in this Wikipedia article.  The same data restriction issues that we discuss in our book and in our blog touch on other aspects of geospatial data, such as fines for unauthorized surveys, lack of geotagging information on many cameras when the GPS chip detects a location within China, and seeming unlawfulness of crowdsourced mapping efforts such as OpenStreetMap.

But furthermore, as we have noted, the satellite images are processed tiled and data sets, and like other data sets, they need to be critically scrutinized as well.  They should not be considered “reality” despite their appearance of being the “actual” Earth’s surface.  They too contain error, may have been taken on different dates or seasons, may be reprojected on a different datum, and other data quality aspects need to be considered.

china-agol

Another difference between these maps is the wide variation in the amount of detail in terms of the streets data in China.  The OpenStreetMap was the most complete; the other web mapping platforms offered a varying level of detail; some of which were seriously lacking, surprisingly especially in the year 2017, in almost every type of street except major freeways.  The streets content was much more complete in other countries.

It all comes back to identifying your end goals in using any sort of GIS or mapping package.  Being critical of the data can and should be part of the decision making process that you use and the choice of tools and maps to use.  By the time you read this, the image offset problem could have been resolved.  Great!  But are there now new issues of concern? Data sources, methods, and quality vary considerably among different countries. Furthermore, the tools and data change frequently, along with the processing methods, and being critical of the data is not just something to practice one time, but rather, fundamental to everyday work with GIS.

A Review of the Gap Analysis Program’s Protected Areas Data Portal

March 19, 2017 Leave a comment

Today’s guest blog essay comes from Linda Zellmer, Government Information & Data Services Librarian, Western Illinois University.  Linda can be contacted at LR-Zellmer @ wiu.edu.

Several years ago, I worked with a class in our Recreation, Parks and Tourism Administration department. The students in the class were getting their first exposure to GIS, and used it to analyze the populations served by a park to develop a plan for managing and expanding its services. At the time, students had to obtain geospatial data on park locations and boundaries from local or state government agencies or download Federal lands data from the National Atlas of the United States. Then they combined the park boundary data with data from the Census Bureau to learn about the population characteristics of the people in the area. Finally, they visited the park of interest to get information on park usage and amenities. A new data set, the Protected Areas Database of the United States (PAD-US) will make this class and related research much easier, because it provides data on all types of protected areas for either the entire United States, a U.S. Region, by landscape region, or by US State or Territory.  PAD-US data is available for downloading, viewing and as a web map service from the PAD-US website.

The PAD-US data was developed as part of the Gap Analysis Program of the U.S. Geological Survey. The Gap program collects data on land cover, species distribution and stewardship to determine whether a given species’ habitat is protected, so that plans for further protection (if needed) can be developed. According to the PAD-US Standards and Methods Manual for Data Stewards, the data set contains geospatial data on “marine and terrestrial protected areas” that are “dedicated to the preservation of biological diversity and to other natural, recreation and cultural uses.” The data set contains geospatial data showing the extent and location of Federal, State, Local and private lands set aside for recreation and conservation. It also provides information on the owner name and type, whether the site is publicly accessible, and information on whether the site is being managed for conservation.

 

padus_usgs

The Gap Analysis Program’s Protected Areas of the US Data Portal.

Connections between Geospatial Data and Becoming a Data Professional

September 25, 2016 Leave a comment

Dr. Dawn Wright, Chief Scientist at Esri, recently shared a presentation she gave on the topic of “A Geospatial Industry Perspective on Becoming a Data Professional.”

How can GIS and Big Data be conceptualized and applied to solve problems?  How can the way we define and train data professionals move the integration of Big Data and GIS simultaneously forward?  How can GIS as a system and GIS as a science be brought together to meet the challenges we face as a global community?   What is the difference between a classic GIS researcher and a modern GIS researcher?   How and why must GIS become part of open science?

These issues and more are examined in the slides and the thought-provoking text underneath each slide.  Geographic Information Science has long welcomed strong collaborations among computer scientists, information scientists, and other Earth scientists to solve complex scientific questions, and therefore parallels the emergence as well as the acceptance of “data science.”

But the researchers and developers in “data science” need to be encouraged and recruited from somewhere, and once they have arrived, they need to blaze a lifelong learning pathway.  Therefore, germane to any discussion on emerging fields such as data science is how students are educated, trained, and recruited–here, as data professionals within the geospatial industry.  Such discussion needs to include certification, solving problems, critical thinking, and ascribing to codes of ethics.

I submit that the integration of GIS and open science not only will be enriched by the immersion of issues that we bring up in this blog and in our book, but is actually dependent in large part on researchers and developers who understand such issues and can put them into practice.  What issues?  Issues of understanding geospatial data and knowing how to apply it to real-world problems, of scale, or data quality, of crowdsourcing, of data standards and portals, and others that we frequently raise here.  Nurturing these skills and abilities in geospatial professionals is a key way of helping GIS become a key part of data science, and our ability to move GIS from being a “niche” technology or perspective to one that all data scientists use and share.

data_professional.PNG

This presentation by Dr. Dawn Wright touches on the themes of data and this blog from a professional development perspective.

 

Crowdsourcing Story Maps and Privacy

As we have pointed out in this blog, we have had the capability to create story maps (multimedia-rich, live web maps) for a few years now, and we have also had the capability to collect data via crowdsourcing and citizen science methods using a variety of methods.  But now the capability exists for both to be used at the same time–one way is with the new crowdsourcing story map app from Esri.

The crowdsource story map app joins the other story map apps that are listed here.  To get familiar with this new app, read this explanation.  Also, you might explore a new crowdsourced story map that, after selecting “+ Participate”, prompts you for your location, photograph, and a sentence or two about attending, in this case, the Esri User Conference.  If you did not attend, examining the application will give you a good sense for what this new app can do.

It’s not just this story map that has me interested.  It is that this long-awaited capability is now at our fingertips, where you can, with this same app, create crowdsourced story maps for gathering data on such things as tree cover, historic buildings, noisy places, litter, weird architecture, or something else, on your campus or in your community.  It is in beta, but feel free to give this crowdsourcing story map app a try.

We have also discussed location privacy concerns both here and in our book.  The story Map Crowdsource app is different from the other Story Maps apps in that it enables people to post pictures and information onto your map without logging in to your ArcGIS Online organization.  Thus, the author does not have complete control over what content appears in a Crowdsource story. Furthermore, the contributor’s current location, such as their current street address or locations they have visited, can be exposed in a Crowdsource app and appear with their post in these maps as a point location and as text. This may be fine if your map is collecting contributions about water quality, invasive plant species, or interesting places to visit in a city, where these location are public places. But it may not be desirable for other subject matter or scenarios, especially if people may be posting from their own residence.

Thus, it is up to you as the author of a Story Map Crowdsource app to ensure that your application complies with the privacy and data collection policies and standards of your organization, your community, and your intended audience.  You might wish to set up a limited pilot or internal test of any Story Map Crowdsource project before deploying and promoting it publicly in order to review if it meets those requirements. And for you as a user of these maps, make sure that you are aware that you are potentially exposing the location of your residence or workplace, and make adjustments accordingly (generalizing your location to somewhere else in your city, for example) if exposing these locations are of concern to you).

Thus, the new crowdsource story map app is an excellent example of both citizen science and location privacy.

ccc

Example of the new crowdsourcing story map app.

Three Lessons for Improving Data Access

March 21, 2016 Leave a comment

 

This week’s guest post is courtesy of Brian Goldin, CEO of Voyager Search.

The Needle in the Haystack

Every subculture of the GIS industry is preaching the gospel of open data initiatives. Open data promises to result in operational efficiencies and new innovation. In fact, the depth and breadth of geo-based content available rivals snowflakes in a blizzard. There are open data portals and FTP sites to deliver content from the public sector. There are proprietary solutions with fancy mapping and charting applications from the private sector. There are open source and crowd sourced offerings that grow daily in terms of volume of data and effectiveness of their solutions. There are standards for metadata. There are laws to enforce that it all be made available. Even security stalwarts in the US and global intelligence communities are making the transition. It should be easier than ever to lay your hands on the content you need. But now, we struggle to find the needle in a zillion proverbial haystacks.

Ironically, GIS users and data consumers need to be explorers and researchers to find what they need. We remain fractured about how to reach the nirvana where not only is the data open, but also it is accurate, well documented, and available in any form. We can do better, and perhaps we learn some lessons from consumer applications that changed the way we find songs, buy a book, or discover any piece of information on the web.

Lesson one: Spotify for data.

In 1999, Napster landed a punch, knocking the wind out of the mighty music publishing industry. When the dust settled, the music industry prevailed, but it did so in a weakened state with their market fundamentally changed. Consumers’ appetite for listening to whatever they wanted for free made going back to business as usual impossible. Spotify ultimately translated that demand into an all-you-can-eat music model. The result is that in 2014 The New Yorker reported that Spotify’s user base was more than 50 million worldwide with 12.5 million subscribers. By June 2015, it was reportedly 20 million subscribers. Instead of gutting the music publishers, Spotify helped them to rebound.

Commercial geospatial and satellite data providers should take heed. Content may well be king, but expensive, complicated pricing models are targets for disruption. It is not sustainable to charge a handful of customer exorbitant fees for content or parking vast libraries of historical data on the sidelines while smaller players like Skybox, gather more than 1 terabyte of data a day and open source projects gather road maps of the world. Ultimately, we need a business model that gives users an all-you-can-eat price that is reasonable rather than a complex model based on how much the publisher thinks you can pay.

Lesson two: Google for GIS.

We have many options for finding the data, which means that we have a zillion stovepipes to search. What we need is unification across those stovepipes so that we can compare and contrast their resources to find the best content available.

This does not mean that we need one solution for storing the data and content. It just means we need one place for searching and finding all of the content no matter where it exists, what it is, what software created it or how it is stored. Google does not house every bit of data in a proprietary solution, nor does it insist on a specific standard of complex metadata in order for a page to be found. It if did, Internet search would resemble the balkanised GIS search experience we have today. But when I want GIS content, I have to look through many different potential sources to discover what might be the right one.

What is required is the ability to crawl all of the data, content, services and return a search page that shows the content on a readable, well formatted page with some normalised presentation of metadata that includes the location, the author, a brief description and perhaps the date it was created, no matter where it this resides. We need to enable people to compare content with a quick scan and then dig deeper into whatever repository houses it. We need to use their search results to inform the next round of relevancy and even to anticipate the answers to their questions. We need to enable sharing and commenting and rating on those pages to show where and how user’s feel about that content. This path is well-worn in the consumer space, but for the GIS industry these developments lag years behind as limited initiatives sputter and burn out.

Lesson 3. Amazon for geospatial.

I can find anything I want to buy on Amazon, but it doesn’t all come from an Amazon warehouse nor does Amazon manufacture it. All of the content doesn’t need to be in one place, one solution or one format; so long as it is discoverable in and deliverable from one place. Magically, anything I buy can be delivered through a handy one-click delivery mechanism! Sure, sometimes it costs money to deliver it, other times it’s free, but consumers aren’t challenged to learn a new checkout system each and every time they buy from a new vendor. They don’t have to call a help desk for assistance with delivery.

Today, getting your hands on content frequently requires a visit an overburdened GIS government staffer who will deliver the content to you. Since you might not be able to see exactly what they have, you almost always ask for more than you need. You’ll have no way of knowing when or how that data was updated. What should be as easy as clip-zip-and-ship delivery — the equivalent of gift-wrapping a package on Amazon — seems a distant dream. But why is this?

While agency leadership extols the virtues of open government initiatives, if their content is essentially inaccessible, the risk of being punished for causing frustration is minimal compared with that of exposing bad data or classified tidbits. So why bother when your agency’s first mandate is to accomplish some other goal entirely and your budget is limited? Government’s heart is certainly behind this initiative, but is easily outweighed by legitimate short-term risks and the real world constraints on human and financial resources.

The work of making public content discoverable in an open data site as bullet proof as Amazon’s limitless store seems can and should be done by industry with the support of the government so that everyone may benefit. In the private sector, we will find a business model to support this important work. But here’s the catch. This task will never be perceived as being truly open if it is done by a company that builds GIS software. The dream of making all GIS content discoverable and open, requires that it everyone’s products are equally discoverable. That’s a huge marketing challenge all by itself. Consider that Amazon’s vision of being the world’s largest store does not include making all of the stuff sold there. There really is a place for a company to play this neutral role between the vendors, the creators of the content and the public that needs it.

On the horizon

We have come so far in terms of making content open and available. The data are out there in a fractured world. What’s needed now isn’t another proprietary system or another set of standards from an open source committee. What’s really needed is a network of networks that makes single search across all of this content, data and services possible whether it’s free or for a fee. We should stop concerning ourselves with standards for this or that, and let the market drive us toward those inevitable best practices that help our content to be found. I have no doubt that the brilliant and creative minds in this space will conquer this challenge.

Brian Goldin, CEO of Voyager Search.

Advancing Geographic Information Science: Report

March 13, 2016 2 comments

A new report entitled Advancing Geographic Information Science:  The Past and Next Twenty Years  has been published by GSDI Association Press, edited by Harlan Onsrud and Werner Kuhn.  The e-book’s 30 chapters (or you may order a paperback here) include many themes that we focus on in this blog and in our bookThe GIS Guide to Public Domain Data.  Because of these themes, and because the authors of the chapters include many who are recognized scholars in GIScience, we believe the book merits close attention. Particularly germane to our focus on data sources, quality, crowdsourcing, privacy, and standards is Part One of the book:  GIScience Contribution, Influences, and Challenges.  Onsrud, a long-time and well-repected GIScience and Engineering professor at the University of Maine, and Werner Kuhn, noted professor of GIS at UC Santa Barbara, have done a careful editing of the book’s content and have pulled together some forward-thinking pieces.

This part (one) of the book includes Contributions of GIScience over the Past Twenty Years, by Egenhofer, Clarke, Gao, Quesnot, Franklin, Yuan, and Coleman, Technological and Societal Influences on GIScience, by Winter, Lopez, Harvey, Hennig, Jeong, Trainor, and Timpf, Emerging Technological Trends Likely to Affect GIScience in the Next Twenty Years, by Nittel, Bodum, Clarke, Gould, Raposo, Sharma, and Vasardani, and Emerging Societal Challenges Likely to Affect GIScience in the Next Twenty Years, by Ramasubramanian, Couclelis, and and Midtbø.  The technical and societal influences on GIScience described here include databases, free and open source software, spatial data infrastructure, GPS, sensor data collection, the Web, web mapping services, mobile computing, social media and crowdsourced data, and linked data.  Privacy needs are among those described in the “likely to affect GIScience in the future” technical chapter.  I found the reflections on older populations, bioengineering, natural disasters, safer mobility, and sensors to be thought provoking in the “coming societal trends” chapter.

If you care about data and other issues surrounding GIS in society, including the where the field has been and where it is headed, this book will be worth your time in investigating.

 

The Changing Geospatial Landscape: Report

February 14, 2016 1 comment

The US National Geospatial Advisory Committee recently released The Changing Geospatial Landscape:  A Second Look.   The report follows the first report from 2009. This committee consists of 28 experts from academia, the private sector and all levels of government: Federal, Tribal, State, regional, and municipal.  The committee’s stated goals of the new report are “to contribute its perceptions of incipient technologies that we expect will guide, define or determine the development of this industry in the near and medium term. Of even greater importance, the report highlights those aspects of innovation that bear directly on public policy and on individual privacy and security. The NGAC has also prepared this report to help inform the development of the next iteration of the strategic plan for the National Spatial Data Infrastructure (NSDI).”

In the first section, several near and medium term trends are noted and briefly described, including satellite imagery, including small platforms of which we have written about in this blog, advances in GNSS, UAVs, 4G mobile telephone technology, indoor positioning, platform evolution, cloud storage, crowdsourced data, and communications.  Next, social, economic, and policy issues are noted, such as the rural-urban dichotomy in the availability of internet services, workforce development in the geospatial industry, data analytics, standards, privacy and health issues, and data access.

I believe that skimming the report would be useful for anyone wanting to know what the main geospatial issues are of concern to this committee and for the geospatial industry in general, although I admit that after the seven years following the first report, I would have hoped for some clearer recommendations.  The report seems rather disorganized, but does point to the one constant in the geospatial industry:  Change.

Your reactions?

geospatiallandscape

The Changing Geospatial Landscape:  Report of the National Geospatial Advisory Committee.