Home > Public Domain Data > Reflections on recent GeoEthics webinar discussions

Reflections on recent GeoEthics webinar discussions

The GeoEthics webinar series from the American Association of Geographers and the University of California Santa Barbara, with support from Esri shares many common themes with this Spatial Reserves book and blog. These include surveillance (including location privacy), and governance (including regulation, data ownership, open data, and open software). I encourage you to watch the archives, including ethical spatial analytics (Feb 2021), responsible use of spatial data (May 2021), and others, and to keep tabs on the page to possibly watch an upcoming webinar live. Webinars are free and open to anyone, and AAG membership is not required, but you need to register ahead of time to watch them live.

On the webinar focusing on ethical spatial analysis, Dr Rogerson discussed examples pointing to instances where spatial dependence may confound the results of statistical testing. These practices raise significant ethical issues for public policy. Dr Vadjunec touched on an issue that we raised in this Spatial Reserves essay: Potential harm from location-tagged data and crowdsourcing. She also touched on privacy, the quality of data provided by volunteers and citizen scientists, issues raised by ethnographic research on very small numbers of human subjects, and the broader issues of trusting representations of the world as big data that may or may not be truthful when compared with real conditions on the ground. Dr Alvarez and Dr Bennett discussed maps as social constructs, how remotely sensed images are processed, and other pertinent related topics. Dr Sieber, in her discussion about artificial intelligence, discussed doorbell cameras, facial recognition, and other topics that will only become more important as time passes, and for which communities, including law enforcement, will have to make some important decisions on how, when, and why to use these tools.

Dr Goodchild, who has for myself and I suspect for many of us been someone we’ve admired and followed for a long time, made comments that made me realize that while we have made great strides in documenting data, we still have a journey ahead of us if we truly want another person or organization to be able to use the data to address a problem with the same workflows and inputs we used, for their own area of the world, or for the same problem with different variables or at a different scale.  This is reproducibility. Documenting data is only one part of enabling reproducibility. A key way this can move forward more rapidly is making sure that software companies, including my own, Esri, even more fully document the methods and models that are used for each analytical tool in their toolboxes. This could someday go so far as to send a message to the software user when the user is running an analysis tool, such as the presence of spatial dependence or some other factor. The bottom line is that all stages where spatial data is being processed should be documented and replicable, and efforts need to be made to estimate the uncertainties that are introduced. Accomplishing this rigorously is a noble and difficult to achieve goal.

But part of the responsibility will always be with the data users: Some GIS software such as ArcGIS Pro provide the user with history of the geoprocessing that was done as part of a project (such as a .aprx file). How can we encourage data users to include this history when they share their results with others? How can we encourage software developers to improve tools that will make data sources and methods easily discernible and transparent?

–Joseph Kerski

Categories: Public Domain Data Tags:
  1. Peg Gronemeyer
    November 15, 2021 at 2:49 pm

    I follow your blog closely and greatly appreciate your attention to data quality, accuracy, reproducibility, etc. – as someone slightly obsessed with detailed data documentation and the potential, inadvertent, misuse of tools because of assumption violations, OR the wrong projection that “moves” the data off enough to affect results (but not enough for someone to notice unless they are watching for it).

    Your suggestion “…send a message to the software user when the user is running an analysis tool, such as the presence of spatial dependence or some other factor.” would be amazing! I am reminded of a time when I was trying to use kriging and was unsure if I was using it correctly – was I using correct values and settings?
    With today’s global use of GIS and maps, Google Maps, G. Earth, etc., I also think about the unintentional errors that are getting deeply embedded in data sets, especially – as you said – from ‘community science’ which is becoming more and more common. Would there be a way we could create an error circle around each point to show the large area where a data point could really be located? How to do that when you receive data from public volunteers? Should we put a minimum error on any community science data? Our lab relies heavily on community science – but I have not found a reasonable process to consistently assess the accuracy and usability of such data.

    Thank you for your comments and articles.

  2. November 15, 2021 at 3:46 pm

    Thanks Peg for your thoughtful comments! Please spread the word about this blog to your colleagues. –Joseph Kerski

  3. valentinmik5st
    November 21, 2021 at 2:07 pm

    Should you tell you have deceived.

    • josephkerski
      December 7, 2021 at 6:08 pm

      Interesting… please expand!

  1. November 16, 2021 at 6:17 pm
  2. April 11, 2022 at 3:41 pm
  3. August 15, 2022 at 9:22 am

Leave a comment