AI-generated maps: Copyright, ownership, metadata

Artificial Intelligence (AI) is already used in a variety of ways to create maps and geospatial data, including:

  1. Image Recognition: AI can analyze satellite imagery to recognize features such as buildings, roads, and bodies of water in a GIS context. I am amazed, for example, what these AI feature extraction tools can do, and they are sitting in the ArcGIS Living Atlas of the World, ready for your use, right now! Land cover, powerlines, trees, vehicles, and much more.
  2. Natural Language Processing: AI can analyze large amounts of text data to identify locations, landmarks, and other geographic information.
  3. Predictive Analytics: AI can use data on traffic patterns, weather conditions, and other factors to make predictions about future changes in a given area, such as changes in population or road construction.
  4. Machine Learning: AI can be used to analyze large datasets of geographic information to identify patterns and make predictions about future changes.
  5. Combining citizen / community science mapping efforts with AI, as this Map with AI group is already doing in conjunction with the OpenStreetMap initiative.

Overall, AI can help create more accurate and up-to-date maps by analyzing data quickly and efficiently. However, it is important to note that human input and validation are still necessary to ensure that the maps are accurate and reflect real-world conditions.

How did I find the above information? I used an AI tool to do it–via ChatGPT. I did so as a test –to use the very tools I am discussing here to actually generate some of the content of this blog. I did add 2 items – my comment in #1 above, and the entire entry #5. But the rest, including the paragraph that begins with “overall…”, came from ChatGPT.

True to the theme of our book and this blog, AI brings new emphasis to our “be critical of the data” discussions. AI also will have enormous impacts on data quality–positively and potentially negatively. AI is already impacting the workflows, and the day-to-day work of GIS professionals and non-GIS professionals alike, in many disciplines, enabling wise decisions to be made at a widening array of scales, daily. But it also has the capability of creating much data that may be of poor or unknown quality. There are many implications: Right now, it is challenging to manage all of our GIS content–some is on our local devices, some is on our on-premises servers, some is online that we have placed there; some is online that colleagues of ours have placed there, some is generated by joining our content to online content, and many other combinations exist. Imagine some of that content being increasingly generated by trained AI techniques: Who owns that content? Where is it housed? What are the implications of combining real data with generated data? What happens someday when we start using a data set and we don’t realize until later that the data is not actually of a real place?

My tests in ChatGPT to generate a map as of this writing turned out fruitless. However, the field is rapidly changing. For example, Latent Diffusion is an AI-powered tool which can create images from a text prompt. A user tells Latent Diffusion what they want a picture of and the tool generates a synthesized image based on your request. Could this image be a map? Why not? See my result below. My colleague Jill Clark received the following message when submitting a test request on the site, “Copyright discussion about AI generated art is ongoing“.

And on a related note: Who owns the copyright to AI content? Does a user’s request to render an image of a rabbit on a motorbike mean that the user owns the copyright or whoever/whatever generated the output?

Maps generated from Latent Diffusion AI tool.

We have many data sources as this blog and our book make clear. Do we or will we have the same for AI-generated content? As we have written about many times in this blog, innovation precedes legislation. What safeguards will be in place – and when will they be in place – to tag all AI-generated content as just that…. AI Generated? If the user knows where the data came from, they can decide to use, use with caution, or not use as they see fit for their application (as we have written about before, for example, here).

There are many clever people already pushing the research frontiers rapidly forward. Given the pressing challenges in our world and in our own communities, I salute these efforts. Read more about art-meets-cartography-and-AI, here. Read about xMap in Singapore here, Batran’s data science and AI here, and AI and Xmind, here. There will no doubt be hundreds more tools and initiatives around mapping and AI. But with all of these tools, I also wanted to raise the “be critical of the data” mantra, now more than ever.

No doubt this is the first of many essays we will write in this data blog on this topic, but I am interested to hear your thoughts in the comments, below.

–Joseph Kerski

Categories: Public Domain Data

A review of the Wisconsin GeoData Portal

We have reviewed a great many data portals on this blog, some very useful, others not so much. One of the most useful is the state of Wisconsin’s GeoData Portal, with the name GeoData@Wisconsin: I had the great pleasure recently of keynoting and attending the conference of the WLIA, the Wisconsin Land Information Association ( conference (my video is here) where I learned about the portal’s new features and data sets. As the state has long been a leader in GIS, cartography, and geography, through its many fine community, technical, and tribal colleges, and its public and private universities (and indeed, many of my coworkers at Esri and USGS are graduates of their fine educational institutions) it comes as no surprise that their state data portal is among the best. And I believe that Wisconsin is one of the few, perhaps the only, state with its very own state cartographer and state cartographers’ office, and I’ve known Dr Howard Veregin there for many years and have great respect for him and the office’s leadership.

GeoData@Wisconsin is an online geoportal that provides discovery and access to Wisconsin geospatial data, imagery, and scanned maps. It is developed and maintained by the UW-Madison Geography Department’s Robinson Map Library and State Cartographer’s Office. The geoportal combines a map-based spatial search with traditional keyword searching and faceted browsing options to locate and download geospatial data. The portal contains some wonderful digital current and historic maps and aerials. Of particular note, per the advice we frequently give in this blog about “know your data”, is the “Held By” facet within the geoportal–this is an excellent way to see where the content is housed and who created it.

Many datasets in the geoportal are housed in the UW-Madison Robinson Map Library’s geospatial data archive, an absolutely fabulous place that I visited on the same week as my trip to the WLIA (see it via my video here, while others are hosted by the original data producers. I also like the fact that metadata records from curated sites that the GeoData portal obtains data from are routinely indexed at specific time intervals to promote discovery and access to this content alongside other resources in the geoportal. Searching and browsing for datasets hosted by individual counties and state agencies is available through the portal, while direct downloads are made possible from the data producers themselves. The site also includes WisconsinView, a treasure trove for remotely sensed imagery of all types, as well as the Wisconsin Coastal Atlas for the state’s 15 coastal counties bordering Lake Michigan and Lake Superior.

This site is an excellent example of a best practice on how to create and maintain a user-friendly data portal. The map-driven interface has all the elements that data users are seeking–many ways to browse and search, and map-based ways of finding data, with options of streaming and downloading. I highly recommend that data users use this site, and recommend that if you are setting up your own data portal, to use some of the design elements in the Wisconsin data portal for inspiration for your own site.

Part of the Wisconsin data portal–for historical aerials of Manitowoc County (one of my favorite counties in the entire USA).

–Joseph Kerski

Categories: Public Domain Data

Examining an innovative demographic explorer dashboard

As GIS continues to expand to include new audiences that consume geospatial data and make decisions from it, and as GIS tools continue to rapidly evolve, new ways of displaying and offering user interaction with that data are appearing daily. One example is a collaboration between North Carolina State University’s Center for Geospatial Analytics and the university’s Computer Science Department. This resulted in an innovative demographic explorer dashboard covering data at multiple scales for their state, and I had the pleasure of meeting its developers recently at a wonderful event called the North Carolina GIS conference. I invite you to explore the dashboard here:

To use the dashboard, first zoom or search for an area of interest, and then use the select tool in the upper left of the map pane. A show selection and a deselect tool appears after you have selected some census polygons to analyze.

The developers of this tool are researchers who create experimental tools for their research partners. The above application is a tool they are testing to determine if the North Carolina State Department of Transportation (NCDOT) along with other state, regional and local governments will be able to use it for some of their needs.  This is an attempt to standardize the data used for this type of analysis and the way it is visualized. Much of the data came from the US Census Bureau’s American Community Survey. The developers wrote R scripts to process the data in order to get them into a format that the dashboard uses. These details along with the tables they used for the dashboard are listed on the “Methods” tab of the dashboard (under the map). 

One innovative feature of this dashboard was the way they displayed all the values for the layers in the same category in the same same scale due to the dashboard data requirements; another was their use of layer transparencies to have the additive effect for visualization providing correct representation of the story that the data tells. 

The North Carolina demographic data viewer.

Based on my discussions with data providers and users over the past few years, and evidenced by the volume of viewers being developed, these types of maps and dashboards will continue to swiftly expand. If you are a developer of a data portal, it is my hope that this example detailed here will be helpful in your thought and design process.

And, adhering to our theme of the Spatial Reserves blog, it is ever-more important that data users understand where the data are coming from for these viewers, the quality of that data, and the scale at which data is being shown. The data user needs also to determine whether the viewer is sufficient, or whether they need access to the data services behind the viewer, and/or whether they need to download the data to their own device. It is also my hope that these same data users, empowered to be data producers with dashboards and other tools they have at their fingertips, will thoughtfully consider data users’ needs when they are in the role of data producer.

We look forward to your comments.

–Joseph Kerski

10 Usability Heuristics for User Interface Design

April 17, 2023 Leave a comment

This blog and our book often focuses on the usability of spatial data portals. We have been honest in our reviews–some of these data portals are useful, some need some work, and some are just plain user unfriendly. The design of the portal has a great influence on whether it will be used and how it will be used by pressed-for-time data users.

The following 10 usability heuristics ( developed by Jakob Nielsen I believe could serve as a useful set of principles for designing data portal user interfaces. Jakob calls these 10 elements “heuristics” because they are broad rules of thumb and not specific usability guidelines.

The 10 include keeping users informed about status (for example of their data requests), using understandable phrases and words, giving users control and freedom, among many others.

You can also download and print these heuristics as a very compelling visual reminder set of posters! Also useful is this video playlist from Jakob and the references included in the article linked above.

Which do you feel is most pertinent in our world of geotechnologies? Which do you feel is most often neglected in designing data portals, or even maps, for that matter?

Part of the 10 usability heuristics described in the essay referenced above.

Joseph Kerski

World Terrestrial Ecosystems Data and Maps

The world terrestrial ecosystems data is an amazing, wonderful new data set that is offered in several ways useful to the GIS data community. Terrestrial ecosystems can be defined by their climate, landform and land cover. The World Terrestrial Ecosystems Map identifies areas with similar terrestrial ecosystem structure, and as such, consider it to be a “synthesis” layer of climate, landforms, and land cover. I highly recommend it as incredibly useful in instruction and in research. An example of a world terrestrial ecosystem class that provides a hint as to the richness of this data, is “Tropical Moist Forest on Mountains”.

A story map of the world terrestrial ecosystems data, with links to many data sets, and how it was created, is here:

Part of the World Terrestrial Ecosystems story map.

The data is provided in the following formats:

  1. As an ArcGIS Story map, as shown above.
  2. As an ArcGIS Pro map package (566 MB), documentation is here:

3. As a image service – see the link toward the end of the story map above.

4. As a streaming feature service: Here is one but at the time of this writing, I am not sure it is the “latest” –

This data and the research that spawned its creation extend far beyond simply using it in a GIS. In fact, “This dataset will raise awareness for existing ecosystems in global scale and draw a baseline of where we are by addressing the targets at SDGs -15 Life on Land Category.” – said Charlie Frye , Esri Chief Cartographer. Addressing SDG 15 most effectively requires data on protected land, forest land, biodiversity, and more, which is what this research provides and enables.

I look forward to hearing how you are using this data set!

Joseph Kerski

OpenTopography’s Lidar, topographic, and bathymetric data

March 20, 2023 Leave a comment

OpenTopography is an effort that facilitates community access to high-resolution, Earth science-oriented, topography data, and related tools and resources.

The OpenTopography Facility is based at the San Diego Supercomputer Center at the University of California, San Diego and is operated in collaboration with colleagues in the School of Earth and Space Exploration at Arizona State University and at UNAVCO. Its mission is to democratize online access to high-resolution (meter to sub-meter scale), Earth science-oriented, topography data acquired with lidar and other technologies, harness cutting edge cyberinfrastructure to provide Web service-based data access, processing, and analysis capabilities that are scalable, extensible, and innovative, promote discovery of data and software tools through community populated metadata catalogs, partner with public domain data holders to leverage OpenTopography infrastructure for data discovery, hosting and processing, provide professional training and expert guidance in data management, processing, and analysis, and foster interaction and knowledge exchange in the Earth science lidar user community.

As this blog is focused on spatial data, OpenTopography merits attention, as it contains elevation data, Lidar data, bathymetric data, and much more. The site includes an API and even some lessons and tutorials. You can even contribute data that you have gathered to OpenTopography. I have found the data catalog and map interface to be straightforward. The data are delivered in a suitable variety of formats and the goal seems to be helping the end data user.

As a reader of this blog, I think you will find OpenTopography to be a useful resource and I encourage you to keep it in your oft-used bookmarks.

Open Topography’s front page.

–Joseph Kerski

Accessing Imagery from the Alaska Satellite Facility

The Alaska Satellite Facility (ASF) ( is committed to making remote sensing data accessible. The facility is a part of the Geophysical Institute of the University of Alaska Fairbanks, and is a NASA Distributed Active Archive Center (DAAC). It advances remote sensing to support national and international Earth science research, field operations, and commercial applications.

We have reviewed hundreds of data portals over the past dozen years on this blog and in our book as well–some portals that are very user-friendly, others less so. The ASF’s data portal is among the easiest to use and the most useful of any that I have encountered over my entire career. Indeed, our earlier review is here and was also positive! Most of the data that the ASF provides is Synthetic Aperture Radar (SAR) data. SAR refers to a technique for producing fine-resolution images from a resolution-limited radar system. It requires that the radar be moving in a straight line, either on an airplane or, as in the case of NISAR to be launching in 2024, orbiting in space. But Sentinel-1, AVNIR, RADARSAT, and many other image platforms and types are available on the site as well. SAR data is not affected by cloud cover, but there are many other advantages to SAR data, including global change monitoring in 3D, accurate elevation models, reliable monitoring, and high-resolution.

For example, see the interface below. You can search by imagery, geography, time period, and much more. In searching for Sentinel-1 imagery in this area of North America, I am given options on delivery mechanisms, file formats, to download or stream, and much more. But– not too much more where the options are overwhelming–it is just right. The user experience is really incredible and I salute the developers of this site. I had the opportunity to work with the developers and scientists from ASF at the most recent AGU science conference and have the utmost respect for them.

User Interface for data acquisition from ASF.

I highly recommend using this data portal for many of your data needs!

–Joseph Kerski

New US Census Bureau Address Count Listing Files Released

February 20, 2023 1 comment

I’ve worked with and have had great respect for Geographer Jim Castagneri, who is with the US Census Bureau, for many years now. When I asked him to write for the Spatial Reserves data blog, about some of his favorite new data sets and services, I was thrilled that he agreed. Here, Jim shares an important geographic product release that I believe will be of interest to data analysts and GIS users. I’ve tested this resource myself and agree that it is extremely useful. —Joseph Kerski


The lack of detailed population counts available in the intercensal period has troubled data users since the introduction of the modern census.  Aside from local efforts to track and record housing and population change, the federal government now has a new dataset to assist in this effort. 

Called the Address Count Listing Files, these data represent the latest available count of housing units by census block updated bi-annually.  These files are created for all 50 States, Washington D.C., Puerto Rico, and U.S. Island Areas.  The latest data release was in January of 2023.  More information on these files and how to download them can be found here:

These files are relevant for several reasons.  First, they represent the first-time the Census Bureau has released updated, block-level housing counts between decennial censuses.  Second, the housing counts are actual counts, not estimates, and they are not affected by the Bureau’s new Differential Privacy non-disclosure effort. 

Paired with the annually revised TIGER Partnership Shapefiles, local planners and regional emergency responders can now derive relatively accurate, detailed population figures without a need to conduct aerial interpolation or the more rigorous dasymetric modeling with tangentially related datasets.  By considering local housing occupancy rates and persons-per-household figures, one can derive a fairly accurate population count at the block level independent from the decennial census count. 

The above site also contains a very helpful viewer app based on ArcGIS technology (shown below).

What would you like to see on this blog?

February 6, 2023 Leave a comment

We have been writing the Spatial Reserves blog about all things data–where to find it, how to assess its quality, and its societal implications, for over a decade now. We started it because of frequent questions that the GIS community had and still have on “How do I find geospatial data?”, “How do I know if that data is any good?”, and “What are the social and ethical implications of the use of geospatial data, including copyright, symbology, fee vs. free data, and location privacy?” We have enjoyed interacting with you, the readers, through your comments, emails, LinkedIn posts, and in other ways.

During these past 10 years, we have never conducted a survey of the community of readers. As we are considering writing a 2nd edition to the book The GIS Guide to Public Domain Data, we thought now would be an ideal time to conduct such a survey.

Please participate in a short 9 question survey that should only take 3 minutes of your time but which will be immensely helpful to us and the future readers of this data blog:

We look forward to hearing what you think!

–Joseph Kerski, Jill Clark

Categories: Public Domain Data