Due to a lack of mapping resources and difficulties in obtaining census statistics for many Asian countries, the AsiaPop project was set up in July 2011 to produce detailed and freely available population distribution maps for the whole of Asia. The project recently announced the next release of population distribution datasets for 17 Asian countries, including Sri Lanka, Nepal, North and South Korea and Tajikistan. The data, available in GeoTIFF format, are available to download for free (subject to registration) and provide population count data (persons per square) for 2010 and 2015.
A combination of high-resolution (100m) settlement maps derived from satellite imagery and land cover maps were used to reallocate contemporary census-based population data, producing more accurate national cover than had been previously available. The datasets are used to measure the impact of population growth, monitoring change and as a basis for future development strategies.
A related project was established in Africa in 2009 (http://www.afripop.org/), also providing free population count data (in Esri FLOAT format).
AmeriPop, started in Oct 2012, aims to provide similar data for Central and South America (http://www.ameripop.org/).
My colleagues and I were thrilled with the arrival of Terraserver. Think back with me to 1998. While maps and images for use in GIS on the web today are commonplace, back then it was revolutionary. Suddenly, thanks to an agreement between the USGS and Microsoft, the GIS community had access to USGS topographic maps and aerial photographs down to 1 meter spatial resolution for the entire USA. Two additional features made this service extra special. First, these images were georeferenced, meaning that they could be easily used within a GIS environment. Second, these images were online: No CD-ROMs or other physical media were required! After downloading the maps and aerials for our area of interest, we could read these maps and images into our ArcInfo or ArcView GIS software. True, the header files often needed to be edited first, but this resource gave us a huge leap forward because we had terabytes of data at our fingertips via http://www.terraserver-usa.com, later becoming http://msrmaps.com. Even better was when some enterprising folks at Esri wrote programs to automatically stream these images to ArcGIS.
Now, 14 years later, Terraserver was recently retired. As the National Atlas recently wrote, “We note its passing and salute all those who developed the service. Many people were involved in this groundbreaking effort. […] The National Atlas switched over to services provided by Esri so that Atlas users can continue to link from our maps to large-scale topo maps and aerial views. This takes us full circle. The National Atlas Map Maker was the first on-line, interactive mapper offered by the Federal government. It was partially developed under a joint research effort by the USGS and ESRI in 1997.”
A plethora of base maps, topographic maps, satellite images, and aerial photographs are now available to the GIS user and the general public such as via ArcGIS Online. Times have changed but the need for good base data lives on. While I don’t long for those days of tinkering with header files, I salute the early pioneers who made it all happen. The evolution of GIS data, and discussion about data sources, quality, and related issues are ones Jill Clark and I discuss our book The GIS Guide to Public Domain Data.
I and my colleagues frequently need old aerials for land use change studies, however, and therefore, I wish Terraserver had remained online. Why couldn’t it have done so? What are now the best sources for old aerial photographs?
There’s a saying that goes something along the lines of …’Whoever wins the war gets to write the history’. Perhaps a similar saying could be applied to map making … ‘Whoever makes the map gets to interpret the location‘.
A map, paper or digital, is a representation of the Earth’s surface. That representation is an interpretation of the location, based on a particular perspective. Although a great deal of modern map making is automated, a certain amount of cartographic interpretation is still involved. Recent years have also seen a huge increase in the volume of citizen-generated mapping, freely available to anyone with an internet connection. Different mapping algorithms, cartographers, or citizen map makers may choose to emphasise certain features at the expense of others, introducing a degree of bias in the final product.
In a recent article for the BBC, Why modern maps put everyone at the centre of the world, Simon Garfield observes that “… new maps are gridded by technicians and pixel masters, who may be more concerned with screen-loading speeds than the absence on a map of certain parts of, say, Manchester or Chicago.”
A map is a version of a location and like versions of history, some are more reliable than others. As end users, few of us can go check for ourselves, so we have to rely on the map producers to not only minimize the bias, but also document the manner in which the data was collected so we can decide for ourselves which version suits our requirements best.
One of the themes running through our book The GIS Guide to Public Domain Data is the great value inherent in geospatial data. These data increasingly help us make everyday decisions more efficiently in just about every walk of life from health care to city planning to climate studies. The belief that better data will lead to better decisions, and the increasing value that people throughout society are placing on GIS is fueling initiatives to make geospatial data more available, accessible, and open.
But, how can this value be assessed quantitatively? According to a recent article in Earthzine, Putting a Value on Geospatial Data, geospatial data can help governments cut expenditures and increase efficiency. However, since the benefits are spread across multiple departments inside and outside an organization, it is very difficult to measure the benefit. Measuring the number of downloads of a data set hardly begins to explain the story of these benefits.
Nevertheless, the article does cite some interesting statistics from around the world. For example, in England and Wales, between 2008 and 2009 the GDP was an estimated £320 million (about $500 million) more than what it would have been if local governments had not made use of geospatial information for service delivery, according to a 2010 report co-produced by British firm ConsultingWhere and ACIL Tasmin. In Australia, according to a 2008 report, the financial gains from using spatial information account for between 0.6 and 1.2 percent of that country’s gross domestic product (GDP). The report focused on sectors including agricultural, fisheries, property, mining, and government.
The article therefore provides some relevant information to back up the arguments we make in the book, and some fascinating reflections to consider as you teach and learn about geospatial data.
Nigel Thrift, vice-chancellor and president of the University of Warwick in England, recently published a blog On Being Able to Find Things, discussing the best ways to interpret and communicate the vast amounts of information that are instantly available today. He notes how important quantitative methods have become in analysing data but adds that there has been a growing appreciation of the value of a more qualitative approach, based on description and observation, to communicating the information derived from that data analysis.
Enter the map as an aide to storytelling. Not a new idea but one that is enjoying something of a renaissance with the emergence of sites like story maps from Esri and MapStory from The MapStory Foundation. Both sites provide the tools and publishing framework for registered users to collate and present their spatial data on a particular theme, issue or story. For example, the University of Minnesota’s Institute on the Environment story map on global crop production, or Everett Lasher’s time-lapse map of the distribution of undersea communication cables.
All the recent innovations in geospatial technologies – cloud computing, crowd sourcing, map applications embedded in web pages and so on – make sites like these possible. Best part? You don’t have to be an expert in any of these technologies to take advantage of them.