Skip to content

From V1: How will the geospatial data market evolve over the next ten years?

The ability to collect, distribute and access geospatial data continues to improve in terms of speed and precision of collection, timeliness of delivery, and affordability.

In addition to these expected improvements, brought about by the rising importance of geospatial data and advancements in geotechnology, there are a number of profound changes that will greatly impact the geospatial data market in the years to come.

The changes underway are a combination of increased individual data collection, more open government, the explosion of sensor data, technology that automatically extracts information from data, and the ability to synthesize information from many sources. While there are increasing business opportunities in each of these areas, there are also significant disruptions taking place that threaten the business models of many established geospatial data organizations. The next ten years will be a time of many changes, but will also bring a greater empowerment of the GIS user given the amount of available data, with much of it for free.

Volunteered Map Data

The advent of crowd-sourced geospatial data, such as OpenStreetMaps (OSM), is continuing to gain momentum, and could easily replace the market for much commercial geospatial data. At present, the coverage of OSM data nearly matches that of the commercial providers, and in some cases it surpasses it for accuracy and level of detail. By 2020, the market for navigation data will be much smaller, but there will still be a place for highly-accurate and trusted-source data.

Mobile platforms are of increasing importance for data collection. The better location precision of these devices will help greatly in both the collection of accurate geospatial data, and the delivery of helpful location-aware applications. The mobile platforms are quickly dwarfing all other computing platforms in terms of their number, and their pace of innovation. This trend will continue to the point where we have less robust computing platforms, but much greater connectivity to each other and the details that are of interest to us.

Open Government

The growing government transparency movement, with more open data, and the advent of application-development contests such as “Code for America,” are placing the emphasis on what can be done with government data to improve decision making and offer greater services to constituents. The shift is away from services for citizens toward a collaboration with citizens.

The increased involvement of citizens with both the data and the services that are offered, will mean a vested partnership in assuring the quality and accuracy of data. With more people accessing and using the data, the data exposure will mean quality improvements, particularly if there is a means for the crowd to conduct quality control and create updates. With an open data approach, there will be less data drudge work, and this freeing of time and effort will enable governments to employ greater analytics to make sense of inputs and predict and chart future courses.

Automated Collection and Extraction

Machine learning and automated extraction tools that pull information from data are on the rise. The ability to pull different data products from raw imagery or other sensor inputs will become a focus area for many. The users will have the capability to use and tune data inputs for their own purposes, and data providers will concentrate on creating more diverse data products. The more specialized data products that can be derived, the more value there is in the sensors themselves.

The demand for this specialized data is already high, but the ability to deliver real-time information to create sophisticated programs that monitor and react to data inputs autonomously and adaptively will really see this interest take off. The “app for that” mentality could easily take hold toward a “data for that” ability, with the software developer orchestrating the different data feeds in order to create custom solutions.

Synthesis and Fusion

Geospatial data interoperability plays a huge role in the ability to pull together a variety of data, particularly when moving toward real-time. The more normalized the data are to each other, the faster and greater the synthesis of information.

Experts, such as today’s geospatial technologists, will evolve toward more active developers of software, but also as synthesizers of data. The huge volumes of data available will require skilled technicians to verify, aggregate and analyze this information for rich insights. The connectivity of the Web will feed these specialists, and organizations and governments might simply subscribe to regular data scrubbing and synthesis services. The ability to craft solutions that return reliable results and improve organizational efficiency means that these knowledge workers will be in high demand.

Source V1