Skip to content

The Evolution of Geospatial Technology Calls for Changes in Geospatial Research, Education and Government Management

(April) By Prof. Mike Jackson, David Schell and Prof. D.R. Fraser Taylor. DirectionsMag

1. Introduction

Over the last 15 years, geospatial technologies have evolved and dramatically converged with a variety of formal information technology disciplines. The discrete, even esoteric pursuits we referred to as GIS, Earth imaging, GPS, AM/FM, location based services and navigation systems are no longer discrete. Now they are of a piece, they “talk to one another” and interact freely in a fertile communications environment of wireless broadband, portable cell phone/computers, sensor-webs and, of course, the dynamically evolving environment of the World Wide Web. In fact, geodata is rapidly becoming a conventional and pervasively familiar data type seen at once to underpin and significantly recharacterize the digital world, with broad implications for both technology and society.

The inescapable fact is that the “geospatial technology revolution” will continue at a fast pace which will require accommodation across multiple domains of research, education and government. The generally accepted idea that scientific advances create disruptive technologies which in turn cause “creative destruction” of commercial and industrial enterprises applies as well across a broad spectrum of institutional entities representing diverse cultural interests. Competent leaders understand that organizations must adapt continually to such change because it is unavoidable. Routinely, they learn from the change they observe occurring in the world and make it their business to reorganize and redirect resources and people to respond creatively, responsibly and aggressively to the most critical of the observable resulting challenges. At the present time, a similarly focused approach to leadership must be emphasized in both academia and government, just as it is now being addressed in industry.

Never in the history of public and private sector geospatial processing has there been a more urgent need for leadership that understands change. The most obvious challenge is that expansion of human population and industry has brought humanity to a point of converging crises, and diverse industry stakeholders see geospatial technology in this context as a critical factor in enabling humanity to avoid disaster.

Another obvious challenge is that the market forces that drive the evolution of technology do not meaningfully, or in any disciplined way, take into full account the needs of science and social processes. Leaders in research, education and government, therefore, share responsibility for understanding industrial development and the state of commercial applications if they wish to define more useful outcomes than those which would otherwise occur by default – outcomes which would in fact be inevitable should leading thinkers and policy planners persist in defending traditional institutional and economic intellectual practices instead of opening themselves to the evolved research and policy requirements of the increasingly complex modern world.

2. Deconstructing and reconstructing geospatial academia

Because our societal problems are so critical, we must build the best possible foundation for advancing the capabilities and wide use of geospatial technologies. Recent dramatic progress in the development of geospatial interoperability enables us to model increasingly complex aspects of our environment and social processes, and this significantly amplifies the power of geographic thinking. To make the most of this progress, it is timely to take a fresh and objective look at the positioning of geospatial issues in both academic and government contexts, and to evaluate critically whether present organizational and departmental structures adequately serve our actual critical needs.

It would be a useful academic exercise to write a history of academic GIS, putting it in the context of the evolution of academic geography and in the larger context of the evolution of technology, academia and science itself, but we don’t in fact need to go into such detail to understand our choices.

To begin with the obvious, let us first note that it has become conventional to locate the teaching of GIS and Earth imaging in GIS, geography, geomatics engineering or natural resource departments, and in so doing to focus curricular approaches initially on the acquisition of expertise in operating on spatial data largely by means of desktop GIS applications. Students learn about scanning and digitizing and converting digital spatial data between formats, systems and software. There are many other elements in the curriculum, of course, and hopefully as students broaden their exposure to geographic computing they learn to think in terms of the fundamental and pervasive relevance of spatial information.

This approach has worked well and has been successful to the extent that it has been driven by techniques able to address the expanding, but carefully limited, scope of issues requiring the use of diverse forms of location data and the integration of spatial information into mainstream information and communication technology processes. However, the authors believe it is increasingly important for university decision makers to appreciate the impact on geospatial research and education of interoperability standards organizations such as the Open Geospatial Consortium and ISO’s TC211 committee on “Geographic Information and Geomatics,” as well as the various platform standards organizations that have been instrumental in creating the enabling environment for information interoperability in the World Wide Web.

A significant consequence of this “geospatial interoperability” in the context of the Web is that increasingly the geodata and geoprocessing resources one requires to address a variety of information integration challenges need not be imported and stored permanently on one’s desktop, but may reside on a remote computer across the Web and yet be accessed as though attached to the local system. Service oriented architectures and Web-based geoprocessing means that the work of researchers and students increasingly need not involve the often inefficient import of whole “files” of data from which the limited subset of interest must be manually extracted. Their searches for data are now potentially more like searches using an Earth browser, in which one’s query specifies a region of interest and a remote server returns an “answer”. Using free Web services and having no GIS education, millions of people in fact are now able to perform operations that were very recently known and used only in the closed community of GIS, and this progress ushers in a new age of information integration and the harmonization of the wide variety of the world’s diverse collections of geospatial information.

We will see, however, that such miracles of geospatial interoperability cannot be adequately exploited without our addressing the conventional restrictions imposed by our scientific disciplines themselves. Unfortunately, work with geospatial information is constrained by traditional thinking and the constraints are reinforced by the architecture of GIS technologies designed within the limitations of prevailing conceptual strategies. As a consequence, through the entire history of geospatial computing, “received wisdom” focused on geospatial software as “packages,” or proprietary applications, and geospatial data was universally viewed in the traditional file structure dictated by its earliest incarnation in the age of half-inch magnetic tape. Most unfortunately, data collection was conceived in terms of discrete projects yielding data for specific uses.

With such a heritage, one might, with justification, ask how, with our legacy of dated technologies, could the habits of “thinking in a box” which have characterized the last three decades of academic and government usage be otherwise?” It is, of course, because of today’s network-based information technology paradigm that we are able to examine alternatives and, in particular, to conceptualize ways that our powers of thought might expand. Interoperability enables us to begin thinking about data as a vast Web resource whose components are easily discoverable, assessable, accessible and combinable in analysis despite their diverse file formats, coordinate reference systems, originally intended uses, heterogenous data models, and naming schema. The million springs of this resource will evolve independently in response to real world conditions and constantly changing local and thematic information requirements. But they will be there, constantly accumulating, for us and our automated tools to explore and mine.

Understanding the Earth as a system requires that our scientific information systems be conceptualized in such a way that they are capable of interfacing with one another and ultimately able to function as a single unified system, one that is able to support more effective interdisciplinary collaboration. Such synergy – the true integration of diverse research data and the building of one study upon another – has not usually been possible in the modern IT context, much less realized, because of non-interoperability. Data are usually collected, used once, and forgotten – not catalogued and actively maintained online for discovery and ready access by others. Studies are published, but it is far less common to publish the data associated with them, even though this data represents an additive resource that could be used with other data in more comprehensive studies, multidisciplinary and longitudinal research projects, or decision-support efforts.

Most geospatial data never even enter a GIS. Sensors and field studies, for example, may employ location parameters, but data collected by sensors or in the context of field studies are seldom published online with machine-readable schemas documenting other parameters and data associated with the location parameters. As envisioned above, if both parameters and data were published in this way, Web-based systems could find and aggregate such data into rich and informative data layers in a GIS.

Open, internationally adopted XML-based standards now make it possible for geodata, geoprocessing operations, sensors, data reduction methods, modeling services, transducer parameters and sensor data repositories to be made publishable, discoverable, accessible and useable in the environment of the Web. The range of such resources increases as the standards effort continues to gain participants from new technology and application domains. Geospatial interoperability standards have in fact removed barriers imposed by “last generation” technical incompatibilities between diverse types of systems, such as GIS and Earth imaging, diverse vendor-proprietary systems, and diverse custom projects, and in so doing, standards have stimulated researchers to imagine innovative spatial relationships that broaden the scope of their research.

Not surprisingly, science, like any other human pursuit, clings conservatively to its formalities, and yet science by no means remains static. One of the most important causes of change in the structure of scientific thought occurs because of intellectual paradigm shifts which are both the cause and effect of technological evolution – evolution that is driven by industrial necessity as well as by restless human inventiveness. When one considers glass lenses, calculus, chronometers, X-rays, semiconductors, raster imaging devices, GIS, etc., one sees that the history of science is closely tied to the history of new tools for knowledge. The new tools provide new ways of gathering knowledge and new ways of looking.

Looking back, we tend to see the coordinated genius and logic of scientific progress, whereas in the present, we are more likely to see diversity and controversy which sometimes results in resistance to the transformative use of new technologies in science. This is partly because the old ways of gathering knowledge provide not only a familiar way of looking at reality but also a comfortable foundation for traditional funding arrangements and institutional stability. Nor should it be overlooked that professional standing is important to scientists, causing many to wish their research to be seen in conventional terms, as a contribution, not an affront to established institutions and practices.

In this context, academics concerned with geospatial issues face adoption of upsetting new technological advances represented by the emergence of a new generation of standards in the geospatial technology domain – a direct result of the recent ubiquitous developments in the fundamental information and communication technologies that drive advances in distributed computing and service architectures. Researchers are challenged to appreciate the fact that interoperability standards are positioned to bring important benefits to the formal sciences that advance tools such as GIS, remote sensing and modeling; the natural sciences that use these tools to understand geospatial features and phenomena; and the non-scientific communities that use geospatial technologies.

The theory most relevant to the Web that underpins the argument of this paper is Metcalf’s Law. This basic tenet in information science states that the value of a node on a communications network is proportional (by some exponent) to the number of potential users of that node in the network or the total number of nodes. George Gilder theorized “the square of the number of connected users,” but the actual exponent depends on many factors. This applies, for example, to the network of telephones, the network of Web servers, and the network of fax machines. It applies more generally, outside of technology, to, for example, the networks of people who can converse about genetics or who have money to spend.

Information networks depend on nodes that connect through common interfaces and encodings, like people speaking a common language. Interoperability has been documented to be a growing concern of government IT managers, and by the same trend, the huge COMDEX computer products tradeshow has been replaced by a show called INTEROP. This is because we are involved in a communications revolution on top of a computing revolution. A key social innovation of our era is the establishment of consensus standards organizations that manage the processes for agreeing on such frameworks, interfaces and encodings, which are usually released as free and open specifications for use by any developer. Interoperability, therefore, is the key to a connected world.

Academia, and government, as we shall see, need to be concerned with how Metcalf’s Law applies to the network of online scientific or municipal geospatial data sets, sensors, Web services and data schemas that are accompanied by standard metadata and that can be discovered and accessed through Web catalogs and servers that implement open standard interfaces and encodings. Academia, in particular, needs also to be concerned with the connections between devices and technologies and the build-out of useful permutations among these to meet the dramatically increasing need for integrating geospatial information resources into the research techniques of a variety of traditional scientific disciplines.

3. Open Access to Geospatial Data

Academics and those who fund their research should be acutely interested in the proposition that geospatial data developed for scientific purposes can be, in a Web environment, a resource whose value increases with the number of researchers who use it. Geography has always been interdisciplinary and GIS has always been a tool for combining data from different sources. All geodata refers to some aspect of the same Earth. If researchers properly document, archive and publish their data and methodologies using available Web technologies, standards and best practices, many benefits accrue:

a) Improved opportunities for cross-disciplinary and longitudinal studies. This is key. Geography is inherently cross-disciplinary, and a main underlying theme in modern science is the relatedness of phenomena. We need interdisciplinary and longitudinal studies to help us address critical problems such as resource depletion, climate change, population, pollution, disaster management and adequate provision of food, water, shelter and energy.
b) Improved verifiability of results. Science requires that experiments be replicable, and therefore experimental data must be available. In the context of the standards regime described above, details of methods and semantics become more accessible for review. In this new environment, GIS and remote sensing studies will be less vulnerable to the accusation that pretty maps are being used to cover up poor experimental design or biased reasoning. Climate science made more verifiable will be harder to discredit or ignore.
c) Improved Web-based data publish/search capabilities. These make much more data available and enable more efficient assessment of data. In most cases, literature searches will be a much less efficient way to discover data than direct data searches using online catalogs, because researchers looking for previously collected data often do not know which bodies of literature to search. For example, data collected by an ornithologist may include temperature readings that would be valuable to a hydrologist.
d) Improved ability to re-use or repurpose data for new investigations, reducing redundant data collection, increasing the value of data and creating opportunities for value-added data enhancement. The value of data increases when the data can be re-used or when they can be collected with the intention of serving multiple research purposes. This benefits the public and those who fund scientific research, and brings a greater return in terms of the general application and use of science. It benefits data owners, who may be able to charge for use of the data they generate or who may add value to it to provide more useful and saleable products.
e) Improved opportunities to collaboratively plan data collection/publishing efforts to serve multiple defined and undefined uses.
f) Improved rigor and transparency regarding data collection methods, processing methods and data semantics. The SensorML standard makes the processing chain transparent. Generally agreed upon conventions for describing data and methods (such as data reduction) contribute to clarity and rigor.
g) Improved ability to discover spatial relationships. Researchers will surely browse data and become curious about patterns they notice. Robust data descriptions and quick access to the data will enable more rapid exploration of hypothetical relationships.
h) Improved ability to characterize, in a standardized human-readable and machine-readable way, the parameters of sensors, sensor systems and sensor-integrated processing chains (including human interventions). This enables useful unification of many kinds of observations, including those that yield a term rather than a number.
i) Improved ability to “fuse” in-situ measurements with data from scanning sensors. This bridges a historical divide between research communities that have focused mainly on either unmediated raw spatial-temporal data or spatial-temporal data that is the result of a complex processing chain.
j) Improved ability to “chain” Web services for data reduction and analysis, and improved ability to introduce data into computer models that use multiple inputs from remote data stores or real-time data feeds.
k) Improved ability to encode sensor data in the ISO Feature Model (ISO 19109). This is just one of the ways in which the OGC Sensor Web Enablement (SWE) standards will enable scientists to leverage off-the-shelf, standards-based tools for data modeling and management, just as they currently use commercial spreadsheet and database programs.
l) Improved societal and institutional return on investment of research dollars, and improved ability of research funding institutions to do due diligence and policy development.
m) More efficient scientific debate and accelerated pace of scientific discovery, as automation and new institutional arrangements reduce the amount of time spent on data management, freeing researchers’ time for more creative work and more communication with other scientists.

In the bioscience world, a similar vision has been put forward and steps toward it taken by the Science Commons), founded by Lawrence Lessig. The Science Commons champions “Open Access” (OA), and OA was a key enabler of the Human Genome Project.

Geospatial academics worldwide ought to note also the significance to the research community taken by the recently installed Obama administration in the US, which has resulted in the appointment as co-chairs of the President’s Council of Advisors on Science and Technology Harold Varmus, co-founder of the Public Library of Science and former director of the US-NIH, and Eric Lander, a lead researcher in the Human Genome Project and founding director of the Broad Institute (a joint MIT and Harvard institute which addresses the effectiveness of “a new, collaborative model of science focused on transforming medicine)”. Varmus is one of the most high-profile advocates of Open Access and the role of government in providing open access, and both the Human Genome Project and the Broad Institute are practitioners of open data. In this context, is it not then obvious and provocative to consider the potential importance to geospatial information science of recognizing the GEOSS (Global Earth Observation System of Systems), within the US federal government as well as the world scientific community, to be an initiative that is similar to and as important as the Human Genome project?

4. Academia can accelerate and guide geospatial technology evolution.

The authors support a movement toward Open Access in all the sciences that produce and use geospatial data. But in addition to Open Access, the authors seek a concerted global effort to study and evolve interoperability to make geospatial data and services a more important part of the rapidly evolving ICT environment, which is to say, to build a better connection between the real world and the digital world.

Geospatial educators should track and contribute to this progress in geospatial research as it unfolds, because it is essential to their future success. Students’ exposure to new geospatial technologies will both prepare them for a wide variety of professional paths that involve geospatial information and inspire educators to evolve new branches of geospatial research and study.

It is inevitable that a certain inertia exists among busy academics who use and teach GIS in moving beyond familiar tools to embrace Web-based geoprocessing, yet the traditional curricular emphasis will have to shift as young people steeped in the Web 2.0 environments become impatient with less powerful and diverse tools and knowledge techniques. But even with Web-aware students, there is a disconnect that we need to recognize and accommodate.

Whether the technology is automobile engines, cosmetics or photogrammetry, most users have little interest in how their technologies were developed. That’s understandable and probably healthy, as our memories, school years and attention are limited. Civilization evolves through specialization and interdependence. The implication for geospatial academia and the commercial geo-technology community is that we have reached the point where the objectives of the two pursuits have generically diverged, defining a basic academic distinction between the formal study of geography as a cultural discipline and the scientific study of geospatial information technology as a science. We believe that it is of the utmost importance to appreciate this distinction in order to more precisely and productively define these fields in both research and education.

Geospatial software vendors have done an excellent job of bringing the technologies and the standards framework to their present level, but the authors believe that universities have both an opportunity and a responsibility to build on what GIS departments and industry have provided. Geographers will provide important input and vendors will ultimately deliver the new capabilities, but universities and science funding agencies and foundations need to recognize that the defined curricula of geographers and the business models of their software vendors do not motivate the progress that needs to occur. Qualified academic leaders need to put forward thoughtful, aggressive proposals for advancing geospatial information technology as it sits within the new location-aware pervasive computing digital economy. Proposed projects and programs can easily be positioned as delivering job creation, business creation, and national competitive advantage. It should also be argued, of course, that these programs will arm researchers and agencies with powerful means to address societal problems and leverage for society the public dollars spent on creating data that, through Open Access, become a shared resource. These are not empty words. They frame our responsibility.

The authors believe that new research centers must be created and existing mathematics, computer science, geography and human factor departments must create new subspecialties or new programs to explore and invent new information systems focused on terrestrial features and phenomena, space, and time. Computer science per se is too abstract to subsume the work we describe. The new discipline needs to be anchored to geodesy, positioning and location-aware technology, and it needs to co-evolve with the open standards framework for geospatial interoperability. The new discipline is a discipline for the study and advancement of processing that is bound to the physical and social environment that we live our daily lives in. The elements of the new “science of geospatial interoperability” are:

* Geodesy and positioning, to establish an accurate empirical reference system for our not-perfectly-spherical Earth. All our understanding of earth observations and processes depends on being able to measure the earth with precision and accuracy, and to precisely locate things and their trajectories in space and time. The science of geodesy is more complex and difficult than most people imagine, but this complexity has been mastered in the GIS field.
* Semantics and ontologies involve the names, meanings, and relationships of entities within a framework of discourse. Geospatial semantics further incorporate measurements and spatial/temporal relationships between objects. Semantics and ontologies are critical to interoperability because our digital data models are representations of many different human systems of symbols and signs, each data model reflecting the perspective of a different discipline, profession, jurisdiction, or individual researcher. Semantic variety contributes to non-interoperability, but Web technologies are increasingly providing tools for automatic translation between data models, which opens the door to data sharing on a very broad scale. These tools and related standards will also provide necessary information about data quality, suitability and ownership. The Geospatial Semantic Web is poised to develop rapidly on the platform of the World Wide Web Consortium’s Semantic Web.
* High performance pervasive computing makes full use of advanced fixed and mobile computing networks, facilitating real time data acquisition, modeling and simulation. The continuing rapid advance of computing technology is timely, because geospatial interoperability is particularly important for the hardest problems – those that deal with the multiple interrelatedness of spatially distributed events and phenomena in an asynchronous and complex world. Hardware and software constraints and the non-interoperability of data models have shaped and limited our perceptions of reality as it appears through the lens of legacy GIS and remote sensing. We need to aggressively expand our modeling of features, phenomena and relationships in time and space as computation and bandwidth constraints diminish, as real-time access to thousands of constantly updated data sources and services becomes practical, and as Web-accessible sensor nets proliferate. The systems envisioned can’t function unless very significant resources reside in the network and are accessible through standard interfaces and encodings. This requires new work on database and caching design, mobile ad-hoc networks, smart placement of high bandwidth connections, and judicious use of XML-based schemas and web services. All of these can be developed effectively in research settings focused on the new science of geospatial interoperability.
* User interface management and workflow – the conceptualization and management of human interaction with the digital world – comprise the fourth element of Geospatial Interoperability. This area is potentially the most powerful driver of innovation in the area of interoperability research, because it is the actual workshop of perception and human imagination as applied to the tools of science. Especially in the case of complex applications, the design of a digital environment within which a human is enabled to perceive and operate on data significantly influences the quality of the human-computer interaction. Through advances in bandwidth and processing speeds, it is possible in an increasing number of cases to model interactively. As in parametric design in CAD, or in process modeling where changes in the flow of events can be observed as a function of interactive tweaking of process parameters, it should be possible to manipulate vast frameworks of proven or hypothetical geospatial relationships with the same eye/hand/mind skill employed in using a pencil. Importantly, it is not industry’s developers but rather academics in fields such as cognition, perception, semantics, media studies and semiotics who will produce deeper understandings of how our relationships to the world are altered by these evolving technologies.

These four elements define a domain of exploration, invention, and cooperation that is essential if we are to develop more efficient modes of observing and modeling our natural environment and our complex interactions with it. These technologies will shape our new modes of thinking about complex geospatial phenomena whilst being molded within an understanding of the ethical constraints and social acceptability that must at all times be considered and responded to within liberal democratic societies.

The new Centre for Geospatial Science (CGS) at the University of Nottingham in the UK and the Geomatics and Cartographic Research Centre at Carleton University in Ottawa, Canada exemplify the new approach that must be taken.

CGS was established in November 2005 with the objectives of responding to the new forces shaping geospatial science and interoperability. It was framed around the building of a multi-disciplinary team of geographers, geodesists, computer scientists, mathematicians and human factors specialists. It directs its research at providing an enabling infrastructure and body of geospatial theory for other disciplines to integrate or build on. The development of generic platform components within a standards-based architecture has been a core goal and this has led naturally to collaborative research programmes across an ever increasing range of applications areas. Research objectives address the challenges of spatial data conflation, association, and modeling data analysis and prediction, semantic interoperability and communications of results typically within the context of mobile location-based services and with the constraints of small, mass-consumer devices such as mobile phones. The end user of the CGS research is seen as being both other specialists and the general public who, armed with powerful GPS enabled mobile phones with high-quality cameras and high-bandwidth data communications, are increasingly as much the providers of location and geo-referenced data as they are the consumer of it.

The Geomatics and Cartographic Research Centre at Carleton University is contributing to the science of geospatial interoperability in a number of ways. The theory and practice of cybercartography and the open source and open standards cybercartographic atlas framework are integrating geospatial information from a wide variety of online sources to produce a number of cybercartographic atlases on a variety of topics of interest to society. A cybercartographic atlas is a metaphor for all kinds of qualitative and quantitative information linked through location. An example is the Cybercartographic Atlas of Antarctica (see “atlases”) which integrates information from a number of scientific disciplines in new ways and which draws on databases and sensors in a number of diverse locations to present information to the user in real time in a seamless manner using OGC standards and specifications. The work of the Centre involves a number of disciplines including, among others, cognitive science, human-computer interaction, geography, cartography, computer science, English, music and psychology. Special attention is given to the development of effective user interfaces and to the archiving of geospatial data. Cybercartography is multi-modal and multi-sensory which poses additional challenges and opportunities for the creation, management and use of geospatial data. The interdisciplinary process, using geospatial data, is creating transdisciplinary knowledge which is giving new insights into some of the pressing problems of society in both a national and international context. Different narratives on the same issue are presented in ways which engage users and the distinction between the creators and users of geospatial data is becoming increasingly blurred as users can create their own narratives in a Web 2.0 environment.

5. Reinventing government’s Spatial Data Infrastructure Role

In the late 1980s and early 1990s, leaders in the US community of government users of GIS and remote sensing began developing the idea of the “National Spatial Data Infrastructure,” which is now a fixture in the vocabulary of geospatial technology users and policy-makers around the world. Corollary to the development of the NSDI concept in the US was the creation in 1990 of the interagency Federal Geographic Data Committee (FGDC) and then in 1994 the formation of the Open Geospatial Consortium, Inc. (OGC), a public private partnership dedicated to the development of consensus IT standards.

The FGDC, like similar committees in other nations (and committees in NATO and the UN) tackled geodata coordination, that is, data and metadata standardization, among offices and agencies. In parallel to these widespread data coordination activities, the OGC developed the open interface and encoding standards needed to enable interoperability among the heterogenous processing systems marketed by the diverse geospatial product vendors. As the Web became the dominant distributed computing platform, most of the OGC’s work focused on the Web-based interoperability of geospatial services.

In 1990, the FGDC, with its symbolic positioning as the focus for US Federal Government geospatial processing, was housed in the US Department of the Interior (DOI). Nineteen years later, after years of technology development and experience with the realities of coordinating the government’s vastly diverse spatial information management requirements across nearly all departmental boundaries, it seems relevant to re-examine the organizational positioning of this vital function. Such a re-examination is particularly urgent in light of geoprocessing’s increased relevance to comprehensive, multidisciplinary government programs, and of equal importance, in recognition of the increasingly integral definition of standards-based “Interoperability Science” as distinct from the compilation of discipline-specific data repositories.

This is not just a US issue. Most countries are poised to generate vast quantities of new geodata. The technologies for producing and using geodata have become so inexpensive and cost-effective that the budgetary constraints that faced government agencies and companies are increasingly not an issue. Perhaps of even greater significance, however, is the fact that the generators of the data will also increasingly be the mass consumer. Through the use of what will be ubiquitous GPS enabled mobile phones (supplemented with other positioning technology for indoor as well as outdoor coverage) the public will increasingly provide dense networks of location-tagged data and imagery as a by-product of their natural mobility and consumption of location-based services. The issue is particularly urgent for nations that are embarking on major SDI projects based-on institutional data capture models. Throughout the lifecycles of these major capital infrastructure programmes, new needs and new suppliers of data will emerge that are probably impossible to predict more than a year or two ahead and that will demand a level of infrastructural flexibility and interoperability that has not previously been considered.

Putting geospatial resources on the Web and making those resources as discoverable as possible and as useful as possible for as many purposes as possible is what defines an SDI. Today, as in 1990, literally billions of dollars are spent each year in the US on redundant data collection. Billions more are wasted because needed data is not available, or at least, not discoverable. When sewer departments, utilities, bus companies, foresters, registries of deeds, conservation commissions, architects, and other non-academic as well as academic data producers publish their spatial data online using standards-based methods and appropriate access methods, their data becomes a highly leveraged public asset.

Even governments that do not face the prospect of having to pay off trillions of dollars in deficits are well advised, in the light of the new Web-based information infrastructure, to review and revise policies, procedures, and laws to make their NSDIs as valuable as possible.

A review in any nation involves looking at where in the government bureaucracy NSDI governance belongs. In the US, the USGS and other Department of Interior agencies were early users of geospatial technologies, along with the Department of Defense, but the technology is now used almost universally, and it has grown beyond those departments’ domains and purviews. It is now instructive to observe that the obstacles that may in fact prevent maturation of the NSDI seem to be characterized less by the traditional concerns of geospatial practitioners in such agencies than by the realities and regulation of both national and global commercial practice. Developers of data in the public and private sectors resist publishing and sharing their data for many good reasons as well as some bad reasons. Governments need to focus on the wide range of legal and commercial issues associated with geospatial technology and the collection, distribution and use of geospatial data. These issues include liability, privacy, national security and intellectual property rights (IPR) in spatial data, and they are complicated by the issue of data provenance, that is, an accounting of the source, history, integrity, currency and reliability of component data sets that comprise a composite data layer. The number of government, consumer and business applications for spatial technology is rapidly growing, and the issues involved have much more to do with money and law than natural resources.

The geospatial technology industry, in cooperation with its government and private sector customers, as well as its academic partners, needs to provide special technologies for IPR, security and access control if NSDIs are to mature. Policies need to account for different information communities’ different conceptual abstractions, classification schemes, data models, processing approaches and recording methods and the much more anarchic and unconstrained / quality controlled sources of data from crowd-sourcing and voluntary geographic information developments. The issues are complicated and they apply to law enforcement, civil protection and emergency response as much as they apply to hydrology and agriculture or the consumption of location services by the man-in-the-street. In government, as in the Earth sciences, all things are connected, partly because jurisdictions in different functional areas overlap in their geographic areas. Geospatial information and technology issues thus deserve attention at a higher level of government than in the past.

This is particularly important if federal governments plan on spending money on base layer data development as part of their current efforts to stimulate the economy. An overriding goal of public policy is to empower, not burden future generations. We thus want to avoid investing in outmoded centralized and proprietary data warehouses whose structure, function and cost are at odds with the open, distributed architecture model that the whole ITC industry is embracing.

Also, it is very important to consider that there are some elements of Spatial Law that are not well defined by any national establishment, and it would be of benefit to all to begin addressing these shortcomings through well-informed legislation and international negotiation instead of through the courts or through international crisis management. The point is, that governments should be thinking strategically about how geospatial technology should be positioned, through policy and law, to contribute as fully as possible to the social welfare and international cooperation, and not allow itself to be trapped in outmoded institutional approaches that are no longer capable of providing even the illusion of practical governance.

6. Conclusion

Just as the significance of the Web could not be widely appreciated until the necessary Web standards had been in place for a few years, we believe that all the domains of geospatial technology and application are about to experience a remarkable transformation due to global adoption of open standard geospatial Web service interfaces and encodings. The rich “network effects” made possible by chained Web services, GRID computing, sensor webs, geospatial semantics, and online catalogs for data, services and schemas hold great promise, but there is no guarantee that this promise will be fulfilled. The question is, can we find the institutional will – in academia and government – to make changes that enable societies around the world to make the most of these new tools?

SOURCE DirectionsMag