Skip to content

Solar Energy. SolarSAT: solar radiation analysis, dimensioning, design, cost estimation, control and management of a photovoltaic plant in a single website

SolarSAT is a new integrated system, fully customizable, for companies that design, install and sell photovoltaic plants, companies and bodies in charge of their maintenance, inverter manufacturers, manufacturers and distributors of photovoltaic modules, agencies and bodies that promote renewable energy systems.

By accessing both archive and real time satellite data the SolarSAT system offers two types of functionalities that can be distributed either as independent or as combined services: PVPlanner and PV-Controller.

SolarSAT PVPlanner

PV Planner SolarSAT PVPlanner is a web service that allows to evaluate the feasibility and convenience of a new plant and gives the cost/profit estimation for it. To the potential customer who wishes to consider the opportunity to setup a photovoltaic plant, PVPlanner will at first provide the expected productivity, taking into account the typical solar energy available in the chosen locality. Such evaluation is made possible by the availability of solar irradiation hystorical satellite data of the last 10 or 20 years.

With PVPlanner the potential customer, or the designer himself, can dimension and design the plant according to the required energy needs or the desired nominal power, combining in the best way the various components (modules, inverters, …) chosen among those made available by the company supplying the service. PVPlanner then takes into account the features and prices of the selected components, the fiscal parameters imposed by the current law, the maintenance costs, the incentives and savings linked to the producible energy. At the end PVPlanner provides a series of economic parameters that allow to evaluate the investment convenience (e.g. Net Present Value, Internal Rate of Return, breakeven point, etc.).

SolarSAT PVController

SolarSAT PVController is an integrated hardware-software system that allows the remote control and management via web of photovoltaic plants of any size.

PVController keeps under control the plant efficiency and provides the comparison between the produced hourly energy and the expected one. The latter is calculated after the effective solar irradiance derived from either satellite data acquired in real time or from local measurement by dedicated sensors. In case there is a significant difference between produced and producible energies, the people in charge of plant maintenance are warned by e-mail or SMS and can then decide for a possible intervention. PV Controller

To the PVController service is associated a family of dataloggers and sensors of the SolarSAT series. They can be adapted to plants of any kind and size and interfaced to any inverter. With PVController a whole PV park can be managed with a single, simple and effective web interface.

Installing the dataloggers is simple and usually does not require any configuration by the operator, as this is done via web-GPRS. Moreover, thanks to the GPRS data link, there is no need for a fixed phone line at the plant. The various datalogger models can be used for both satellite monitoring and for monitoring by local sensors, to measure the plant physical parameters.

Capitalizing on its background of environmental monitoring and manufacturing of spectroradiometers for solar radiation measurement, Flyby has selected those components which are at the same time reliable, with high performances and easy to install.

The SolarSAT integrated system is available in several configurations and is fully customizable for the company or Body that is willing to distribute it to its customers.

Click here to visit the SolarSAT site

SOURCE FlyBy

On the nights of the 2nd, 16th, 17th and 18th March of 2009, EUROSENSE executed four aerial flights covering the city of Antwerp and 20 surrounding municipalities for the creation of a thermographic map. Covering an area of approx. 700 km², it is one of the largest thermographic projects in the context of energy efficiency of buildings ever executed in Europe!

The thermographic map, foreseen to be ready in early summer of 2009, will give an indication to the inhabitants of Antwerp and surrounding municipalities regarding the state of their roof insulation. By means of an easy-to-follow interpretation key and a corresponding interpretation legend, the inhabitants themselves can check if potential problems exist in the roof insulation of their building. Based on this information and supported eventually through subventions from the city or community, concrete actions can be undertaken to improve their roof insulation. This way it is an excellent mean to support the local and national energy efficiency policies.

Local, regional and national press (newspapers & television) actively reported on the project. Also the involvement of hundreds of volunteers (through questionnaires & temperature measurements during the night) was highlighted extensively.

EUROSENSE is for already several decades very active in different types of thermographic projects. In the context of energy efficiency of buildings, it is currently involved in several operational projects for several European cities/regions (like e.g. Brussels, Antwerp and Genk in Belgium and three areas in France). Furthermore, Eurosense is active in the continuous enhancement of the service towards specific customer needs and use of future sensors, within the DUE (Data User Element) project “Urban Heat Island and Urban Thermography”, funded by ESA

For more information on these services, we refer to the following contact information:
EUROSENSE Belfotop N.V Belgium
Address: Nerviërslaan 54, B-1780 Wemmel, BELGIUM
Phone: +32 (0)2 460 70 00
Fax: +32 (0)2 460 49 58
Website: http://www.eurosense.com

Via the following link, you can find a movie report from the Flemish Television (VRT) regarding the ground measurements: http://www.deredactie.be/cm/de.redactie/mediatheek/1.490419?mode=popupplayer

Source EUROSENSE Belfotop

(April) By Prof. Mike Jackson, David Schell and Prof. D.R. Fraser Taylor. DirectionsMag

1. Introduction

Over the last 15 years, geospatial technologies have evolved and dramatically converged with a variety of formal information technology disciplines. The discrete, even esoteric pursuits we referred to as GIS, Earth imaging, GPS, AM/FM, location based services and navigation systems are no longer discrete. Now they are of a piece, they “talk to one another” and interact freely in a fertile communications environment of wireless broadband, portable cell phone/computers, sensor-webs and, of course, the dynamically evolving environment of the World Wide Web. In fact, geodata is rapidly becoming a conventional and pervasively familiar data type seen at once to underpin and significantly recharacterize the digital world, with broad implications for both technology and society.

The inescapable fact is that the “geospatial technology revolution” will continue at a fast pace which will require accommodation across multiple domains of research, education and government. The generally accepted idea that scientific advances create disruptive technologies which in turn cause “creative destruction” of commercial and industrial enterprises applies as well across a broad spectrum of institutional entities representing diverse cultural interests. Competent leaders understand that organizations must adapt continually to such change because it is unavoidable. Routinely, they learn from the change they observe occurring in the world and make it their business to reorganize and redirect resources and people to respond creatively, responsibly and aggressively to the most critical of the observable resulting challenges. At the present time, a similarly focused approach to leadership must be emphasized in both academia and government, just as it is now being addressed in industry.

Never in the history of public and private sector geospatial processing has there been a more urgent need for leadership that understands change. The most obvious challenge is that expansion of human population and industry has brought humanity to a point of converging crises, and diverse industry stakeholders see geospatial technology in this context as a critical factor in enabling humanity to avoid disaster.

Another obvious challenge is that the market forces that drive the evolution of technology do not meaningfully, or in any disciplined way, take into full account the needs of science and social processes. Leaders in research, education and government, therefore, share responsibility for understanding industrial development and the state of commercial applications if they wish to define more useful outcomes than those which would otherwise occur by default – outcomes which would in fact be inevitable should leading thinkers and policy planners persist in defending traditional institutional and economic intellectual practices instead of opening themselves to the evolved research and policy requirements of the increasingly complex modern world.

2. Deconstructing and reconstructing geospatial academia

Because our societal problems are so critical, we must build the best possible foundation for advancing the capabilities and wide use of geospatial technologies. Recent dramatic progress in the development of geospatial interoperability enables us to model increasingly complex aspects of our environment and social processes, and this significantly amplifies the power of geographic thinking. To make the most of this progress, it is timely to take a fresh and objective look at the positioning of geospatial issues in both academic and government contexts, and to evaluate critically whether present organizational and departmental structures adequately serve our actual critical needs.

It would be a useful academic exercise to write a history of academic GIS, putting it in the context of the evolution of academic geography and in the larger context of the evolution of technology, academia and science itself, but we don’t in fact need to go into such detail to understand our choices.

To begin with the obvious, let us first note that it has become conventional to locate the teaching of GIS and Earth imaging in GIS, geography, geomatics engineering or natural resource departments, and in so doing to focus curricular approaches initially on the acquisition of expertise in operating on spatial data largely by means of desktop GIS applications. Students learn about scanning and digitizing and converting digital spatial data between formats, systems and software. There are many other elements in the curriculum, of course, and hopefully as students broaden their exposure to geographic computing they learn to think in terms of the fundamental and pervasive relevance of spatial information.

This approach has worked well and has been successful to the extent that it has been driven by techniques able to address the expanding, but carefully limited, scope of issues requiring the use of diverse forms of location data and the integration of spatial information into mainstream information and communication technology processes. However, the authors believe it is increasingly important for university decision makers to appreciate the impact on geospatial research and education of interoperability standards organizations such as the Open Geospatial Consortium and ISO’s TC211 committee on “Geographic Information and Geomatics,” as well as the various platform standards organizations that have been instrumental in creating the enabling environment for information interoperability in the World Wide Web.

A significant consequence of this “geospatial interoperability” in the context of the Web is that increasingly the geodata and geoprocessing resources one requires to address a variety of information integration challenges need not be imported and stored permanently on one’s desktop, but may reside on a remote computer across the Web and yet be accessed as though attached to the local system. Service oriented architectures and Web-based geoprocessing means that the work of researchers and students increasingly need not involve the often inefficient import of whole “files” of data from which the limited subset of interest must be manually extracted. Their searches for data are now potentially more like searches using an Earth browser, in which one’s query specifies a region of interest and a remote server returns an “answer”. Using free Web services and having no GIS education, millions of people in fact are now able to perform operations that were very recently known and used only in the closed community of GIS, and this progress ushers in a new age of information integration and the harmonization of the wide variety of the world’s diverse collections of geospatial information.

We will see, however, that such miracles of geospatial interoperability cannot be adequately exploited without our addressing the conventional restrictions imposed by our scientific disciplines themselves. Unfortunately, work with geospatial information is constrained by traditional thinking and the constraints are reinforced by the architecture of GIS technologies designed within the limitations of prevailing conceptual strategies. As a consequence, through the entire history of geospatial computing, “received wisdom” focused on geospatial software as “packages,” or proprietary applications, and geospatial data was universally viewed in the traditional file structure dictated by its earliest incarnation in the age of half-inch magnetic tape. Most unfortunately, data collection was conceived in terms of discrete projects yielding data for specific uses.

With such a heritage, one might, with justification, ask how, with our legacy of dated technologies, could the habits of “thinking in a box” which have characterized the last three decades of academic and government usage be otherwise?” It is, of course, because of today’s network-based information technology paradigm that we are able to examine alternatives and, in particular, to conceptualize ways that our powers of thought might expand. Interoperability enables us to begin thinking about data as a vast Web resource whose components are easily discoverable, assessable, accessible and combinable in analysis despite their diverse file formats, coordinate reference systems, originally intended uses, heterogenous data models, and naming schema. The million springs of this resource will evolve independently in response to real world conditions and constantly changing local and thematic information requirements. But they will be there, constantly accumulating, for us and our automated tools to explore and mine.

Understanding the Earth as a system requires that our scientific information systems be conceptualized in such a way that they are capable of interfacing with one another and ultimately able to function as a single unified system, one that is able to support more effective interdisciplinary collaboration. Such synergy – the true integration of diverse research data and the building of one study upon another – has not usually been possible in the modern IT context, much less realized, because of non-interoperability. Data are usually collected, used once, and forgotten – not catalogued and actively maintained online for discovery and ready access by others. Studies are published, but it is far less common to publish the data associated with them, even though this data represents an additive resource that could be used with other data in more comprehensive studies, multidisciplinary and longitudinal research projects, or decision-support efforts.

Most geospatial data never even enter a GIS. Sensors and field studies, for example, may employ location parameters, but data collected by sensors or in the context of field studies are seldom published online with machine-readable schemas documenting other parameters and data associated with the location parameters. As envisioned above, if both parameters and data were published in this way, Web-based systems could find and aggregate such data into rich and informative data layers in a GIS.

Open, internationally adopted XML-based standards now make it possible for geodata, geoprocessing operations, sensors, data reduction methods, modeling services, transducer parameters and sensor data repositories to be made publishable, discoverable, accessible and useable in the environment of the Web. The range of such resources increases as the standards effort continues to gain participants from new technology and application domains. Geospatial interoperability standards have in fact removed barriers imposed by “last generation” technical incompatibilities between diverse types of systems, such as GIS and Earth imaging, diverse vendor-proprietary systems, and diverse custom projects, and in so doing, standards have stimulated researchers to imagine innovative spatial relationships that broaden the scope of their research.

Not surprisingly, science, like any other human pursuit, clings conservatively to its formalities, and yet science by no means remains static. One of the most important causes of change in the structure of scientific thought occurs because of intellectual paradigm shifts which are both the cause and effect of technological evolution – evolution that is driven by industrial necessity as well as by restless human inventiveness. When one considers glass lenses, calculus, chronometers, X-rays, semiconductors, raster imaging devices, GIS, etc., one sees that the history of science is closely tied to the history of new tools for knowledge. The new tools provide new ways of gathering knowledge and new ways of looking.

Looking back, we tend to see the coordinated genius and logic of scientific progress, whereas in the present, we are more likely to see diversity and controversy which sometimes results in resistance to the transformative use of new technologies in science. This is partly because the old ways of gathering knowledge provide not only a familiar way of looking at reality but also a comfortable foundation for traditional funding arrangements and institutional stability. Nor should it be overlooked that professional standing is important to scientists, causing many to wish their research to be seen in conventional terms, as a contribution, not an affront to established institutions and practices.

In this context, academics concerned with geospatial issues face adoption of upsetting new technological advances represented by the emergence of a new generation of standards in the geospatial technology domain – a direct result of the recent ubiquitous developments in the fundamental information and communication technologies that drive advances in distributed computing and service architectures. Researchers are challenged to appreciate the fact that interoperability standards are positioned to bring important benefits to the formal sciences that advance tools such as GIS, remote sensing and modeling; the natural sciences that use these tools to understand geospatial features and phenomena; and the non-scientific communities that use geospatial technologies.

The theory most relevant to the Web that underpins the argument of this paper is Metcalf’s Law. This basic tenet in information science states that the value of a node on a communications network is proportional (by some exponent) to the number of potential users of that node in the network or the total number of nodes. George Gilder theorized “the square of the number of connected users,” but the actual exponent depends on many factors. This applies, for example, to the network of telephones, the network of Web servers, and the network of fax machines. It applies more generally, outside of technology, to, for example, the networks of people who can converse about genetics or who have money to spend.

Information networks depend on nodes that connect through common interfaces and encodings, like people speaking a common language. Interoperability has been documented to be a growing concern of government IT managers, and by the same trend, the huge COMDEX computer products tradeshow has been replaced by a show called INTEROP. This is because we are involved in a communications revolution on top of a computing revolution. A key social innovation of our era is the establishment of consensus standards organizations that manage the processes for agreeing on such frameworks, interfaces and encodings, which are usually released as free and open specifications for use by any developer. Interoperability, therefore, is the key to a connected world.

Academia, and government, as we shall see, need to be concerned with how Metcalf’s Law applies to the network of online scientific or municipal geospatial data sets, sensors, Web services and data schemas that are accompanied by standard metadata and that can be discovered and accessed through Web catalogs and servers that implement open standard interfaces and encodings. Academia, in particular, needs also to be concerned with the connections between devices and technologies and the build-out of useful permutations among these to meet the dramatically increasing need for integrating geospatial information resources into the research techniques of a variety of traditional scientific disciplines.

3. Open Access to Geospatial Data

Academics and those who fund their research should be acutely interested in the proposition that geospatial data developed for scientific purposes can be, in a Web environment, a resource whose value increases with the number of researchers who use it. Geography has always been interdisciplinary and GIS has always been a tool for combining data from different sources. All geodata refers to some aspect of the same Earth. If researchers properly document, archive and publish their data and methodologies using available Web technologies, standards and best practices, many benefits accrue:

a) Improved opportunities for cross-disciplinary and longitudinal studies. This is key. Geography is inherently cross-disciplinary, and a main underlying theme in modern science is the relatedness of phenomena. We need interdisciplinary and longitudinal studies to help us address critical problems such as resource depletion, climate change, population, pollution, disaster management and adequate provision of food, water, shelter and energy.
b) Improved verifiability of results. Science requires that experiments be replicable, and therefore experimental data must be available. In the context of the standards regime described above, details of methods and semantics become more accessible for review. In this new environment, GIS and remote sensing studies will be less vulnerable to the accusation that pretty maps are being used to cover up poor experimental design or biased reasoning. Climate science made more verifiable will be harder to discredit or ignore.
c) Improved Web-based data publish/search capabilities. These make much more data available and enable more efficient assessment of data. In most cases, literature searches will be a much less efficient way to discover data than direct data searches using online catalogs, because researchers looking for previously collected data often do not know which bodies of literature to search. For example, data collected by an ornithologist may include temperature readings that would be valuable to a hydrologist.
d) Improved ability to re-use or repurpose data for new investigations, reducing redundant data collection, increasing the value of data and creating opportunities for value-added data enhancement. The value of data increases when the data can be re-used or when they can be collected with the intention of serving multiple research purposes. This benefits the public and those who fund scientific research, and brings a greater return in terms of the general application and use of science. It benefits data owners, who may be able to charge for use of the data they generate or who may add value to it to provide more useful and saleable products.
e) Improved opportunities to collaboratively plan data collection/publishing efforts to serve multiple defined and undefined uses.
f) Improved rigor and transparency regarding data collection methods, processing methods and data semantics. The SensorML standard makes the processing chain transparent. Generally agreed upon conventions for describing data and methods (such as data reduction) contribute to clarity and rigor.
g) Improved ability to discover spatial relationships. Researchers will surely browse data and become curious about patterns they notice. Robust data descriptions and quick access to the data will enable more rapid exploration of hypothetical relationships.
h) Improved ability to characterize, in a standardized human-readable and machine-readable way, the parameters of sensors, sensor systems and sensor-integrated processing chains (including human interventions). This enables useful unification of many kinds of observations, including those that yield a term rather than a number.
i) Improved ability to “fuse” in-situ measurements with data from scanning sensors. This bridges a historical divide between research communities that have focused mainly on either unmediated raw spatial-temporal data or spatial-temporal data that is the result of a complex processing chain.
j) Improved ability to “chain” Web services for data reduction and analysis, and improved ability to introduce data into computer models that use multiple inputs from remote data stores or real-time data feeds.
k) Improved ability to encode sensor data in the ISO Feature Model (ISO 19109). This is just one of the ways in which the OGC Sensor Web Enablement (SWE) standards will enable scientists to leverage off-the-shelf, standards-based tools for data modeling and management, just as they currently use commercial spreadsheet and database programs.
l) Improved societal and institutional return on investment of research dollars, and improved ability of research funding institutions to do due diligence and policy development.
m) More efficient scientific debate and accelerated pace of scientific discovery, as automation and new institutional arrangements reduce the amount of time spent on data management, freeing researchers’ time for more creative work and more communication with other scientists.

In the bioscience world, a similar vision has been put forward and steps toward it taken by the Science Commons), founded by Lawrence Lessig. The Science Commons champions “Open Access” (OA), and OA was a key enabler of the Human Genome Project.

Geospatial academics worldwide ought to note also the significance to the research community taken by the recently installed Obama administration in the US, which has resulted in the appointment as co-chairs of the President’s Council of Advisors on Science and Technology Harold Varmus, co-founder of the Public Library of Science and former director of the US-NIH, and Eric Lander, a lead researcher in the Human Genome Project and founding director of the Broad Institute (a joint MIT and Harvard institute which addresses the effectiveness of “a new, collaborative model of science focused on transforming medicine)”. Varmus is one of the most high-profile advocates of Open Access and the role of government in providing open access, and both the Human Genome Project and the Broad Institute are practitioners of open data. In this context, is it not then obvious and provocative to consider the potential importance to geospatial information science of recognizing the GEOSS (Global Earth Observation System of Systems), within the US federal government as well as the world scientific community, to be an initiative that is similar to and as important as the Human Genome project?

4. Academia can accelerate and guide geospatial technology evolution.

The authors support a movement toward Open Access in all the sciences that produce and use geospatial data. But in addition to Open Access, the authors seek a concerted global effort to study and evolve interoperability to make geospatial data and services a more important part of the rapidly evolving ICT environment, which is to say, to build a better connection between the real world and the digital world.

Geospatial educators should track and contribute to this progress in geospatial research as it unfolds, because it is essential to their future success. Students’ exposure to new geospatial technologies will both prepare them for a wide variety of professional paths that involve geospatial information and inspire educators to evolve new branches of geospatial research and study.

It is inevitable that a certain inertia exists among busy academics who use and teach GIS in moving beyond familiar tools to embrace Web-based geoprocessing, yet the traditional curricular emphasis will have to shift as young people steeped in the Web 2.0 environments become impatient with less powerful and diverse tools and knowledge techniques. But even with Web-aware students, there is a disconnect that we need to recognize and accommodate.

Whether the technology is automobile engines, cosmetics or photogrammetry, most users have little interest in how their technologies were developed. That’s understandable and probably healthy, as our memories, school years and attention are limited. Civilization evolves through specialization and interdependence. The implication for geospatial academia and the commercial geo-technology community is that we have reached the point where the objectives of the two pursuits have generically diverged, defining a basic academic distinction between the formal study of geography as a cultural discipline and the scientific study of geospatial information technology as a science. We believe that it is of the utmost importance to appreciate this distinction in order to more precisely and productively define these fields in both research and education.

Geospatial software vendors have done an excellent job of bringing the technologies and the standards framework to their present level, but the authors believe that universities have both an opportunity and a responsibility to build on what GIS departments and industry have provided. Geographers will provide important input and vendors will ultimately deliver the new capabilities, but universities and science funding agencies and foundations need to recognize that the defined curricula of geographers and the business models of their software vendors do not motivate the progress that needs to occur. Qualified academic leaders need to put forward thoughtful, aggressive proposals for advancing geospatial information technology as it sits within the new location-aware pervasive computing digital economy. Proposed projects and programs can easily be positioned as delivering job creation, business creation, and national competitive advantage. It should also be argued, of course, that these programs will arm researchers and agencies with powerful means to address societal problems and leverage for society the public dollars spent on creating data that, through Open Access, become a shared resource. These are not empty words. They frame our responsibility.

The authors believe that new research centers must be created and existing mathematics, computer science, geography and human factor departments must create new subspecialties or new programs to explore and invent new information systems focused on terrestrial features and phenomena, space, and time. Computer science per se is too abstract to subsume the work we describe. The new discipline needs to be anchored to geodesy, positioning and location-aware technology, and it needs to co-evolve with the open standards framework for geospatial interoperability. The new discipline is a discipline for the study and advancement of processing that is bound to the physical and social environment that we live our daily lives in. The elements of the new “science of geospatial interoperability” are:

* Geodesy and positioning, to establish an accurate empirical reference system for our not-perfectly-spherical Earth. All our understanding of earth observations and processes depends on being able to measure the earth with precision and accuracy, and to precisely locate things and their trajectories in space and time. The science of geodesy is more complex and difficult than most people imagine, but this complexity has been mastered in the GIS field.
* Semantics and ontologies involve the names, meanings, and relationships of entities within a framework of discourse. Geospatial semantics further incorporate measurements and spatial/temporal relationships between objects. Semantics and ontologies are critical to interoperability because our digital data models are representations of many different human systems of symbols and signs, each data model reflecting the perspective of a different discipline, profession, jurisdiction, or individual researcher. Semantic variety contributes to non-interoperability, but Web technologies are increasingly providing tools for automatic translation between data models, which opens the door to data sharing on a very broad scale. These tools and related standards will also provide necessary information about data quality, suitability and ownership. The Geospatial Semantic Web is poised to develop rapidly on the platform of the World Wide Web Consortium’s Semantic Web.
* High performance pervasive computing makes full use of advanced fixed and mobile computing networks, facilitating real time data acquisition, modeling and simulation. The continuing rapid advance of computing technology is timely, because geospatial interoperability is particularly important for the hardest problems – those that deal with the multiple interrelatedness of spatially distributed events and phenomena in an asynchronous and complex world. Hardware and software constraints and the non-interoperability of data models have shaped and limited our perceptions of reality as it appears through the lens of legacy GIS and remote sensing. We need to aggressively expand our modeling of features, phenomena and relationships in time and space as computation and bandwidth constraints diminish, as real-time access to thousands of constantly updated data sources and services becomes practical, and as Web-accessible sensor nets proliferate. The systems envisioned can’t function unless very significant resources reside in the network and are accessible through standard interfaces and encodings. This requires new work on database and caching design, mobile ad-hoc networks, smart placement of high bandwidth connections, and judicious use of XML-based schemas and web services. All of these can be developed effectively in research settings focused on the new science of geospatial interoperability.
* User interface management and workflow – the conceptualization and management of human interaction with the digital world – comprise the fourth element of Geospatial Interoperability. This area is potentially the most powerful driver of innovation in the area of interoperability research, because it is the actual workshop of perception and human imagination as applied to the tools of science. Especially in the case of complex applications, the design of a digital environment within which a human is enabled to perceive and operate on data significantly influences the quality of the human-computer interaction. Through advances in bandwidth and processing speeds, it is possible in an increasing number of cases to model interactively. As in parametric design in CAD, or in process modeling where changes in the flow of events can be observed as a function of interactive tweaking of process parameters, it should be possible to manipulate vast frameworks of proven or hypothetical geospatial relationships with the same eye/hand/mind skill employed in using a pencil. Importantly, it is not industry’s developers but rather academics in fields such as cognition, perception, semantics, media studies and semiotics who will produce deeper understandings of how our relationships to the world are altered by these evolving technologies.

These four elements define a domain of exploration, invention, and cooperation that is essential if we are to develop more efficient modes of observing and modeling our natural environment and our complex interactions with it. These technologies will shape our new modes of thinking about complex geospatial phenomena whilst being molded within an understanding of the ethical constraints and social acceptability that must at all times be considered and responded to within liberal democratic societies.

The new Centre for Geospatial Science (CGS) at the University of Nottingham in the UK and the Geomatics and Cartographic Research Centre at Carleton University in Ottawa, Canada exemplify the new approach that must be taken.

CGS was established in November 2005 with the objectives of responding to the new forces shaping geospatial science and interoperability. It was framed around the building of a multi-disciplinary team of geographers, geodesists, computer scientists, mathematicians and human factors specialists. It directs its research at providing an enabling infrastructure and body of geospatial theory for other disciplines to integrate or build on. The development of generic platform components within a standards-based architecture has been a core goal and this has led naturally to collaborative research programmes across an ever increasing range of applications areas. Research objectives address the challenges of spatial data conflation, association, and modeling data analysis and prediction, semantic interoperability and communications of results typically within the context of mobile location-based services and with the constraints of small, mass-consumer devices such as mobile phones. The end user of the CGS research is seen as being both other specialists and the general public who, armed with powerful GPS enabled mobile phones with high-quality cameras and high-bandwidth data communications, are increasingly as much the providers of location and geo-referenced data as they are the consumer of it.

The Geomatics and Cartographic Research Centre at Carleton University is contributing to the science of geospatial interoperability in a number of ways. The theory and practice of cybercartography and the open source and open standards cybercartographic atlas framework are integrating geospatial information from a wide variety of online sources to produce a number of cybercartographic atlases on a variety of topics of interest to society. A cybercartographic atlas is a metaphor for all kinds of qualitative and quantitative information linked through location. An example is the Cybercartographic Atlas of Antarctica (see “atlases”) which integrates information from a number of scientific disciplines in new ways and which draws on databases and sensors in a number of diverse locations to present information to the user in real time in a seamless manner using OGC standards and specifications. The work of the Centre involves a number of disciplines including, among others, cognitive science, human-computer interaction, geography, cartography, computer science, English, music and psychology. Special attention is given to the development of effective user interfaces and to the archiving of geospatial data. Cybercartography is multi-modal and multi-sensory which poses additional challenges and opportunities for the creation, management and use of geospatial data. The interdisciplinary process, using geospatial data, is creating transdisciplinary knowledge which is giving new insights into some of the pressing problems of society in both a national and international context. Different narratives on the same issue are presented in ways which engage users and the distinction between the creators and users of geospatial data is becoming increasingly blurred as users can create their own narratives in a Web 2.0 environment.

5. Reinventing government’s Spatial Data Infrastructure Role

In the late 1980s and early 1990s, leaders in the US community of government users of GIS and remote sensing began developing the idea of the “National Spatial Data Infrastructure,” which is now a fixture in the vocabulary of geospatial technology users and policy-makers around the world. Corollary to the development of the NSDI concept in the US was the creation in 1990 of the interagency Federal Geographic Data Committee (FGDC) and then in 1994 the formation of the Open Geospatial Consortium, Inc. (OGC), a public private partnership dedicated to the development of consensus IT standards.

The FGDC, like similar committees in other nations (and committees in NATO and the UN) tackled geodata coordination, that is, data and metadata standardization, among offices and agencies. In parallel to these widespread data coordination activities, the OGC developed the open interface and encoding standards needed to enable interoperability among the heterogenous processing systems marketed by the diverse geospatial product vendors. As the Web became the dominant distributed computing platform, most of the OGC’s work focused on the Web-based interoperability of geospatial services.

In 1990, the FGDC, with its symbolic positioning as the focus for US Federal Government geospatial processing, was housed in the US Department of the Interior (DOI). Nineteen years later, after years of technology development and experience with the realities of coordinating the government’s vastly diverse spatial information management requirements across nearly all departmental boundaries, it seems relevant to re-examine the organizational positioning of this vital function. Such a re-examination is particularly urgent in light of geoprocessing’s increased relevance to comprehensive, multidisciplinary government programs, and of equal importance, in recognition of the increasingly integral definition of standards-based “Interoperability Science” as distinct from the compilation of discipline-specific data repositories.

This is not just a US issue. Most countries are poised to generate vast quantities of new geodata. The technologies for producing and using geodata have become so inexpensive and cost-effective that the budgetary constraints that faced government agencies and companies are increasingly not an issue. Perhaps of even greater significance, however, is the fact that the generators of the data will also increasingly be the mass consumer. Through the use of what will be ubiquitous GPS enabled mobile phones (supplemented with other positioning technology for indoor as well as outdoor coverage) the public will increasingly provide dense networks of location-tagged data and imagery as a by-product of their natural mobility and consumption of location-based services. The issue is particularly urgent for nations that are embarking on major SDI projects based-on institutional data capture models. Throughout the lifecycles of these major capital infrastructure programmes, new needs and new suppliers of data will emerge that are probably impossible to predict more than a year or two ahead and that will demand a level of infrastructural flexibility and interoperability that has not previously been considered.

Putting geospatial resources on the Web and making those resources as discoverable as possible and as useful as possible for as many purposes as possible is what defines an SDI. Today, as in 1990, literally billions of dollars are spent each year in the US on redundant data collection. Billions more are wasted because needed data is not available, or at least, not discoverable. When sewer departments, utilities, bus companies, foresters, registries of deeds, conservation commissions, architects, and other non-academic as well as academic data producers publish their spatial data online using standards-based methods and appropriate access methods, their data becomes a highly leveraged public asset.

Even governments that do not face the prospect of having to pay off trillions of dollars in deficits are well advised, in the light of the new Web-based information infrastructure, to review and revise policies, procedures, and laws to make their NSDIs as valuable as possible.

A review in any nation involves looking at where in the government bureaucracy NSDI governance belongs. In the US, the USGS and other Department of Interior agencies were early users of geospatial technologies, along with the Department of Defense, but the technology is now used almost universally, and it has grown beyond those departments’ domains and purviews. It is now instructive to observe that the obstacles that may in fact prevent maturation of the NSDI seem to be characterized less by the traditional concerns of geospatial practitioners in such agencies than by the realities and regulation of both national and global commercial practice. Developers of data in the public and private sectors resist publishing and sharing their data for many good reasons as well as some bad reasons. Governments need to focus on the wide range of legal and commercial issues associated with geospatial technology and the collection, distribution and use of geospatial data. These issues include liability, privacy, national security and intellectual property rights (IPR) in spatial data, and they are complicated by the issue of data provenance, that is, an accounting of the source, history, integrity, currency and reliability of component data sets that comprise a composite data layer. The number of government, consumer and business applications for spatial technology is rapidly growing, and the issues involved have much more to do with money and law than natural resources.

The geospatial technology industry, in cooperation with its government and private sector customers, as well as its academic partners, needs to provide special technologies for IPR, security and access control if NSDIs are to mature. Policies need to account for different information communities’ different conceptual abstractions, classification schemes, data models, processing approaches and recording methods and the much more anarchic and unconstrained / quality controlled sources of data from crowd-sourcing and voluntary geographic information developments. The issues are complicated and they apply to law enforcement, civil protection and emergency response as much as they apply to hydrology and agriculture or the consumption of location services by the man-in-the-street. In government, as in the Earth sciences, all things are connected, partly because jurisdictions in different functional areas overlap in their geographic areas. Geospatial information and technology issues thus deserve attention at a higher level of government than in the past.

This is particularly important if federal governments plan on spending money on base layer data development as part of their current efforts to stimulate the economy. An overriding goal of public policy is to empower, not burden future generations. We thus want to avoid investing in outmoded centralized and proprietary data warehouses whose structure, function and cost are at odds with the open, distributed architecture model that the whole ITC industry is embracing.

Also, it is very important to consider that there are some elements of Spatial Law that are not well defined by any national establishment, and it would be of benefit to all to begin addressing these shortcomings through well-informed legislation and international negotiation instead of through the courts or through international crisis management. The point is, that governments should be thinking strategically about how geospatial technology should be positioned, through policy and law, to contribute as fully as possible to the social welfare and international cooperation, and not allow itself to be trapped in outmoded institutional approaches that are no longer capable of providing even the illusion of practical governance.

6. Conclusion

Just as the significance of the Web could not be widely appreciated until the necessary Web standards had been in place for a few years, we believe that all the domains of geospatial technology and application are about to experience a remarkable transformation due to global adoption of open standard geospatial Web service interfaces and encodings. The rich “network effects” made possible by chained Web services, GRID computing, sensor webs, geospatial semantics, and online catalogs for data, services and schemas hold great promise, but there is no guarantee that this promise will be fulfilled. The question is, can we find the institutional will – in academia and government – to make changes that enable societies around the world to make the most of these new tools?

SOURCE DirectionsMag

LandBase™ provides classification down to individual property level

Leicester, UK / Munich, Germany – March 5, 2008 – Infoterra Ltd, a leader in the provision of geospatial products and services, uses Definiens’ market leading object-based image analysis to produce land cover classifications at the highest resolution, down to individual property level.

Infoterra’s new land cover mapping product, LandBase™ demonstrates the trend among geospatial data providers towards the provision of value-added products and services. Products such as LandBase are becoming an increasingly important differentiator in a competitive marketplace. Definiens technology is a key value-adding enabler in this process.

“Definiens software was an essential tool to our land cover classification process. The enterprise architecture allows us to develop solutions within our R&D team which can be easily adjusted in the production process. Together with the robust client-server architecture, the delivery of high quality value-added products is much faster” stated Andrew Tewkesbury, Product Development, Infoterra Ltd. “The semi-automated solution we developed for LandBase provides us with a successful model for delivery of future value-added products.”

“Infoterra’s LandBase product demonstrates what can be accomplished with an effective deployment of the Definiens Enterprise Image Intelligence® Suite, which is specifically designed to enable the development of value-added products,” states Ralph Humberg, VP Earth Sciences at Definiens. “We believe that LandBase will provide an enormous advantage for Infoterra in the UK.”

Definiens Enterprise Image Intelligence® Suite -http://www.definiens.com/image-analysis-software-for-earth-sciences_45_7_9.html
Infoterra’s LandBase™ – http://www.infoterra.co.uk/data_landbase.php

About Infoterra Ltd
Infoterra Ltd is a leading provider of geographic information products and services. Its portfolio of geographic information solutions includes airborne and satellite data acquisition, geo-information creation, database management and outsourced hosting. Infoterra provides geospatial knowledge to companies worldwide to help them make informed decisions. The company has major customers in government, insurance, utilities, engineering, defence and oil, gas & mineral exploration.
Infoterra Ltd, which incorporates Imass Ltd, is part of the Infoterra Group, which also comprises companies in France, Germany, Hungary and Spain.

About Infoterra Group
Infoterra, wholly owned by Europe’s leading space company Astrium, is a leading provider of geo-information products and services for managing the development, environment and security of our changing world. With entities in the United Kingdom, Germany, France, Spain and Hungary, its customers include international corporations, governments and authorities around the globe, and organisations such as the European Commission (EC) and the European Space Agency (ESA).

Infoterra holds the exclusive commercial exploitation rights for the high-resolution radar satellite TerraSAR-X, and plays a leading role in geo-information services within the European GMES initiative of the EC and ESA. Infoterra, together with Spot Image, form the Earth Observation Division of Astrium Services.

Definiens in Earth Sciences
Definiens enables organizations involved in Earth Sciences to quickly extract accurate geo-information from any kind of remote sensing imagery. The company assists data collectors, service providers and end users in integrating earth observation and remote sensing data to generate accurate GIS-ready information. Definiens’ intelligent feature extraction capabilities accelerate mapping, change detection and object recognition–delivering standardized and reproducible image analysis results.

About Definiens
Definiens is the number one Enterprise Image Intelligence company for analyzing and interpreting images on every scale, from microscopic cell structures to satellite images. The Definiens Cognition Network Technology®, developed by Nobel laureate Prof. Gerd Binnig and his team, is an advanced and robust context-based technology designed to fulfill the image analysis requirements of the Medical, Life and Earth Sciences markets. The technology is modeled on the powerful human cognitive perception processes to extract intelligence from images. Definiens provides organizations with faster image analysis results, allowing deeper insights enabling better business decisions. The company is headquartered in Munich, Germany and has offices throughout the United States. Further information is available at www.definiens.com.

Definiens, Definiens Cellenger, Definiens Cognition Network Technology, Definiens eCognition, Enterprise Image Intelligence and Understanding Images are trademarks or registered trademarks of Definiens.

Press Contacts
Infoterra Ltd
Sarah Haslam
+44 (0)116 273 2300

Definiens
Eva Tietz, Manager Corporate Communications
etietz@definiens.com

SOURCE DEFINIENS

After several delays, the European Space Agency has just put GOCE satellite into orbit. Critical Software worked with ESA in the development of the GOCE satellite which will measure the gravity field and represent models of the Earth with extremely high accuracy and spatial resolution, allowing us to significantly improve our knowledge about solid Earth physics and climate research. The module tests on the software onboard the satellite were performed by Critical.

Coimbra, 19th March 2009 – After several delays, the European Space Agency has just put GOCE satellite into orbit. Critical Software worked with ESA in the development of the GOCE satellite which will measure the gravity field and represent models of the Earth with extremely high accuracy and spatial resolution, allowing us to significantly improve our knowledge about solid Earth physics and climate research. The module tests on the software onboard the satellite were performed by Critical.

The GOCE (Gravity field and steady-state Ocean Circulation Explorer) satellite is part of ESA’s fleet of Earth Explorers, which will perform research missions for the Living Planet Programme. These missions focus on the key components of our planet, such as the geosphere, hydrosphere, cryosphere, atmosphere and biosphere. The aim is to make space-based global measurements to advance our understanding of the interactions between these components and the impact that human activity is having on natural Earth processes. The GOCE mission will therefore improve our knowledge of Earth interior processes such as earthquakes and volcanism but also of ocean circulation, which plays a crucial role in energy exchanges around the globe, and sea-level change.

The GOCE satellite will improve knowledge about the impact that human activity is having on Earth

The development of the GOCE satellite was carried out by an industrial consortium of 45 companies, distributed over 13 European countries, which encompasses Critical Software, requested to support the GOCE Platform Application software Module Test. “As a software testing expert company, Critical has once again demonstrated its abilities to perform software verification and validation, an area in which it has earn respect from other Space industry players in Europe”, says Paulo Guedes, Business Development Manager for Aeronautics and Defence markets from Critical Software.

About Critical Software

Critical is a leading provider of solutions, services and technologies for mission and business critical information systems. From offices in Portugal, USA, UK, Romania and Brazil our global market focus and multinational culture have taken us to penetrate in markets around the globe, enabled us to adapt to the most demanding contexts and made it possible to better understand customer needs and business requirements. Working within a diverse array of sectors we deliver high-quality, expert staff who help our customers achieve and sustain success in all their IT ventures. Time and again, our customers value our honesty, in-depth knowledge and partnership approach, which together ensure excellent results. Being vendor-independent, enables us to accurately select and recommend the most suitable and costeffective third-party technologies to each customer specific requirements. The company operates a quality management system certified to CMMI® Level3, ISO 9001:2000 Tick-IT, ISO 15504, NATO AQAP 150, AQAP 2120 and EN9100.

SOURCE CRITICAL

…Chelys SRRS (Satellite Rapid Response System)…

An extension has been added to the Chelys SRRS (Satellite Rapid Response System), making it possible to automatically orthorectify images. The SRRS is a multimission real-time data processing and quality control system that is able to simultaneously process data coming from several different missions. Even after the orthorectification addition, the system still performs in real time, as the entire correction process takes less than 20 seconds per product. Once the image has been geometrically corrected, its projection can be mapped with absolute precision using a DEM (Digital Elevation Model) to simulate the third dimension. The result is a new, more interactive way to navigate satellite images.
More details and examples can be found at the following link

Other Chelys IT Solutions

SRRS – Satellite Rapid Response System

The Satellite Rapid Response System (SRRS) is an advanced real time processing system for satellite data. One of the main features lies on the his scalability and his capability of managing different kinds of mission. Ideally devoted to the generation of images as closely as possible to the colours found in nature, the SRRS includes advanced functionnalities for analysis and data quality control.
details…

Meridian – Meris Display and Analysis tool

Meridian is a visualization tool to display Meris products. Main features of the tool are the fast rendering of all levels images (including Level 0) and the advanced memory management h3. methods used allow to view all Meris products images in a very fast way, it is possible to load huge amounts of products without taking care of the memory installed in the computer.
details…

Envisat Control Panel

The first product developed by Chelys for the Envisat project; this tool was intended to support the Integration and Validation activities on the Payload Data Segment of Envisat satellite (PDS).
details…

OCD Management Tool

The OCD Management Tool provides the user with a graphical Interface (Human-Machine Interface, HMI) to ease the management and modification of Envisat CMC and GSP facility configuration files (OCD).

MAS – Multimission Analysis System

The Multimission Analysis System is a tool that provides users with a graphical interface to show time-based events. The tool allows to analyse any kind of log/trace files in order to extract sensible information that can be displayed in several ways. For any information it’s possible to apply general purpose filters or to develop data specific filters to be included in the application. Data can be retrieved also remotely through FTP protocol.

Ground Segments Maintenance Site

Developed to support the integration, testing and maintenance phase of the PDS (Payload Data Segment) of Envisat, the tool evolved and is now the core of the maintenance and test group activities, supporting the configuration management of hardware/software items and all the project-related documentation.
details…

Weather Conditions

Developed to perform tests with SOAP interfaces, this small utility, distributed as freeware, is useful to monitor the weather conditions of a specified location on the earth. The application also generates a set of statistics through the sampling of temperature, pressure and humidity.
details…

SOURCE CHELYS

CONTACT
CHELYS srl
Via Erasmo Gattamelata, 3, 00176 Rome – ITALY
Luca Mellano, Director
Tel: +39 06 78 400 24 /Mob: +39 34 72 720 398 / Fax: +39 06 62 275 199

…investigate the new dynamic map applications of water quality products…

WAQSSWater Quality Service System: a service for coastal management provided by Brockmann Consult

The service portfolio consists of 3 different products categories:
-Standard products
-Special products
-Special requests

Standard products are basic products of the North Sea and Baltic Sea, definded by Brockmann Consult and offered to every user. The special products and requests are specially designed to the needs of the different end users. The requirements of the users are specified between the users and the WAQS team at Brockmann Consult and are determined in so called Service Level Agreements (SLAs) between both parties.

More info at WAQSS

SOURCE Brockmann Consult

… (Feb, 2009. Spain) UAV services, workshop contributions…

On the 24th of January, 2009, a violent wind storm battered the North of Spain, with gusts of wind of over 120 km/h in some areas according to METEOCAT, the Catalan Metereological Service.

In particular, Catalonia had some regions severely affected, being the most dramatic incident in Sant Boi de Llobregat, where a sports centre collapsed killing four children.

The outcome of the storm left houses cut off with hundreds of fallen trees over roads and tracks, developments and quarters without electricity supply for weeks and several damages in personal properties.

In order to have a thorough report about storm damages and to have a full overview of the extent of the works to be carry out by local administrations, Aurensis was entrusted a study of the area with UAV Services.

Our UAV Services

Aurensis’ UAV Services are Earth Observation services carried out with small aircrafts flying without a pilot and operated via remote terrestrial control. These unmanned aerial vehicles obtain high-resolution images, video and thermal information and are especially devoted to survey emergencies such as fires and storm damages, locate people in areas with a difficult access, survey civil works and infrastructure and to make cartography of delimited areas.

The results are real-time information, as images of the area can be obtained while the aircraft is flying in order to evaluate the situation, as well as georeferenced data and georeferenced video can be delivered in very short time, and orthophotos and cartography in just a week.

In case of fires or situations where the temperature differential is a key factor, a thermal map can be obtained to locate burning points, different types of vegetal cover, illegal spills in coasts and rivers and self-combustion in rubbish dumps.

Results of UAV Services in Catalonia

The result of Aurensis’ flight over the area affected by the wind storm in Catalonia produced a high-resolution video and georeferenced photographies where the damage caused by the wind could be easily assessed: roads and tracks cut by fallen trees were identified, as well as the municipal and private properties, greenhouses, crops, etc., affected.

With just one click, the responsible of recovery programmes could locate exactly on a map the area where the photography or video was taken, having a full view from the air of the scope of the damage.

Later on, Aurensis was ordered a fast orthophoto to accurately measure the area so recovery programmes plans could be optimized.

Benefits of UAV Services

The simplicity of the UAV (Unmanned Aerial Vehicles) imply that the aircrafts are undetected, are environmentally respectful, can be operated more frequently as they are easily launched, and can perform dangerous missions as they do not jeopardize the life of any pilot and, as a whole, are more economical than other traditional services.

Aurensis’ UAV Service is especially useful for managing emergencies in specific areas, as immediacy in these cases is a must. Photographies and video are taken while the flight is being carried out, with a full view of the area from the air. As well, orthophotos and cartography are obtained in very short time, with the exact location on a map of critical aspects to be surveyed.

Videos taken through UAV can also be reused for different purposes. In case of civil works for example, public administrations can advertise their civil work programmes with theses services.

About Aurensis

Aurensis, is a leading European company in the sector of New Technologies applied to Territory. Aurensis offers global solutions for fixed or mobile asset management: from digital cartography through satellite imagery, photogrammetric flights or UAV to the development of computerized solutions combining GIS, location and telecommunications services and the provision of satellite broadband telecommunication services.

Auresis, is part of the Telespazio Group, one of the world’s leading satellite operators. The group has more than 1,700 employees, sites in 25 countries, 4 spatial centres and a turnover of 394 mil € in 2007.

www.aurensis.com

Media Contacts
marketing@aurensis.com
Tel: +34 935 830 200

More Aurensis News at

Projects in 2008-2009: Environmental Research Funders Forum Mapping Study of Coordination Bodies and Department of Energy and Climate Change Review of GECC


Environmental Research Funders Forum Mapping Study of Coordination Bodies (2008-2009)

Assimila is undertaking a review of coordination bodies in environmental science. The purpose of the review is to identify gaps and overlaps in coordination in order that ERFF may target its activities more effectively.

The Environmental Research Funders Forum (ERFF) is an initiative of 19 major UK public-sector funders of environmental research in order to maximize the coherence and effectiveness of UK funding for environmental research, monitoring or observation, policy development and training. Assimila is supporting ERFF in a study of coordination bodies for environmental science in the UK. The study has identified and categorized a large number of coordination bodies and consulted with the major ones to establish a clear view of their scope, objectives, membership and methods of operation.

The study will include an analysis of gaps and overlaps which will enable ERFF to target its own activities more effectively.

More information can be found on the ERFF WWW site

Department of Energy and Climate Change Review of GECC (2008-2009)

Assimila is undertaking a review of the Global Environmental Change Committee (GECC) for DECC. The review is based on a consultation of GECC members and stakeholders and will identify how well the GECC has been meeting its aims and objectives, recommending changes where appropriate

Source Assimila

Scientists and industrials from the Surrey Satellite Technology Ltd, the University of Leicester and EADS Astrium have developed a air quality mesure device that can play the role of a pollution radar over cities. Dr Roland Leigh from the University of Leicester details the principle for Scitizen.

What technology does the pollution radar rely on?

CityScan, the pollution radar uses novel imaging spectrometers and high-speed CCD detectors to analyse scattered sunlight to an unprecedented level of detail. The software which turns this spectral information into 3D gas concentrations contains a number of complex algorithm components developed at the University of Leicester to correctly apportion gas concentrations to given locations. The imaging spectrometer has been designed by Surrey Satellite Technologies Ltd, and is a novel concentric design, originally proposed for the space industry, recently demonstrated in a lab at the University of Leicester, and now being integrated into this operational air quality monitoring system. The spectrometers separate the sunlight into individual wavelengths. The intensity of each wavelength, in each viewing direction is recorded on CCD detectors. These intensities are used to derive 3D gas concentrations. The high speed detector systems being used in this instrument are the result of 10 years of development at the University of Leicester in ground-based and space-borne detector electronics.

What are its advantages over currently available air quality measuring devices?

Current air quality monitoring devices measure the concentration of gases such as NO2 at a given location, usually by sucking air into an instrument. Such techniques provide accurate results for a given specific location, but do not in themselves accurately represent the widely variable concentrations over large urban areas. Usually models are employed in an attempt to fill in the gaps between these point measurements in a city. CityScan probes the whole atmosphere above a city, and takes measurements from every spot, allowing a direct measurement of concentrations of gases such as nitrogen dioxide. These measurements will allow us to directly measure emissions over particular areas, follow the transport of those emissions across the city, and to a certain extent, track the chemical changes in the air as it moves.

You are currently developing two new instruments to create 3D maps of atmospheric gases. To what extent may all these new tools contribute to a public health policy in cities?

These two instruments will provide information on emissions, transport and chemistry of air quality gases within an urban environment. For policy-makers, accurate maps of emissions across a city are a vital tool in decision making, coupled with an understanding of how these emissions behave downwind, both chemically and dynamically. The measurements CityScan offers can provide many extra pieces of information, to help policy makers identify and prioritise emission sources which may have an impact on air quality and the health of residents in particular areas of the city. Such information is key to decision makers when evaluating the societal benefit of the emitting processes against the potential harm of the resultant pollution.

Roland Leigh is a researcher at the Earth Observation Science, Space Research Centre, Department of Physics & Astronomy, University of Leicester.

Source Scitizen