Skip to content

Japanese Weathernews will launch a satellite in September 2012 that will provide navigational services to ships travelling along the Russian and North American coasts in the Arctic Ocean, the newspaper Nikkei reported.

A 30 percent reduction in sea ice coverage over the last 30 years due to global warming has opened up the Arctic Ocean to shipping, including the shortest sea route between Europe and Asia.

The satellite will be launched from the Yasny launch base in Russia’s Orenburg region. The cost of development and launch will be about $1.7 million. It will circle the Earth 15 times a day.

The satellite will transmit images and information about sea ice in the Arctic Ocean. Weathernews will combine the information with available data on sea currents, weather and wave height to provide consumers with a finished product enabling safe navigation along the northern route.

“Even a one-week reduction in travel time will significantly reduce fuel costs and speed cargo delivery to the end point. Moreover, this route is much safer than other routes that expose ships to attack from Somali pirates,” Nikkei reported, citing a major Japanese shipping company.

Source: RIA Novosti and Spacedaily

New calibration satellite required to make accurate predictions, say scientists

A new paper published in Philosophical Transactions of the Royal Society A, explains weaknesses in our understanding of climate change and how we can fix them. These issues mean predictions vary wildly about how quickly temperatures will rise. This has serious implications for long term political and economic planning. The papers lead author is Dr Nigel Fox of The National Physical Laboratory, The UK’s National Measurement Institution.

The Earth’s climate is undoubtedly changing, but how fast and what the implications will be are unclear. Our most reliable models rely on data acquired through a range of complex measurements. Most of the important measurements – such as ice cover, cloud cover, sea levels and temperature, chlorophyll (oceans and land) and the radiation balance (incoming to outgoing energy) – must be taken from space, and for constraining and testing the forecast models, made over long timescales. This presents two major problems.

  • Firstly, we have to detect small changes in the levels of radiation or reflection from a background fluctuating as a result of natural variability. This requires measurements to be made on decadal timescales – beyond the life of any one mission, and thus demands not only high accuracy but also high confidence that measurements will be made in a consistent manner.
  • Secondly, although the space industry adheres to high levels of quality assurance during manufacture, satellites, particularly optical usually lose their calibration during the launch, and this drifts further over time. Similar ground based instruments would be regularly calibrated traceable to a primary standard to ensure confidence in the measurements. This is much harder in space.

The result is varying model forecasts. Estimates of global temperature increases by 2100, range from ~2-10◦C. Which of these is correct is important for making major decisions about mitigating and adapting to climate change: for instance how quickly are we likely to see serious and life threatening droughts in which part of the world; or if and when do we need to spend enormous amounts of money on a new Thames barrier. The forecasted change by all the models is very similar for many decades only deviating significantly towards the latter half of this century.

Dr Nigel Fox, head of Earth Observation and Climate at NPL, says: “Nowhere are we measuring with uncertainties anywhere close to what we need to understand climate change and allow us to constrain and test the models. Our current best measurement capabilities would require >30 yrs before we have any possibility of identifying which model matches observations and is most likely to be correct in its forecast of consequential potentially devastating impacts. The uncertainties needed to reduce this are more challenging than anything else we have to deal with in any other industrial application, by close to an order of magnitude. It is the duty of the science community to reduce this unacceptably large uncertainty by finding and delivering the necessary information, with the highest possible confidence, in the shortest possible time.”

The solution put forward by the paper is the TRUTHS (Traceable Radiometry Underpinning Terrestrial- and Helio- Studies) mission, a concept conceived and designed at NPL. This which would see a satellite launched into orbit with the ability to not only make very high accuracy measurements itself (a factor ten improvement) but also to calibrate and upgrade the performance of other Earth Observation (EO) satellites in space. In essence it becomes “NPL in Space”.

The TRUTHS satellite makes spectrally resolved measurements of incoming solar radiation and that reflected from the ground, with a footprint similar in size to half a rugby field. The unprecedented accuracy allows benchmark measurements to be made of key climate indicators such as: the amount of cloud, or albedo (Earth’s reflectance) or solar radiation, at a level which will allow differences in climate models to be detected in a decade (1/3 that of existing instruments). Its data will also enable improvements in our knowledge of climate and environmental processes such as aerosols, land cover change, pollution and the sequestration of carbon in forests.

However, not only will it provide its own comprehensive and climate critical data sets but can also facilitate an upgrade in performance of much of the world’s Earth observing systems as a whole, both satellite and ground data sets. By performing reference calibrations of other in-flight sensors through near simultaneous observations of the same target, it can transfer its calibration accuracy to them. Similarly its ability to make high accuracy corrections of atmospheric transmittance allow it to calibrate ground networks measuring changes at the surface e.g. flux towers and forests and other reference targets currently used by satellites such as snowfields of Antarctica, deserts, oceans and the Moon. In this way it can even back correct the calibration of sensors in-flight today.

TRUTHS will be the first satellite to have high accuracy traceability to SI units established in orbit. Its own measurements and in particular the calibration of other sensors will not only aid our understanding of climate change but also facilitate the establishment and growth of commercial climate and environmental services. One of the barriers to this markets growth is customer confidence in the results and long-term reliability of service. TRUTHS enable a fully interoperable global network of satellites and data with robust trustable guarantees of quality and performance.

The novelty of TRUTHS lies in its on-board calibration system. The instruments on the TRUTHS satellite will be calibrated directly against an on-board primary standard – an instrument called a CSAR (Cryogenic Solar Absolute Radiometer). This compares the heating effect of optical radiation with that of electrical power – transferring all the difficulties associated with existing space based optical measurements (drift, contamination, etc) to more stable electrical SI units. In effect, this mimicks the traceability chain carried out on the ground in orbit.

This would make climate measurements ten times more accurate and give us models on which we could make important decisions about the future.

The project, which would be led by NPL, is being considered by different organisations. The European Space Agency has recommended looking into ways to take it forward, possibly as a collaboration with other space agencies. NASA is also keen to collaborate formally.

Nigel concludes: “Taking this forward would be an excellent investment for the UK, or any other country which supports it. This is not only an effective way to address the problem of understanding climate change, but also an excellent opportunity for business. It would grow expertise in Earth Observation and showcase the UK’s leading space expertise – an industry which is growing by 10 per cent a year. It would also provide a platform to underpin some of the carbon trading which will be a big international business in the near future.”

###

The full reference for the paper is:
Phil. Trans. R. Soc. A (2011) 369, 4028-4063
doi:10.1098/rsta.2011.0246
The URL after publication will be: http://rsta.royalsocietypublishing.org/lookup/doi/10.1098/rsta.2011.0246
Nigel Fox delivered a lecture on this subject as part of NPL’s Celebrating Science lecture series, which can be viewed here: http://www.youtube.com/watch?v=BalCag7fQdE&feature=player_detailpage

More details can also be found at http://www.npl.co.uk/TRUTHS

About the National Physical Laboratory

The National Physical Laboratory (NPL) is the UK’s National Measurement Institute and one of the UK’s leading science facilities and research centres. It is a world-leading centre of excellence in developing and applying the most accurate standards, science and technology available.

NPL occupies a unique position as the UK’s National Measurement Institute and sits at the intersection between scientific discovery and real world application. Its expertise and original research have underpinned quality of life, innovation and competitiveness for UK citizens and business for more than a century.

PARIS — Astrium Services’ Geo-Information division announced Sept. 19 it will provide data from its optical and radar Earth observation satellites to the commission of the 27-nation European Union (EU) under a three-year contract valued at 17 million euros ($23.5 million).

The deal is part of the commission’s Global Monitoring for Environment and Security (GMES) program, which includes investment in European government-owned satellite systems as well as imagery purchases from commercial and non-European government satellites.

Under the contract, Astrium Services will provide data from its Spot 4 and Spot 5 optical satellites, and from its TerraSAR-X and TanDEM-X radar spacecraft, all of which are in orbit. The agreement also calls for Astrium to provide imagery from Taiwan’s Formosat-2 satellite, to which the company has access through its Spot Infoterra subsidiary.

The commission will also receive an undisclosed amount of data from the two Pleiades high-resolution optical satellites scheduled for launch in the coming months and financed mainly by the French government, and the medium-resolution Spot 6 and Spot 7 optical satellites that Astrium is building on its own to replace Spot 5.

Source

A NASA-led study has documented an unprecedented depletion of Earth’s protective ozone layer above the Arctic last winter and spring caused by an unusually prolonged period of extremely low temperatures in the stratosphere.

The study, published online Sunday, Oct. 2, in the journal Nature, finds the amount of ozone destroyed in the Arctic in 2011 was comparable to that seen in some years in the Antarctic, where an ozone “hole” has formed each spring since the mid-1980s. The stratospheric ozone layer, extending from about 10 to 20 miles (15 to 35 kilometers) above the surface, protects life on Earth from the sun’s harmful ultraviolet rays.

The Antarctic ozone hole forms when extremely cold conditions, common in the winter Antarctic stratosphere, trigger reactions that convert atmospheric chlorine from human-produced chemicals into forms that destroy ozone. The same ozone-loss processes occur each winter in the Arctic.

However, the generally warmer stratospheric conditions there limit the area affected and the time frame during which the chemical reactions occur, resulting in far less ozone loss in most years in the Arctic than in the Antarctic.

To investigate the 2011 Arctic ozone loss, scientists from 19 institutions in nine countries (United States, Germany, The Netherlands, Canada, Russia, Finland, Denmark, Japan and Spain) analyzed a comprehensive set of measurements. These included daily global observations of trace gases and clouds from NASA’s Aura and CALIPSO spacecraft; ozone measured by instrumented balloons; meteorological data and atmospheric models.

The scientists found that at some altitudes, the cold period in the Arctic lasted more than 30 days longer in 2011 than in any previously studied Arctic winter, leading to the unprecedented ozone loss. Further studies are needed to determine what factors caused the cold period to last so long.

“Day-to-day temperatures in the 2010-11 Arctic winter did not reach lower values than in previous cold Arctic winters,” said lead author Gloria Manney of NASA’s Jet Propulsion Laboratory in Pasadena, Calif., and the New Mexico Institute of Mining and Technology in Socorro.

“The difference from previous winters is that temperatures were low enough to produce ozone-destroying forms of chlorine for a much longer time. This implies that if winter Arctic stratospheric temperatures drop just slightly in the future, for example as a result of climate change, then severe Arctic ozone loss may occur more frequently.”

The 2011 Arctic ozone loss occurred over an area considerably smaller than that of the Antarctic ozone holes. This is because the Arctic polar vortex, a persistent large-scale cyclone within which the ozone loss takes place, was about 40 percent smaller than a typical Antarctic vortex.

While smaller and shorter-lived than its Antarctic counterpart, the Arctic polar vortex is more mobile, often moving over densely populated northern regions. Decreases in overhead ozone lead to increases in surface ultraviolet radiation, which are known to have adverse effects on humans and other life forms.

Although the total amount of Arctic ozone measured was much more than twice that typically seen in an Antarctic spring, the amount destroyed was comparable to that in some previous Antarctic ozone holes. This is because ozone levels at the beginning of Arctic winter are typically much greater than those at the beginning of Antarctic winter.

Manney said that without the 1989 Montreal Protocol, an international treaty limiting production of ozone-depleting substances, chlorine levels already would be so high that an Arctic ozone hole would form every spring. The long atmospheric lifetimes of ozone-depleting chemicals already in the atmosphere mean that Antarctic ozone holes, and the possibility of future severe Arctic ozone loss, will continue for decades.

“Our ability to quantify polar ozone loss and associated processes will be reduced in the future when NASA’s Aura and CALIPSO spacecraft, whose trace gas and cloud measurements were central to this study, reach the end of their operational lifetimes,” Manney said. “It is imperative that this capability be maintained if we are to reliably predict future ozone loss in a changing climate.”

Source

Arkon satellites


As of 2011. The launch was originally promised in 2009 and later in 2012-2013.

ARKON-2

The spacecraft is designed to perform round-the-clock global Earth surveillance with high resolution in radio-frequency ranges Х, L, P regardless of meteorological conditions.

ARKON-2M

The spacecraft is designed to perform the Earth surveillance in X-band. Application of on-board active-phase array assures high resolution and performance.

ARKON-VICTORIA

The spacecraft is designed to perform high-periodical surveillance providing imaging in panchromatic and in six spectral zones, including simultaneous imaging in any three zones.

SAR Systems Arkon-2 (Russia):

  • (a) The original Arkon-1 satellites were equipped with hi-res optical imagers.
  • (b) The two new Arkon-2 satellites are being built for Roskosmos by Lavochkin and will be equipped with SAR imagers.
  • © The SAR imagers are being built by Vega Radio Engineering Corp.
  • (d) These operate at 3 different wavelengths – in X, L and P bands.

Source

Other sources

The Lavochkin Science and Production Association (en ruso sólo)

“Vega Radio Engineering Corp.”:http://www.vega.su/ (en ruso sólo)

(3 October 2011) Small satellite manufacturer Surrey Satellite Technology Limited (SSTL) has today announced completion of the development phase of its new low-cost Synthetic Aperture Radar (SAR) satellite system. Called NovaSAR-S, the system offers customers coverage of any spot on Earth in all conditions – seeing through cloud cover across both day and night.

The 400-kg NovaSAR-S design combines SSTL’s flight-proven SSTL-300 platform with an innovative S-band SAR payload, developed in collaboration with Astrium Ltd, who will be responsible for supplying the payloads. The challenge has been accommodating the power and processing requirements within a small, low-cost satellite platform. NovaSAR-S’s 3m x 1m phased array antenna was developed by space-borne SAR specialists at Astrium Ltd, and has now been successfully trialled using an airborne demonstrator. The SSTL-300 platform hosting the payload is an adaption of SSTL’s very-high-resolution imaging NigeriaSat-2 mission, which was launched in August.

“Based on highly efficient S-band solid state amplifier technology, NovaSAR-S has been designed to provide detailed imaging performance for a variety of orbits,” said Luis Gomes of SSTL. “It offers space-based radar capability to customers who might not have considered it possible – all for the equivalent cost of a traditional low cost optical Earth observation mission.”

NovaSAR-S acquires medium resolution radar imagery of 6-30 m ground sample distance, depending on the viewing mode being employed. Its four viewing modes are optimised for a wide range of applications, including flood monitoring, agricultural crop assessment, forest monitoring, land cover classification, disaster management and maritime applications – notably ship tracking and oil spill detection.

Radar images reveal surface textures instead of reflected light. A radar satellite illuminates its target with a microwave beam then records the signal bouncing back. In addition, the satellite takes advantage of its rapid motion relative to Earth’s surface to build up an image with sharpened resolution equivalent to that of a much larger ‘synthetic aperture’ antenna.

Intended for equatorial or polar low-Earth orbits, NovaSAR-S offers high data throughput of at least one million square km per day, observing in a variety of polarisation combinations to add ‘colour’ and detail to acquisitions.

The system is designed to function either independently, or as part of a constellation. A trio of NovaSAR-S satellites could image any point on the globe every day, regardless of local weather or time of day.

SSTL’s unique approach to engineering and project management means it can deliver a complete NovaSAR-S mission into orbit within 24 months. The platform has been specifically sized to facilitate low-cost launch opportunities. SSTL can also offer support on ground segment architecture, data processing and archiving and knowledge transfer, based on customer needs.

About SSTL

Surrey Satellite Technology Limited (SSTL) is the world’s leading small satellite company, delivering operational space missions for a range of applications including Earth observation, science and communications. The Company designs, manufactures and operates high performance satellites and ground systems for a fraction of the price normally associated with space missions, with over 400 staff working on turnkey satellite platforms, space-proven satellite subsystems and optical instruments.

Since 1981 SSTL has built and launched 36 satellites, as well as providing training and development programmes, consultancy services and mission studies for ESA, NASA , international governments and commercial customers, through an innovative approach that is changing the economics of space.

Based in Guildford, UK, SSTL is owned by EADS Astrium NV.
www.sstl.co.uk

Notes to editor:
SSTL is this week exhibiting at the International Astronautical Congress (IAC) in Cape Town. To discuss SSTL’s new technologies with an expert visit stand 66. For more information on the IAC 2011 congress and exhibition visit www.iac2011.com.
This press release can be downloaded as a Word or Pdf document at the following url: http://www.sstl.co.uk/news-and-events
SSTL Contact:
Joelle Sykes, Surrey Satellite Technology Limited
Tel: +44 (0)1483 804243 Email: j.sykes@sstl.co.uk
Press Contact:
Robin Wolstenholme, bcm public relations
Tel: +44 (0)1306 882288 Email: r.wolstenholme@bcmpublicrelations.com


MDA To Provide Radar Information In Support Of Maritime Safety To The U.S. National Geospatial-Intelligence Agency

(23 September 2011) MacDonald, Dettwiler and Associates Ltd., a provider of essential information solutions, announced today that it has signed a multi-million dollar sole source delivery order under its Indefinite Delivery/Indefinite Quantity contract with the U.S. National Geospatial-Intelligence Agency.

MDA will provide radar information to be used in creating ice charts and for maritime surveillance to improve the safety of maritime navigation.

MDA Receives Amendments To Design Phase Of Canada’s RADARSAT Constellation Mission

(23 September 2011) MacDonald, Dettwiler and Associates Ltd., a provider of essential information solutions, announced today that it has received two contract amendments from the Canadian Space Agency totaling CA$ 9.18 million, for the Design Phase of the RADARSAT Constellation Mission.

The amendments continue the long lead procurement of parts and equipment needed for the Build Phase.

MDA Continues To Provide Critical Information To Monitor Surface Deformation In Mines

(22 September 2011) MacDonald, Dettwiler and Associates Ltd., a provider of essential information solutions, announced today that it has signed contracts in excess of CA$990,000 with Luossavaara-Kiirunavaara AB (LKAB) and a leading potash producer to provide critical information from RADARSAT-2 for monitoring surface deformation at and in the vicinity of their mines.

The contract with LKAB extends an existing agreement for a further three years.

MDA will apply advanced RADARSAT-2 technology to detect subtle surface changes, weather independent, over a broad area. Early detection of changes, such as subsidence, allows remedial actions to happen before problems escalate and impact the environmental safety or production activities of the mine. MDA will also provide analytical reports that compare the historical to the current situation. These reports are valuable support documents required by regulatory organizations.

About MDA

MDA provides advanced information solutions that capture and process vast amounts of data, produce essential information, and improve the decision making and operational performance of business and government organizations worldwide.

Focused on markets and customers with strong repeat business potential, MDA delivers a broad spectrum of information solutions, ranging from complex operational systems, to tailored information services, to electronic information products.

(source: MDA)

Satellites are helping to forecast the location of urban areas most affected during heat waves, helping planners to design cooler, more comfortable cities.

The temperature in densely urbanised areas can be several degrees higher than in nearby rural areas – a phenomenon known as the ‘urban heat island’ effect.

These ‘heat islands’ are particularly noticeable at night. During the day, cities accumulate solar radiation and release the energy after the Sun sets.

The negative effects of this increase in urban temperatures are multiple: health problems, higher energy demand, air pollution and water shortages.

At their final review held at ESA’s ESRIN site Frascati, Italy, the Urban Heat Islands and Urban Thermography team presented its findings on how remote sensing allows the continuous monitoring of thermal radiation emitted by urban surfaces.
Monitoring thermal radiation can help city planners to design more ‘liveable’ cities, assist civil protection authorities in taking adequate measures during heat waves and create maps of energy efficiency.
The project analysed trends in heat distribution over 10 European cities – Athens, Bari, Brussels, Budapest, Lisbon, London, Madrid, Paris, Seville and Thessaloniki – over the last 10 years, using multiple sensors.
Satellite sensors played a large role in collecting data, providing thermal-infrared measurements that scientists then used to improve urban climate and weather prediction models that can forecast heat waves. Two airborne campaigns and multiple ground sensors also contributed.

Forecast for Thessaloniki
In mid-August 2010, two strong heat islands were correctly forecast in Thessaloniki, Greece a day in advance. These heat islands lead to elevated temperatures and reduce thermal comfort.

At night, the most vulnerable areas of city retained high temperatures, above 31°C.

According to the forecast, there was a small area within the urban centre where both temperature and risk were expected to be low – corresponding to an open space with abundant vegetation cover.

Parks cool areas of Madrid
Thermal radiation maps of Madrid compared to soil surface maps showed that the night air temperature in parks or areas of vegetation is significantly cooler than other areas. This demonstrates the important role that green areas play in the overall thermal radiation and circulation of urban areas.

The new urban climate models show a muggy outlook for the future. During the summer of 2003, large portions of Europe were struck by a major heat wave. Paris was affected severely as the urban heat island effect prevented the city from cooling during the night, leading to thousands of heat-related deaths.

Modelling results suggest that, in the future, heat waves of this intensity and duration might occur every 3–4 years.

Source

(23 September 2011) ESA’s Envisat observation satellite yesterday completed its 50 000th circuit of Earth – travelling 2.25 billion km since its launch nearly a decade ago.

Envisat orbits our planet every 100 minutes, speeding along at more than seven kilometres per second.

Launched on 28 February 2002, the lorry-sized Envisat is the largest Earth observation satellite ever built.

It is also the world’s most complex environmental satellite, with ten different instruments studying Earth’s land, oceans and atmosphere.

They include the Medium Resolution Imaging Spectrometer, which captures images of ocean colour and land cover, like the above scene of New Zealand.

The Advanced Synthetic Aperture Radar can be used day or night because it can see through clouds and darkness. This is particularly useful over polar regions, which are prone to long periods of bad weather and extended darkness.

Trace gases in the troposphere and stratosphere are measured globally by the Sciamachy imaging spectrometer, yielding maps of air pollution such as carbon dioxide, methane and nitrogen dioxide.

Nitrogen dioxide – a mainly man-made gas – can cause lung damage and respiratory problems.

It also plays an important role in atmospheric chemistry because it leads to the production of ozone in the troposphere, the lowest part of the atmosphere.

Envisat’s Michelson Interferometer for Passive Atmospheric Sounding monitors gases and pollutants. As one of three atmospheric instruments, it can map the levels of more than 20 trace gases, including ozone, as well as the pollutants that attack ozone.

The sensor is particularly useful for tracking the ozone hole over the Antarctic.

Other Envisat instruments include the Radar Altimeter, which measures surface height to an accuracy of a few centimetres, and the Advanced Along-Track Scanning Radiometer, which records global ground and sea-surface temperatures.

(source: ESA)

The European Environment Agency is responsible for developing, implementing and evaluating environmental policy for 32 member countries in the European Union. As part of their mandate, they look closely at adapting environmental governance due to our rapidly changing world. V1 editor Matt Ball spoke with Jacqueline McGlade, executive director of this agency, at the recent Esri International User Conference where she gave a keynote about her agency’s lead role on sensors and systems, and about the growing need to mitigate impacts on the environment.

V1: Is the open data movement and urge for transparency that is sweeping across government making its way into the environmental monitoring community?

McGlade: In the last few years, we’ve really been pushing countries to come online voluntarily with information that isn’t part of their regulatory requirement, but that citizens really want to know about. Air quality is a classic one, where countries don’t have to report all the time on air quality, but they are doing it, and for cities in particular.

This idea that you do yourself less harm by reporting everything to the citizen is gradually building up. In other words, it’s better to be open and transparent about what’s really happening, than to hide it away or to create information aggregates and to smooth out all the anomalies.

V1: Often we hear that one of the barriers to data sharing is because those holding the data don’t want to expose the poor quality of their data. Is there still a fear about how the data is utilised?

McGlade: People are very aware of environmental conditions. There are things that no matter how well you try, you cannot overcome. Air quality is one of those. The Netherlands, for example, is often struggling to meet the air quality requirements, but it’s not for want of trying, it’s because of the wind-borne natural circulation of air across the troposphere that brings pollutants from the East (from Central Asia and from China). It just so happens that it gets dumped in that part of Europe.

You can have the cleanest and best policies on the ground, but if you get delivered a whole extra load of pollutants that is transported a long way from somewhere else, then it’s very difficult to cope with. You try to understand what the burden is, and what you have to assimilate, and you have to react to it. It’s not that it’s bad data, it’s telling you important things about the exposure issues related to your population. You might have to have other actions, such as urging people to stay indoors.

I think the preemptive and reactive response is much more relevant today than in the past. It is important to see all that the data shows, because otherwise you’re not credible.

V1: You spoke a good deal about mankind’s role in global change in your keynote address. Your agency’s role is very much in the impact detection business, and it continually strikes me how little understanding we have about our long-term impacts.

McGlade: I think that another role that we play, and it’s linked very much to our mission, centers around sustainability. We develop and deliver information that enables government and citizens to have a strategy that is sustainable. We are strong advocates of the precautionary principle, and have often invoked this principle, maybe sometimes against the wishes of industry and governments, but unfortunately in most cases we have proven to be right over time. The power of the precautionary principle is that it makes people think.

There is a huge responsibility for an agency like the one that I run, where you have people trusting you, and they trust us because we are very transparent, because we challenge countries, and we try as best as we can to validate data. We do it on the basis of no surprises. If we find fault with data and reports, we always go back and ask for these to be rechecked and verified.

The public looks to us, as does the legislature, to really put together the evidence that will help make good policy in the future and also to tell them which policies are not working. Some policies become almost irrelevant, and become superseded by other policy that is happening such as with transport, agriculture and climate change.

We have a lookout role looking at global trends, we do a lot of scenario and futures work, and we do this preemptive work to have precaution. We tend to see that there are absolute gaps in our knowledge, and we have flagged those, but look at the research community to fill those gaps.

V1: The movement into sensor development seems to be a bold move that requires investment. Is cost justification a part of the outreach effort?

McGlade: I would say that our Information Technology is very much at the cutting edge, and we try to encourage countries to really come with us, because we are seeking solutions that reduce costs. We have moved to the use and development of cheap sensors and to the cloud to reduce the operating cost of service. We are always looking out for the countries to make sure they can do the job in a cost-effective way.

When I first wanted to push sensors, I found as you might expect that the American continent had the bulk of sensor developers. We really didn’t have so many in the European setting. What I can see now is a very interesting phenomenon where there are fantastic sensor development, but they haven’t connected to the user. There are really smart approaches, with nanoscale sizes, but they haven’t found who is going to use them in many cases. On the other hand, there are many people that haven’t even thought of using a sensor, because they didn’t think it was possible.

While we can find one or two sensors that are fit for purpose, we need to think ourselves about shaping that whole industry. We have started doing that with contracts out for sensor development to ruggedize sensors, to prove the telecommunications package, and battery life, etc. I used to work a lot in the marine sector on instruments, and I think we can learn a lot from that technology development in many areas.

Sensors are the future. It is a way in which citizens, if properly priced, can participate. I’m after adding to what countries can do by having citizens bring in data and see it being used. It’s not just crowsdourcing, but professionalizing ways in which citizens can participate.

V1: Is the volume of data, and making sense of the data, a concern? How do we unlock all the data that has been collected, and that continually expands?

McGlade: Our next big step is to work with partners and to work on data tagging. If you have your own data, you will be ensured about your own intellectual property being attached to that, like a fingerprint. You’ll also be able to search for your data and see who is using it.

As a corollary, having approached some national institutes of science, I’m working toward creating the idea of a new career structure in academic research around people who do collect data. They don’t necessarily write scientific papers, but they form an incredibly important role by gathering information about the environment, whether it is monitoring data or one-off observations or collections.

I think we can unleash an enormous amount of knowledge that is currently sitting in shoe boxes and computers where there isn’t a home for it. I’m hoping to create a sort of marketplace for environmental information for sharing data about a place, and where you get credit if you collected the data.

V1: You’ve recently accepted an appointment to the Board of Directors of the Open Geospatial Consortium, and your discussion of the assurance of intellectual property ties into their work as well as their work on sensor standards. How important are standards for your vision?

McGlade: Sensor Web Enablement (SWE) is very important, and we need to always ensure that what we do with geospatial technology is SWE consistent. We need to bring data from sensors and automatically putit together with mapping, which requires these standards. What we want to see are very simple standards, and some of the OGC standards are very complicated. If I had one plea to that community it would be to simplify standards, otherwise people won’t adopt and deploy them and put them into practice.

You don’t want it to be so generic that it would lose relevance, but maybe we should be very careful about becoming too thematic and dogmatic. I do think standards are very important, to provide data exchange and interoperability, but I think we need to use them very carefully, otherwise we come obsessed about the standard as opposed to why we collect the data in the first place.

V1: On the mitigation side, is your organization involved much in issues of resilience, and bringing environments back to good health?

McGlade: We’re very involved. We’re writing the methods book for the United Nations on experimental accounts for ecosystems and ecosystem services that has just been approved in New York in June. The reason this is very important is because the system of national accounts—where we get our figures for GDP, employment figures, and so on—will now have a system on environmental accounting that will be linked. We are a very strong part of validating those figures. Eurostat partners and other statistical bodies have asked our agency to write the second part for experimental accounts for ecosystem services.

Through a process of linking to the system of national accounts, you can not so much put a value on ecosystem services, but you can put a cost of restoration. If you have a cost of restoration, it helps you identify from a dynamical sense, when and how interventions can cause the loss of resilience. So, there’s a combination of how the ecosystem functions (the services), and at what point through the cost benefit and the cost of restoration can you let it go to before you cross the threshold. That is the identification that we will be looking for as part of the accounting systems.

It’s a very fast-track piece of work that has to be completed by the Spring of 2013. We have a very strong sense that it will be used by a lot of countries, and I’m very hopeful that this will give us another layer of understanding on how the world works.

It is spatially explicit, and it is linked to the system of accounts that are spatial and to social and demographic mapping. It becomes a new accounting tool for both ecosystems and the environment.

V1: We’re very interested in the interface with the built world and the natural world, and have focused a good deal of our coverage on cities as it is a place where we can be most efficient. Is there a strong urban focus at your agency?

McGlade: This idea of resilience is very important for urban settings, because there is clearly critical mass when it comes to being effective around resources, facilities and services. There is also a sense in which the fragmentation patterns are equally important, such as urban spread and sprawl, which has its own spatial pattern. We’re looking at how that fragmentation process, which causes a loss of resilience, can be averted by anticipating how urban sprawl could occur and the implications. We are working on understanding how spreading sprawl in certain areas effects particular ecosystems, and putting together the right policies and decision making to avoid that.

It also leads to the idea that within the urban setting there is a certain resilience, such as how many people you pack into a certain area, and how much green space you provide, etc. I think that is an as yet untapped area, and it is why I believe that the three-dimensional tools are fascinating. They will enable architects to really play around with concepts such as maximum density, and also what that means for the resilience of facilities, family structures, and so on. I can see that we need to do a lot more on both urban patterns of living, but also how we intersperse and create different living environments within the urban setting.

V1: I live in a redevelopment community, and I’m constantly amazed at the capacity for natural resilience if we return a balance. For instance, daylighting a stream in my community has returned an abundance of nature to an area that was once covered by concrete.

McGlade: We started a wonderful project to return bees to the top of buildings, and it’s a social project that started with a young lad that wanted to help homeless and disenfranchised people. He worked with a hotel out by the Copenhagen airport to put the first hives on their roof. He trains disenfranchised people to look after bees. We made a small honey factory, and the honey from the beehives is harvested and sold. We harvested 60 litres of honey in one day.

What’s fascinating when you make cities think about bees is that they create these rivers of nectar. The idea that parks when connected will be able to run from one park to another. It’s really triggering what parks can do to support different pollinators. As you know, in many parts of the world there is an artificial pollination where bees are delivered and then moved on. What happens if you have a healthy bee population in a city is that the productivity in the gardens goes up by as much as 30 percent, because there is far more pollination success. People’s gardens become much more productive if there is a healthy bee an pollinator production.

It makes the whole city become more resilient, because you have this reinforcement of natural process going on. It doesn’t matter that it’s urban. The other thing that people don’t understand about bees is that they have a way to detect and reject bees that are contaminated or polluted, but they do allow bees that are slightly contaminated. During the regurgitation process the honey gets processed and stored in additional bees, so that the honey that comes out is absolutely pure.

The one thing they can’t deal well with though is pesticide. That’s why the honey in the countryside is often so much more contaminated than in the cities, because pesticides aren’t used in city parks. The honey from bees in cities is really clean.

We had a great breakthrough at a restaurant in Copenhagen called Noma that is supposed to be the world’s best restaurant. The chef there has led this whole movement on foraging for food. They are going to sell our honey, because they see it as the purest and best honey that they can have. What’s really nice is that there are five or six socially disadvantaged people who have a proper living harvesting our honey. It’s a great story, and I think it links to how cities can take on another capacity.

V1: It’s really interesting how our cities are now looking more closely at livability, walkability and quality of life issues in the planning process.

McGlade: How we can work together on projects like the bees has a real transformative place in our lives. Especially for people that live on their own, which they often do in cities. Most of the successes that we’ve seen, particularly in Scandinavia, is for places where people can come together to grow things in communal gardens. Even in very northern temperatures where it’s very difficult to grow things, having greenhouses annexed to buildings where people can commonly grow tomatoes, has a completely positive effect for people that live in those flats. Cities can be pretty good places to live, not always, but they can be.

It’s primarily about how people are brought together, and creating the infrastructure to bring people together are more the norm in some countries. In Finland for instance, they create these common areas. I think it’s learning by experience, which is why our agency puts a lot of store in the success stories, because you can look at those and understand how easy it is to do these things yourself. It’s about telling a story so that others can have inspiration, and hope that it gets replicated.

V1: With all these little positive steps, it’s also important to think with the seriousness that your talk started out with in terms of the impacts of climate change. You related that we’re in a dire situation, but that we can turn it around.

McGlade: We are in deep trouble, and we have to pay attention, even though we’re in the midst of a financial crisis. You only have to go to Greenland once to understand, and I go there a lot because we have an accord to support the government. Every time I go, our guide that goes out onto the ice every day, relates the speed in which the ice is melting. We have to adapt, we can still mitigate and we must mitigate. We’re going to have to learn a lot about how to live on a planet where things are very different.

Copenhagen for example was pretty much wiped out just a few weeks ago because more rain than ever fell in a very short span of time. The agency was flooded, and the whole city was flooded. We were within just a couple inches from the high voltage lines being flooded, which would have meant a very big explosion. These things are happening all the time now all over Europe and the world.

In the end, it is about people being helped to know what they rely on. If I were a business, I would want to know where my closest electric substation is, and if it is protected from flooding. In response, I’d hope to know that if I located my business in a safe area, and I took care of my infrastructure that my insurance premium would be different. I think that’s where the business sector needs to be more upbeat, and paying attention, to manage these impacts proactively. That’s why my agency is reaching out more to the small business community, letting them know that it is their duty and care to be informed. We will help you, we’ll make the data publicly available and transparent, we’ll tell you about the infrastructure, but then you need to act on it in a way that you’ve thought about what climate change will mean.

Written by Matt Ball Monday, 15 August 2011
Source