Skip to content

Europe must, as a matter of priority, establish a space situational awareness system to monitor orbiting debris, solar radiation and asteroids, according to a “reinforced” space policy set out by the European Commission.

Brussels is anxious to reduce the €332 million ($475 million) annual damage done to European assets by radiation and debris collisions, and to mitigate the unquantifiable but potentially massive problems caused by complete loss of a satellite or Earth impact of an asteroid.

Other key priorities are completion of the Galileo navigation constellation and implementation of the GMES Earth monitoring programme.

Underlining its belief in the social and economic value of space – and the need to preserve Europe’s independent access to Earth orbit and beyond – the EC also sees as a priority the support of research and development to increase European technological non-dependence to ensure that innovation benefits both citizens and non-space sectors. Communication satellites will be key to this effort.

Other priorities include working with European Union member states to identify EU-level action that can be taken in space exploration and to develop a space industrial policy with member states and the European Space Agency.

Internationally, the EC wants to continue working with the USA and Russia, and initiate dialogue with other space-faring nations, such as China. It says space should be an “integral part of the EU’s external policy, in particular to the benefit of Africa”.

EC vice-president for industry and entrepreneurship Antonio Tajani stresses: “Space activities create highly skilled jobs, innovation, new commercial opportunities and improve citizens’ well-being and security.”

By Dan Thisdell
Source

(April 8, 2011)

Privacy

Personal ‘geo data’ as sensitive as private genetic information Daily) Is the geospatial community losing the debate on privacy without realizing it?

No More German Street View? Earth blog) Have not confirmed whether this is true.

Swiss Court Restricts Google Street View Mag)

Personal ‘geo data’ as sensitive as private genetic information, expert argues Daily). Is location really as sensitive as our DNA?

CA legislator introduces state ‘Do Not Track Bill’ (PC World) Bill modeled after federal bill introduced in February, which would regulate ‘precise geolocation information’. (You may remember that CA state court recently ruled that a zip code is personally identifiable information).

Tell-all telephone (Zeit online) This story about a German politician who asked for record of all location information carrier had collected on him has been receiving a lot of media attention.

The risks of digital mapping (the Muse)

Mobile-app companies receive subpoenas (WSJ) Some companies had been collecting users’ location without notice and/or permission.

Spatial Data Infrastructures

Best Practices for Local Governments (GeoData Policy) Two-page report was prepared by National Geospatial Advisory Committee (U.S.

Public Sector Mapping Agreement starts (England and Wales) The Ordnance Survey is doing some innovative work with respect to spatial data licensing agreements.

Law Enforcement/National Security

Justice Department opposes privacy reforms (cnet) Justice Department takes position that warrant not needed for ‘less precise’ location information from cell tower and should only be used for GPS or multilateration-derived location information.

Va Court Upholds Conviction Based Upon Warrantless GPS USE (NBC Washington)

Minn to initiate pilot program to track drivers’ mileage (Twin Cities Daily Pilot) Note proposed legislation to protect consumers’ data.

Smart Grid

Weather data: another smart grid opportunity? (Electricity Policy)

Crowd-sourcing

Academics Join Relief Efforts Around the World as Crisis Mappers# (The Chronicle of Higher Education)

GPS

LightSquared’s Planned Wireless Internet Network Threatens GPS (Huffington Post) This issue continues to receive a good deal of attention.

Miscellaneous

NOAA CIO Tackles Big Data (Information Week)

Geospatial Information Law
(hokumonline.com) Law enacted in Indonesia. Note discussion of data accuracy and dissemination.

Who Wants To Not Get Stabbed? Interesting ‘game’ combining Google Street View and crime data. Raises all sorts of possibilities.

(April 2011) Understanding changes in the global atmospheric concentration of carbon dioxide (CO2) is as much a political imperative as a scientific one. But are these demands realistic with the instruments available?

For both scientists and policy-makers, one of the biggest challenges is finding out where and how CO2 is released and absorbed by natural and human-influenced land and ocean processes.

Inventories of emissions from each country suggest that approximately two thirds of human-induced greenhouse gas (GHG) emissions arise from fossil-fuel emissions, and one third from agriculture, forestry and other changes in how land is used. This includes around 18 per cent from deforestation.

Based on accurate measurements at ground level, we know that on average less than half of the CO2 emitted by human activities remains in the atmosphere. The rest is apparently being taken up by the world’s oceans, plants and soils.

However, there are substantial uncertainties associated with these natural components of the carbon cycle – where is carbon being taken up, how much, and will this situation last? Without understanding these natural fluxes – the difference between the amount emitted and the amount absorbed – we cannot reliably predict the future climate, nor can we establish a robust emission verification scheme. Knowing how much each country is emitting is critical if we are to reduce global GHG emissions.

The main focus of activities under the UN Framework Convention on Climate Change (UNFCCC) has been to manage GHG emissions from the production and use of energy. Emissions from agriculture, forestry and other land uses have traditionally been the poor relation, receiving less attention because of the temporary nature and complexity of the way carbon is stored in terrestrial ecosystems – a process known as ‘sequestration’. However, the international policy framework aims to increase the sequestration of CO2 by encouraging beneficial changes in land use.

Central to these efforts is the need to measure, report and verify carbon emissions and sequestration from land-use projects and policies in different countries, so we can compile national emissions inventories and comply with emission reduction market mechanisms. Currently, emissions are calculated by ‘inputs’, based on how the land is being used, rather than from measuring levels of emissions in the atmosphere.
Carbon dioxide network. Based on data provided by the Global Monitoring Division of the US National Oceanic and Atmospheric Administration’s Earth System Research Laboratory.

The current network of carbon dioxide measurements available. Data provided by the Global Monitoring Division of the US National Oceanic and Atmospheric Administration’s Earth System Research Laboratory. Click image to enlarge.

The importance of reducing emissions from deforestation has been largely ignored until recently, when the REDD (Reducing Emissions from Degradation and Deforestation) agenda took centre stage in the run-up to the UNFCCC meeting in Cancun in November 2010.

The debate focuses on how to value carbon stored in forests, offering financial incentives to slow down deforestation. But this depends on being able to make robust measurements to ensure countries are following the rules.

Recent analysis of satellite observations has reported that annual deforestation rates over the Amazon have dropped repeatedly since a peak in 2004. Since that peak the mean annual reduction is approximately 4000km2, equating to an annual reduction of 117 million tonnes of CO2.

Scientists and policy-makers both need to measure CO2. But how good is our current observing system? The existing network of carbon-cycle measurements, taken with flasks, continuous sensors, tall towers and aircraft flights, provides an excellent measure of how global carbon fluxes vary over time, in different places and from season to season.

However, these measurements are mostly taken over North America, Europe and the remote oceans, whereas vital, vulnerable carbon stores like the tropics and the boreal zone – the subpolar forests and tundra that cover much of Canada and Russia – are essentially unobserved.

Shifting our strategy

This means these regions remain poorly understood, and we urgently need dense and frequent observations to improve our understanding of the global carbon cycle. There is a compelling need for a radical shift in our measurement strategy, and satellite observations will be important in making that shift.

The technology needed to measure the tiny variations in atmospheric CO2 caused by differences in its uptake and release at ground level has progressed rapidly, with sensors already flying on a number of satellites. Compared to the ground-based network, these space-based observations monitor the global atmosphere frequently (though they are less precise).

Current space-borne instruments work by measuring either thermal emission or reflected sunlight from the Earth’s surface and atmosphere, in the infrared (IR) parts of the electromagnetic spectrum where atmospheric CO2 is absorbed.

Sensors measuring thermal infrared (IR) emission, such as NASA’s AIRS and TES, and the European IASI instruments, are most sensitive to changes in CO2 in the mid-upper troposphere (the lowest 6-10km of Earth’s atmosphere). But they are less sensitive nearer the surface, so they provide limited information about how the land is emitting and absorbing CO2.

In contrast, sensors measuring reflected sunlight in the shorter IR wavelengths are most sensitive to CO2 in the lower troposphere. Three sensors of this kind have been launched so far: the European Space Agency (ESA) SCIAMACHY in 2002, Japan’s GOSAT in 2009 and NASA OCO 2009, which failed to reach orbit due to a problem with the launch vehicle.

SCIAMACHY provided the first tantalising glimpse of the total amount of CO2 in the troposphere – known as the tropospheric column – and how this varies around the globe, but the instrument was not optimised for CO2.

GOSAT was the first successfully-launched sensor dedicated to measure CO2 in the lower troposphere. Initial data are promising and improvements are still being made. OCO-2, a near exact copy of OCO, is scheduled for launch in 2013, and several European concepts are being considered that will provide the necessary sensitivity to CO2 near the surface. They will let us estimate carbon release and uptake over areas smaller than a continent and at weekly or monthly timescales.

The ongoing efforts to measure CO2 accurately in the lower atmosphere reflect the technical challenge of measuring minute changes in CO2. The variations of interest are of the order of a few parts per million (ppm – roughly equivalent to 1 per cent of the tropospheric column), against a background level of about 385ppm; consequently CO2 measurements from satellites need to be extremely precise.

It is these small variations in CO2 that we can use, via computer models of the exchange of gases between land and atmosphere, and of how gases move around the atmosphere, to estimate how much CO2 is being absorbed and emitted across a whole region.

Whilst current and planned instruments are a major step forward for improving models of the carbon cycle, the data they provide do not yet satisfy politicians’ needs. International legislation requires us to measure annual CO2 emissions to within one tonne in order to calculate emission inventories by states, or to calculate emissions from individual emission- reduction projects.

Nevertheless, these developments do open the door to independent assessments of the impact on global emissions of regional changes in land use, such as deforestation in Southeast Asia, central Africa and South America.

To address concurrently the science and policy questions that require knowledge of carbon fluxes on a wide range of temporal and spatial scales, and with limited financial resources, we must strive both for improved ground-based networks and for satellite systems with denser and more frequent sampling over the regions we know least about.

A space-borne instrument that orbited only above the tropics, for instance, would provide unprecedented coverage of tropical land ecosystems, and would contribute significantly to verifying emissions from human activities as part of international efforts such as REDD.

Such a mission would also complement the global survey orbits adopted by many other satellite instruments, and also the ground-based measurement networks which are mostly outside the tropics. Such efforts need to be integrated with existing surface and aircraft measurements of trace gases and land-surface properties. Watch this space.

Article by Paul Palmer, Hartmut Bösch and Andy Kerr explore the science and politics of measuring CO2 from space.
Professor Paul Palmer is a member of the School of GeoSciences at the University of Edinburgh, Dr Hartmut Bösch is a member of the Department of Physics and Astronomy at the University of Leicester, and Dr Andy Kerr is Director of the Edinburgh Centre on Climate Change. Email: paul.palmer@ed.ac.uk

Source

WAYLAND, MA—(Marketwire – April 7, 2011) – The Group on Earth Observations (GEO) has announced a Call for Participation (CFP) in the 4th phase of the GEOSS Architecture Implementation Pilot (AIP-4).

The Open Geospatial Consortium, Inc. (OGC®) provides leadership in AIP-4 and invites OGC members and other organizations to respond to the CFP. The CFP document is available at: http://earthobservations.org/geoss_call_aip.shtml.

AIP-4 will improve access to GEOSS datasets that support the “Critical Earth Observation Priorities” that have been identified by the GEO User Interface Committee. It will increase the use of these data by building on the accomplishments of prior AIP phases. AIP-4 aims to:

- Increase on-line access to “Critical Earth Observation Priorities Data Sources”; – Ensure datasets are discoverable through the GEOSS Common Infrastructure; and – Demonstrate effectiveness of general and specialized software tools for using data.

Responses to this CFP are requested by 8 May 2011. Discussion and clarification of the CFP and the initiation of AIP-4 will be the topic of weekly teleconferences. Agenda and logistics for these telecons are posted at http://www.ogcnetwork.net/AIPtelecons

The Point of Contact for the AIP task is George Percivall, percivall@opengeospatial.org.

The OGC® is an international consortium of more than 420 companies, government agencies, research organizations, and universities participating in a consensus process to develop publicly available geospatial standards. OGC Standards empower technology developers to make geospatial information and services accessible and useful with any application that needs to be geospatially enabled. Visit the OGC website at http://www.opengeospatial.org.

GEO (Group on Earth Observations) is a voluntary partnership of 148 governments and international organizations, launched in response to calls for action by the 2002 World Summit on Sustainable Development and by the G8 (Group of Eight) leading industrialized countries. GEO is coordinating efforts to build a Global Earth Observation System of Systems, or GEOSS. See http://earthobservations.org/about_geo.shtml.

Contact
Steven Ramage
Executive Director, Marketing and Communications
Open Geospatial Consortium (OGC)
Bergen, Norway
sramage@opengeospatial.org
Phone: +47 9862 6865

Source Open Geospatial Consortium, Inc.

28-29 June, Izmir, Turkey


Geographic Information Systems (GIS) and Remote Sensing (RS) Applications (ISEPP)

Date &Venue
28-29 June 2011, Gediz University, Izmir, Turkey

ISEPP 2011 Symposium Topics – Examples of successful GIS/RS applications in environmental protection and planning – Pollution Prevention and Control, Waste Management – Natural Resources, Biodiversity and Habitat – Watershed and Water Resources – Flooding and Erosion – Hazardous Waste and Brownfields Redevelopment – Special Protection Areas – Historic and Archaeological Sites – Land Use/Land Cover – Sustainable Development – Regulatory Compliance

Organizers
CEVKOR Foundation
The Turkish American Society of Civil and Environmental Engineers (TASCEE)
Turkish Environmental Protection Agency for Special Areas (EPASA)
Ege University
Dokuz Eylul University
Gediz University

ISEPP Contact
Adress:CEVKOR Foundation, Murselpasa Bulvari, 1265 Sokak, No:10/10, 35230, Konak / Izmir / Turkey
Phone:+ 90 232 445 99 99
Fax:+ 90 232 445 31 31
Symposium Secretary: Muavviz AYVAZ, Dr, CEVKOR Foundation
E-mail:info@cevkorconferences.com

(March31st, 2011) A new NASA-funded study has revealed widespread reductions in the greenness of Amazon forests caused by last year’s record-breaking drought.


“The greenness levels of Amazonian vegetation — a measure of its health — decreased dramatically over an area more than three and one-half times the size of Texas,” said Liang Xu, the study’s lead author from Boston University. “It did not recover to normal levels, even after the drought ended in late October 2010.”

The drought sensitivity of Amazon rainforests is a subject of intense study. Computer models predict a changing climate with warmer temperatures and altered rainfall patterns could cause moisture stress leading to rainforests being replaced by grasslands or woody savannas. This would release the carbon stored in rotting wood into the atmosphere, which could accelerate global warming. The United Nations’ Intergovernmental Panel on Climate Change has warned similar droughts could be more frequent in the Amazon region in the future.

The comprehensive study was prepared by an international team of scientists using more than a decade’s worth of satellite data from NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS) and Tropical Rainfall Measuring Mission (TRMM). Analysis of these data produced detailed maps of vegetation greenness declines from the 2010 drought. The study has been accepted for publication in Geophysical Research Letters, a journal of the American Geophysical Union.

The authors first developed maps of drought-affected areas using thresholds of below-average rainfall as a guide. Next, they identified affected vegetation using two different greenness indexes as surrogates for green leaf area and physiological functioning.

The maps show the 2010 drought reduced the greenness of approximately 965,000 square miles of vegetation in the Amazon — more than four times the area affected by the last severe drought in 2005.

“The MODIS vegetation greenness data suggest a more widespread, severe and long-lasting impact to Amazonian vegetation than what can be inferred based solely on rainfall data,” said Arindam Samanta, a co-lead author from Atmospheric and Environmental Research Inc. in Lexington, Mass.

The severity of the 2010 drought also was seen in records of water levels in rivers across the Amazon basin, including the Rio Negro which represents rainfall levels over the entire western Amazon.

Water levels started to fall in August 2010, reaching record low levels in late October. Water levels only began to rise with the arrival of rains later that winter.

“Last year was the driest year on record based on 109 years of Rio Negro water level data at the Manaus harbor,” said Marcos Costa, co-author from the Federal University in Vicosa, Brazil. “For comparison, the lowest level during the so-called once-in-a-century drought in 2005 was only eighth lowest.”

As anecdotal reports of a severe drought began to appear in the news media last summer, the authors started near-real time processing of massive amounts of satellite data. They used a new capability, the NASA Earth Exchange (NEX), built for the NASA Advanced Supercomputer facility at the agency’s Ames Research Center in Moffett Field, Calif. NEX is a collaborative supercomputing environment that brings together data, models and computing resources.

With NEX, the study’s authors quickly obtained a large-scale view of the impact of the drought on the Amazon forests and were able to complete the analysis by January 2011. Similar reports about the impact of the 2005 drought were published about two years after the fact.

“Timely monitoring of our planet’s vegetation with satellites is critical, and with NEX it can be done efficiently to deliver near-real time information, as this study demonstrates,” said study co-author Ramakrishna Nemani, a research scientist at Ames. An article about the NEX project appears in this week’s issue of Eos, the weekly newspaper of the American Geophysical Union.

Source

Interviews from the 2010 Geospatial Conference in Vienna with Geospatial experts who have attended the conference.

Fleming Europe’s Second Annual Geospatial Summit which takes place on the 1-2 June 2011 in Budapest, Hungary will again provide an insight into the latest strategies and technological advances for using geospatial information in society.

We have asked some of our speakers for their views about the industry.

Q: What areas should be developed in order to improve interoperability?

Commodore Pat Tyrrell: Interoperability is required on a number of different levels: it is vital within a single service, it is required for joint service co-operation and, increasingly, important for coalition operations. I would suggest, however, that there is little appetite for interoperability with those nations with whom you may find yourself in conflict!

Traditionally, interoperability has been achieved by setting robust and well defined standards. The problem is that, as the data requirements get more complex, standardisation becomes more of a behemoth. Complexity and standards are not happy bedfellows: if you standardise a thread on a bolt that is one thing but to standardise a complex environment such as GIS you need to be able to “flex” the building blocks to accommodate new technologies and opportunities. The key here is to understand those building blocks and the key one is the data with which we work. Expressing the data in a common format can be extremely difficult but, if we use language such as that offered by XML, we can provide equivalences of meaning in a highly flexible and dynamic fashion.

The key to effective interoperability is to take a holistic view of intelligence rather than one which looks at different collection methodologies in isolation. To this end we require an approach that links GIS with signals intelligence, human intelligence with open source intelligence etc. Here we will need some dictionary of key terms to ensure that we are talking about the same thing.

Col John Fitzgerald: Metadata standards need to be further developed to improve geospatial intelligence interoperability across communities and nations. As more information and functionality are delivered through services across diverse interconnected networks, then standardised and relevant metadata which describes the quality of information and aids in its management is essential for effective discovery, fusion and exploitation.

Lt Col Neil Marks:
• A continued drive forwards on the great standards work to date.
• Increased Member State burden sharing on projects, especially given the current financial climate. This would also see greater and wider use of multi-national systems, such as the EU Global Monitoring for Environment and Security (GMES) project.
• Member States look to reduce and remove release constraints on geospatial information, by working through the bi, tri and quadrilateral agreements and arrangements, particularly given the increasing amount of ‘online’ open source data being available.

Mr Pascal Legai: In the recent past, the EU Satellite Centre (EUSC) has experienced a remarkable increase in the demand for products. This evolution is especially due to the growing number of European Common Security and Defence Policy missions and operations. –

Those missions and operations represent the field where interoperability plays a most significant role: The Centre’s products and services need be fully integrated both in the political planning process of EU and Member States, as well as be readily available to the commanders of missions and operations in the field.

We expect that the creation of the EEAS will be a major step forward in the direction of interoperability, and we are looking forward to further integrating our processes with the workflow of this important new EU organ. This calls for several concrete steps: to fully integrate the EUSC’s capabilities into CFSP/CSDP operations, and direct support of operations, especially with regard to the integration of civil and military planning capabilities at the Council General Secretariat / EEAS; to play a key role in the security dimension of the EU Global Monitoring for Environmental and Security (GMES) programme in support to the European Security Strategy; and to continue to explore cooperative opportunities where benefits for further improvement in EU crisis response capabilities could arise.

Mr Francesco Pisano: UNOSAT is an international entity providing operational satellite analysis to various international and national actors in specific areas such as humanitarian aid and emergency response, damage assessment, human security, human rights and territorial development planning. As such UNOSAT does not perform any work that resembles intelligence or surveillance. Even so, we are experts in several of the same fields in which geospatial intelligence operates.

I answered this question earlier in my career, during a talk I gave in San Antonio at a large gathering of the US Geospatial Intelligence Foundation. Interoperability is too often defined as the capacity to work in inter-operable ways within the same compartment in any production chain, especially intelligence. My way to see interoperability is a way to connect various compartments and harvest the benefits of synergy across sectors rather than within sectors. With regard to the work we do, interoperability would ideally stretch to providing my Team with valuable data for us to help the international community respond to disasters and emergencies in developing countries.

Mr John Tate: I believe that the wider adoption of open standards, in particular data and web services, will lead to an improved interoperability. Improvements in the ability to share data in service and non-service manner will be achieved, reducing the numbers of data silos / towers of excellence (depending on your point of view). Coupled with the adoption of an ‘enable and enhance’ (current capability / equipment) approach rather than replace will offer costs benefits.

To read the full interview and to find out more about the upcoming 2nd Annual Geospatial Summit visit www.flemingeurope.com or ask your program directly from usman.koroma@flemingeurope.com

About Fleming Europe

Fleming Europe conferences are events linking business with intelligence. Carefully designed to provide key strategic business information and the best networking opportunities for the participants, our B2B conferences are highly interactive. Number of delegates from specialized industry sectors – brought together by Fleming Europe – become part of a premium community discussing the questions of the day and enjoying the value of a five star event.

SeaWiFS – Sea-viewing Wide Field-of-view-Sensor –

Mary Cleave left the NASA astronaut corps in the early 1990s to make a rare jump from human spaceflight to Earth science. She was going to work on an upcoming mission to measure gradations in ocean color – something she had actually seen from low-Earth orbit with her own eyes. From space, differing densities of phytoplankton and algae and floating bits of plant life reveal themselves as so many blues and greens. For Cleave, a former environmental engineer, the attraction was simple.

“We were going to measure green slime on a global scale,” said Cleave, now retired from her varied NASA career.

That is exactly what SeaWiFS – Sea-viewing Wide Field-of-view-Sensor – did for over 13 years, until it recently stopped communicating with ground-based data stations and after several months of intensive efforts at recovery, was declared unrecoverable in February.

This seemingly simple measurement offers a window into the oceans’ basic ability to support life. SeaWiFS’ long, well-calibrated data record gives scientists one of the best benchmarks available to study the planet’s biological response to a changing environment.

The OrbView-2 spacecraft, which carried the SeaWiFS instrument, stopped communicating with Earth-based data stations in December 2010. After several months of attempts to revive the link, GeoEye, the company that operated the spacecraft, officially ended any further attempts at recovery.

SeaWiFS was NASA’s first “data buy” mission, in which a private company, Orbital Sciences, designed and built the instrument and spacecraft to NASA specifications and NASA agreed to purchase the data as long at it met its scientific requirements. Like all spacecraft, the one that carried SeaWiFS had several fundamental back-up systems, which allowed the spacecraft to operate well past its planned five-year mission life.

“The hope was to induce it to go into ‘Phoenix mode’ – a self-protective state, that would have allowed the ground controllers to recover the spacecraft. Unfortunately they were never able to get a return signal and without the ability to communicate with it, chances for recovery faded,” said Gene Carl Feldman, SeaWiFS project manager, based at NASA’s Goddard Space Flight Center, Greenbelt, Md.

“Unfortunately, we’ll never absolutely know what went wrong. Many people said it would never get off the ground. Some said it wouldn’t last a year. The mission was planned for five years. We got 13 years of incredible data out of this amazing little satellite.”

For centuries, oceanographers were limited in their study of the highly variable and incredibly vast ocean by what they could physically sample from the deck of a slow moving ship.

Like so many scientific fields, satellites changed that. The oceans, once thought homogenous and boring, have been revealed as far more dynamic, changing and varied from region to region and season to season. Quantifying this diversity in time and space would be impossible without long-operating satellites.

Since its launch in 1997, SeaWiFS has been making outsized contributions to the field of observing the oceans pulse with life through changing seasons and a changing climate.

SeaWiFS was designed to measure ocean color. This seemingly narrow measurement captures the fundamental biological activity at the ocean surface, the surging and depleting life cycle of phytoplankton, the microscopic floating ocean plant life. Phytoplankton forms the base of the oceanic food web, and its abundance is a direct indicator of the seas’ ability to support life.

It also plays a central role in the oceans’ carbon uptake, a key component of the planet’s climate system, particularly as the level of carbon dioxide in the atmosphere continues to increase. SeaWiFS was used to offer real-time monitoring of red tides and other harmful algae, which can bloom in polluted waters and be deadly to fish and oysters. Ocean color also offers a window into the constantly changing interplay between the ocean’s physical and chemical processes such as temperature and nutrient levels, the atmosphere and the biological life of the seas.

Feldman remembers watching the first dramatic example of SeaWiFS ability to capture this unfold. The satellite reached orbit and starting collecting data during the middle of the 1997-98 El Nino. An El Nino typically suppresses nutrients in the surface waters, critical for phytoplankton growth and keeps the ocean surface in the equatorial Pacific relatively barren.

Then in the spring of 1998, as the El Nino began to fade and trade winds picked up, the equatorial Pacific Ocean bloomed with life, changing “from a desert to a rain forest,” in Feldman’s words, in a matter of weeks. “Thanks to SeaWiFS, we got to watch it happen,” he said. “It was absolutely amazing – a plankton bloom that literally spanned half the globe.”

Originally designed to measure only ocean plant life, modifications made to SeaWiFS before launch allowed it to make a similar kind of measurement of plant color on land. This ability to see all of the planet’s plant life with a single, well-calibrated instrument produced a first-of-its-kind snapshot of the Earth’s biosphere in 1998. Then-Vice President Al Gore was so impressed he asked for a poster of the image to hang in his office, Feldman said.

From the broadest perspective, Feldman sees this measurement as an observation of what makes Earth different from all the other celestial bodies NASA studies. “None of the other planets we have studied so far seem to have the combination of factors that result in life. We do. That SeaWiFS image of the global biosphere is the picture of the things that set us apart.”

But the mission’s lasting contribution will no doubt be its study of the life of the oceans. This record stretches over a long enough period to produce a critical record of both natural variability and the planet’s biological response to a changing climate.

Given the length and breadth of the SeaWiFS data record, it may help oceanographers test this question of just how resilient the oceans are. SeaWiFS has observed a decline in plant life productivity in some of the larger “gyres,” large-scale ocean current patterns, said Jim Yoder, a senior scientist at Woods Hole Oceanographic Institute, Woods Hole, Mass. who also worked on ocean color at NASA in two different stints in the 1980s and 1990s.

It’s the kind of observation over time a satellite can make, and it has created a debate in ocean science circles about whether this is a natural cycle or an effect of climate change.

“Everyone who looks at the data sees the decline, it’s just a question of why,” Yoder said. “This has set off a lot of interest and debate. The climate implication would be the global ocean is declining in productivity due to increasing temperatures, which suppresses nutrients so there is less phytoplankton. Is it a cycle or is it a trend? Part of SeaWiFS’ legacy may be related to trying to resolve this. You can’t do that if your global change data has gaps between years and different satellites.”

Cleave, a Space Shuttle astronaut who had been an environmental engineer with the Utah Water Research Laboratory before joining the astronaut corps, is one of the few people to have seen ocean color gradations from space with her own eyes.

“It really is amazing and beautiful,” said, Cleave, who worked as SeaWiFS project manager for much of the 1990s before becoming NASA’s Associate Administrator for the Science Mission Directorate, NASA Headquarters, Washington. She retired in 2007. “You can see plants growing in the ocean, which is basically what ocean color is. Particularly around the mouths of big, dirty rivers.”

Cleave was drawn to questions about both ocean and land plants’ role in the global carbon cycle. How much carbon dioxide were the aquatic plants drawing from the atmosphere for photosynthesis? How much were land plants using? Before SeaWiFS, this fundamental question wasn’t answered. With its ability to measure both land and ocean primary productivity, SeaWiFS also revealed to scientists the balance between the two as a percentage of global plant life.

“Turns out it’s about 50-50,” Cleave said. “But that’s something we didn’t know.”

NASA continues to make ocean color measurements with its MODIS (MODerate resolution Imaging Spectroradiometer) instruments. Others have followed on in recognizing the importance of ocean color observations both for understanding the global carbon cycle and more immediate societal benefits such as monitoring harmful algae and helping commericial fisheries locate potential feeding grounds.

Since the launch of SeaWiFS, the Indian Space Research Organization has launched its Oceansat satellites and the European Space Agency launched Envisat, which measures ocean color with its MERIS (Medium-resolution Imaging Spectrometer) instrument.

Yoder said SeaWiFS has made great contributions, but its ultimate legacy remains unwritten. Scientists will likely use its data record for decades, both for new research and as a baseline to measure the biosphere’s response to climate change.

“It’s hard to pin this down, but it changed the field of biological productivity from one that focused on individual bays or estuaries. It changed the field from a local to global perspective. Until then you didn’t have that ability.”

Said Feldman, “The international scientific community certainly could not have asked for a more tenacious little spacecraft and instrument that has served us so well for the past 13-plus years. There is no question that the Earth is changing. SeaWiFS enabled us for the first time to monitor the biological consequences of that change – to see how the things we do, and how natural variability, affect the Earth’s ability to support life.”

by Patrick Lynch, Greenbelt MD (SPX) Apr 06, 2011

Source

The UK Space Agency has officially become an executive agency of the Department for Business, Innovation and Skills, with an annual budget of £240million.

The association will be responsible for devising and implementing British policy in orbit and will target areas that can deliver the greatest economic benefits, scientific excellence and national security.

Priority areas include developing scientific advancements in space technologies, gaining a better understanding of our planet through Earth observation spacecraft and nurturing the next generation of space scientists and researchers.

“The UK space industry is worth an estimated £7.5billion and is an important driver for economic growth,” said David Willetts, Minister for Universities and Science. “This is why we’ve earmarked £10million in the Budget to start a national space technology programme and committed to reducing the regulatory burden on industry.

“The establishment of the UK Space Agency will provide a focal point for this work and bring together our very best talent. This will help us concentrate efforts on advancements in space science and satellite technology and ultimately give us a better understanding of our own planet.”

UK Space Agency Chief Executive David Williams, added: “Today represents 12 months of hard work from agency staff and from colleagues at the Science and Technology Facilities Council, Technology Strategy Board, Natural Environment Research Council and other partners in industry and across Whitehall.

“Now established, we need to use the upcoming months to set the direction of travel for the UK space sector. That means concentrating on encouraging growth and engaging with industry, academia and other Government departments to make sure we’re developing in the right way.”

As part of the Budget and Growth Review, the Government also highlighted its commitment to reforming the Outer Space Act, which will introduce an upper limit on the third party liability of UK satellite operators. It will continue to work with the international regulatory authorities to enable space tourism operations in the UK and define regulations for novel space vehicles that could offer low cost access to space.

Author, Laura Hopperton
Supporting Information
http://www.bis.gov.uk/

(April 2011) Space technology and exploration could be used to help mitigate loss of life and damage caused by natural disasters, a conference in Brussels was told.

The event on Monday heard that last summer’s devastating fires in Russia, which killed thousands of people, would have been “even worse” but for the use of technology normally used in outer space.

Musa Manarov, a Russian cosmonaut, said, “The international community ought to be giving more attention to the possible application of space technology in such circumstances. It can be used not only to mitigate the consequences of earthquakes, floods and other natural disasters but, possibly, also predict these events. It has great potential”.

He was speaking by satellite from Moscow in a debate in Brussels with senior officials from the EU institutions to mark the 50th anniversary of the first manned spaceflight.

Manarov, who is now a member of the defence committee of the Russian Duma, or parliament, said the fires in Russia, caused by one of the country’s hottest heatwaves on record, were “catastrophic”.

He added, “Even so, without the system of monitoring, prediction and warning that was deployed through space technology the situation would have been much worse”.

He said space technology application in this way “knows no boundaries or national borders”. As such, he said, the benefits of space exploration “fully justifies the enormous efforts and costs” involved.

Speaking at the same event, Thomas Brandtner, of the council of the EU, agreed that the space technology could “increasingly” be used to help predict both natural and man-made disasters such as the recent earthquake in Japan.

“This, and last year’s Russian fires, are a good example of how space exploration can impact on everyday life back on earth and how ordinary people can benefit from the use of satellites to transmit data.”

Brandtner, head of unit for competitiveness and industry policy in the council’s general secretariat, said that Europe already had its own system for monitoring such events with the GMES (Global Monitoring for Environment and Security), the European programme for the establishment of a European capacity for earth observation.

He added, “Once GMES is fully operational we hope to gradually increase our cooperation on these issues with countries like Russia.

“But I believe that we are already on the same wavelength with Russia when it comes to the use of space technology,” said Brandtner.

Also participating in the video conference was Theo Pirard, director of the Space Information Centre in Belgium, who said, “The ability to map the earth from space has totally changed what society is capable of”.

Source

“We are already on the same wavelength with Russia when it comes to the use of space technology” Thomas Brandtner