Skip to content

Data quality is a problem we need to address if we in the geospatial industry expect to be a part of the enterprise IT picture. Our most pressing need is a simple, reliable way to answer: “Are these data fit for this purpose?” each time spatial data are merged or shared in an enterprise system.

Here‘s the problem. In the past, users captured individual spatial data sets for specific and often independent uses. Today, the spatial data used in enterprise systems flow in from many sources. Often the origins and capabilities of the data are unknown. Making the problem tougher, data have to be integrated and used quickly while they are still relevant.
The data quality problem is multi-faceted. To understand the issues, it helps to categorize the problem into general domains. Here‘s a rough cut at a few potentially useful categories.
* Geometric domain – topology, proximity, directionality, alignment, co-ordinates, including method of collection GPS or non-GPS. Incompatible topology models, for example, can induce significant errors as data are merged.
* Data domain – geocode/addressing, vector, raster, elevations. Each data type has its own unique characteristics and capabilities.
* Application domain – topography, cartography, transportation, utilities, localized eCommerce. Each application has a unique set of data quality requirements.
* Data management domain – database management; extract, transform, load; data merge, search and Web service. Data quality problems may be resolved or exacerbated within each data management function. Careful design of data management workflows can minimize problems.
* Temporal domain – The time element can make spatial data much more useful. With thoughtful design, spatial and temporal data together have a wide range of powerful capabilities. But, spatiotemporal data carries significant complexity making data management and access tricky.
* Political/cultural domain – Often political and cultural issues are the most difficult to resolve. If the people involved in a system are unwilling or incapable of sharing data, data quality suffers. Changing this kind of problem requires a deep understanding of organizational dynamics and information behaviors.
* Economic/financial domain – Capturing and managing spatial data are generally expensive and labor-intensive. There has to be some kind of economic system that allows the people doing the work to be compensated. However, current intellectual property (IP) models tend to create problems if data with different IP constraints are merged.
There are undoubtedly many other useful categorizations and examples. The point is that each domain has a set of data quality issues that need to be addressed. The logical place to address those issues is within standards bodies and industry groups. OGC is just starting to work on spatial data quality and will likely become a focal point. The Infrastructure for SPatial InfoRmation in Europe (INSPIRE) outlines specific requirements for data that are to be aggregated into central EU systems.
While the spatial data quality problem still needs a lot of work, we are seeing some progress in two areas: geocoding/addressing and vector data. The geocoding area is relatively mature because large enterprises have long needed to optimize mass marketing, billing and other mail-related functions. A number of established vendors address enterprise requirements for converting addresses to explicit locations. Examples: SAS Institute/Dataflux, Pitney Bowes/Group 1, Trillium Software, QAS, SRC and Cquay. These companies generally compete on the basis of their capabilities and processing throughput for address standardization and geocoding.
One new startup, Proxix, is addressing the need for high-precision geocoding by collecting and using parcel geometry and data for the US. Proxix also provides a capability for selecting the best data source for each geocode based on a user-defined set of rules. A long-established company, DMTI Spatial, now offers “Location Hub,” a product that broadly simplifies spatial data quality and management tasks.
Companies like Proxix and DMTI Spatial are addressing needs for high-precision location intelligence and spatial data quality. These new requirements will drive additional innovation from both established vendors and startups.
With a few notable exceptions, spatial data quality in the vector domain is less mature. The basic reason is that, historically, vector data were generally gathered for a specific purpose by a user. That user managed editing and error correction until the data were fit for their particular purpose. In spite of lengthy discussions and arguments about data sharing, users didn‘t have much real incentive to design or manage vector data for external uses.
Today, that is starting to change. Concepts like master spatial data management and location hubs are being implemented within enterprise systems. Spatial information infrastructures like Ordnance Survey‘s Master Map are well established. Whether you call it master data management, location hubs, or spatial information infrastructure, use of vector data across different applications is increasing.
One company, 1Spatial (formerly Laser Scan), has been automating vector data quality management for years. Its main product, Radius Studio, offers automated, rules-based data quality management tools for high-volume processes. Radius Studio also manages the integration of vector data from multiple sources. As enterprises increase their cross-process use of vector data, we will see companies like 1Spatial gain traction and spawn a new wave of innovation.
But our industry is a long way from simple, cheap, standard and versatile data quality solutions that address the many spatial data quality problem domains. Looking forward about 12 to 18 months, expect to see enterprise users focus on spatial data quality. Companies that can address these issues with innovative solutions (rather than cleverly re-packaging existing stuff) will do well. Also, expect to see users demand standardized interfaces and services. OGC and INSPIRE have important roles to play.
To summarize, enterprise users require effective, standard, predictable data quality. Those users increasingly want to use spatial data within their information systems. This situation creates a demand for broadly effective spatial data quality management – a demand that our industry has yet to address. But, there‘s a pony in there somewhere. We need to find it. Soon.
(Source DirectionsMag)

DMC International Imaging plans to launch a high-resolution optical imaging satellite named UK DMC-2 in 2008 to provide continuous Continent-level imaging with direct downlink of data to customer‘s ground stations.

At just 120 kg, UK DMC-2 will be a very capable low cost Earth Observation (EO) satellite, carrying a higher-resolution (22 metre) multi-spectral DMC imager with an ultra-wide 660km imaging swath.
The enhanced micro-satellite will be able to image continuously while broadcasting data in real-time to licensed ground stations.
UK DMC-2 will be launched into the existing international Disaster Monitoring Constellation (DMC) with Deimos-1, built by Surrey Satellite Technology Ltd (SSTL) for Deimos Imaging SL, Spain.
The 2 new generation DMC satellites will be coordinated with the 5 existing DMC satellites to provide data continuity and a new level of imaging output.
The rapid revisit capability of the constellation is especially valuable for remote sensing applications that monitor fast changing phenomena such as fires, floods and crops. It also covers large areas quickly including the Amazon Basin, Australia, and Europe, providing the capability for multi-season coverage of major agriculture.
David Hodgson, Managing Director DMCii, commented: “The next generation of DMC will serve our existing and new customers with higher resolution imagery and allow high frequency continent-wide environmental monitoring. The commercial success of the DMC demonstrates the advantage of the small satellite constellation approach to data supply and is the result of DMCii tailoring services to the individual needs of customers.”
DMCii works with the Members of the DMC to coordinate international commercial imaging campaigns, and coordinates rapid disaster response through the International Charter: Space and Major Disasters.
DMCii‘s extensive calibration and process development programme generates high quality products to meet the demanding requirements of many remote sensing applications from precision agriculture to illegal logging.
DMC International Imaging Ltd.
(DMCii) is a UK company that supplies satellite imagery products and services to a wide range of international customers. DMCii supplies programmed and archived optical satellite imagery from the multi-satellite Disaster Monitoring Constellation (DMC).
DMCii works with customers to tailor a supply service to meet specific customer and application needs. DMC images are used in a wide variety of commercial and government applications including agriculture, forestry and environmental mapping.
DMC imagery is used extensively by organisations such as the European Commission, Brazilian Space Agency, United Nations, and the US Geological Survey.
The small satellites of the DMC provide daily revisit combined with an unmatched 660km imaging swath width for frequent broad area coverage. Multispectral image products feature a pixel ground sample distance (GSD) of 32-metres today and 22-metres from 2008. DMCii‘s panchromatic image products feature a very high-resolution 4-metre pixel GSD. All DMC images are calibrated and processed to a variety of product levels according to customer requirements.
DMCii provides imagery from satellites it owns and operates, UK-DMC and UK-DMC2 (2008) and also coordinates the commercial activity of the DMC satellite constellation.
The DMC is independently owned and operated by a cooperating consortium of organisations representing member nations:
+ Centre National des Techniques Spatiales. Algeria
+ Beijing Landview Mapping Information Technology Ltd, China
+ National Space Research and Development Agency, Nigeria
+ T?BITAK BILTEN, Turkey
+ Surrey Satellite Technology Ltd, UK
+ Deimos Space S.L., Spain
The DMC Consortium members work together through DMCii for commercial sales and for collaborative Earth Observation campaigns.
DMCii coordinates the DMC for humanitarian use in the event of major international disasters. Working in partnership with the UK British National Space Centre (BNSC) DMCii and the DMC Consortium members, DMCii provides services and imagery to the International Charter: “Space and Major Disasters”.
DMCii provides 24-hour emergency on call officer services and, in the event of a major disaster, tasks the global fleet of satellites made available by the world‘s space agencies. DMCii uses its expertise to select appropriate satellites for individual disaster circumstances.
DMC imagery is supplied to organisations such as the United Nations, and the US Geological Survey during disasters such as Tsunami, Forest Fires and Flooding.
DMCii was formed in October 2004 and is a wholly-owned subsidiary of the world leader in small satellite technology, Surrey Satellite Technology Ltd. SSTL designed and built the DMC with the support of the BNSC and in conjunction with the DMC member nations Algeria, China, Nigeria, Turkey and the UK.
(Source DMCii and Spacedaily)

China‘s director of the State Oceanic Administration, Sun Zhihui, has confirmed that the country wants to make five more oceanic satellites that will include ocean color remote sensing satellites, ocean dynamic environment satellites and ocean surveillance satellites.

(April 2007) China launched the ocean color remote sensing satellite Haiyang-1B from the Taiyuan Satellite Launching Center.
For 2009, the Haiyang-2 satellite, or ocean dynamic environment satellites is in the works.
Oceanic research relies heavily on satellite ocean remote sensing technology to cover the maritime environment, disaster relief and other academic research.
According to the white paper issued by the Information Office of the State Council called “China‘s Space Activities in 2006,” in the next five years the country will start developing high-resolution Earth observation system, oceanic satellites, Earth resources satellites, and small satellites for environmental protection.
“Although China is one of only five countries in the world able to independently launch ocean color remote sensing satellites, we still lag behind developed countries in this field,” said Sun.
Sun said oceanic satellites important for marine economy, and monitoring China‘s marine rights.
He added that China should be at par with other countries in terms of satellite launching and observation technologies by 2015.
Josephine Roque

The Open Geospatial Consortium, Inc. (OGC) has issued a Call for Participation (CFP) in the “Architecture Implementation Pilot,” which is a coordinated interoperability initiative of GEOSS, FedEO and Tri-Lateral initiatives. Responses are due by 11 May 2007 and the “Pilot Kickoff Meeting” will be held 5-6 June 2007 in ESRIN the European Space Agency‘s establishment in Italy.

This CFP seeks participants in a coordinated Architecture Implementation Pilot. A Pilot is a collaborative effort that applies open standards for interoperability to achieve user objectives in an environment representative of operational use. Outcomes include best-practices and interoperability arrangements suitable for an operational capability.
This CFP seeks proposals from organizations involved with Earth Observation systems to:
* Identify components with services, e.g., portals, catalogs and other services;
* Participate in confirming the interoperability of those identified services using standards and interoperability arrangements as identified in the preliminary architecture of the CFP; and,
* Participate in the collaborative development of societal benefit scenarios to guide testing and demonstrations of the identified interoperable services.
The CFP was initiated to solicit response for the GEOSS Architecture Implementation Pilot. The Pilot aims to incorporate contributed components consistent with the GEOSS Architecture – using a solicited GEO Web Portal and a GEOSS Clearinghouse search facility – to access services through GEOSS Interoperability Arrangements in support of the GEO Societal Benefit Areas.
The Pilot benefits from the collaborative support of two OGC Interoperability Program Pilots:
* The Tri-Lateral Interoperability Pilot is a collaborative, operational test of open standards deployment, supporting collective requirements of organizations responsible for national and regional “Spatial Data Infrastructures” in Europe (INSPIRE), Canada (GeoConnections), and the U.S. Federal Geographic Data Committee (FGDC).
* The Federated Earth Observation Missions (FedEO) Pilot provides a broad international venue for operational prototyping and demonstration of Earth Observation (EO) data access harmonization, interoperability requirements and protocols as defined by the European Space Agency (ESA), together with other space agencies and other OGC members.
The Architecture Implementation Pilot CFP documents can be downloaded from opengeospatial portal.
The OGC® is an international industry consortium of more than 335 companies, government agencies and universities participating in a consensus process to develop publicly available interface specifications. OpenGIS® Specifications support interoperable solutions that “geo-enable” the Web, wireless and location-based services, and mainstream IT. The specifications empower technology developers to make complex spatial information and services accessible and useful with all kinds of applications.
(Source opengeospatial)

“Future Challenges for Local and Regional Authorities: How can Space Technology help?”, to be held on 29-30 May 2007 in Barcelona, Spain.

The aim of the Eurisy programme dedicated to Local and Regional Authorities is to facilitate the use by European regions and cities of the existing services providing solutions to some of the challenges they face (such as monitoring of the natural environment, pollution, land use, real estate, traffic management, tracking of goods and people, natural disaster management, etc).
For further information on this programme (Final Announcement and Outline Programme) and to register (Registration Form, Hotel Reservation Form, Exhibition Stand) at EURISY website

(Source EURISY)

VITO News
The VEGETATION Image Processing Toolbox (“VIP-Toolbox”) is a collection of executable tools and an application programming interface (API) which has been developed to facilitate the utilization and processing of VEGETATION data. The purpose of the Toolbox not to duplicate existing commercial packages, but to complement them with functionality dedicated to the handling of VEGETATION products.
The Toolbox is released as Open Source to give users the change to get the most out of the Toolbox and the VEGETATION products… more
(Source VITO)

VEGA Group Creates a Simulation Platform for Galileo Data

Customer Challenge

The European Union (EU) and the European Space Agency (ESA) needed to add powerful visualization capabilities to a simulation environment at the Galileo System Simulation Facility (GSSF).

Solution Achieved

In 2000, the European Union (EU) and the European Space Agency (ESA) initiated Galileo, an international project to design and deploy a global satellite navigation and positioning system specifically for the civilian applications of today and of the future. The Galileo project will include a combination of ground based systems, a constellation of 30 satellites, and a host of support services that will provide advanced technologies for transportation, energy, personal navigation, surveying, environmental and emergency management applications.

Advancing Technology Development Using Simulations

From the beginning of the project, ESA has focused its attention on developing advanced technologies to provide a broad spectrum of tools and capabilities to support present day and future civilian applications.

Along with a full array of payload systems and ground station services, ESA has commissioned a comprehensive simulation environment that will provide sophisticated models of Galileo’s operations. These simulation models will be indispensable to companies and organizations that provide services to consumers that are dependant on accurately assessing the navigational accuracy and analyses of data integrity for the Galileo system.

This vision for a simulation environment is now being realized as the Galileo System Simulation Facility (GSSF). GSSF provides a simulation environment that reproduces the functional and performance behavior of the Galileo system and provides a robust and easy to use platform for companies to develop simulation applications to ensure the integrity of products and services being developed using Galileo, GPS or EGNOS systems.

The Galileo System Simulation Facility

GSSF is being developed for ESA/ESTEC by a multinational team lead by VEGA IT GmbH of Darmstadt in Germany, which is a 100% subsidiary of the British VEGA Group Plc. GSSF gives users the ability to create models that simulate Galileo functions such as navigational accuracy, data integrity, the effect of the environment on navigation performance, and ground segment performance for Galileo, GPS and EGNOS systems.

Developed in C++ and C# (.NET) on the Windows XP platform, GSSF offers a single simulator environment that allows users to set up and configure a simulation scenario, run the simulation, analyze and visualize the data and import or export data and reports. VEGA has incorporated the advanced visualization capabilities of IDL into GSSF to provide users with additional simulation and 3D processing capabilities. IDL allows users to create powerful visualizations of the data including modeling, contour plots and map plots and many more.

Galileo’s GSSF version 2.0 allows the user to perform advanced modeling of data, create contour plots and much more. – Images credit Galileo Project

Performance Analysis & Ground Station Validation – GSSF version 2.0

GSSF 2.0 has been formally accepted by ESA and is now available for free download from www.gssf.eu. Currently over 300 licenses have been downloaded and many core institutions are making use of the platform to model data from Galileo, GPS and EGNOS.

The current version of GSSF (2.0) includes the Service Volume Simulation (SVS) to create models for the analysis of navigational performance and integrity over long periods of time and over large geographic areas. It also includes the Raw Data Generation (RDG) from Galileo and GPS for experimental purposes.

The latest version of GSSF (2.1) released end 2006 to ESA, includes integrity simulations, improvements to existing version 2.0 functionality.

The Real Power – Extending GSSF to Support Custom Applications

The GSSF offers a variety of standard simulation models and analyses for many applications, however, the true advantage of the GSSF comes from its ability to be customized and extended via the additional use of IDL as a development platform to meet a wide variety of custom application needs.

In addition to the flexibility of the Windows XP platform, the IDL engine generating the advanced visualizations in the system offers developers the ability to build additional functionality for application-specific purposes such as user-defined analyses. Equipment providers and service organizations can use this extensibility to develop customized models, post-processing routines and 3D visualizations to help gain a better understanding of the performance and accuracy of Galielo, GPS and EGNOS, and the relative performance of their particular commercial offering.

Application developers can also embrace the underlying IDL engine as a platform from which to develop third party, commercial software applications using the navigation system data.

Customizing the GSSF Platform with VEGA and CREASO

“The support of the team at CREASO GmbH, which is the German IDL vendor and engineering support provider, was essential. Frank Zimmermann, the GSSF Project Manager at VEGA, said, “The GSSF development did significantly benefit from the use of IDL. Due to the support provided by Bernhard Kortmann and his team, we achieved a seamless and smooth integration of IDL into the GSSF infrastructure, greatly enhancing the visualization capabilities of GSSF.”

As primary providers to the ESA/ESTEC effort, both VEGA and its IDL application development partner, CREASO GmbH, are in a unique position to be able to consult with companies and organizations looking to offer services and products using Galileo data. These services can also include the development of customized GSSF simulation applications to suit the particular needs of each company.

Benefits

IDL allowed the Galileo team to add advanced data visualization capabilities to their ground-based system

Using IDL allowed developers to increase the value and accuracy of data simulations

IDL easily integrated with the C++ – based application

(Source Vega-Group)

The business divisions „Remote Sensing Technology“ and „SpaceCom“ are acting since more than 15 years successfully in niche markets within the Space business with a different focus.

It is one of our main goals to improve the accessibility of our business activities and service portfolios for our customers by concentrating the existing competencies within the SpaceCom Division.
From 1st January 2007 our new SpaceCom division offers professional solutions via the business segments Space and Security and Earth Observation and Science for:
-Ground segment and control centres
-Data distribution and dissemination services
-System monitoring and control
-Integrated management of complex infrastructures
-Service centre solutions
-Professional applications
-Satellite receiving stations
Director of the SpaceCom Division will be Dr.-Ing. Horst Wulf assisted by Mr. Ulli Leibnitz as Deputy Director and being responsible for the business segment „Space and Security”.
For all Customers of the former „Remote Sensing Technology“ Division Dr. Peter Scheidgen will become responsible for the business segment „Earth Observation and Science”.
Mr. Oliver Harrmann will become Head of the „System Engineering“ department.
It goes without saying that we will keep you informed about all actual and future events. In case of any additional questions please do not hesitate to contact us.
For more information please click here to get to the new SpaceCom website.
Contact Wichmann
E-Mail: spacecom@vcs.de
(Source VCS)

SciSys, a leading provider of software services, has announced a major contract from EADS Astrium GmbH to develop on-board satellite software for the three-satellite Swarm mission, a key part in the European Space Agency‘s Living Planet programme.

(February 2007) The objective of Swarm is to provide the best ever survey of the geomagnetic field and its evolution over time, and so gain new insights into improving our knowledge of the Earth’s interior and climate. Resulting geomagnetic field models will improve our understanding of atmospheric processes related to climate and weather and will also have practical applications in many different areas, such as for example, ocean circulation, space weather, and radiation hazards.
The Swarm concept consists of a constellation of three satellites in three different polar orbits between 400 and 550 km altitude. High-precision and high-resolution measurements of the strength and direction of the magnetic field will be provided by each satellite. The satellites will be launched in 2009.
The SciSys on-board software will be responsible for controlling the precise orbit and attitude of each Swarm satellite as well as handling all of their communications with the ground segment. SciSys will be involved throughout all the stages of the Swarm development, from the initial software architecture and definition of the requirements, through integration of the software on-site with the processor hardware supplier in Italy, to supporting system level testing.
“We are delighted to have been chosen to provide the key software for this important programme and welcome the opportunity to continue our close relationship with the teams at EADS Astrium and ESA” said Alan Batten, the Space Business Manager at SciSys.
The SciSys team also confirmed that they have been contracted by Laben to develop the Test Application Software as part of the On-board Computer development for the Swarm mission.
About SciSys:
SciSys plc (formerly Science Systems) is a leading global developer of IT services, e-Business and advanced technology solutions, renowned for quality and efficiency. The company operates in a broad spectrum of market sectors including space, defence, public sector, communication, business services and transport. Within these markets, SciSys has been involved in significant developments in key technologies that have changed the way people do their jobs. SciSys clients are predominantly blue chip, government and quasi-government organisations. Customers include the Environment Agency, Ministry of Defence, Astrium, Rural Payments Agency, European Space Agency and the Metropolitan Police.
For more information, please contact:
Chris Lee
SciSys Ltd
Tel: +44 (0)1249 466466
Chris.Lee@scisys.co.uk
(Source SciSys)

Training course in surveillance and reconnaissance

Spot Image is holding a one-week training course in June 2007 for image analysts wishing to learn more about the contribution of the new FORMOSAT-2 and KOMPSAT-2 Earth observation satellites in the areas of surveillance and reconnaissance.

Course objectives:
By the end of the course, trainees will have gained the knowledge they need to:
• Manage the fundamental aspects of image analysis for surveillance and reconnaissance
• Analyse high- and very high-resolution imagery
• Interpret KOMPSAT-2 imagery to detect and identify objects
• Interpret FORMOSAT-2 revisit imagery to detect changes
Course requirements:
The course is aimed at people who already have a good grasp of the basic principles of Earth observation
Dates and venue:
Monday 11 to Friday 15 June 2007
Spot Image’s premises in Toulouse, France
Cost:
2,000 euros per trainee for the week
For more information and enrolment:
training@spotimage.fr
(deadline to enrol : 12th May 2007
(Source Spotimage)