Skip to content

18 September 2017. CLS, the French satellite services provider, has announced the acquisition of SIRS (Systèmes d’Information à Référence Spatiale [Information systems for space-related purposes), a French company founded in 1989, specialising in the production of geographical data from optical or aerial satellite images.

CLS is a subsidiary of CNES (Centre National d’Etudes Spatiales), Ardian (the independent private investment company) and IFREMER (Institut Français de Recherche pour l’Exploitation de la Mer).

Thanks to this acquisition, CLS – which already specialises in environmental monitoring – has further broadened its service offer and is now able to provide customers with enhanced solutions in the field of continental monitoring. Through its 25 offices and subsidiaries, CLS will consolidate its offer in Europe and develop its activity around the world with new services such as support for agriculture, regional planning, sustainable forest management, water resource management, and emergency mapping (in the event of flooding, forest fires, volcanic eruptions, etc.). With the acquisition of SIRS , CLS consolidates its position as an international provider of satellite services for the protection of the environment.

A strategic partnership

Christophe Vassal, Chairman of the CLS Management Board: “The future services will benefit from the synergy between CLS and SIRS and we expect them to be a great success. We have extremely complementary products, solutions and services. CLS already provides its customers with 30 years of experience in the field of environmental monitoring. SIRS complements CLS’s offer of oceanographic services with continental solutions and with its expertise in the field of optical satellite data.”

Jean-Paul Gachelin, Chairman and founder of SIRS: “This merger is a great opportunity for SIRS. Joining the CLS Group should boost our growth. CLS will help us develop our services abroad through its network of 25 offices and subsidiaries around the world. All the SIRS employees are pleased to be joining CLS. The Group shares our values and we have the same goal, which is to improve our understanding and management of the environment in order to make Earth a sustainable planet.”

The CLS Group had a turnover of nearly €115 million in 2016 and forecasts revenues of more than €125 million in 2017. In recent years, the group has experienced strong growth and set ambitious targets thanks to new markets.

Download the press release

24 July 2017. Sinergise has become a reseller of Airbus Defence and Space satellite imagery – Pléiades, which will be soon integrated into Sentinel Hub.

Pléiades Satellite Image – Andasol Solar Power Station, Spain © CNES 2014, distribution Airbus DS

Sinergise partnered with Airbus Defence and Space to provide Pléiades imagery, which will be available for purchase and distributed through Sentinel Hub services. We’ve already started with integrating new datasets into Sentinel Hub and we will keep you posted on the up-dates.

Pléiades satellite imagery is provided by two twin satellites, Pléiades 1A and Pléiades 1B, which are operating as a constellation in the same orbit. They are delivering very-high resolution optical data products and offering a daily revisit capability for the whole globe.

The Pleiades constellation is designed to obtain data in double-quick time. Its capability to acquire imagery in less than 24 hours presents a perfect source for usage in response to a crisis or natural disaster. You can use Pléiades products for regular monitoring, precision mapping and photointerpretation. By using the Airbus’ satellite tasking service the image acquisition time becomes faster and to access the new images takes less time (just a few hours in certain cases).

Pléiades 50-cm-resolution products are available at three processing levels, Primary, Projected and Ortho, in all spectral combinations. Continue reading for more information on spectral combinations and spectral bands.

Imagine a field, where 10000 variaties are growing in tighly, regularly spaced plots. Welcome to the daily live of plant breeders. Plant breeders observe, record and analyze key features in these fields every day to produce the selection of next generation of varieties.

Until recently, plant breeders would visit their experimental fields with pen and paper in order to acquire the necessary information. No need to explain it’s is a huge challenge to exclude human errors and obtain all information in a consistent and objective manner.
Let me explain to you how mapEO can add objectiveness and automation to these measurement campaigns by retrieving various kind or information through drone imagery.

IT’S ALL ABOUT INTEGRATING IMAGE ANALYTICS INTO DAILY DECISION MAKING

Drone uptake for regular monitoring in agriculture is limited, because drone flight endurance is short (10-20 minutes). But these short flights are just perfect for plant breeding plots. The images that are collected are so detailed that they can be used to generate current metrics automatically. On top of that, these measurements provide unbiased and consistent information throughout the growing season

In the field of plant breeding, the drone nor the camera are the most important aspect. It’s all about the image analytics, how they relate to existing agrometrics and how well this new source of information can be integrated into your daily decision making.

MAPEO, A DRONE IMAGE PROCESSING TOOL

We’ve been collaborating intensively with plant breeders from all over the world to develop mapEO, a drone image processing service dedicated for experimental fields.
Thanks to mapEO, you can obtain agrometrics which reaches field based measurements about:

• Plant height
• Disease detection
• Plant emergence
• Bio mass
• Plant cover
• …
Want to know more about mapEO and what it can do for you? Feel free to contact VITO they’ll help to get you started.

KEEPING THE MASSIVE AMOUNTS OF DATA USEABLE

Although field sizes are rather limited, the resulting data is not. Counting emerging spinach plants for example requires a milimeter pixel resolution, which for a complete field can end up to 10 GB of data.
In order to process, access and use all this data correct and easy, you require specific infrastructure.

mapEO helps you as a plant breeders fully adopt and integrate drone technology and image processing techniques into your daily live.

You are able to:
• derive agrometric statistics over a vegetation cycle
• analyse synergies with experiments all over the world
• derive standards which make current drone flights still valuable in 5 years time

Source

When you look at a satellite image, although you may not see it, every pixel in the image is subtly contaminated by the atmosphere.

If you want to derive quantitative information or study evolutions through a series of satellite images, it is essential that the atmospheric disturbance is removed and the object surface reflectance is retrieved before any other analysis is made. iCOR does just that, and it is based on years of experience in the field at VITO Remote Sensing.

AVAILABLE TO THE BROAD USER COMMUNITY

The atmospheric correction software, iCOR (previously known as OPERA), is now available to the broad user community through the ESA Sentinel Application Platform (SNAP) for the atmospheric correction of Sentinel-2 and Landsat-8 data.

By implementing iCOR in SNAP, researchers can test iCOR for their own study areas and experiment with the different functionalities the new tool has to offer. The iCOR SNAP plug-in can be freely downloaded.

iCOR has been extensively validated in different research projects (HIGHROC, INFORM and SPONGE) and within the ESA-NASA ACIX, Atmospheric Correction Inter-comparison Exercise (https://earth.esa.int/web/sppa/meetings-workshops/acix).
It is important to keep improving our work on the atmospheric correction and expanding iCORs functionalities. Feedback from users is hereby invaluable so we encourage all iCOR user to send feedback, comments, suggestions, wishes, etc.

ICOR, SCENE GENERIC TOOL READY TO APPLY DEDICATED CORRECTION

The strength of iCOR is that it is scene generic: it works just as well on land as on water targets (coastal, transitional and inland waters). The tool identifies whether a pixel is water or land and applies a dedicated correction. iCOR runs without user interaction and all the input parameters can be retrieved from the image itself.

The extra module, SIMEC is part of iCOR and the SNAP implementation allows you to switch it on and off. In particular inland waters and estuaries can be severely affected by these effects and correction is strongly recommended. The iCOR implementation in SNAP corrects Sentinel-2 and Landsat-8 satellite imagery, and is easily configurable through the SNAP user interface. It works on Linux and Windows platforms alike.

WHERE IT ALL STARTED

The development of iCOR started many years ago and was originally used to correct airborne hyperspectral imagery for atmospheric effects. We soon added an extra module, SIMEC (Similarity Environment Correction), because we experienced severe contamination of the water pixels by light coming from the surrounding land and vegetation.

We extended the first features so that it would be possible to correct satellite imagery as well and made the tool fully operational without any need for user interaction. iCOR was tested and validated extensively for MERIS, Landsat-8, Sentinel-2 and PROBA-V. We use iCOR ourselves in several operational processing chains.

LET’S GET YOU STARTED

The iCOR SNAP plug-in can be freely downloaded. After download and installation you can open it as a plugin in SNAP. The manual describes in detail the installation and handling of iCOR.
iCOR was developed through funding from the Belgian Science Policy, The European Space Agency and the European Commission.

Source

Small differences in the content of light, far more subtle than what we can perceive as different colours, carry valuable information about the health status of vegetation.

Hyperspectral imaging reveals this by splitting light into a large number of spectral components. Remote sensing applications use this spectral data to derive accurate information of even minor changes in the state of vegetation, water or other objects. With compact hyperspectral cameras, vegetation can be monitored from multiple platforms, including small drones and satellites. VITO Remote Sensing wants to make the extraction of accurate information at high update rates more accessible, to support precision agriculture, environmental and vegetation monitoring.

EXPLOITING THE POTENTIAL OF HYPERSPECTRAL IMAGING AND SMALL AIRCRAFT

Small drones have made high resolution local monitoring accessible. By equipping them with miniature hyperspectral imagers, their potential can be increased. But for that, new technologies are needed. A particularly well-suited technology is based on thin film interference filters (or linear variable filters – LVF) which can be deposited directly onto imaging sensors. This creates a very compact design which can be produced in a cost effective way. Fundamentally different from traditional hyperspectral imagers, such imagers allow to acquire 2-dimensional images by sensing different parts of the spectrum at different pixel locations. Developed by our partner imec, the Belgian research and innovation center in nano-electronics and digital technology, VITO Remote Sensing has been involved in optimising this technology for remote sensing requirements.

GET THE INFO OUT OF YOUR DATA

A compact sensor or camera is only part of the solution. Users also need to get the information out of the acquired data. By adding a user-friendly image processing flow you create a useful remote sensing tool. This is where VITO Remote Sensing steps in. Thanks to our knowledge and experience in image processing, image quality and remote sensing applications, we have created a unique hyperspectral imaging solution, COSI (compact hyperspectral imaging).

THE DARK HORSE AMONG MINIATURE HYPERSPECTRAL CAMERAS

Some other hyperspectral cameras have been developed based on thin film filters, but with filters arranged in conventional mosaic patterns similar to those in consumer colour cameras. Such imagers are stuck with a fundamental compromise in which only low spectral and spatial resolutions can be obtained. By changing to the LVF design, we can use photogrammetric methods and image processing to generate a hyperspectral data product with a wider spectral range and vastly improved spectral and spatial resolution. After the development of COSI, a prototype camera using a 2 Mpixel sensor (developed by imec), a second generation product was developed in collaboration with German camera builder Cubert gmbh . COSI became ButterflEYE LS and is available on the market since the end of 2016.

A GROWING FAMILY OF LVF BASED HYPERSPECTRAL IMAGING SOLUTIONS

VITO Remote Sensing has also been leading a Belgian consortium (VITO (prime), imec, Deltatec, AMOS, CMOSIS, under the authority of ESA) to develop a larger (12 MPixel) hyperspectral imager based on the same technology. The goal is to build an image sensor chip which acquires both hyperspectral and broad band panchromatic images that can be used on a wider range of remote sensing platforms, both airborne and spaceborne. A prototype camera will be available mid 2017.

An important milestone for thin film based hyperspectral systems is the in-orbit demonstration of the HyperScout instrument on board of a cubesat which will be launched mid 2017. This system is developed by Cosine in collaboration with VITO, S&T and TUDelft. To overcome limited downlink capacity (a very pressing limitation for hyperspectral imagers on small platforms) the system contains a powerful on-board processing module which converts the raw data into derived map products. Only these need to be transmitted to earth, which makes the system vastly more efficient.

Source

Land is an essential natural resource for humanity and all terrestrial ecosystems, but human demands exceed available resources. This leads to deforestation, drought, decrease of biodiversity and loss of wildlife. The Copernicus Global Land Service assists public and private organizations in the preservation of our ecosystems, where understanding is the first step.

COPERNICUS GLOBAL LAND COVER MAP

To tackle deforestation or loss of biodiversity for example, organizations first have to know the physical coverage of the Earth’s surface, its use and its dynamics. The Copernicus Global Land Services therefore extend its portfolio and released its first Global Land Cover Map to provide spatial information about land for a diversity of applications ranging from global forest monitoring, global crop monitoring, biodiversity and nature conservation to climate modelling.

By merging remote sensing imagery with other ancillary data sources, a highly automated, accurate and cost efficient Land Use and Land Cover (LULC) and Land Use and Cover Change (LUCC) solution generates yearly 100 m land cover maps, following the classification scheme of the FAO Land Cover Classification System (LCCS).

TUNE THE MAP TO YOUR APPLICATION

Next to this basic map, a set of continuous cover layers for tree, grass, shrub and bare soil are generated based on a novel approach. Each cover layer provides the fraction of the pixel that belongs to the given class. Through this information users can combine the layers and tune the default land cover classes for their application, and thus support the United Nations’ Sustainable Development Goals (SDGs).

Image of Tanzania showing the land cover classification on the left, a google earth image in the middle and the cover fraction compilation on the right.

THE DATA BEHIND THE MAP

The map is generated and validated by IIASA, Wageningen University and VITO Remote Sensing,. It is derived from PROBA-V 100 m and 300 m images and uses training data collected at 10 m resolution and several ancillary data layers.

The map and its cover layers were generated on the PROBA-V Mission Exploitation Platform, a scalable Hadoop Spark platform with 1200 executers and 5TB memory. Such platform is able to handle large amount of data and enables us to perform multiple iterations to perform quality checks and improve the products as it takes only a few days to perform all processing steps.

LAND COVER CLASSIFICATION FOR AFRICA

The first map covers the African continent in 2015. An independent validation shows accuracies up to 10% higher than other global land cover maps. A qualitative comparison shows significant improvements in terms of spatial accuracy compared to other global maps, especially in the Sahel zone, border regions of Sudan, Ethiopia and Kenya as well as eastern Botswana and majority of Madagascar. The map is assessed against both its generic use as well as against its use in different applications. The continuous cover fractions provide information at 100 m resolution with an accuracy of 85-95% (mean absolute error of 5-15%).

WHAT’S STILL TO COME?

The generation of the land cover maps is part of the Copernicus Service, hence sustained delivery of yearly updates. The next step, planned for mid-2018, is to scale up to the globe, deliver three yearly maps and integrate Sentinel-2 higher spatial details for selected regions.

More information can be found on the Copernicus Global Land portal. Users are encouraged to provide feedback through the Geo-WIKI validation tool or through the Copernicus Global Land helpdesk.

Source

This study proposes a synergistic landslide detection framework using Sentinel 1 and Sentinel 2 data and applying several change detection algorithms.

As case study, the landslide of the Amyntaio’s mine in Greece at 10/6/2017 was selected. The landslide was rapidly and effectively detected through operational and automatic processes. Thus, rapid and reliable conclusions can be extracted for decision making and risk monitoring.

Impact

Operational, rapid and automatic landslide detection provides vital information for decision making and risk monitoring. Copernicus/ESA data can contribute to this topic as provide free use, efficient information, high temporal frequency, and big area cover. Multimodal approaches such a synergy of Sentinel 1 and 2 processes can lead to proper detection of landslide phenomena.

Concept

The main goal of this research is the rapid and automatic detection of large landslide phenomena using Sentinel 1 and 2 data applying several change detection algorithms.

Technical Details

Several change detection techniques were used to map the large landslide of the Amynteou mine. The used time series data from Copernicus program, i.e before the landslide and after the landslide, were: 2 SAR images type IW/SLC from Sentinel 1 (4/6/2017 and 10/6/2017) and 2 EO images type MSIL2A from Sentinel 2 (1/6/2017 and 21/6/2017). Concerning the Sentinel 1 data, change detection maps were extracted using the magnitude and the phase layer. On the other hand, concerning the Sentinel 2 data, change detection maps were extracted through direct image processing and machine learning using spectral information and annotated data. The landslide detection was performed via change detection techniques at ERDAS IMAGINE software 2016.1 (Imagine SAR Feature tool, Imagine SAR Interferometry tool, Imagine Objective tool, and Change Detection tools).


Used time series data from Sentinel 1 and 2


Change detection strategy and results using Sentinel 1 and Sentinel 2 data

Contact Info
o Contact person : Maltezos Evangelos, Betty Charalampopoulou
o E-mail : mail@geosystems-hellas.gr

Extreme weather events are causing significant shifts in the productivity of agricultural activities posing a danger to food security and yield production. Improving weather forecasts is essential when it comes to dealing with the effects of climate change and save agricultural investments.

In response to these threats, the UK Space Agency and a consortium of UK companies led by RHEA Group, are working in partnership with the Ugandan Ministry of Water and Environment and the Ugandan National Meteorological Association (UNMA) to effectively develop and implement a solution that can anticipate and proactively respond to these weather irregularities.

The Drought and Flood Mitigation Service (DFMS) uses localised weather forecasts, combined with the latest satellite and ground-based data and the derivation of improved drought and flood models. The application delivers high-quality, timely, geo-information to its users, enabling them to efficiently respond to both, possible negative and positive, effects of forecasts on agriculture and livestock production.

The Technology Behind DFMS

DFMS uses the Environment Early Warning Platform (EWP) to communicate with local farmers on actions to be taken throughout the growing season to maximise crop yields and protect their livestock. The EWP will assimilate a range of diverse data sources, ranging from satellite and meteorological data to community/mobile sources that are obtained in the field. The system will use cloud technologies for flexible deployment and processing

EWP includes data from seasonal forecasts, linked hydrological modelling for drought and flood, satellite imagery from Copernicus and other missions, soil moisture, land surface temperature, water level extents, radar images, and land cover classification. The accumulation of this data will be displayed on a secure, reliable, platform, accessible via an open data cube.

The region: Karamoja, Uganda

DFMS initially will work in the Karamoja region of Uganda, an area with high levels of poverty and vulnerability, where 10% of the total population lives (more than one million people). Their farmers currently receive weather forecast information via the radio, and they also use their indigenous methods to predict rainfall. Unlike the current weather prognostications methods, DFMS provides parish level of detail, ensuring more specific and accurate data.

Currently, the effects of climate change impact an estimate of 80% of the Ugandan population. Irregular weather patterns, such as the timing of the onset of the rainy season and the reliability and intensity of precipitation, are increasing crop failure, soil erosion, and land degradation disrupting both small-holder livelihoods as well as agricultural businesses.

Drought and Flood Mitigation Service
info@dfms.co.uk
www.dfms.co.uk

Dr Sam Lavender from Pixalytics Ltd was a speaker at ‘One Step Beyond’ a TEDx Leicester event at the National Space Centre in Leicester, UK which took place on the 21st September. Sam gave a talk entitled “Beyond The Blue Ocean” which demonstrated through satellite imagery the different aspects of the ocean that can be seen from space.

The imagery was displayed within the Space Centre’s planetarium and so the screen was huge!

TEDx are independent events within the global TED organisation dedicated to spreading ideas and sparking conversations through short talks. Sam’s video will join the TEDx library (https://www.youtube.com/user/TEDxTalks/videos ) in the near future.

AgriMonitor is a web app, that allows the monitoring of agricultural areas through time either by using interactive graphs of vegetation and soil indices, or image snapshots that describe the intra-field crop growth conditions at various spatial and time resolutions

NEUROPUBLIC is an active member of the European Association of Remote Sensing Companies, that develops cloud EO-based services for the agricultural sector. We are doing business by using Copernicus and IoT data, promoting the penetration of EO deeper into the CAP line of business and also the establishment of the digital farming market (figure 1).

AgriMonitor is a web app that allows the monitoring of agricultural areas through time, either by using interactive graphs of vegetation and soil indices, or image snapshots that describe the intra-field crop growth conditions at various spatial and time resolutions. The information context is enriched by processed data coming from our own network of telemetric stations.

For most areas of the Greek Territory, agronomists and farmers are now able to understand at parcel level the evolution of their crops. It’s a time machine that enables decisions under a straightforward way (figure 2).

The end user may upload his agricultural parcels or he can digitize an Area of Interest (AOI) and see the results in real time. The AOI polygons may also represent large areas, such as wetlands, grasslands or even lakes and forests. This way, Agrimonitor can arise as a valuable way to monitor the evolutionary change of whole ecosystems through time.

Methodologically, at NEUROPUBLIC we have adapted an object-based approach, aiming to extract with an automated way objective and well-documented information. Each agricultural parcel, is an object under monitoring. We assign extracted indicator values to agricultural parcels, having as main input streams of Earth Observation-based layers, ground-sensor measurements, or even farm logs.

The assigned info creates a multi-dimensional signature of the object that describes the parcel under different aspects.  We present the assigned info by using graphs or images. We can also use these datasets to create models that allow the extraction of insights, or the classification using various criteria. The services as well as the processing workflows have been created using open source software. We mainly use Python in the backend as there are great packages available to handle spatial data along with zonal operations and arrays. Considering the maps and the charts presented in UI, we focused on interactive libraries, such as Leaflet and Plotly. The existing infrastructure of Agrimonitor has already been tested in the development of Smart Farming and CAP services. Our future steps include the integration of Sentinel 1 and 3 data as well as the extension of the coverage at a EU level. NEUROPUBLIC will demonstrate Agrimonitor at the upcoming 23rd MARS Conference that will take place in Gormanston, Ireland, on 28 and 29 November 2017.