Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:


New tools for processing satellite images

Remote Sensing Group researchers based at the Universidad Politécnica de Madrid’s School of Computing (FIUPM) have developed a number of satellite image fusion techniques, making the images more useful and applicable in a variety of fields like urban planning, natural resources management or precision farming, etc.

To ease the use of these techniques, the Remote Sensing Group has built add-ons for ImageJ. ImageJ is public domain Java image processing software. These add-ons are necessary to process multi-spectral images, and there is now an image read/write module with specific formats for multi-band images (usable for any number of spectral bands).

Additionally, the Remote Sensing Group has developed a module of generic utilities. Another module implements the image transforms most commonly used in image fusion. And yet another includes different multi-spectral and panchromatic image fusion algorithms, including algorithms developed by the Remote Sensing Group, already published in specialized journals.

Information requirements

Some of our planet’s most pressing problems, like population growth, the need for sustainable intensive agriculture, safe food and energy production and distribution, soil management, climate change, public health and social conflicts, cannot be solved without immediate information on a local and a global scale.

Remote sensing is the only available technology at present for acquiring most of this information almost immediately and on any scale, as it supplies a tremendous amount of high quality (accurate, consistent and reliable) geographic, spatial and spectral data, etc.

One of the most commonly used products in remote sensing are images from remote optical sensors: multi-spectral and panchromatic images. A multi-spectral image is an image recorded by a sensor that picks up information contained in different bands of the electromagnetic scale. A multi-spectral image is composed of as many images of the same area of the earth’s surface as spectral bands recorded by the sensor (spectral resolution). Each of these images actually provides information about the amount of radiation reflected by the different surfaces in the area being observed within each of the spectral bands. Another feature of these images is their spatial resolution. Spatial resolution determines the size of the smallest detail that can be observed in the image.

On the other hand, panchromatic images provide the same type of information for a single, normally the broadest, spectral band.

Information extraction methods

The volume of data to be managed and interpreted can vary significantly depending on the sensor type used and the features of the images they record. Even so, data throughput is always very high and calls for objective and precise information extraction methods.

Traditionally these methods have been based on automatic classification and interpretation techniques. On the whole, these techniques output qualitative information that is not always easily quantifiable. Spectral indices or model inversion are other approaches that have recently proved to be more effective for extracting quantitative information from satellite images.

Whatever approach is taken, the precision of the results provided by the remote sensing techniques depends on a series of factors. Current sensor technology (ETM+, ASTER, MERIS, MOS, SPOT, IKONOS, QUICKBIRD...) and, consequently, the features (spatial, spectral and temporal resolution) of the images they record are two key factors. Due to space platform information transmission technology limitations, there is at present a trade-off between these three types of resolution. Sensors that provide a high spatial resolution tend to have a lower spectral resolution and vice versa. This is why panchromatic images have a higher spatial resolution than multi-spectral images. Note, for example, that multi-spectral images from space programmes like LANDSAT, SPOT or IRS, have a high spectral resolution compared with the respective panchromatic images, but their spatial resolution is lower.

Image fusion techniques

Still, there is a wide range of remote sensing applications that require satellite images combining both features (high spatial and spectral resolution).

Even though some of today’s satellites, like QUICKBIRD and IKONOS, do actually deliver such high resolution images, they are costly and not always to be had by ordinary users. One way of getting high resolution spatial and spectral images at fairly reasonable costs is to use image fusion techniques. What’s more, image fusion techniques can output very high resolution images if applied to data recorded by latest generation sensors. The goal of image fusion techniques is to consistently integrate information from different images, assuring that the final fused image retains the key information from the source images.

There are now many optical image fusion methodologies and algorithms. The most common are based on a number of different transforms. Some, like the methodologies based on the Brovey transform, principal component analysis or the IHS transformation method (intensity, hue, saturation), are conceptually very simple.

As demonstrated in many papers, however, the colour of the fused images output by these methodologies is quite distorted compared with the original multi-spectral images. This means that they cannot be used in a variety of routine tasks in the field of remote sensing, like image classification or change detection.

On the other hand, there are many methods based on multi-resolution analysis techniques. Most of these techniques use the discrete wavelet transform (DWT). Generally, these methods provide only a slight distortion of the colour of the fused images with respect to the multi-spectral image. They outperform the techniques described above, but there is no way of controlling the trade-off between the spatial and spectral quality of the fused images.

Remote Sensing Group Techniques

The School of Computing’s Remote Sensing Group has proposed a weighted version of DWT-mediated fusion calculated using the à trous algorithm. This algorithm can efficiently separate the background information from the detail of an image, avoiding the decimation process characteristic of other algorithms used to calculate the DWT.

The Remote Sensing Group has also proposed other fusion techniques based on multidirection-multiresolution transforms (MDMR). These methods output better quality fused images than DWT-based techniques and, at the same time, have a built-in facility for controlling the trade-off between the images’ spatial quality and colour distortion.

Accessible tools

With the aim of easing the use of the proposed algorithms already published in reputed international journals, the Remote Sensing Group set itself the goal of integrating the methods into a tool.

To do this, it set out to build the tools required to process multi-spectral images into public domain image processing software developed in Java: ImageJ.

As mentioned earlier, the modules that are now up and running are:

• Image read/write module with specific formats for multi-band images (any number of spectral bands)
• Generic utilities module
• Implementation of the image transforms most commonly used in image fusion
• Multi-spectral and panchromatic fusion algorithms. They include the algorithms developed within the School of Computing’s Remote Sensing Group, published in international journals and described above.

All these ImageJ add-ons have been packaged as IJFusion.

Eduardo Martínez | alfa
Further information:

All articles from Information Technology >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: A step towards controlling spin-dependent petahertz electronics by material defects

The operational speed of semiconductors in various electronic and optoelectronic devices is limited to several gigahertz (a billion oscillations per second). This constrains the upper limit of the operational speed of computing. Now researchers from the Max Planck Institute for the Structure and Dynamics of Matter in Hamburg, Germany, and the Indian Institute of Technology in Bombay have explained how these processes can be sped up through the use of light waves and defected solid materials.

Light waves perform several hundred trillion oscillations per second. Hence, it is natural to envision employing light oscillations to drive the electronic...

Im Focus: Freiburg researcher investigate the origins of surface texture

Most natural and artificial surfaces are rough: metals and even glasses that appear smooth to the naked eye can look like jagged mountain ranges under the microscope. There is currently no uniform theory about the origin of this roughness despite it being observed on all scales, from the atomic to the tectonic. Scientists suspect that the rough surface is formed by irreversible plastic deformation that occurs in many processes of mechanical machining of components such as milling.

Prof. Dr. Lars Pastewka from the Simulation group at the Department of Microsystems Engineering at the University of Freiburg and his team have simulated such...

Im Focus: Skyrmions like it hot: Spin structures are controllable even at high temperatures

Investigation of the temperature dependence of the skyrmion Hall effect reveals further insights into possible new data storage devices

The joint research project of Johannes Gutenberg University Mainz (JGU) and the Massachusetts Institute of Technology (MIT) that had previously demonstrated...

Im Focus: Making the internet more energy efficient through systemic optimization

Researchers at Chalmers University of Technology, Sweden, recently completed a 5-year research project looking at how to make fibre optic communications systems more energy efficient. Among their proposals are smart, error-correcting data chip circuits, which they refined to be 10 times less energy consumptive. The project has yielded several scientific articles, in publications including Nature Communications.

Streaming films and music, scrolling through social media, and using cloud-based storage services are everyday activities now.

Im Focus: New synthesis methods enhance 3D chemical space for drug discovery

After helping develop a new approach for organic synthesis -- carbon-hydrogen functionalization -- scientists at Emory University are now showing how this approach may apply to drug discovery. Nature Catalysis published their most recent work -- a streamlined process for making a three-dimensional scaffold of keen interest to the pharmaceutical industry.

"Our tools open up whole new chemical space for potential drug targets," says Huw Davies, Emory professor of organic chemistry and senior author of the paper.

All Focus news of the innovation-report >>>



Industry & Economy
Event News

70th Lindau Nobel Laureate Meeting: Around 70 Laureates set to meet with young scientists from approx. 100 countries

12.02.2020 | Event News

11th Advanced Battery Power Conference, March 24-25, 2020 in Münster/Germany

16.01.2020 | Event News

Laser Colloquium Hydrogen LKH2: fast and reliable fuel cell manufacturing

15.01.2020 | Event News

Latest News

"Make two out of one" - Division of Artificial Cells

19.02.2020 | Life Sciences

High-Performance Computing Center of the University of Stuttgart Receives new Supercomuter "Hawk"

19.02.2020 | Information Technology

A step towards controlling spin-dependent petahertz electronics by material defects

19.02.2020 | Power and Electrical Engineering

Science & Research
Overview of more VideoLinks >>>