top of page

Hyperspectral Imaging for finer Surface Characterization

  • Writer: Arpit Shah
    Arpit Shah
  • May 31, 2022
  • 12 min read

Updated: Dec 11, 2025

  1. Introduction


It is the investigative nature of Satellite Imagery Analytics that, I believe, draws enthusiasts to this field—the ability to extract meaningful insights from remotely sensed data acquired by spaceborne instruments hundreds of kilometres above the Earth’s surface.


In this post, you will get familiar with a relatively novel technique of acquiring Earth Observation data known as Hyperspectral Imaging (technically, Imaging Spectroscopy) through a workflow demonstration video. Before that, I will outline some Remote Sensing fundamentals along with the characteristics of the more commonly used Multispectral and Radar imaging techniques. These will provide the right foundation for understanding where Hyperspectral Imaging truly shines.

Section Hyperlinks

  1. Remote Sensing Background


Objects, materials, and surfaces on Earth can be imaged in two ways:

  1. Passively – by capturing reflected solar radiation

  2. Actively – by transmitting a signal and capturing the backscatter received by the satellite sensor

Natural-color Optical image of the Earth's surface. Source: gsitechnology.com
Figure 1: Natural-color Optical image of the Earth's surface. Source: gsitechnology.com

Figure 1 shows a natural-color rendition of a passively acquired optical dataset. The imagery forms when the satellite captures reflected solar radiation—sunlight is composed of visible light, infrared, and ultraviolet energy wavelengths of the Electromagnetic Spectrum. If the instrument captures reflectance across more than three spectral bands (see types here), the dataset qualifies as Multispectral Imagery.



Electromagnetic Spectrum Infographic - Multispectral Satellite Imagery for Earth Observation is acquired by capturing the reflection of Solar Radiation i.e. visible light, a portion of infrared, and a portion of ultraviolet. Radar Satellite Imagery for Earth Observation is acquired by capturing the reflection of a specific range of Microwaves, originally transmitted by the satellite itself. Image Source: NASA ARSET
Figure 2: Electromagnetic Spectrum Infographic - Multispectral Satellite Imagery for Earth Observation is acquired by capturing the reflection of Solar Radiation i.e. visible light, a portion of infrared, and a portion of ultraviolet. Radar Satellite Imagery for Earth Observation is acquired by capturing the reflection of a specific range of Microwaves, originally transmitted by the satellite itself. Image Source: NASA ARSET

Different surfaces respond differently to solar radiation. Some wavelengths reflect upon interacting with the object, while others may be absorbed or transmitted. The extent of reflectance depends on:

  • Reflective properties of the object (rough vs. smooth surfaces)

  • Geometric effects (angle of illumination and angle of reflection)

  • Biogeochemical characteristics (moisture content, mineral composition, structure, etc.)


That being said, even before the signal even reaches the surface, it must pass through the atmosphere, where it can be influenced by clouds and gases. The same applies when the reflected energy exits the atmosphere and travels toward the sensor.


How Soil, Vegetation and Water respond to Solar Radiation. Source - seos-project.eu
Figure 3: How Soil, Vegetation and Water respond to Solar Radiation. Source: seos-project.eu

As the example in Figure 3 demonstrates, the reflectance behaviour of Soil, Vegetation, and Water varies significantly across solar wavelengths.


For instance:

  • Water reflects almost nothing (0%) in the Near Infrared region

  • Soil reflects around 30–35%

  • Vegetation shows the highest reflectance—up to 50% in the Near Infrared


This makes spectral analysis possible: multispectral datasets can be processed to detect, delineate, and classify broad geologic features like farmlands, forests, and water bodies.

However, sunlight—the source of illumination for Multispectral Imaging—comes with drawbacks:

  • No imaging is possible at night

  • Daytime imagery can be obstructed by clouds, aerosols, and haze

  • Fine-grained distinctions are limited

For example, if one intends to differentiate palm trees from coconut trees, multispectral sensors may not provide enough spectral detail. Both may reflect similarly in the broad wavelength bands that multispectral instruments capture.


This limitation highlights the growing importance of imaging techniques with higher spectral resolution—i.e., those that capture reflectance across many more and much narrower spectral bands. Such sensors enable finer surface discrimination, something Multispectral imagers (as of today) struggle with at regional scales.

Beyond spectral resolution, Remote Sensing also involves other important types of resolution. Here is a concise summary:


  1. Spectral Resolution - The range and width of energy wavelengths that a sensor can detect. For example, Cartosat-1 captures reflectance from 500–850 nm (a single, broad spectral band of width 350 nm).


  2. Spatial Resolution - The size of the smallest surface feature the sensor can detect—effectively the pixel size. For example, Sentinel-2's R, G, B bands have pixels measuring 10 × 10 metres, giving them a spatial resolution of 10 m.


  3. Temporal Resolution - The frequency with which a satellite revisits the same location. For example, Sentinel-5P has a 16-day repeat cycle, meaning the same area is imaged once every 16 days.


  4. Radiometric Resolution - The sensor’s ability to detect slight differences in energy, represented by the number of possible digital values per pixel. For example, the OLI-2 multispectral instrument on Landsat 9 has a 14-bit radiometric resolution, meaning each pixel can take on 16,384 (2¹⁴) distinct reflectance values.

More information on these can be accessed here. These types of resolution are not mutually exclusive. Increasing one often forces a trade-off in another—for example, improving spectral resolution may require sacrificing spatial coverage or radiometric sensitivity.

Radar Satellite Imagery. Source: BreakingDefense.com
Figure 4: Radar Satellite Imagery. Source: BreakingDefense.com

Synthetic Aperture Radar (SAR), or simply Radar Satellite Imagery, often serves as a valuable alternative in workflows where Multispectral Imagery proves insufficient. It can also be used in a complementary or supplementary manner to validate outputs derived from processing Multispectral datasets.





Radar satellites acquire data using an active imaging sensor that transmits microwaves toward the Earth and captures their reflectance—technically known as backscatter, where energy rays return in the direction of the source. Sentinel-1, one of the most widely used Earth Observation radar constellations, transmits pulses of C-band microwaves (with wavelengths between 1 cm and 8 cm).


One major advantage Radar satellites hold over Multispectral (and even Hyperspectral) satellites is that, owing to their active illumination capability, they can acquire imagery during night-time.

A distinction worth noting is that Multispectral or Hyperspectral imaging for spaceborne Earth Observation is inherently passive, relying on reflected solar radiation. However, the same techniques, when implemented using airborne or ground-based instruments—such as drones, aircraft sensors like AVIRIS NG, or geodetic systems—may employ active illumination instead. Similarly, astronomical observatories like the Chandra X-ray Observatory (a spaceborne instrument designed for outer-space research) also use active modes of illumination.

Another advantage is that microwaves remain largely unaffected by cloud cover and atmospheric aerosols, thanks to their long wavelengths. This makes Radar an extremely reliable tool for Earth Observation, especially in regions with persistent cloudiness. Additionally, surface materials interact with microwaves very differently from how they interact with solar wavelengths—a property that can be exploited exclusively through Radar data or in combination with Multispectral or Hyperspectral imagery.


A technical comparison table summarizing the characteristics, advantages, and disadvantages of Multispectral vs. Radar Remote Sensing can be accessed here and some of the applications that use one or both can be accessed here.

So, if Multispectral and Radar techniques have different strengths and serve a wide range of often mutually exclusive applications, what exactly is Hyperspectral Remote Sensing, and where does its utility lie?

The next section addresses this question directly.

  1. Hyperspectral Imaging


As described in the introductory course by EO College-

“Imaging spectroscopy refers to imaging sensors measuring the spectrum of solar radiation reflected by Earth surface materials in many contiguous wavebands—whether the sensors are ground-based, airborne, or spaceborne. With up to hundreds of reflectance bands available, the technique enables detection and quantification of materials based on the shape of their spectral curve.”

Contrary to popular perception, Hyperspectral Imaging is not a new technology. The first imaging spectrometer became operational as early as 1982, although it was mounted on research aircraft and therefore limited to small-area acquisitions and select locations. As a result, only a small subset of researchers had access to such data. It was only in the 2000s, when spectrometers began to be launched onboard Satellites, that Hyperspectral Imaging started gaining mainstream adoption.


As with any emerging technology, subsequent generations refined earlier shortcomings and benefited from advancements in the surrounding ecosystem. Since 2019, several spaceborne Hyperspectral sensors have been launched, and new algorithms have been developed. These developments promise to usher in a new era in Earth Observation—one focused on obtaining deeper insights into the geochemical, biochemical, and biophysical properties of the Earth’s surface and atmosphere.

How is Hyperspectral imagery different from Multispectral imagery?


Consider the depiction below:

Spectral Resolution comparison of Multispectral and Hyperspectral Imagery. Source: Edmundoptics.com
Figure 5: Spectral Resolution comparison of Multispectral and Hyperspectral Imagery. Source: Edmundoptics.com

Spaceborne Hyperspectral sensors for Earth Observation (for example EnMAP or The Environmental Mapping and Analysis Program) acquire reflectance across many narrow and contiguous spectral bands—indicating high spectral resolution. In contrast, Multispectral sensors capture reflectance across comparatively fewer and wider, often non-contiguous, spectral bands—indicating lower spectral resolution.


The charts in Figure 5 above illustrate the distinction:

  • Multispectral imagery produces categorical reflectance values across a few discrete bands—something you can plot as a bar chart.

  • Hyperspectral imagery, with reflectance measured across a continuous wavelength range, allows you to plot a smooth curve—a spectral signature—which reveals how a material responds across the spectrum.


Hyperspectral 'Data Cube'. Source: University of Texas at Austin, Center for Space Research
Figure 6: Hyperspectral 'Data Cube'. Source: University of Texas at Austin, Center for Space Research

The data cube in Figure 6 represents hundreds of stacked Hyperspectral images of the same region, each corresponding to a narrow and contiguous spectral band. Together, these layers form a spectral fingerprint of the region.

Thus, high spectral resolution and continuous reflectance acquisition across a broad spectral range—enabling the derivation of spectral signatures—are the defining features that distinguish spaceborne Hyperspectral Imaging from spaceborne Multispectral Imaging in Earth Observation.


Let me elaborate on the utility of Hyperspectral Imaging with a simple analogy.

Imagine a transparent container stored at cryogenic temperatures (−150°C) containing five substances in solid form. As you gradually heat the container to a scorching 150°C, you note the exact temperature at which each substance changes its state of matter. Once charted (Figure 7), you’ve essentially created the Temperature Signature of the five substances.

Fictitious Example of Temperature Signature of a Mixture
Figure 7: Hypothetical Example of Temperature Signature of a Mixture; Source: Mapmyops

Which of the five substances is H₂O (water)?


Very simple: only Substance C. Water exists as ice below 0°C, as liquid between 0–100°C, and as vapour above 100°C. Since the remaining four substances have temperature signatures that differ from this known pattern, they can easily be ruled out.

Now imagine maintaining a database of temperature signatures for ten substances relevant to your study. With this reference in hand, you can identify which, if any, of the remaining four substances inside the container match the ones of interest. This is exactly how Hyperspectral Imaging works. Instead of a temperature signature, we compare the spectral or energy signature captured by a spaceborne sensor against a database of verified spectral signatures. This helps detect a particular surface feature, distinguish between surface types, or classify materials with far greater precision than conventional methods.

For example, in Figure 8 below, note the spectral responses of healthy, stressed and dry vegetation to Solar Radiation at lower wavelengths (400–800 nm, i.e., the Visible Light region). TTheir reflectance behaviour varies sharply—a difference strongly linked to pigment composition. Healthy vegetation appears green due to substantial chlorophyll, which absorbs red and blue wavelengths while reflecting green. As a result, its reflectance in the visible spectrum is comparatively low.


When vegetation becomes stressed, its ability to produce new chlorophyll diminishes, and paler pigments increase. This yellowing is accompanied by a noticeable rise in reflectance in visible wavelengths.


When plants wither and die—say during drought—the reflectance drops again. Both chlorophyll and carotenoids stop developing, leaving behind tannins that impart a woody colour and reflect mainly red light, thereby reducing overall visible reflectance.


Thus, visible-wavelength responses reveal a wealth of insight about vegetation health and composition.

Spectral Signature of Healthy, Stressed & Dry Vegetation to Sunlight; Source: HYPERedu, EnMAP education
Figure 8: Spectral Signature of Healthy, Stressed & Dry Vegetation; Source: HYPERedu, EnMAP education

Similarly, reflectance patterns in higher wavelengths (1400–2400 nm — the intermediate infrared region) vary in a predictable manner depending on moisture content. Unlike pigments, moisture influences reflectance more linearly: the lower the moisture, the higher the infrared reflectance, because water absorbs infrared energy. This makes infrared wavelengths especially valuable for quantifying plant health and dryness.


All of this demonstrates the value of capturing continuous energy readings across a spectral range. Hyperspectral sensors enable this level of detail, making it possible to detect, delineate and classify vegetation and many other surface types with high precision.


These strengths also highlight the limitations of Multispectral imagery. Consider the spectral range of Sentinel-2 MSI (Multispectral Imager) in Figure 9 below-

Spectral Range of Sentinel-2 Multispectral Imagery. Source: Geosage.com
Figure 9: Spectral Range of Sentinel-2 Multispectral Imagery. Source: Geosage.com

While Sentinel-2 allows broad differentiation between healthy, stressed and dry vegetation, it becomes challenging to determine how stressed or how dry the vegetation is because:

  1. Bands are non-contiguous, creating gaps in the spectrum, and

  2. Bands are broader, causing fine variability in reflectance to be averaged out.


Remember that surface reflectance is driven by more than one factor: surface roughness, geometric effects and biogeochemical properties (in addition to atmospheric influences and wavelength characteristics). Alterations in any of these parameters modify the spectral signature.


Video 1 shows how the signature changes with variations in chlorophyll content, leaf water content and leaf area index—three key influencers of plant reflectance. Such subtle interactions are precisely why Multispectral imagery struggles with fine-grained surface characterization, and where Hyperspectral imaging excels.

Video 1: How Spectral Signature of Vegetation changes upon iterating biogeochemical (chlorophyll, water content) and geometric (leaf area index) influencers
Spectral Signature of Open & Coastal Water; Source: HYPERedu, EnMAP education initiative
Figure 10: Spectral Signature of Open & Coastal Water; Source: HYPERedu, EnMAP education initiative

Likewise, Figure 10 shows the spectral signatures of open versus coastal water. While Multispectral data (e.g., Sentinel-2’s B1 and B3 bands) can help distinguish them, the difference in reflectance is subtle and easily masked. Hyperspectral imaging, with its detailed, continuous signatures, makes the task considerably easier.



Everything sounds so rosy about Hyperspectral Imaging. Do we really need Multispectral Imaging?


More information is not always better—it can sometimes be counterproductive. Because hyperspectral bands are narrow and contiguous, researchers often encounter interference from neighbouring bands, a phenomenon analogous to cross-talk in telecommunications. This can compromise classification accuracy, and filtering out such noise is computationally intensive and requires advanced correction techniques.


Multispectral imagery avoids this issue because its bands are wider and non-contiguous, greatly reducing the risk of spectral leakage. It is also far more accessible, less expensive and easier to process.


Furthermore, some limitations of multispectral data also apply to hyperspectral data: neither can be acquired at night (unlike radar), and both are susceptible to atmospheric influences.

Hyperspectral data can be acquired using different techniques, and the choice of method depends on factors such as the biogeochemical characteristics of the surface, sensitivity to external influences, desired resolution, cost constraints and coverage requirements.


Some acquisitions are best carried out using ground-based sensors, either in the field or in a controlled laboratory setting—particularly when the object or surface is highly sensitive to environmental conditions or when extremely high spatial resolution is required.


Have a look at the two videos below—you’ll notice that acquiring hyperspectral data on the ground demands considerable expertise.

Video 2: On-ground hyperspectral data acquisition in a field environment. Source: HYPERedu, EnMAP education initiative
Video 3: On-ground hyperspectral data acquisition in a lab environment. Source: HYPERedu, EnMAP education initiative

Another example of laboratory-based hyperspectral imaging can be viewed here.


In contrast, spaceborne sensors are preferred when wider and faster coverage is required, when budgets are tighter, when very high spatial resolution is not essential, and when the surface under study is less sensitive to external conditions.


Airborne sensors, mounted on research aircraft or drones, are also widely used. They are particularly effective when a study requires more coverage than ground-based acquisition can provide, and/or when higher spatial resolution is needed than what a spaceborne sensor can offer.


Video 4: Airborne hyperspectral data acquisition using a research flight. Source: HYPERedu, EnMAP education initiative

Beyond the mode of acquisition, hyperspectral sensors also differ in their scanning technologies. The video below gives a clear overview.

Video 5: Hyperspectral sensor technologies / data acquisition techniques

Some research studies may even require hyperspectral datasets acquired using multiple modes and scanning techniques, either to strengthen the analysis or to validate the results.

The scientific scope of Hyperspectral Imaging is immense. It is already used extensively in agricultural, soil, mining, coastal, hazard, Archaeological and Military applications, among many others (Explore ongoing hyperspectral missions and applications here).


I was pleasantly surprised to learn that a technology originally designed for remote sensing is now being applied in crime scene detection, forensic medicine and even the biomedical sector, where imaging spectroscopy serves as a non-invasive method to distinguish cancerous tissues from healthy ones by using wavelengths that penetrate human skin.

Now that you’re familiar with the foundations and utility of Hyperspectral Imaging, I’ll leave you with a demonstration video showcasing its practical use in land-cover classification. You’ll see how to process hyperspectral datasets using open-source software and how to interpret the spectral signature output.

Video 6: Exploring Hyperspectral Imagery and Analyzing Spectral Signatures using the EnMAP-Box plugin on QGIS

Video Time Stamps

00:04 - Video Details

00:19 - Getting familiar with the Datasets

01:09 - Exploring Airborne & Spaceborne Hyperspectral Imagery

03:23 - Visualizing the Spectral Signature

04:50 - Visualizing the Spectral Library

06:05 - Using Regression Analysis to generate Land-Cover Map


Thanks for reading this post. Feel free to share your feedback here.

ABOUT US - OPERATIONS MAPPING SOLUTIONS FOR ORGANIZATIONS


Intelloc Mapping Services, Kolkata | Mapmyops.com offers a suite of Mapping and Analytics solutions that seamlessly integrate with Operations Planning, Design, and Audit workflows. Our capabilities include — but are not limited to — Drone Services, Location Analytics & GIS Applications, Satellite Imagery Analytics, Supply Chain Network Design, Subsurface Mapping and Wastewater Treatment. Projects are executed pan-India, delivering actionable insights and operational efficiency across sectors.


My firm's services can be split into two categories - Geographic Mapping and Operations Mapping. Our range of offerings are listed in the infographic below-

Range of solutions that Intelloc Mapping Services (Mapmyops.com) offers
Range of solutions that Intelloc Mapping Services (Mapmyops.com) offers

A majority of our Mapping for Operations-themed workflows (50+) can be accessed from this website's landing page. We respond well to documented queries/requirements. Demonstrations/PoC can be facilitated, on a paid-basis. Looking forward to being of service.


Regards,

Mapmyops I Intelloc Mapping Services

Mapmyops
  • LinkedIn Social Icon
  • Facebook
  • Twitter
  • YouTube
Intelloc Mapping Services - Mapmyops.com
bottom of page