Skip to content

Multispectral payload

Payload design

Mark IV and V satellites have multispectral payloads with a camera system which captures images of approximately 1m GSD and a swath width of 5 km and 0.7m and 6.7Km respectively, covering the wavelength ranges between 450 nm and 900 nm (check Spectral response). The system uses a closed-loop real-time stabilization system to compensante earth movement and reduce blur. In front of the sensor, there is a 4 band filter that allows to capture the four spectral bands.

All products are made from raw frames like the following image.

Raw frame
Raw frame.

By capturing consecutive frames the acquisition system is capable of mapping the information in the four spectral bands.

Blue Green Red NIR
Blue, Green, Red and NIR coverage.

Spectral Response

drawing

Imagery are captured in visible and near-infrared bands (Red, Green, Blue and NIR), with each band covering a wide range of wavelength to maximize object characterization.

  • Blue: 450 - 510 nm
  • Green: 510 - 580 nm
  • Red: 590 - 690 nm
  • Near-IR: 750 - 900 nm

The following archive contains the spectral response functions of every satellite in our active fleet: newsat_srfs.zip. Alternatively, you can find SRFs individually for each platform below in a csv format:

Spatial Resolution

Satellogic's multispectral imagery products have a very high spatial resolution natively across all spectral bands. Images are captured with approximately 1m native resolution at nadir for all spectral bands. Basic products are resampled for consistency to 1m GSD. Satellogic applies proprietary algorithms to enhance images to 70cm GSD on all bands providing considerable advantages in terms of pixel-level radiometry.

Radiometric Calibration

Satellogic imagery’s radiometric accuracy is attained through a combination of lab measurements and on-orbit vicarious calibrations. Satellogic performs on orbit calibration based on processing data retrieved from crossovers with a well calibrated source (Sentinel-2). This allows the continuous tracking of radiometric stability and the improvement of the calibration in case it is needed. The main targets used for vicarious campaigns are Railroad Valley Playa (USA), Gobabeb (Namibia), Baotou Sand (China) and La Crau (France) calibration sites, where the radiometric accuracy of Satellogic’s L1 products was evaluated at 10%.

By using calibration data retrieved from these two methods, raw data values as collected by the sensor (DNs) are converted to Top of Atmosphere (TOA) Reflectance to produce images that are free from sensor and top of atmosphere distortions.

Known Limitations

The four filter bands of the sensor form four stripes stacked along-track. This means that the four bands of a scene are imaged at different times and from slightly different satellite positions. This configuration results in certain incorrigible features on the images. The most common and best known limitations are listed in this section.

Moving objects

Moving objects, such as vehicles on the ground or in the sea, appear at different positions in different color bands. This feature can be used to determine the object velocity.

Moving objects

An airplane on the runway and cars on a motorway showing their colors split at different positions in the raster, as a consequence of their movement during the capture acquisition.

High altitude objects

Similarly to moving objects, high altitude objects, such as clouds, haze and airplanes, appear at different positions in different color bands, usually due to a parallax effect, which may combine with the object movement.

High altitude clouds

From left to right, parallax effects on clouds, haze and airplane trails.

Haze

Example with haze showing rainbow colors and also a compositing artifact between two consecutive frames.

Because of the parallax effect, the tip of very high buildings can also appear at different positions in different bands, as it can be seen in the following image.

High buildings

The apparent position of the top of Burj Khalifa, the world’s tallest building, appears different in the three color bands because of a parallax effect.

Parasitic light sensitivity

Many digital sensors are sensitive to light even outside the exposure time. The sensor is said to be sensitive to parasitic light. This sensitivity is much lower than the sensor sensitivity during the exposure, but not negligible. If a high reflectance object is captured, causing saturation at its position during the exposure time, there will be additional photoelectrons generated while waiting for the readout. In the case of Newsat Mark-IV satellites, during the readout time the satellite has moved, causing a track of saturated pixels on the image, like the ones shown in the following images.

High buildings

Saturation and parasitic light in a single-band raster (see vertical line originating from a very high reflectance object, on the left) and in a RGB composite (on the right).

Ground level band misalignment

The design of the payload implies that band alignment is height specific. Normally, the geometric corrections optimize alignment at ground level, which implies band misalignment for high altitude objects. However, inaccurate geometric corrections can sometimes cause misalignment at the ground level.

Such inaccuracies tend to appear in dark or featureless regions, where the algorithm cannot find sufficient matches to ground features. In particular, the presence of nearby clouds and haze can trigger this anomaly, because matching of points on clouds may push some correction algorithms towards better alignment at cloud-height at the expense of ground-level alignment .

Dark area next to clouds. The road and the buildings on the ground display significant color misalignment, while the cloud does not.

Low signal-to-noise ratio

Some captures may have low signal-to-noise ratio and appear noisy. This can occur in targets with intrinsically low reflectance values and in shaded areas of low sun-elevation captures. Sometimes the exposure time may not be optimized for the entire capture.

Example of a noisy image due to low sun elevation.

Sensor defective pixels

A few sensors present some grouped dead pixels (appearing as low gain pixels in the flat-field frame) which usually cannot be fully corrected by the flat-field frame, but they may be mitigated or smoothed in the L1 composite, depending on the size of the dead pixel aggregation. Usually they are so small that they appear corrected in the L1 images. However, there might be some additional noise in and around these dead pixels in the L1 composite Additionally, depending on their position in the filter band (e.g. at the edge of the band), they may fall below the overlapping consecutive frame and thus not used in the L1 composite. A list of uncorrectable defective pixels will be provided in a future version of this document.

A raw image with an aggregation of dead pixels (on the left) is compared with the corresponding L1 product (on the right), showing that in the processed image there is arguably an increase of noise at the location of the sensor artifact.

Inhomogeneous tonal distortion along the track

An outdated flat-field frame can infer tonal distortions in the tile center. In the stripe, this is usually noticeable along the track. In a RGB composite, a color artifact will appear. This occurs very rarely (roughly about 2 events per year for the entire fleet), as the flat-field frame is computed frequently (bi-weekly or monthly). This artifact can be corrected by post-processing with an updated flat-field. An example is provided below.

Color artifact along the track caused by a payload change not caught by the flat-field frame, which was outdated (left). Same image after correction with an up-to-date flat-field frame (right).

Blurred images

Although rarely, some captures may present blur. This effect can be caused on limited occasions by anomalies on the thermal conditions, by an unexpected behavior of the stabilization system or by an anomaly on the automatic refocusing system. This issue is uncorrectable.

Example of blurred image

Filter artifacts

Sometimes new dust particles appear on the filter. They usually affect small portions of a capture, acting like a light blocker. These artifacts can be corrected by re-processing the raw frames with a new flat-field frame. To reduce the occurrence of these artifacts in the imagery, the flat-field frames are periodically recomputed for each satellite, usually every 2–4 weeks. The following images show how these artifacts commonly appear in a single-band image and in a RGB composite.

Example of a dust particle acting like a light blocker (left). Note that the artifact is periodically repeated in the image since it is present in each raw frame. After detecting the artifact, a new flat-field frame was generated and the scene reprocessed. On the right the reprocessed version of the same image is shown.

Example of a dust artifact in a RBG composite: in this case the artifact was in the blue filter, resulting in a yellow donut-like shape (left). This artifact was then corrected by reprocessing the image with a new flat-field frame (right).

Although rarely, foreign object debris can create some superficial, small damage to the filter, allowing unfiltered light to reach the corresponding pixels on the sensor. This implies that the spectral information is lost in the affected areas. The resulting artifacts (usually bright donut-like shapes) are incorrigible with further processing

This artifact may not be present depending on the spectral signature of the target. The filter defects causing these artifacts are known and will be listed in a future edition of this document (including affected satellites and positions in the raw frame).

Artifact created by debris on the red band of the filter, causing the light reaching the sensor unfiltered.

Payload artifacts in saturated images

In the areas of frames with very high reflectance values, such as bright clouds or snow, the raw frame value can saturate. The flat-field (gain) correction cannot work because of non-linearity in these saturated areas. As a consequence the final products most likely present payload artifacts, such as sensor defects, pixel response non-uniformity (PRNU) and undesired dust particles on the filter, and processing artifacts such as color bands. In the future, the pixels affected by saturation will be masked.

ORTHO_SATURATION

Example of a target with very high reflectance: a field entirely covered by snow. The target is overexposed and the image cannot be corrected by the flat-field frame. The result is that the payload artifacts are not corrected (notice the high frequency parallel lines and the donut-like shapes due to dust on the filter). Additionally, big tonal differences between adjacent frames create strong color artifacts which cannot be corrected.

Color artifacts due to image compositing

Some images can show sharp intensity change between two adjacent frames during image composition (Note: the composition process concerns only products: L1B basic, L1C, L1D/L1D_SR). This usually corresponds to a difference of 3% or less between the mean brightness levels of the frames in the overlapping pixels. In flat and homogeneous targets, such as water surfaces, deserts and snow, as well as in clouds, this can appear visually noticeable.

ORTHO_COLOR_ISSUE1

Example of tonal differences (image composition artifact) between adjacent frames, mostly noticeable in homogeneous targets, such as water.

The following image shows an example of an uncorrected image composition artifact in an RGB composite.

ORTHO_COLOR_ISSUE2

Example of uncorrected image composition artifact in an RGB composite, inferring to the image a violet shade in alternate bands along the track of the satellite.

Spectral Bands Misalignment

Cloudy images or images with moving objects such as waves may exhibit misalignment effects due to the incorrect alignment of the spectral bands.

Each of our spectral bands is collected at slightly different moments in time and from slightly different positions (separated by seconds and by a small angle difference), then combined into a single multiband image. For static and flat features at sea level, like low-rise buildings and roads, this has no effect on the visualization since there was no change between the different band acquisitions. However a rainbow effect might be seen in the following scenarios:

  • Moving objects like, cars, planes, and even clouds
  • Objects that are at high altitude with respect to background scenery
  • Objects that have a significant vertical extent, such as tall buildings

In these scenarios you might see the object in each of the different bands slightly misaligned since it moved between band acquisitions, for example. This is an expected behavior in the imagery.

CLOUDS

Misalignment caused by clouds.

WAVES

Misalignment caused by moving waves.

Last update: 2024-11-07