Planet Combined Imagery Product Specs

Download as pdf or txt
Download as pdf or txt
You are on page 1of 101

Fagradalsfjall Volcano,

Reykjavik, Iceland

PLANET IMAGERY PRODUCT


SPECIFICATIONS

PLANET.COM MAY 2022


TABLE OF CONTENTS

1. OVERVIEW OF DOCUMENT 11

1.1. COMPANY OVERVIEW 12

1.2 DATA PRODUCT OVERVIEW 12

2. SATELLITE CONSTELLATION AND SENSOR OVERVIEW 12

2.1 PLANETSCOPE SATELLITE CONSTELLATION AND SENSOR CHARACTERISTICS 13

Table 1-A: PlanetScope Constellation and Sensor Specifications 13

2.3 SKYSAT SATELLITE CONSTELLATION AND SENSOR CHARACTERISTICS 14

Table 1-C: SkySat Constellation Overview 15

Table 1-D: SkySat Pointing 15

Table 1-E: SkySat Sensor Specifications 16

2.3.1 SKYSAT STEREO IMAGING CAPABILITY 16

3. PLANETSCOPE IMAGERY PRODUCTS 16

Table 2-A: PlanetScope Satellite Image Product Processing Levels 17

3.1 RADIOMETRIC INTERPRETATION 17

3.1.1 PLANETSCOPE NORMALIZATION AND HARMONIZATION 18

3.2 PLANETSCOPE BASIC SCENE PRODUCT SPECIFICATION 19

Table 2-B: PlanetScope Analytic Basic Scene Product Attributes 19

3.3 PLANETSCOPE ORTHO SCENES PRODUCT SPECIFICATION 20

Table 2-C: PlanetScope Ortho Scene Product Attributes 21

3.3.1 PlanetScope Visual Ortho Scene Product Specification 21

Table 2-D: PlanetScope Visual Ortho Scene Product Attributes 22

3.3.2 PlanetScope Analytic Ortho Scene Product Specification 22

Table 2-E: PlanetScope Analytic Ortho Scene Product Attributes 22

3.4 PLANETSCOPE ORTHO TILE PRODUCT SPECIFICATION 23

© Planet Labs PBC 2022 2


Figure 2: PlanetScope Scene to Ortho Tile Conversion. 24

Table 2-F: PlanetScope Ortho Tile Product Attributes 24

3.4.1 PlanetScope Visual Ortho Tile Product Specification 25

Table 2-G: PlanetScope Visual Ortho Tile Product Attributes 25

3.4.2 PlanetScope Analytic Ortho Tile Product Specification 26

Figure 3: PlanetScope Analytic Ortho Tiles with RGB (left) and NIR False-Color Composite (right) 27

Table 2-H: PlanetScope Analytic Ortho Tile Product Attributes 27

3.4.3 PlanetScope Analytic 5B Ortho Tile Product Specification 28

Figure 4: PlanetScope Analytic Bands 29

4. RAPIDEYE IMAGERY PRODUCTS 29

Table 3-A: RapidEye Satellite Image Product Processing Levels 30

4.1 RADIOMETRIC INTERPRETATION 30

4.2 RAPIDEYE BASIC SCENE PRODUCT SPECIFICATION 32

Table 3-B: RapidEye Basic Scene Product Attributes 32

4.3 RAPIDEYE VISUAL ORTHO TILE PRODUCT SPECIFICATION 33

Table 3-C: RapidEye Ortho Tile Product Attributes 34

4.3.1 RapidEye Visual Ortho Tile Product Specification 34

Figure 5: RapidEye Visual Ortho Tile 35

4.3.2 RapidEye Analytic Ortho Tile Product Specification 36

5. SKYSAT IMAGERY PRODUCTS 37

5.1 SKYSAT BASIC SCENE PRODUCT SPECIFICATION 38

Table 1: SkySat Basic Scene Product Attributes 38

5.2 SKYSAT VIDEO PRODUCT SPECIFICATION 39

Table 2: SkySat Video Product Attributes 40

5.# SKYSAT ALL-FRAMES PRODUCT SPECIFICATION 41

© Planet Labs PBC 2022 3


Table #: SkySat All-Frames Product Attributes 41

5.# PINHOLE CAMERA MODEL 42

5.# FRAME INDEX FILE 43

5.3 RADIOMETRIC INTERPRETATION 44

Table 11: Skysat Analytic Ortho Scene ESUN values, resampled from Thuillier irradiance spectra 45

5.4 SCENE METADATA 45

Table 8: Skysat Basic Scene Geojson Metadata Schema 45

5.3 BASIC SCENE RPC METADATA 46

5.4 SKYSAT VIDEO METADATA 47

Table 10: Skysat Video JSON file Metadata Schema 48

Table 11: Frame Index (csv) 49

5.5 SKYSAT ORTHO SCENE PRODUCT SPECIFICATION 50

Table 10: SkySat Ortho Scene Product Attributes 51

Table 11: SkySat Ortho Scene Asset Attributes 52

5.6 SKYSAT ANALYTIC SCENE GEOTIFF PROPERTIES 54

Table 12: Properties included in the GeoTIFF Header, under ‘TIFFTAG_IMAGEDESCRIPTION’ 54

5.7 SKYSAT ORTHO SCENE GEOJSON METADATA 54

Table 13: Skysat Ortho Scene Geojson Metadata Schema 54

5.8 SKYSAT ORTHO COLLECT PRODUCT SPECIFICATION 55

Table 4-H: SkySat Ortho Collect Attributes 56

5.9 SKYSAT ANALYTIC COLLECT GEOTIFF PROPERTIES 57

Table 14: Properties included in the GeoTIFF Header, under ‘TIFFTAG_IMAGEDESCRIPTION’ 57

5.10 SKYSAT COLLECT METADATA 57

Table 14: Skysat Ortho Collect Geojson Metadata Schema 57

5.11 SKYSAT BASEMAP MOSAIC TILES PRODUCT SPECIFICATION 59

© Planet Labs PBC 2022 4


Table 4-I: Individual Quad Specifications 59

6. OTHER PROVIDER IMAGERY PRODUCTS 59

6.1 LANDSAT 8 60

Table 5-A: Landsat 8 data properties 60

6.2 SENTINEL-2 61

Table 5-B: Sentinel-2 Data Properties 61

7. PRODUCT PROCESSING 62

7.1 PLANETSCOPE PROCESSING 63

Table 6-A: PlanetScope Processing Steps 63

Figure 6: PlanetScope Image Processing Chain 64

7.2 RAPIDEYE PROCESSING 64

Table 6-B: RapidEye Processing Steps 65

Figure 7: RapidEye Image Processing Chain 65

7.3 SKYSAT PROCESSING 66

Table 6-C: SkySat Processing Steps 67

Figure 8: SkySat Image Processing Chain 68

8. PRODUCT METADATA 68

8.1 ORTHO TILES 69

8.1.1 PlanetScope 69

Table 7-A: PlanetScope Ortho Tile GeoJSON Metadata Schema 69

Table 7-B: PlanetScope Ortho Tile Surface Reflectance GeoTIFF Metadata Schema 71

8.1.2 RapidEye 73

Table 7-C: RapidEye Ortho Tile GeoJSON Metadata Schema 73

Table 7-D: RapidEye Ortho Tile Surface Reflectance Metadata Schema 74

8.2 ORTHO SCENES 76

© Planet Labs PBC 2022 5


8.2.1 PlanetScope 77

Table 7-E: PlanetScope Ortho Scene GeoJSON Metadata Schema 77

Table 7-F: PlanetScope Ortho Scene Surface Reflectance GeoTIFF Metadata Schema 79

8.2.2 SkySat 81

Table 7-G: Skysat Ortho Scene Geojson Metadata Schema 81

8.3 BASIC SCENES 82

8.3.1 PlanetScope 82

Table 7-H: PlanetScope Basic Scene GeoJSON Metadata Schema 82

8.3.2 RapidEye 84

Table 7-I: RapidEye Basic Scene GeoJSON Metadata Schema 84

8.3.3 SkySat 85

Table 7-J: Skysat Basic Scene Geojson Metadata Schema 85

8.4 ORTHO COLLECT 86

8.4.1 SkySat 86

Table 8: Skysat Ortho Collect Geojson Metadata Schema 86

9. PRODUCT DELIVERY 88

9.1 PLANET APPLICATION PROGRAMMING INTERFACES (APIS) 89

9.2 PLANET EXPLORER GRAPHICAL USER INTERFACE (GUI) 89

9.3 PLANET ACCOUNT MANAGEMENT TOOLS 90

APPENDIX A – IMAGE SUPPORT DATA 90

1. GENERAL XML METADATA FILE 91

Table A-1: General XML Metadata File Field Descriptions 91

2. UNUSABLE DATA MASK FILE 97

3. USABLE DATA MASK FILE 98

APPENDIX B - TILE GRID DEFINITION 99

© Planet Labs PBC 2022 6


Figure B-1: Layout of UTM Zones 100

Figure B-2: Layout of Tile Grid within a single UTM zone 101

Figure B-3: Illustration of grid layout of Rows and Columns for a single UTM Zone 102

No part of this document may be reproduced in any form or any means without the prior written consent of Planet.
Unauthorized possession or use of this material or disclosure of the proprietary information without the prior written consent
of Planet may result in legal action. If you are not the intended recipient of this report, you are hereby notified that the use,
circulation, quoting, or reproducing of this report is strictly prohibited and may be unlawful.

© Planet Labs PBC 2022 7


GLOSSARY

The following list defines terms used to describe Planet’s satellite imagery products.

Alpha Mask
An alpha mask is an image channel with binary values that can be used to render areas of the image product
transparent where no data is available.

Application Programming Interface (API)


A set of routines, protocols, and tools for building software applications.

Atmospheric Correction
The process of correcting at-sensor radiance imagery to account for effects related to the intervening
atmosphere between the earth’s surface and the satellite. Atmospheric correction has been shown to
significantly improve the accuracy of image classification.

Blackfill
Non-imaged pixels or pixels outside of the buffered area of interest that are set to black. They may appear as
pixels with a value of “0” or as “noData” depending on the viewing software.

Digital Elevation Model (DEM)


The representation of continuous elevation values over a topographic surface by a regular array of z-values,
referenced to a common datum. DEMs are typically used to represent terrain relief.

GeoJSON
A standard for encoding geospatial data using JSON (see JSON below).

GeoTIFF
An image format with geospatial metadata suitable for use in a GIS or other remote sensing software.

Ground Sample Distance (GSD)


The distance between pixel centers, as measured on the ground. It is mathematically calculated based on
optical characteristics of the telescope, the altitude of the satellite, and the size and shape of the CCD sensor.

Graphical User Interface (GUI)


Web based interfaces enable users to interact with Planet's imagery products without needing knowledge of
how to use APIs or Application Programming Interfaces.

JavaScript Object Notation (JSON)


Text-based data interchange format used by the Planet API.

Landsat 8
Freely available dataset offered through NASA and the United States Geological Survey.

Metadata
Data delivered with Planet’s imagery products that describes the products content and context and can be
used to conduct analysis or further processing.

© Planet Labs PBC 2022 8


Nadir
The point on the ground directly below the satellite.

Near-Infrared (NIR)
Near Infrared is a region of the electromagnetic spectrum.

Orthorectification
The process of removing and correcting geometric image distortions introduced by satellite collection
geometry, pointing error, and terrain variability.

Ortho Tile
Ortho Tiles are Planet’s core product lines of high-resolution satellite images. Ortho tiles are available in two
different product formats: Visual and Analytic, each offered in GeoTIFF format.

PlanetScope
The first three generations of Planet’s optical systems are referred to as PlanetScope 0, PlanetScope 1, and
PlanetScope 2.

Radiometric Correction
The correction of variations in data that are not caused by the object or image being scanned. These include
correction for relative radiometric response between detectors, filling non-responsive detectors and scanner
inconsistencies.

Reflectance Coefficient
The reflectance coefficient provided in the metadata is used as a multiplicative to convert Analytic TOA
Radiance values to TOA Reflectance.

RapidEye
RapidEye refers to the five-satellite constellation operating between 2009 and 2020.

Scene
A single image captured by a PlanetScope satellite.

Sensor Correction
The correction of variations in the data that are caused by sensor geometry, attitude and ephemeris.

Sentinel-2
Copernicus Sentinel-2 is a multispectral imaging satellite constellation operated by the European Space Agency.

SkySat
SkySat refers to the 15-satellite constellation in operation since 2014.

Sun Azimuth
The angle of the sun as seen by an observer located at the target point, as measured in a clockwise direction
from the North.

Sun Elevation
The angle of the sun above the horizon.

© Planet Labs PBC 2022 9


Sun Synchronous Orbit (SSO)
A geocentric orbit that combines altitude and inclination in such a way that the satellite passes over any given
point of the planet’s surface at the same local solar time.

Surface Reflectance (SR)


Surface reflectance is the amount of light reflected by the surface of the earth. It is a ratio of surface radiance to
surface irradiance, and as such is unitless, and typically has values between 0 and 1. The Surface Reflectance (SR)
Product is derived from the standard Planet Analytic (Radiance) Product and is processed to top of atmosphere
reflectance and then atmospherically corrected to (bottom of atmosphere or) surface reflectance. Planet uses
the 6S radiative transfer model with ancillary data from MODIS to account for atmospheric effects on the
observed signal at the sensor for the PlanetScope constellation.

Tile Grid System


Ortho tiles are based on a worldwide, fixed UTM grid system. The grid is defined in 24 km by 24 km tile centers,
with 1 km of overlap (each tile has an additional 500 m overlap with adjacent tiles), resulting in 25 km by 25 km
tiles.

Unusable Data Mask

The unusable data mask is a raster image having the same dimensions as the image product, indicating on a
pixel-by-pixel basis which pixels are unusable because they are cloud filled, outside of the observed area and
therefore blackfilled, or the pixel value is missing or suspect (due to saturation, blooming, hot pixels, dust,
sensor damage, etc). The unusable data mask is an 8-bit image, where each pixel contains a bit pattern
indicating conditions applying to the imagery pixel. A value of zero indicates a "good" imagery pixel.

● Bit 0: Black fill - Identifies whether the area contains blackfill in all bands (this area was not imaged by
the spacecraft). A value of “1” indicates blackfill.
● Bit 1: Cloud - This pixel is assessed to likely be an opaque cloud.
● Bit 2: Blue is missing or suspect.
● Bit 3: Green is missing or suspect.
● Bit 4: Red is missing or suspect.
● Bit 5: Red Edge is missing or suspect (Rapideye only).
● Bit 6: NIR is missing or suspect
● Bit 7: Unused

Usable Data Mask

The usable data mask is a raster image having the same dimensions as the image product, comprised of 8
bands, where each band represents a specific usability class mask. The usability masks are mutually exclusive,
and a value of one indicates that the pixel is assigned to that usability class.

● Band 1: clear mask (a value of “1” indicates the pixel is clear, a value of “0” indicates that the pixel is not
clear and is one of the 5 remaining classes below)
● Band 2: snow mask
● Band 3: shadow mask
● Band 4: light haze mask
● Band 5: heavy haze mask
● Band 6: cloud mask
● Band 7: confidence map (a value of “0” indicates a low confidence in the assigned classification, a value
of “100” indicates a high confidence in the assigned classification)

© Planet Labs PBC 2022 10


● Band 8: unusable data mask (see Unusable Data Mask above)

© Planet Labs PBC 2022 11


1. OVERVIEW OF DOCUMENT
This document describes Planet satellite imagery products. It is intended for users of satellite imagery
interested in working with Planet’s product offerings.

1.1. COMPANY OVERVIEW

Planet uses an agile aerospace approach for the design of its satellites, mission control, and operations systems;
and the development of its web-based platform for imagery processing and delivery. Planet employs an “always
on” image capturing method as opposed to the traditional tasking model used by most satellite companies
today.

1.2 DATA PRODUCT OVERVIEW

Planet operates the PlanetScope (PS) and SkySat (SS) Earth-imaging constellations. Imagery is collected and
processed in a variety of formats to serve different use cases, be it mapping, deep learning, disaster response,
precision agriculture, or simple temporal image analytics to create rich information products.

PlanetScope satellite imagery is captured as a continuous strip of single frame images known as “scenes.”
Scenes are derived from multiple generations of PlanetScope satellites. Older-generation of PlanetScope
satellites acquired a single RGB (red, green, blue) frame or a split-frame with a RGB half and a NIR
(near-infrared) half, depending on the capability of the satellite. The new generation of PlanetScope satellites
(PS2.SD and PSB.SD) acquire images with a multistripe frame with bands divided between RGBNIR (PS2.SD) or
RGBNIR, green I, yellow and coastal blue (PSB.SD).

Planet offers three product lines for PlanetScope imagery: a Basic Scene product, an Ortho Scene product, and
an Ortho Tile product. The Basic Scene product is a scaled Top of Atmosphere Radiance (at sensor) and
sensor-corrected product. The Basic Scene product is designed for users with advanced image processing and
geometric correction capabilities. The product is not orthorectified or corrected for terrain distortions. Ortho
Scenes represent the single-frame image captures as acquired by a PlanetScope satellite with additional post
processing applied. Ortho Tiles are multiple orthorectified scenes in a single strip that have been merged and
then divided according to a defined grid.

SkySat imagery is captured similar to PlanetScope in a continuous strip of single frame images known as
“scenes,” which are all acquired in the blue, green, red, nir-infrared, and panchromatic bands. SkySat data is
available in four product lines: the Basic Scene, Ortho Scene, Basemap, and SkySat Collect products.

© Planet Labs PBC 2022 12


2. SATELLITE CONSTELLATION AND SENSOR OVERVIEW

2.1 PLANETSCOPE SATELLITE CONSTELLATION AND SENSOR CHARACTERISTICS

The PlanetScope satellite constellation consists of multiple launches of groups of individual satellites. Therefore,
on-orbit capacity is constantly improving in capability or quantity, with technology improvements deployed at a
rapid pace.

Each PlanetScope satellite is a CubeSat 3U form factor (10 cm by 10 cm by 30 cm). The complete PlanetScope
constellation of approximately 130 satellites is able to image the entire land surface of the Earth every day
(equating to a daily collection capacity of 200 million km²/day). This capacity changes based on the number of
satellites in orbit and throughout the season, as satellites image less in the northern hemisphere in the winter
time because of a decrease in the amount of hours with sunlight.

PlanetScope satellites launched starting in November 2018 have sensor characteristics that enable improved
spectral resolution. The second generation of PlanetScope satellites (known as Dove-R or PS2.SD) have a sensor
plane consisting of four separate stripes organized vertically along the track of the flight path. PlaneScope
images from PS2.SD satellites are available starting from March, 2019 (sparsely) to April 22, 2022.

A third generation of PlanetScope sensors (known as SuperDove or PSB.SD) is currently in orbit and is
producing daily imagery with 8 spectral bands (coastal blue, blue, green I, green, red, yellow, red edge and
near-infrared). These satellites were launched in early 2020 and started producing imagery in mid-March 2020.
PSB.SD PlanetScope satellites reached near daily cadence in August 2021. Starting on April 29, 2022 all new
PlanetScope images have 8-bands and be from the PSB.SD satellites (SuperDoves). The 8-Band PlanetScope
images can be obtained using all Planet Platforms, Integrations and API. The item-type is PSScene.

Composite images with the second and third generation PlanetScope sensors are produced by an image
registration process involving multiple frames ahead and behind an anchor frame. The band alignment is
dependent on ground-lock in the anchor frame and will vary with scene content. For example, publication yield
is expected to be lower in scenes over open water, mountainous terrain, or cloudy areas.

The band alignment threshold is based on across-track registration residuals, currently set to 0.3 pixels for
“standard” PlanetScope products (instruments PS2.SD and PSB.SD), 0.5 pixels to qualify for “test.” Whether a
PlanetScope image is classified as “standard” or “test” can be determined by looking at image GeoJSON
metadata property “quality_category.”

Table 1-A: PlanetScope Constellation and Sensor Specifications

CONSTELLATION OVERVIEW: PLANETSCOPE

Mission Characteristics Sun-synchronous Orbit

Instrument PS2 PS2.SD PSB.SD

Orbit Altitude (reference) 450 - 580 km (~98° inclination) 475 - 525 km

© Planet Labs PBC 2022 13


(~98° inclination)

Max/Min Latitude ±81.5° (dependent on season)


Coverage

Equator Crossing Time 7:30 - 11:30 am (local solar time)

Sensor Type Four-band frame Imager Four-band frame imager Eight-band frame imager
with a split-frame VIS+NIR with butcher-block filter with butcher-block filter
filter providing blue, green, red, providing coastal blue,
and NIR stripes blue, green I, green, yellow,
red, red-edge, and NIR
stripes

Spectral Bands Blue: 455 - 515 nm Blue: 464 - 517 nm Coastal Blue 431-452 nm
Green: 500 - 590 nm Green: 547 - 585 nm Blue: 465-515 nm
Red: 590 - 670 nm Red: 650 - 682 nm Green I: 513. - 549 nm
NIR: 780 - 860 nm NIR: 846 - 888 nm Green: 547. - 583 nm
Yellow: 600-620 nm
Red: 650 - 680 nm
Red-Edge: 697 - 713 nm
NIR: 845 - 885 nm

Ground Sample Distance 3.0 m-4.1 m (approximate, altitude dependent) 3.7 m-4.2 m (approximate,
(nadir) altitude dependent)

Off-Nadir Angle 0° - 5° (latitude dependent)

Frame Size 24 km x 8 km 24 km x 16 km 32.5 km x 19.6 km


(approximate) (approximate) (approximate)

Maximum Image Strip per 20,000 km²


orbit

Revisit Time Daily at nadir

Image Capture Capacity 200 million km²/day

Imagery Bit Depth 12-bit

Availability Date July 2014 - April 2022 March 2019 - April 2022 March 2020 - present

2.2 SKYSAT SATELLITE CONSTELLATION AND SENSOR CHARACTERISTICS

The SkySat-C generation satellite is a high-resolution Earth imaging satellite, first launched in 2016. Fourteen are
currently in orbit, all collecting thousands of sq km of imagery. Each satellite is 3-axis stabilized and agile

© Planet Labs PBC 2022 14


enough to slew between different targets of interest. Each satellite has four thrusters for orbital control, along
with four reaction wheels and three magnetic torquers for attitude control.

All SkySats contain Cassegrain telescopes with a focal length of 3.6m, with three 5.5 megapixel CMOS imaging
detectors making up the focal plane.

Table 1-C: SkySat Constellation Overview

CONSTELLATION OVERVIEW: SKYSAT

Attribute Value

Mass 110 kg

Dimensions 60 x 60 x 95 cm

Total DeltaV 180 m/s

Onboard Storage 360 GB + 360 GB cold spare storage

RF Communication X-band downlink (payload): variable, up to 580 Mbit/s


X-band downlink (telemetry): 64 Kbit/s
S-band uplink (command): 32 Kbit/s

Design Life ~6 years

Table 1-D: SkySat Pointing

SKYSAT POINTING

Attribute Value

Geolocation Knowledge 30 - 50 m [SkySats 3 - 21]

Pixel Size (Orthorectified) All assets: 0.50 m

Ground Sample Distance [SkySat-1, SkySat-2]


Panchromatic: 0.86m
Multispectral: 1.0m

[SkySat-3 - SkySat-15]
Panchromatic: 0.65m
Multispectral: 0.81m

[SkySat-16 - SkySat-21]
Panchromatic: 0.58m
Multispectral: 0.72m

Revisit (per satellite) 4 - 5 days


*Reference altitude 500 km

Equatorial Crossing (local time) 10:30 - SkySat-3 - 7, 14 - 15


13:00 - SkySat-1 and SkySat-2
13:00 - SkySat-8 - 13

© Planet Labs PBC 2022 15


SkySat 16 - 21 crossing times vary daily

Table 1-E: SkySat Sensor Specifications

SKYSAT SENSOR SPECIFICATIONS

Product Attribute Description

Image Configurations Multispectral Sensor (Blue, Green, Red, NIR)

Panchromatic Sensor

Product Framing SkySat satellites have three cameras per satellite, which capture overlapping strips.
Each of these strips contain overlapping scenes. One scene is approximately 2560 x
1080 pixels

Sensor Type CMOS Frame Camera with Panchromatic and Multispectral halves

Spectral Bands Blue: 450 - 515 nm


Green: 515 - 595 nm
Red: 605 - 695 nm
NIR: 740 - 900 nm
Pan: 450 - 900 nm

2.2.1 SKYSAT STEREO IMAGING CAPABILITY

The SkySats are currently capable of capturing in the traditional satellite imaging stereo or tri-stereo approach.
Stereo pairs are captured by a single SkySat, in a single pass, symmetrical from nadir with a total convergence
angle between ~27 and 50 degrees. Tri-stereos are captured similarly, with a middle capture collected as close
to nadir as possible, with ~27 degree convergence angle between the first and third collect. Hence the total
convergence angle of a triplet is ~53 degrees between the first and last collect.

© Planet Labs PBC 2022 16


3. PLANETSCOPE IMAGERY PRODUCTS
PlanetScope imagery products are available as either individual Basic Scenes, Ortho Scenes, or Ortho Tile
products. The Basic and Ortho Scenes can be obtained from the Planet API through the PSScene item type.

Table 2-A: PlanetScope Satellite Image Product Processing Levels

PLANETSCOPE SATELLITE IMAGE PRODUCT PROCESSING LEVELS

Name Description Product Level

PlanetScope Basic Scene Product Scaled Top of Atmosphere Radiance Level 1B


(at sensor) and sensor corrected
product. The Basic Scene product is
designed for users with advanced
image processing and geometric
correction capabilities. This product
has scene based framing and is not
projected to a cartographic projection.
Radiometric and sensor corrections
are applied to the data.

PlanetScope Ortho Scene Product Orthorectified, scaled Top of Level 3B


Atmosphere Radiance (at sensor) or
Surface Reflectance image product
suitable for analytic and visual
applications. This product has scene
based framing and projected to a
cartographic projection.

PlanetScope Ortho Tile Product Radiometric and sensor corrections Level 3A


applied to the data. Imagery is
orthorectified and projected to a UTM
projection.

The name of each acquired PlanetScope image is designed to be unique and allow for easier recognition and
sorting of the imagery. It includes the date and time of capture, as well as the id of the satellite that captured it.
The name of each downloaded image product is composed of the following elements:

<acquisition date>_<acquisition time>_<satellite_id>_<productLevel><bandProduct>.<extension>

3.1 RADIOMETRIC INTERPRETATION

Analytic products are scaled to Top of Atmosphere Radiance. Validation of radiometric accuracy of the on-orbit
calibration has been measured at 5% using vicarious collects in the Railroad Valley calibration site.

© Planet Labs PBC 2022 17


All PlanetScope satellite images are collected at a bit depth of 12 bits and stored on-board the satellites with a
bit depth of up to 12 bits. Radiometric corrections are applied during ground processing and all images are
scaled to a 16-bit dynamic range. This scaling converts the (relative) pixel DNs coming directly from the sensor
into values directly related to absolute at-sensor radiances. The scaling factor is applied to minimize
quantization error and the resultant single DN values correspond to 1/100th of a W/(m²*sr*μm). The DNs of the
PlanetScope image pixels represent the absolute calibrated radiance values for the image.

Converting to Radiance and Top of Atmosphere Reflectance

To convert the pixel values of the Analytic products to radiance, it is necessary to multiply the DN value by the
radiometric scale factor, as follows:

RAD(i) = DN(i) * radiometricScaleFactor(i), where radiometricScaleFactor(i) = 0.01

The resulting value is the at sensor radiance of that pixel in watts per steradian per square meter (W/m²*sr*μm).

To convert the pixel values of the Analytic products to Top of Atmosphere Reflectance, it is necessary to multiply
the DN value by the reflectance coefficient found in the XML file. This makes the complete conversion from DN
to Top of Atmosphere Reflectance to be as follows:

REF(i) = DN(i) * reflectanceCoefficient(i)

Atmospheric Correction

Surface reflectance is determined from top of atmosphere (TOA) reflectance, calculated using coefficients
supplied with the Planet Radiance product.

The Planet Surface Reflectance product corrects for the effects of the Earth's atmosphere, accounting for the
molecular composition and variation with altitude along with aerosol content. Combining the use of standard
atmospheric models with the use of MODIS water vapor, ozone and aerosol data, this provides reliable and
consistent surface reflectance scenes over Planet's varied constellation of satellites as part of our normal,
on-demand data pipeline. However, there are some limitations to the corrections performed:

● In some instances there is no MODIS data overlapping a Planet scene or the area nearby. In those cases,
AOD is set to a value of 0.226 which corresponds to a “clear sky” visibility of 23km, the aot_quality is set
to the MODIS “no data” value of 127, and aot_status is set to ‘Missing Data - Using Default AOT’. If there
is no overlapping water vapor or ozone data, the correction falls back to a predefined 6SV internal
model.
● The effects of haze and thin cirrus clouds are not corrected for.
● Aerosol type is limited to a single, global model.
● All scenes are assumed to be at sea level and the surfaces are assumed to exhibit Lambertian scattering
- no BRDF effects are accounted for.
● Stray light and adjacency effects are not corrected for.

3.1.1 PLANETSCOPE NORMALIZATION AND HARMONIZATION

Planet provides a “harmonization” tool in all Planet platforms to perform a rigorous approximate transform of
the Surface Reflectance measurements of the PS2 instrument PlanetScope satellites to the Surface Reflectance
equivalents from PS2.SD and PSB.SD instrument PlanetScope satellites. This is done by using Sentinel-2 as the

© Planet Labs PBC 2022 18


target sensor. Read technical details of normalizing data in Scene Level Normalization and Harmonization of
Planet Dove Imagery.

To convert the PS2 instrument PlanetScope Surface Reflectance values to a PSB.SD equivalent measurement,
use the “harmonization” tool. This tool is available in Planet Explorer, ArcGIS Pro Add-In and QGIS Plug-In when
placing an order. Use the “harmonization” tool in the Orders API, Subscriptions API, and Google Earth Engine if
you are downloading data through the API.

Note: The harmonization process only applies to bands with a PS2 equivalent—specifically Blue, Green, Red, and
Near-infrared—and only for Surface Reflectance values.

3.2 PLANETSCOPE BASIC SCENE PRODUCT SPECIFICATION

The PlanetScope Basic Scene product is a Scaled Top of Atmosphere Radiance (at sensor) and sensor corrected
product, providing imagery as seen from the spacecraft without correction for any geometric distortions
inherent in the imaging process. It has a scene based framing, and is not mapped to a cartographic projection.
This product line is available in GeoTIFF and NITF 2.1 formats.

The PlanetScope Basic Scene product is a multispectral analytic data product from the satellite constellation.
This product has not been processed to remove distortions caused by terrain and allows analysts to derive
information products for data science and analytics.

The Basic Scene product is designed for users with advanced image processing capabilities and a desire to
geometrically correct the product themselves. The imagery data is accompanied by Rational Polynomial
Coefficients (RPCs) to enable orthorectification by the user.

The geometric sensor corrections applied to this product correct for:

● Optical distortions caused by sensor optics


● Co-registration of bands

The table below describes the attributes for the PlanetScope Basic Scene product:

Table 2-B: PlanetScope Analytic Basic Scene Product Attributes

PLANETSCOPE BASIC SCENE PRODUCT ATTRIBUTES

Product Attribute Description

Product Components and Format The PlanetScope Basic Scene product consists of the following file components:
● Image File – GeoTIFF format
● Metadata File – XML format
● Rational Polynomial Coefficients (RPC) - XML format
● Thumbnail File – GeoTIFF format
● Unusable Data Mask (UDM) File – GeoTIFF format
● Usable Data Mask (UDM2) File - GeoTIFF format

Information Content

Analytic Bands 4-band multispectral image (blue, green, red, near-infrared)

© Planet Labs PBC 2022 19


8-band multispectral image (coastal blue, blue, green I, green, red, yellow, red edge
and near-infrared - PSB.SD only)

Ground Sample Distance Approximate, satellite altitude dependent


PS2: 3.0 m-4.1 m
PS2.SD: 3.0 m-4.1 m
PSB.SD: 3.7 m-4.2 m

Processing

Pixel Size Approximate, satellite altitude dependent


PS2: 3.0 m-4.1 m
PS2.SD: 3.0 m-4.1 m
PSB.SD: 3.7 m-4.2 m

Bit Depth Analytic (DN): 12-bit


Analytic (Radiance - W m-2 sr-1 μm-1): 16-bit

Product Size Nominal scene size is approximately (at 475 km altitude):


PS2: 24 km by 8 km
PS2.SD: 24 km by 16 km
PSB.SD: 32.5 km by 19.6 km
with some variability by satellite altitude.

Geometric Corrections Spacecraft-related effects are corrected using attitude telemetry and best available
ephemeris data, and refined using GCPs.

Positional Accuracy Less than 10 m RMSE

Radiometric Corrections ● Conversion to absolute radiometric values based on calibration coefficients


● Radiometric values scaled by 100 to reduce quantization error
● Calibration coefficients are regularly monitored and updated with on-orbit
calibration techniques.

Map Projection N/A

3.3 PLANETSCOPE ORTHO SCENES PRODUCT SPECIFICATION

PlanetScope satellites collect imagery as a series of overlapping framed scenes, and these Scene products are
not organized to any particular tiling grid system. The Ortho Scene products enable users to create seamless
imagery by stitching together PlanetScope Ortho Scenes of their choice and clipping it to a tiling grid structure
as required.

The PlanetScope Ortho Scene product is orthorectified and the product was designed for a wide variety of
applications that require imagery with an accurate geolocation and cartographic projection. It has been
processed to remove distortions caused by terrain and can be used for cartographic purposes. The Ortho Scenes
are delivered as visual (RGB) and analytic products. Ortho Scenes are radiometrically-, sensor-, and
geometrically-corrected (optional atmospherically corrected) products that are projected to a cartographic map
projection. The geometric correction uses fine Digital Elevation Models (DEMs) with a post spacing of between
30 and 90 meters.

Ground Control Points (GCPs) are used in the creation of every image and the accuracy of the product will vary
from region to region based on available GCPs. Computer vision algorithms are used for extracting feature

© Planet Labs PBC 2022 20


points such as OpenCV’s STAR keypoint detector and FREAK keypoint extractor. The GCP and tiepoint matching
is done using a combination of RANSAC, phase correlation and mutual information.

The table below describes the attributes for the PlanetScope Ortho Scene product:

Table 2-C: PlanetScope Ortho Scene Product Attributes

PLANETSCOPE ORTHO SCENE PRODUCT ATTRIBUTES

Product Attribute Description

Product Components and Format PlanetScope Ortho Scene product consists of the following file components:
● Image File – GeoTIFF format
● Metadata File – XML format
● Thumbnail File – GeoTIFF format
● Unusable Data Mask (UDM) file – GeoTIFF format
● Usable Data Mask (UDM2) file - GeoTIFF format

Product Orientation Map North up

Product Framing Scene Based

Pixel Size (orthorectified) 3m

Bit Depth Visual: 8-bit


Analytic (Radiance - W m-2 sr-1 μm-1): 16-bit
Analytic SR (Surface Reflectance): 16-bit

Product Size Nominal scene size is approximately (at 475km altitude):


PS2: 25 km by 11.5 km
PS2.SD: 25 km by 23.0 km
PSB.SD: 32.5 km by 19.6 km
with some variability by satellite altitude.

Geometric Corrections Sensor-related effects are corrected using sensor telemetry and a sensor model.
Orthorectification uses GCPs and fine DEMs (30 m to 90 m posting).

Atmospheric Corrections Atmospheric effects are corrected using 6SV2.1 radiative transfer code. AOD, water
vapor and ozone inputs are retrieved from MODIS near-real-time data (MOD09CMA,
MOD09CMG and MOD08-D3).

Horizontal Datum WGS84

Map Projection UTM

Resampling Kernel Cubic Convolution

3.3.1 PlanetScope Visual Ortho Scene Product Specification

The PlanetScope Visual Ortho Scene product is orthorectified and color-corrected (using a color curve). This
correction attempts to optimize colors as seen by the human eye providing images as they would look if viewed
from the perspective of the satellite. This product has been processed to remove distortions caused by terrain
and can be used for cartographic mapping and visualization purposes. This correction also eliminates the

© Planet Labs PBC 2022 21


perspective effect on the ground (not on buildings), restoring the geometry of a vertical shot. Additionally, a
correction is made to the sun angle in each image to account for differences in latitude and time of acquisition.

The Visual Ortho Scene product is optimal for simple and direct use of an image. It is designed and made
visually appealing for a wide variety of applications that require imagery with an accurate geolocation and
cartographic projection. The product can be used and ingested directly into a Geographic Information System.

Table 2-D: PlanetScope Visual Ortho Scene Product Attributes

PLANETSCOPE VISUAL ORTHO SCENE PRODUCT ATTRIBUTES

Product Attribute Description

Information Content

Visual Bands 3-band natural color (red, green, blue)

Ground Sample Distance Approximate, satellite altitude dependent


PS2: 3.0 m-4.1 m
PS2.SD: 3.0 m-4.1 m
PSB.SD: 3.7 m-4.2 m

Processing

Pixel Size (orthorectified) 3.0 m

Bit Depth 8-bit

Geometric Corrections Sensor-related effects are corrected using sensor telemetry and a sensor model.
Spacecraft-related effects are corrected using attitude telemetry and best available
ephemeris data. Orthorectified using GCPs and fine DEMs (30 m to 90 m posting) to
<10 m RMSE positional accuracy.

Positional Accuracy Less than 10 m RMSE

Color Enhancements Enhanced for visual use and corrected for sun angle

3.3.2 PlanetScope Analytic Ortho Scene Product Specification

The PlanetScope Analytic Ortho Scene product is orthorectified, multispectral data from the satellite
constellation. Analytic products are calibrated multispectral imagery products that have been processed to
allow analysts to derive information products for data science and analytics. This product is designed for a wide
variety of applications that require imagery with an accurate geolocation and cartographic projection. The
product has been processed to remove distortions caused by terrain and can be used for many data science
and analytic applications. It eliminates the perspective effect on the ground (not on buildings), restoring the
geometry of a vertical shot. The PlanetScope Analytic Ortho Scene is optimal for value-added image processing
such as land cover classifications. The imagery has radiometric corrections applied to correct for any sensor
artifacts and transformation to at-sensor radiance.

Table 2-E: PlanetScope Analytic Ortho Scene Product Attributes

PLANETSCOPE ANALYTIC ORTHO SCENE PRODUCT ATTRIBUTES

© Planet Labs PBC 2022 22


Product Attribute Description

Information Content

Analytic Bands 3-band multispectral image (red, green, blue) - only available for PS2 images
4-band multispectral image (blue, green, red, near-infrared)
8-band multispectral image (coastal blue, blue, green I, green, red, yellow, red edge
and near-infrared) - only available for PSB.SD images

Ground Sample Distance Approximate, satellite altitude dependent


PS2: 3.0 m-4.1 m
PS2.SD: 3.0 m-4.1 m
PSB.SD: 3.7 m-4.2 m

Processing

Pixel Size (orthorectified) 3.0 m

Bit Depth Analytic (Radiance - W m-2 sr-1 μm-1): 16-bit


Analytic SR (Surface Reflectance): 16-bit

Geometric Corrections Sensor-related effects are corrected using sensor telemetry and a sensor model.
Spacecraft-related effects are corrected using attitude telemetry and best available
ephemeris data. Orthorectified using GCPs and fine DEMs (30 m to 90 m posting) to
<10 m RMSE positional accuracy.

Positional Accuracy Less than 10 m RMSE

Radiometric Corrections ● Conversion to absolute radiometric values based on calibration coefficients


● Radiometric values scaled by 100 to reduce quantization error
● Calibration coefficients are regularly monitored and updated with on-orbit
calibration techniques.

Atmospheric Corrections ● Conversion to top of atmosphere (TOA) reflectance values using at-sensor
radiance and supplied coefficients
● Conversion to surface reflectance values using the 6SV2.1 radiative transfer code
and MODIS NRT data
● Reflectance values scaled by 10,000 to reduce quantization error

3.4 PLANETSCOPE ORTHO TILE PRODUCT SPECIFICATION

The PlanetScope Ortho Tile products offer PlanetScope Satellite imagery orthorectified as individual 25 km by
25 km tiles referenced to a fixed, standard image tile grid system. This product was designed for a wide variety
of applications that require imagery with an accurate geolocation and cartographic projection. It has been
processed to remove distortions caused by terrain and can be used for cartographic purposes.

For PlanetScope split-frame satellites, imagery is collected as a series of overlapping framed scenes from a
single satellite in a single pass. These scenes are subsequently orthorectified and an ortho tile is then generated
from a collection of consecutive scenes, typically 4 to 5. The process of conversion of framed scene to ortho tile is
outlined in the figure below.

© Planet Labs PBC 2022 23


The PlanetScope Ortho Tile products are radiometrically-, sensor-, and geometrically-corrected and aligned to a
cartographic map projection. The geometric correction uses fine DEMs with a post spacing of between 30 and
90 meters. GCPs are used in the creation of every image and the accuracy of the product will vary from region
to region based on available GCPs.

Figure 1: PlanetScope Scene to Ortho Tile Conversion.

The table below describes the attributes for the PlanetScope Ortho Tile product:

Table 2-F: PlanetScope Ortho Tile Product Attributes

PLANETSCOPE ORTHO TILE PRODUCT ATTRIBUTES

Product Attribute Description

Product Components and Format PlanetScope Ortho Tile product consists of the following file components:
● Image File – GeoTIFF format
● Metadata File – XML format
● Thumbnail File – GeoTIFF format
● Unusable Data Mask (UDM) File – GeoTIFF format
● Usable Data Mask (UDM2) File - GeoTIFF format

Product Orientation Map North Up

Product Framing PlanetScope Ortho Tiles are based on a worldwide, fixed UTM grid system. The grid
is defined in 24 km by 24 km tile centers, with 1 km of overlap (each tile has an
additional 500 m overlap with adjacent tiles), resulting in 25 km by 25 km tiles.

Pixel Size (orthorectified) 3.125 m

Bit Depth 16-bit

© Planet Labs PBC 2022 24


Product Size Tile size is 25 km (8000 lines) by 25 km (8000 columns). 5 to 500 Mbytes per Tile for 4
bands at 3.125 m pixel size after orthorectification.

Geometric Corrections Sensor-related effects are corrected using sensor telemetry and a sensor model.
Orthorectified using GCPs and fine DEMs (30 m to 90 m posting).

Atmospheric Corrections Atmospheric effects are corrected using 6SV2.1 radiative transfer code. AOD, water
vapor and ozone inputs are retrieved from MODIS near-real-time data (MOD09CMA
and MOD09CMG).

Horizontal Datum WGS84: UTM

Resampling Kernel Cubic Convolution

3.4.1 PlanetScope Visual Ortho Tile Product Specification

The PlanetScope Visual Ortho Tile product is orthorectified and color-corrected (using a color curve). This
correction attempts to optimize colors as seen by the human eye providing images as they would look if viewed
from the perspective of the satellite. It has been processed to remove distortions caused by terrain and can be
used for cartographic mapping and visualization purposes. It eliminates the perspective effect on the ground
(not on buildings), restoring the geometry of a vertical shot. Additionally, a correction is made to the sun angle
in each image to account for differences in latitude and time of acquisition.

The Visual product is optimal for simple and direct use of the image. It is designed and made visually appealing
for a wide variety of applications that require imagery with an accurate geolocation and cartographic projection.
The product can be used and ingested directly into a Geographic Information System.

Table 2-G: PlanetScope Visual Ortho Tile Product Attributes

PLANETSCOPE VISUAL ORTHO TILE PRODUCT ATTRIBUTES

Product Attribute Description

Information Content

Visual Bands 3-band natural color (red, green, blue)

Ground Sample Distance Approximate, satellite altitude dependent


PS2: 3.0 m-4.1 m
PS2.SD: 3.0 m-4.1 m
PSB.SD: 3.7 m-4.2 m

Processing

Pixel Size (orthorectified) 3.125 m

Bit Depth 8-bit

Geometric Corrections Sensor-related effects are corrected using sensor telemetry and a sensor model,
bands are co-registered, and spacecraft-related effects are corrected using attitude
telemetry and best available ephemeris data. Orthorectified using GCPs and fine
DEMs (30 m to 90 m posting) to < 10 m RMSE positional accuracy.

© Planet Labs PBC 2022 25


Positional Accuracy Less than 10 m RMSE

Color Enhancements Enhanced for visual use and corrected for sun angle

3.4.2 PlanetScope Analytic Ortho Tile Product Specification

The PlanetScope Analytic Ortho Tile product is orthorectified, multispectral data from the satellite constellation.
Analytic products are calibrated multispectral imagery products that have been processed to allow analysts to
derive information products for data science and analytics. This product is designed for a wide variety of
applications that require imagery with an accurate geolocation and cartographic projection. It has been
processed to remove distortions caused by terrain and can be used for many data science and analytic
applications. It eliminates the perspective effect on the ground (not on buildings), restoring the geometry of a
vertical shot. The orthorectified visual imagery is optimal for value-added image processing including
vegetation indices, land cover classifications, etc. In addition to orthorectification, the imagery has radiometric
corrections applied to correct for any sensor artifacts and transformation to scaled at-sensor radiance.

Figure 2: PlanetScope Analytic Ortho Tiles with RGB (left) and NIR False-Color Composite (right)

Table 2-H: PlanetScope Analytic Ortho Tile Product Attributes

PLANETSCOPE ANALYTIC ORTHO TILE PRODUCT ATTRIBUTES

Product Attribute Description

Information Content

Analytic Bands 4-band multispectral image (blue, green, red, near-infrared)

Ground Sample Distance Approximate, satellite altitude dependent


PS2: 3.0 m-4.1 m
PS2.SD: 3.0 m-4.1 m
PSB.SD: 3.7 m-4.2 m

Processing

© Planet Labs PBC 2022 26


Pixel Size (orthorectified) 3.125 m

Bit Depth Analytic (DN): 12-bit


Analytic (Radiance - W m-2 sr-1 μm-1): 16-bit
Analytic SR (Surface Reflectance): 16-bit

Geometric Corrections Sensor-related effects are corrected using sensor telemetry and a sensor model,
bands are co-registered, and spacecraft-related effects are corrected using attitude
telemetry and best available ephemeris data. Orthorectified using GCPs and fine
DEMs (30 m to 90 m posting) to <10 m RMSE positional accuracy.

Positional Accuracy Less than 10 m RMSE

Radiometric Corrections ● Conversion to absolute radiometric values based on calibration coefficients


● Radiometric values scaled by 100 to reduce quantization error
● Calibration coefficients are regularly monitored and updated with on-orbit
calibration techniques.

Atmospheric Corrections ● Conversion to top of atmosphere (TOA) reflectance values using at-sensor
radiance and supplied coefficients
● Conversion to surface reflectance values using the 6SV2.1 radiative transfer code
and MODIS NRT data
● Reflectance values scaled by 10,000 to reduce quantization error

3.4.3 PlanetScope Analytic 5B Ortho Tile Product Specification

The PlanetScope Analytic 5B Ortho Tile product is identical to the Analytic Ortho Tile above except with the
PlanetScope red-edge band included.

PLANETSCOPE ANALYTIC ORTHO TILE PRODUCT ATTRIBUTES

Product Attribute Description

Information Content

Analytic Bands 5-band multispectral image (blue, green, red, red-edge, near-infrared)

Ground Sample Distance Approximate, satellite altitude dependent


PS2: 3.0 m-4.1 m
PS2.SD: 3.0 m-4.1 m
PSB.SD: 3.7 m-4.2 m

Processing

Pixel Size (orthorectified) 3.125 m

Bit Depth Analytic (Radiance - W m-2 sr-1 μm-1): 16-bit

Geometric Corrections Sensor-related effects are corrected using sensor telemetry and a sensor model,
bands are co-registered, and spacecraft-related effects are corrected using attitude
telemetry and best available ephemeris data. Orthorectified using GCPs and fine
DEMs (30 m to 90 m posting) to <10 m RMSE positional accuracy.

© Planet Labs PBC 2022 27


Positional Accuracy Less than 10 m RMSE

Radiometric Corrections ● Conversion to absolute radiometric values based on calibration coefficients


● Radiometric values scaled by 100 to reduce quantization error
● Calibration coefficients are regularly monitored and updated with on-orbit
calibration techniques.

Figure 3: PlanetScope Analytic Bands

© Planet Labs PBC 2022 28


4. RAPIDEYE IMAGERY PRODUCTS
RapidEye imagery products are available in two different processing levels.

Table 3-A: RapidEye Satellite Image Product Processing Levels

Name Description Product Level

RapidEye Basic Scene Product Radiometric and sensor corrections Level 1B


applied to the data. On-board
spacecraft attitude and ephemeris
applied to the data.

RapidEye Ortho Tile Product Radiometric and sensor corrections Level 3A


applied to the data. Imagery is
orthorectified using the RPCs and an
elevation model.

The name of each acquired RapidEye image is designed to be unique and allow for easier recognition and
sorting of the imagery. It includes the date and time of capture, as well as the id of the satellite that captured it.
The name of each downloaded image product is composed of the following elements:

RapidEye Ortho Tiles:

<tileid>_<acquisition_date>_<satellite_id>_<productLevel>_<productType>.<extension>

RapidEye Basic Scenes:

<acquisition_date>T<acquisition_time>_<satellite_id>_<productLevel>_<productType>.<extension>

4.1 RADIOMETRIC INTERPRETATION

Analytic products are scaled to Top of Atmosphere Radiance. Validation of radiometric accuracy of the on-orbit
calibration has been measured at 5% using vicarious collects in the Railroad Valley calibration site. Furthermore,
each band is maintained within a range of +/- 2.5% from the band mean value across the constellation and over
the satellite’s lifetime.

All RapidEye satellite images were collected at a bit depth of 12 bits and on-board the satellites, the least
significant bit is removed, and thus 11 bits are stored and downloaded. On the ground, the bit shift is reversed by
a multiplication factor of 2. The bit depth of the original raw imagery can be determined from the “shifting”
field in the XML metadata file. During on-ground processing, radiometric corrections are applied and all images
are scaled to a 16-bit dynamic range. This scaling converts the (relative) pixel DNs coming directly from the
sensor into values directly related to absolute at sensor radiances. The scaling factor is applied so that the
resultant single DN values correspond to 1/100th of a W/(m²*sr*μm). The DNs of the RapidEye image pixels
represent the absolute calibrated radiance values for the image.

© Planet Labs PBC 2022 29


Converting to Radiance and Top of Atmosphere Reflectance

To convert the pixel values of the Analytic products to radiance, it is necessary to multiply the DN value by the
radiometric scale factor, as follows:

RAD(i) = DN(i) * radiometricScaleFactor(i), where radiometricScaleFactor(i) = 0.01

The resulting value is the at-sensor radiance of that pixel in watts per steradian per square meter (W/m²*sr*μm).

Reflectance is generally the ratio of the reflected radiance divided by the incoming radiance. Note that this ratio
has a directional aspect. To turn radiance into reflectance it is necessary to relate the radiance values (e.g. the
pixel DNs multiplied with the radiometric scale factor) to the radiance the object is illuminated with. This is
often done by applying an atmospheric correction software to the image, because this way the impact of the
atmosphere to the radiance values is eliminated at the same time. But it would also be possible to neglect the
influence of the atmosphere by calculating the Top Of Atmosphere (TOA) reflectance taking into consideration
only the sun distance and the geometry of the incoming solar radiation. The formula to calculate the TOA
reflectance not taking into account any atmospheric influence is as follows:

with:

● i = Number of the spectral band


● REF = reflectance value
● RAD = Radiance value
● SunDist = Earth-Sun Distance at the day of acquisition in Astronomical Units. Note: This value is not
fixed, it varies between 0.983 289 8912 AU and 1.016 710 3335 AU and has to be calculated for the image
acquisition point in time.
● EAI = Exo-Atmospheric Irradiance
● SolarZenith = Solar Zenith angle in degrees (= 90° – sun elevation)

For RapidEye, the EAI values for the 5 bands are (based on the “New Kurucz 2005” model):

● Blue: 1997.8 W/m²µm


● Green: 1863.5 W/m²µm
● Red: 1560.4 W/m²µm
● RE: 1395.0 W/m²µm
● NIR: 1124.4 W/m²µm

Atmospheric Correction

Surface reflectance is determined from top of atmosphere (TOA) reflectance, calculated using coefficients
supplied with the Planet Radiance product.

The Planet Surface Reflectance product corrects for the effects of the Earth's atmosphere, accounting for the
molecular composition and variation with altitude along with aerosol content. Combining the use of standard
atmospheric models with the use of MODIS water vapor, ozone, and aerosol data, this provides reliable and

© Planet Labs PBC 2022 30


consistent surface reflectance scenes over Planet's varied constellation of satellites as part of our normal,
on-demand data pipeline. However, there are some limitations to the corrections performed:

● In some instances there is no MODIS data overlapping a Planet scene or the area nearby. In those cases,
AOD is set to a value of 0.226 which corresponds to a “clear sky” visibility of 23km, the aot_quality is set
to the MODIS “no data” value of 127, and aot_status is set to ‘Missing Data - Using Default AOT’. If there
is no overlapping water vapor or ozone data, the correction falls back to a predefined 6SV internal
model.
● The effects of haze and thin cirrus clouds are not corrected for.
● Aerosol type is limited to a single, global model.
● All scenes are assumed to be at sea level and the surfaces are assumed to exhibit Lambertian scattering
- no BRDF effects are accounted for.
● Stray light and adjacency effects are not corrected for.

4.2 RAPIDEYE BASIC SCENE PRODUCT SPECIFICATION

The RapidEye Basic product is the least processed of the available RapidEye imagery products. This product is
designed for customers with advanced image processing capabilities and a desire to geometrically correct the
product themselves. This product line is available in GeoTIFF and NITF formats.

The RapidEye Basic Scene product is radiometrically- and sensor-corrected, providing imagery as seen from the
spacecraft without correction for any geometric distortions inherent in the imaging process, and is not mapped
to a cartographic projection. The imagery data is accompanied by all spacecraft telemetry necessary for the
processing of the data into a geo-corrected form, or when matched with a stereo pair, for the generation of
digital elevation data. Resolution of the images is 6.5 meters GSD at nadir. The images are resampled to a
coordinate system defined by an idealized basic camera model for band alignment.

The radiometric corrections applied to this product:

● Correction of relative differences of the radiometric response between detectors


● Non-responsive detector filling which fills null values from detectors that are no longer responding (This
isn’t currently done because there are no non-responsive detectors)
● Conversion to absolute radiometric values based on calibration coefficients

The geometric sensor corrections applied to this product correct for:

● Internal detector geometry which combines the two sensor chipsets into a virtual array
● Optical distortions caused by sensor optics
● Registration of all bands together to ensure all bands line up with each other correctly

The table below lists the product attributes for the RapidEye Basic Scene product.

Table 3-B: RapidEye Basic Scene Product Attributes

RAPIDEYE BASIC SCENE PRODUCT ATTRIBUTES

Product Attribute Description

Product Components and Format RapidEye Basic Scene product consists of the following file components:

© Planet Labs PBC 2022 31


● Image File – Image product delivered as a group of single-band NITF or GeoTIFF
files with associated RPC values. Bands are co-registered.
● Metadata File – XML format metadata file and GeoJSON metadata available
● Unusable Data Mask (UDM) File – GeoTIFF format
● Spacecraft information (SCI) file - XML format and contains additional
information related to spacecraft attitude, spacecraft ephemeris, spacecraft
temperature measurements, line imaging times, camera geometry, and
radiometric calibration data.
● Browse Image - GeoTIFF format (also referred to as “Quicklook”)

Product Orientation Spacecraft/Sensor Orientation

Product Framing

Geographic based framing – a


geographic region is defined by
two corners. The product width is
close to the full image swath as
observed by all bands (77 km at
nadir, subject to minor trimming
of up to 3 km during processing)
with a product length that does
not exceed 300 km with a
minimum length of 50 km and
around a 10km overlap.

Ground Sample Distance (nadir) 6.5 m

Bit Depth 16-bit unsigned integers

Pixel Size (orthorectified) 6.5m at Nadir

Radiometric Accuracy Absolute accuracy less than +/- 5.0%


Inter-satellite Accuracy less than +/- 2.5% of the band mean across the constellation

Geometric Corrections Idealized sensor, orbit and attitude models. Bands are co-registered.

Positional Accuracy Less than 10 m RMSE


Band-to-Band Registration Less than 0.2 pixels (1-sigma) for terrain with slope below 10o

Horizontal Datum WGS84

Resampling Kernel Cubic Convolution

4.3 RAPIDEYE VISUAL ORTHO TILE PRODUCT SPECIFICATION

The RapidEye Ortho Tile products are orthorectified as individual 25 km by 25 km tiles. This product was
designed for a wide variety of applications that require imagery with an accurate geolocation and cartographic

© Planet Labs PBC 2022 32


projection. It has been processed to remove distortions caused by terrain and can be used for many
cartographic purposes.

The RapidEye Ortho Tile products are radiometrically-, sensor- and geometrically-corrected and aligned to a
cartographic map projection. The geometric correction uses fine DEMs with a post spacing of between 30 and
90 meters. GCPs are used in the creation of every image and the accuracy of the product will vary from region
to region based on available GCPs. RapidEye Ortho Tile products are output as 25 km by 25 km tiles referenced
to a fixed, standard RapidEye image tile grid system.

The table below lists the product attributes for the RapidEye Ortho Tile product.

Table 3-C: RapidEye Ortho Tile Product Attributes

RAPIDEYE ORTHO TILE PRODUCT ATTRIBUTES

Product Attribute Description

Product Components and Format RapidEye Ortho Tile product consists of the following file components:
● Image File – GeoTIFF file that contains image data and geolocation information
● Metadata File – XML format metadata file and GeoJSON metadata available
● Unusable Data Mask (UDM) File – GeoTIFF format

Product Orientation Map North Up

Product Framing RapidEye Ortho Tiles are based on a worldwide, fixed UTM grid system. The grid is
defined in 24 km by 24 km tile centers, with 1 km of overlap (each tile has an
additional 500 m overlap with adjacent tiles), resulting in 25 km by 25 km tiles.

Pixel Size (orthorectified) 5m

Bit Depth Visual: 8-bit


Analytic (Radiance - W m-2 sr-1 μm-1): 16-bit

Product Size Tile size is 25 km (5000 lines) by 25 km (5000 columns). 250 Mbytes per Tile for 5
bands at 5 m pixel size after orthorectification.

Geometric Corrections Sensor-related effects are corrected using sensor telemetry and a sensor model,
bands are co-registered, and spacecraft-related effects are corrected using attitude
telemetry and best available ephemeris data. Orthorectified using GCPs and fine
DEMs (30 m to 90 m posting).

Horizontal Datum WGS84

Map Projection UTM

Resampling Kernel Cubic Convolution

4.3.1 RapidEye Visual Ortho Tile Product Specification

The RapidEye Visual Ortho Tile product is orthorectified and color-corrected (using a color curve). This
correction optimizes colors as seen by the human eye, providing images as they would look if viewed from the
perspective of the satellite. It has been processed to remove distortions caused by terrain and can be used for
cartographic mapping and visualization purposes. It eliminates the perspective effect on the ground (not on

© Planet Labs PBC 2022 33


buildings), restoring the geometry of a vertical shot. Additionally, a correction is made to the sun angle in each
image to account for differences in latitude and time of acquisition.

The visual product is optimal for simple and direct use of the image. It is designed and made visually appealing
for a wide variety of applications that require imagery with an accurate geolocation and cartographic projection.
The product can be used and ingested directly into a Geographic Information System.

Figure 4: RapidEye Visual Ortho Tile

Table 3-D: RapidEye Visual Ortho Tile Product Attributes

RAPIDEYE VISUAL ORTHO TILE PRODUCT ATTRIBUTES

Product Attribute Description

Information Content

Visual Bands 3-band natural color (red, green, blue)

Ground Sample Distance 6.5 m (at reference altitude 630 km)

Processing

Pixel Size (orthorectified) 5m

Bit Depth 8-bit

© Planet Labs PBC 2022 34


Geometric Corrections Sensor-related effects are corrected using sensor telemetry and a sensor model, ban
are co-registered, and spacecraft-related effects are corrected using attitude
telemetry and best available ephemeris data. Orthorectified using GCPs and fine
DEMs (30 m to 90 m posting) to
< 10 m RMSE positional accuracy.

Positional Accuracy Less than 10 m RMSE


Band-to-Band Registration Less than 0.2 pixels (1-sigma) for terrain with slope below 10o

Radiometric Corrections ● Correction of relative differences of the radiometric response between detectors.
● Non-responsive detector filling which fills nulls values from detectors that are no
longer responding.
● Conversion to absolute radiometric values based on calibration coefficients.

Color Enhancements Enhanced for visual use and corrected for sun angle

4.3.2 RapidEye Analytic Ortho Tile Product Specification

The RapidEye Analytic Ortho Tile product is orthorectified, multispectral data. This product is designed for a
wide variety of applications that require imagery with an accurate geolocation and cartographic projection. It
has been processed to remove distortions caused by terrain and can be used for many data science and analytic
applications. It eliminates the perspective effect on the ground (not on buildings), restoring the geometry of a
vertical shot. The orthorectified imagery is optimal for value-added image processing including vegetation
indices, land cover classifications, etc. In addition to orthorectification, the imagery has radiometric corrections
applied to correct for any sensor artifacts and transformation to at-sensor radiance.

Table 3-E: RapidEye Analytic Ortho Tile Product Attributes

RAPIDEYE ANALYTIC ORTHO TILE PRODUCT ATTRIBUTES

Product Attribute Description

Information Content

Analytic Bands 5-band multispectral image (blue, green, red, red edge, near-infrared)

Ground Sample Distance 6.5 m (at reference altitude 630 km)

Processing

Pixel Size (orthorectified) 5m

Bit Depth 16-bit

Radiometric Accuracy Absolute accuracy less than +/- 5.0%


Inter-satellite Accuracy less than +/- 2.5% of the band mean across the constellation

Geometric Corrections Sensor-related effects are corrected using sensor telemetry and a sensor model, ban
are co-registered, and spacecraft-related effects are corrected using attitude
telemetry and best available ephemeris data. Orthorectified using GCPs and fine
DEMs (30 m to 90 m posting) to
< 10 m RMSE positional accuracy.

© Planet Labs PBC 2022 35


Positional Accuracy Less than 10 m RMSE
Band-to-Band Registration Less than 0.2 pixels (1-sigma) for terrain with slope below 10o

Radiometric Corrections ● Correction of relative differences of the radiometric response between detectors.
● Non-responsive detector filling which fills null values from detectors that are no
longer responding.
● Conversion to absolute radiometric values based on calibration coefficients.

Atmospheric Corrections ● Conversion to top of atmosphere (TOA) reflectance values using at-sensor
radiance and supplied coefficients
● Conversion to surface reflectance values using the 6SV2.1 radiative transfer code
and MODIS NRT data
● Reflectance values scaled by 10,000 to reduce quantization error

© Planet Labs PBC 2022 36


5. SKYSAT IMAGERY PRODUCTS

5.1 SKYSAT BASIC SCENE PRODUCT SPECIFICATION

The SkySat Basic Scene product includes Analytic, Analytic DN, L1A Panchromatic DN, and Panchromatic
imagery that is uncalibrated and in a raw digital number format. The Basic Scene Product is not corrected for
any geometric distortions inherent in the imaging process.

Imagery data is accompanied by Rational Polynomial Coefficients (RPCs) to enable orthorectification by the
user. This product is designed for users with advanced image processing capabilities and a desire to
geometrically correct the product themselves.

The Basic L1A Panchromatic DN assets (basic_l1a_panchromatic_dn, basic_l1a_panchromatic_dn_rpc) are made


available for download immediately after production, before the remaining imagery assets, which may require
super-resolution and orthorectification. Hence the L1A Pan browse image will be visible in Explorer and the API
before all other image assets are ready for download.

The SkySat Basic Scene Product has a sensor-based framing, and is not mapped to a cartographic projection.

● Analytic - unorthorectified, radiometrically corrected, multispectral BGRN


● Analytic DN - unorthorectified, multispectral BGRN
● Panchromatic - unorthorectified, radiometrically corrected, panchromatic (PAN)
● Panchromatic DN - unorthorectified, panchromatic (PAN)
● L1A Panchromatic DN - unorthorectified, pre-super resolution, panchromatic (PAN)

Table 4-A: SkySat Basic Scene Product Attributes

SKYSAT BASIC SCENE PRODUCT ATTRIBUTES

Product Attribute Description

Product Components Image File – GeoTIFF format


and Format Metadata File – JSON format
Rational Polynomial Coefficients – Text File
UDM File – GeoTIFF format

Information Content

Image Configurations 4-band Analytic DN Image (Blue, Green, Red, NIR)

1-band Panchromatic DN Image (Pan)

Product Orientation Spacecraft/Sensor Orientation

Product Framing Scene based:

© Planet Labs PBC 2022 37


SkySat Satellites have three cameras per satellite, which capture overlapping strips. Each of
these strips contain overlapping scenes.
One scene is approximately 2560px x 1080px.

Sensor Type CMOS Frame Camera with Panchromatic and Multispectral halves

Spectral Bands Blue: 450 - 515 nm


Green: 515 - 595 nm
Red: 605 - 695 nm
NIR: 740 - 900 nm
Pan: 450 - 900 nm

Processing Basic Scene

Product Bit Depth 16-bit Unsigned Integer Multispectral and Panchromatic Imagery

Radiometric Corrections Cross-Sensor Non Uniformity Correction (1%)


Conversion to absolute radiometric values based on calibration coefficients
Calibration coefficients regularly monitored and updated with on-orbit calibration techniques

Geometric Corrections Idealized sensor model and Rational Polynomial Coefficients (RPC)
Bands are co-registered

Horizontal Datum WGS84

Map Projection N/A

Resampling Kernel Resampling of Analytic Multispectral Data to > 1.0m GSD

Ground Sample Distance [SkySat-1, SkySat-2]


Panchromatic: 0.86m
Multispectral: 1.0m

[SkySat-3 - SkySat-15]
Panchromatic: 0.65m
Multispectral: 0.81m

[SkySat-16 - SkySat-21]
Panchromatic: 0.58m
Multispectral: 0.72m

Pixel Size (Orthorectified) All assets: 0.50 m

Geometric Accuracy <50m RMSE

© Planet Labs PBC 2022 38


5.2 SKYSAT VIDEO PRODUCT SPECIFICATION

Full motion videos are collected between 30 and 120 seconds by a single camera from any of the SkySats. Videos
are collected using the panchromatic half of the camera, hence all videos are PAN only.

Videos are packaged and delivered with a video mpeg-4 file, plus all image frames with accompanying video
metadata and a frame index file (reference Product Types below).

● 1A Panchromatic DN - unorthorectified, pre-super resolution, panchromatic (PAN)

Table 4-B: SkySat Video Product Attributes

SKYSAT VIDEO SCENE PRODUCT ATTRIBUTES

Product Attribute Description

Product Components Video file - MP4


and Format Video frames - folder
- Image Frame File – TIFF format
- Rational Polynomial Coefficients – Text File
- Frame Index - CSV File
Metadata File – JSON format

Information Content

Image Configurations 1-band L1A Panchromatic DN Image (Pan)

Product Orientation Spacecraft/Sensor Orientation

Sensor Type CMOS Frame Camera with Panchromatic and Multispectral halves

Spectral Bands Pan: 450 - 900 nm

Video Duration 30 - 120 seconds

Processing Basic Video Scene

Bit Depth 16 Unsigned Integer

Radiometric Corrections Cross-Sensor Non Uniformity Correction (1%)

Geometric Corrections Idealized sensor model and Rational Polynomial Coefficients (RPC)

Horizontal Datum WGS84

Map Projection N/A

Resampling Kernel N/A

Ground Sample Distance [SkySat-3 - SkySat-15]


Panchromatic: 0.81m

© Planet Labs PBC 2022 39


[SkySat-16 - SkySat-21]
Panchromatic: 0.72m

Geometric Accuracy <50m RMSE

5.3 SKYSAT ALL-FRAMES PRODUCT SPECIFICATION

The SkySats capture up to 50 frames per second per Collect. The All-frames asset includes all of the originally
captured frames in a Collect, uncalibrated and in a raw digital number format. Delivered as a zip file containing
all frames as basic L1A panchromatic DN imagery files, with accompanying RPC txt files, and a JSON pinhole
camera model.

Table 4-C: SkySat All-Frames Product Attributes

SKYSAT ALL-FRAMES SCENE PRODUCT ATTRIBUTES

Product Attribute Description

Product Components All frames - folder


and Format - Image Frame File – TIFF format
- Rational Polynomial Coefficients – Text File
- Pinhole camera model - JSON format
Metadata File – JSON format
Frame Index – CSV file

Information Content

Image Configurations 1-band L1A Panchromatic DN Image (Pan)

Product Orientation Spacecraft/Sensor Orientation

Sensor Type CMOS Frame Camera with Panchromatic and Multispectral halves

Spectral Bands Pan: 450 - 900 nm

Processing Basic L1A Scene

Bit Depth 16 Unsigned Integer

Radiometric Corrections Cross-Sensor Non Uniformity Correction (1%)

Geometric Corrections Idealized sensor model and Rational Polynomial Coefficients (RPC)

Horizontal Datum WGS84

Map Projection N/A

Resampling Kernel N/A

Ground Sample Distance [SkySat-3 - SkySat-15]


Panchromatic: 0.81m

[SkySat-16 - SkySat-21]

© Planet Labs PBC 2022 40


Panchromatic: 0.72m

Geometric Accuracy <50m RMSE

5.4 PINHOLE CAMERA MODEL

Described here is the JSON pinhole model that accompanies each all-frames asset. The pinhole model is based
on projective matrices, omitting the optical distortion model. As built, the SkySat telescopes have ~1 pixel or less
of distortion across all three sensors.

Projective Model

Note that this model uses 3D and 2D homogeneous coordinates.

Let be a position in ECEF coordinates, with values in meters, in 3D homogeneous coordinates.

Let be a position in imaging plane coordinates, with values in pixels (or fractional pixels), in 2D
homogeneous coordinates.

4𝑥4 4𝑥4 3𝑥4


The projective model is described by three matrices, 𝑃 𝑒𝑥𝑡𝑟𝑖𝑛𝑠𝑖𝑐
∈𝑅 , 𝑃 𝑖𝑛𝑡𝑟𝑖𝑛𝑠𝑖𝑐
∈ 𝑅 , 𝑃 𝑐𝑎𝑚𝑒𝑟𝑎
∈ 𝑅 such

that 𝑖𝑚 = 𝑃 𝑐𝑎𝑚𝑒𝑟𝑎
𝑃 𝑖𝑛𝑡𝑟𝑖𝑛𝑠𝑖𝑐
𝑃 𝑒𝑥𝑡𝑟𝑖𝑛𝑠𝑖𝑐
𝑋 𝐸𝐶𝐸𝐹

3𝑥4
For efficiency, we can also combine all three components into a single projective matrix, 𝑃 𝑝𝑟𝑜𝑗𝑒𝑐𝑡𝑖𝑣𝑒
∈𝑅 such
that 𝑃 𝑝𝑟𝑜𝑗𝑒𝑐𝑡𝑖𝑣𝑒
= 𝑃 𝑐𝑎𝑚𝑒𝑟𝑎
𝑃 𝑖𝑛𝑡𝑟𝑖𝑛𝑠𝑖𝑐
𝑃 𝑒𝑥𝑡𝑟𝑖𝑛𝑠𝑖𝑐

A given value of im describes a projective ray in the pinhole camera frame, representing the projection of 𝑋 𝐸𝐶𝐸𝐹
onto the camera sensor. Note that w=0 indicates a ray parallel to the imaging plane and will never intersect the
sensor. For w≠0, we can simply solve for u and v.

Exterior Orientation

Let describe the satellite position at a particular time, in ECEF coordinates and with values
in meters.

© Planet Labs PBC 2022 41


Let be a quaternion describing the rotation from the ECEF frame to the boresight frame
(positive z-axis aligned with telescope boresight).

𝑃 𝑒𝑥𝑡𝑟𝑖𝑛𝑠𝑖𝑐
is constructed from the exterior orientation by translating the origin to the satellite position and
applying the ECEF-to-boresight rotation (following the conventions in
https://en.wikipedia.org/wiki/Conversion_between_quaternions_and_Euler_angles#Rotation_matrices):

Interior Orientation

𝑃 𝑖𝑛𝑡𝑟𝑖𝑛𝑠𝑖𝑐
and 𝑃 𝑐𝑎𝑚𝑒𝑟𝑎
are based on the rigorous model from "SkySat Imaging Geometry." Their derivation involves
multiple frame changes and axis flips and is not described here. We expect that these will remain nearly
constant over time for each satellite and camera. 𝑃 𝑒𝑥𝑡𝑟𝑖𝑛𝑠𝑖𝑐 is unique to each satellite and imaging time, but
shared across cameras for each capture event.

5.5 FRAME INDEX FILE

FRAME INDEX (CSV)

Field Value Sample

name Frame image filename(w/o file extension) 1207431805.69566202_sc00110_c2_PAN

datetime Time of frame capture 2018-04-10T21:43:07Z

gsd Ground Sample Distance 0.964506

sat_az Avg satellite azimuth for frame 48.3168

sat_elev Avg satellite elevation for frame 55.477

x_sat_eci_km X-axis aligned ECI coordinate 3074.73

y_sat_eci_km Y-axis aligned ECI coordinate 3057.87

z_sat_eci_km Z-axis aligned ECI coordinate 5338.56

qw_eci First pointing quaternion coordinate in ECI 0.28172862


coordinate system

qx_eci Second pointing quaternion coordinate in ECI -0.55973753


coordinate system

© Planet Labs PBC 2022 42


qy_eci Third pointing quaternion coordinate in ECI -0.74397115
coordinate system

qz_eci Fourth pointing quaternion coordinate in ECI -0.23201253


coordinate system

x_sat_ecef_km X-axis aligned ECEF coordinate 3816.34769

y_sat_ecef_km Y-axis aligned ECEF coordinate 4718.37789

z_sat_ecef_km Z-axis aligned ECEF coordinate 2999.05659

qw_ecef First pointing quaternion coordinate in ECEF 0.3396504


coordinate system

qx_ecef Second pointing quaternion coordinate in -0.3795945


ECEF coordinate system

qy_ecef Third pointing quaternion coordinate in ECEF 0.85021459


coordinate system

qz_ecef Fourth pointing quaternion coordinate in -0.13296909


ECEF coordinate system

bit_dpth Pixel bit depth of frame 16

geom Frame dimensions POLYGON((-123.132 49.2933,-123.089


49.294,-123.092 49.2825,-123.135 49.2818))

integration_time_ms Capture integration time, in ms 433.59375

filename Full filename with 12 digit timestamp 1289391430.33374000_sc00114_c3_PAN_i0000000


604.tif

5.6 RADIOMETRIC INTERPRETATION

To convert the pixel values of the Analytic products to radiance, it is necessary to multiply the DN value by the
radiometric scale factor, as follows:

RAD(i) = DN(i) * radiometric_scale_factor(i), where radiometric_scale_factor(i) = 0.01

The resulting value is the Top of Atmosphere Radiance of that pixel in watts per steradian per square meter
(W/m²*sr*μm).

To convert the pixel values of the Analytic products to Top of Atmosphere Reflectance, it is necessary to multiply
the DN value by the reflectance coefficient found in the GeoTiff header. This makes the complete conversion
from DN to Top of Atmosphere Reflectance to be as follows:

REF(i) = DN(i) * reflectance_coefficient(i)

Alternatively, the customer may perform the TOA Reflectance conversion on their own using the following
equation, with the ESUN values given below in Table 3.

© Planet Labs PBC 2022 43


2
(π × 𝑅𝑎𝑑𝑖𝑎𝑛𝑐𝑒 × 𝑑 )
𝑇𝑂𝐴𝑅 = 𝐸𝑆𝑈𝑁 × 𝑐𝑜𝑠(90−𝑠𝑢𝑛 𝑒𝑙𝑒𝑣𝑎𝑡𝑖𝑜𝑛)

𝑑 =Earth to sun distance in astronomical units

Table 4-D: Skysat Analytic Ortho Scene ESUN values, resampled from Thuillier irradiance spectra

PAN BLUE GREEN RED NIR

SkySat-1 1587.94 1984.85 1812.88 1565.83 1127

SkySat-2 1587.94 1984.85 1812.88 1565.83 1127

SkySat-3 1585.89 2000.7 1821.8 1584.13 1120.33

SkySat-4 1585.89 2000.7 1821.8 1584.13 1120.33

SkySat-5 1573.42 2009.23 1820.33 1584.84 1104.96

SkySat-6 1573.42 2009.23 1820.33 1584.84 1104.96

SkySat-7 1573.42 2009.23 1820.33 1584.84 1104.96

SkySat-8 1582.79 2009.28 1820.25 1583.3 1114.22

SkySat-9 1583.61 2009.29 1821.04 1583.83 1109.44

SkySat-10 1583.88 2008.61 1820.87 1583.5 1112.3

SkySat-11 1586.89 2009.26 1821.14 1583.66 1113.77

SkySat-12 1581.65 2009.5 1821.24 1584.91 1109.01

SkySat-13 1580.89 2009.43 1821.7 1583.77 1108.74

SkySat-14 1581.65 2009.5 1821.24 1584.91 1109.01

SkySat-15 1580.89 2009.43 1821.7 1583.77 1108.74

SkySat-16 - 21 1582.43 2005.51 1817.55 1580.98 1113.57

5.7 SCENE METADATA

Basic Scene GeoJSON metadata

Table 4-E: Skysat Basic Scene Geojson Metadata Schema

SKYSAT BASIC SCENE GEOJSON METADATA SCHEMA

Parameter Description Type

acquired The RFC 3339 acquisition time of the image. string

camera_id The specific detector used to capture the String (e.g. “d1”, “d2”)
scene.

cloud_cover Ratio of the area covered by clouds to that number (0 - 1)


which is uncovered.

© Planet Labs PBC 2022 44


ground_control If the image meets the positional accuracy boolean
specifications this value will be true. If the
image has uncertain positional accuracy,
this value will be false.

gsd The ground sampling distance of the image number


acquisition.

item_type The name of the item type that models string (e.g. “PSScene3Band”, ”SkySatScene”)
shared imagery data schema.

provider Name of the imagery provider. string ("planetscope","rapideye", “skysat”)

published The RFC 3339 timestamp at which this item string


was added to the API.

publishing_stage Stage of publishing for an item. Both "l1a" string (“preview”, “finalized”)
assets and SkySatScenes with
fast-rectification applied will have a
publishing_stage = "preview".
Fast-rectification refers to the initial
rectification of the orthorectified product, to
enable faster publication. Once
full-rectification is applied, all assets will be
updated to publishing_stage = "finalized

quality_category Metric for image quality. To qualify for string (“standard”, “test”)
“standard” image quality an image must
meet a variety of quality standards, for
example: PAN motion blur less than 1.15
pixels, compression bits per pixel less than 3.
If the image does not meet these criteria it
is considered “test” quality.

satellite_azimuth Angle from true north to the satellite vector number (0 - 360)
at the time of imaging, projected on the
horizontal plane in degrees.

satellite_id Globally unique identifier of the satellite string


that acquired the underlying imagery.

strip_id Globally unique identifier of the image strip string


this scene was collected against

sun_azimuth Angle from true north to the sun vector number (0 - 360)
projected on the horizontal plane in
degrees.

sun_elevation Elevation angle of the sun in degrees. number (0 - 90)

updated The RFC 3339 timestamp at which this item string


was updated in the API.

view_angle Spacecraft across-track off-nadir viewing number (0 - 90)


angle used for imaging, in degrees.

© Planet Labs PBC 2022 45


5.8 BASIC SCENE RPC METADATA

Table 9: Skysat Basic Scene Text file Metadata Schema

Parameter Description Sample

LINE_OFF Row offset of center point 534.896219421794

SAMP_OFF Column offset of center point 1267.3960612691

LAT_OFF Latitude coordinate of center point -18.1132

LONG_OFF Longitude coordinate of center point 178.4441

HEIGHT_OFF Altitude of center point 123

LINE_SCALE Scaling factor for row coordinate 534.896219421794

SAMP_SCALE Scaling factor for column coordinate 1267.39606126914

LAT_SCALE Scaling factor for latitude coordinates -0.0264

LONG_SCALE Scaling factor for longitude coordinates 0.0331

HEIGHT_SCALE Scaling factor for altitude coordinates 77

LINE_NUM_COEFF_ Numerator coefficient in row RPC equation (1-20) 4.27902854674

LINE_DEN_COEFF_ Denominator Coefficient in row RPC equation(1-20) 0.00174493132019

SAMP_NUM_COEFF_ Numerator coefficient in column RPC 0.0110620153979


equation(1-20)

SAMP_DEN_COEFF_ Denominator coefficient in column RPC equation 0.00174477677906


(1-20)

5.9 SKYSAT VIDEO METADATA

Table 9: Skysat Video Geojson Metadata Schema

SKYSAT BASIC SCENE GEOJSON METADATA SCHEMA

Parameter Description Type

acquired The RFC 3339 acquisition time of the image. string

camera_id The specific detector used to capture the String (e.g. “d1”, “d2”)
scene.

item_type The name of the item type that models string (e.g. “PSScene3Band”, ”SkySatScene”)
shared imagery data schema.

© Planet Labs PBC 2022 46


provider Name of the imagery provider. string ("planetscope","rapideye", “skysat”)

published The RFC 3339 timestamp at which this item string


was added to the API.

publishing_stage SkySatVideo assets will always have a string (“finalized”)


“finalized” publishing_stage

quality_category Metric for image quality. To qualify for string (“standard”, “test”)
“standard” image quality an image must
meet a variety of quality standards, for
example: PAN motion blur less than 1.15
pixels, compression bits per pixel less than 3.
If the image does not meet these criteria it
is considered “test” quality.

satellite_azimuth Angle from true north to the satellite vector number (0 - 360)
at the time of imaging, projected on the
horizontal plane in degrees.

satellite_id Globally unique identifier of the satellite string


that acquired the underlying imagery.

strip_id Globally unique identifier of the image strip string


this scene was collected against

sun_azimuth Angle from true north to the sun vector number (0 - 360)
projected on the horizontal plane in
degrees.

sun_elevation Elevation angle of the sun in degrees. number (0 - 90)

updated The RFC 3339 timestamp at which this item string


was updated in the API.

view_angle Spacecraft across-track off-nadir viewing number (0 - 90)


angle used for imaging, in degrees.

Video Scene metadata

Table 4-F: Skysat Video JSON file Metadata Schema

VIDEO PRODUCT METADATA

FIeld Value Sample

Satellite Satellite ID 00110

Camera Camera used for imaging 2

Geometry Composite of the geospatial extent of all frames in the video GeoJson Polygon

© Planet Labs PBC 2022 47


Time

Start Start time of video capture 2018-04-10T21:43:07

End End time of video capture 2018-04-10T21:44:07

Duration (s) Duration of video in seconds 59.976592063903809

Angle

Start Satellite collection elevation of first frame in video 55.476973516035933

End Satellite collection elevation of last frame in video 61.410026752389307

Convergence Convergence angle between first and last frames 5.9330532363533734

Azimuth

Start Satellite azimuth angle of first frame in video 48.316762122631033

End Satellite azimuth angle of last frame in video 143.12580513942621

Delta Difference between start and end satellite azimuth angle 94.809043016795172

Exposure

Panchromatic Gain Sensor amplification of the signal 1.0, 10.0, or 30.0

Panchromatic Integration time, in ms 433.59375


Integration Time

Compression Ratio The ratio comparing an image's true size to its size on the file system 4.0

Scan Rate Kms The ground speed at which the SkySat captures image frames 433.59375

© Planet Labs PBC 2022 48


Table 4-G: Frame Index (csv)

FRAME INDEX (CSV)

Field Value Sample

name Frame image filename(w/o file extension) 1207431805.69566202_sc00110_c2_PAN

datetime Time of frame capture 2018-04-10T21:43:07Z

gsd Ground Sample Distance 0.964506

sat_az Avg satellite azimuth for frame 48.3168

sat_elev Avg satellite elevation for frame 55.477

x_sat_eci_km X-axis aligned ECI coordinate 3074.73

y_sat_eci_km Y-axis aligned ECI coordinate 3057.87

z_sat_eci_km Z-axis aligned ECI coordinate 5338.56

qw_eci First pointing quaternion coordinate in ECI 0.28172862


coordinate system

qx_eci Second pointing quaternion coordinate in ECI -0.55973753


coordinate system

qy_eci Third pointing quaternion coordinate in ECI -0.74397115


coordinate system

qz_eci Fourth pointing quaternion coordinate in ECI -0.23201253


coordinate system

x_sat_ecef_km X-axis aligned ECEF coordinate 3816.34769

y_sat_ecef_km Y-axis aligned ECEF coordinate 4718.37789

z_sat_ecef_km Z-axis aligned ECEF coordinate 2999.05659

qw_ecef First pointing quaternion coordinate in ECEF 0.3396504


coordinate system

qx_ecef Second pointing quaternion coordinate in -0.3795945


ECEF coordinate system

qy_ecef Third pointing quaternion coordinate in ECEF 0.85021459


coordinate system

qz_ecef Fourth pointing quaternion coordinate in -0.13296909


ECEF coordinate system

bit_dpth Pixel bit depth of frame 16

geom Frame dimensions POLYGON((-123.132 49.2933,-123.089


49.294,-123.092 49.2825,-123.135 49.2818))

integration_time_ms Capture integration time, in ms 433.59375

filename Full filename with 12 digit timestamp 1289391430.33374000_sc00114_c3_PAN_i0000000


604.tif

© Planet Labs PBC 2022 49


5.10 SKYSAT ORTHO SCENE PRODUCT SPECIFICATION

The SkySat Ortho Scene product includes Visual, Analytic DN, Analytic, Panchromatic, and Pansharpened
Multispectral imagery. The Ortho Scene product is sensor- and geometrically-corrected, and is projected to a
cartographic map projection. The geometric correction uses fine Digital Elevation Models (DEMs) with a post
spacing of between 30 and 90 meters.

Ground Control Points (GCPs) are used in the creation of every image and the accuracy of the product will vary
from region to region based on available GCPs. Also note, ortho accuracy is not guaranteed for scenes with a
view angle greater than 30 degrees, captured above +/-85 degrees latitude, with low solar angles, varying
terrain, or with a large concentration of clouds, snow, or water within the scene or full collect.

Additionally, publication is not guaranteed for collections with very large view angles (i.e. greater than 45
degrees) and very low solar elevation (i.e. lower than 20 degrees).

● Visual - orthorectified, pansharpened, and color-corrected (using a color curve) 3-band RGB Imagery
● Pansharpened Multispectral - orthorectified, pansharpened 4-band BGRN Imagery
● Analytic SR - orthorectified, multispectral BGRN. Atmospherically corrected Surface Reflectance
product.
● Analytic - orthorectified, multispectral BGRN. Radiometric corrections applied to correct for any sensor
artifacts and transformation to top-of-atmosphere radiance
● Analytic DN - orthorectified, multispectral BGRN, uncalibrated digital number imagery product
Radiometric corrections applied to correct for any sensor artifacts
● Panchromatic - orthorectified, radiometrically correct, panchromatic (PAN)
● Panchromatic DN - orthorectified, panchromatic (PAN), uncalibrated digital number imagery product

Table 4-H: SkySat Ortho Scene Product Attributes

SKYSAT ORTHO SCENE PRODUCT ATTRIBUTES

Product Attribute Description

Product Components Image File – GeoTIFF format


and Format Metadata File – JSON format
Rational Polynomial Coefficients – Text File
UDM File – GeoTIFF format

Information Content

Product Framing Scene Based:

© Planet Labs PBC 2022 50


SkySat Satellites have three cameras per satellite, which capture overlapping strips.
Each of these strips contain overlapping scenes.
One scene is approximately 2560px x 1080px.

Sensor Type CMOS Frame Camera with Panchromatic and Multispectral halves

Spectral Bands Blue: 450 - 515 nm


Green: 515 - 595 nm
Red: 605 - 695 nm
NIR: 740 - 900 nm
Pan: 450 - 900 nm

Processing

Radiometric Corrections Cross-Sensor Non Uniformity Correction (1%)


Conversion to absolute radiometric values based on calibration coefficients
Calibration coefficients regularly monitored and updated with on-orbit calibration techniques
Conversion to surface reflectance values using the 6SV2.1 radiative transfer code and MODIS
NRT data

Geometric Corrections Sensor-related effects are corrected using sensor telemetry and a sensor model.
Orthorectification uses GCPs and fine DEMs (30 m to 90 m posting).

Horizontal Datum WGS84

Map Projection UTM

Resampling Kernel Cubic Convolution

Geometric Accuracy <10 m RMSE

Table 4-I: SkySat Ortho Scene Asset Attributes

Product Attribute Description

Visual: 3-band Pansharpened (PS Red, PS Green, PS Blue)

Pansharpened Multispectral: 4-band Pansharpened (PS Blue, PS Green, PS Red,


Bands PS NIR)

Analytic, Analytic DN, Analytic SR: 4-band Multispectral (B, G, R, N)

© Planet Labs PBC 2022 51


Panchromatic, Panchromatic DN: 1-band Panchromatic

Pixel Size All assets: 0.50 m


(Orthorectified)

Bit Depth Visual: 8-bit Unsigned Integer


Pansharpened Multispectral, Analytic, Analytic DN, Panchromatic, Panchromatic
DN: 16 Unsigned Integer

Geometric Corrections Sensor-related effects are corrected using sensor telemetry and a sensor model.
Orthorectification uses GCPs and fine DEMs (30m to 90m posting).

Visual, Pansharpened Multispectral, Analytic DN, Panchromatic DN:


● No correction applied, pixel values are digital numbers
Analytic, Panchromatic:
● Absolute Radiance derived using vicarious calibration methods
Radiometric Calibration ● Product is radiometrically calibrated to radiance units [W/(µm * m^2 *
Accuracy str)], and scaled by 100 to reduce quantization errors
● Calibration is regularly monitored and updated with on-orbit calibration
techniques.
● Conversion to surface reflectance values using the 6SV2.1 radiative transfer code
and MODIS NRT data

Radiometric Accuracy +/- 5% Relative accuracy at < 10 degrees off-nadir angle


(Analytic,
Panchromatic)

Color Enhancements Enhanced for visual use


(Visual)

Atmospheric Correction

Surface reflectance is determined from top of atmosphere (TOA) reflectance, calculated using coefficients
supplied with the Planet Radiance product.

The Planet Surface Reflectance product corrects for the effects of the Earth's atmosphere, accounting for the
molecular composition and variation with altitude along with aerosol content. Combining the use of standard
atmospheric models with the use of MODIS water vapor, ozone and aerosol data, this provides reliable and
consistent surface reflectance scenes over Planet's varied constellation of satellites as part of our normal,
on-demand data pipeline. However, there are some limitations to the corrections performed:

● In some instances there is no MODIS data overlapping a Planet scene or the area nearby. In those cases,
AOD is set to a value of 0.226 which corresponds to a “clear sky” visibility of 23km, the aot_quality is set
to the MODIS “no data” value of 127, and aot_status is set to ‘Missing Data - Using Default AOT’. If there
is no overlapping water vapor or ozone data, the correction falls back to a predefined 6SV internal
model.
● The effects of haze and thin cirrus clouds are not corrected for.
● Aerosol type is limited to a single, global model.
● All scenes are assumed to be at sea level and the surfaces are assumed to exhibit Lambertian scattering
- no BRDF effects are accounted for.

© Planet Labs PBC 2022 52


Stray light and adjacency effects are not corrected for.

5.11 SKYSAT ANALYTIC SCENE GEOTIFF PROPERTIES

Table 4-J: Properties included in the GeoTIFF Header, under ‘TIFFTAG_IMAGEDESCRIPTION’

Field Value Sample

radiometric_scale_factor Provides the parameter to convert the 0.01


scaled radiance pixel value to radiance.
Multiplying the scaled radiance pixel
values by the scale factor, derives the Top
of Atmosphere Radiance product. This
value is a constant, set to 0.01

reflectance_coefficients The value is a multiplicative, when [0.0019093447035360626,


multiplied with the DN values, provides 0.0021074819723268657,
the Top of Atmosphere Reflectance 0.002420630889355243, 0.003471901841411239]
values, in watts per steradian per square
meter (W/m²*sr*μm)

satellite_azimuth Angle from true north to the satellite 103.22169693


vector at the time of imaging, projected
on the horizontal plane in degrees.

satellite_elevation Angle between the satellite pointing 61.32334041


direction and the local horizontal plane in
degrees.

sun_azimuth Angle from true north to the sun vector 136.7200917


projected on the horizontal plane in
degrees.

sun_elevation Elevation angle of the sun in degrees. 56.98039498

5.12 SKYSAT ORTHO SCENE GEOJSON METADATA

Table 4-K: Skysat Ortho Scene Geojson Metadata Schema

SKYSAT ORTHO SCENE GEOJSON METADATA SCHEMA

Parameter Description Type

acquired The RFC 3339 acquisition time of the image. string

camera_id The specific detector used to capture the String (e.g. “d1”, “d2”)
scene.

cloud_cover Ratio of the area covered by clouds to that number (0 - 1)


which is uncovered.

ground_control If the image meets the positional accuracy boolean


specifications this value will be true. If the

© Planet Labs PBC 2022 53


image has uncertain positional accuracy,
this value will be false.

gsd The ground sampling distance of the image number


acquisition.

item_type The name of the item type that models string (e.g. “PSScene3Band”, ”SkySatScene”)
shared imagery data schema.

provider Name of the imagery provider. string ("planetscope","rapideye", “skysat”)

published The RFC 3339 timestamp at which this item string


was added to the API.

publishing_stage Stage of publishing for an item. Both "l1a" string (“preview”, “finalized”)
assets and SkySatScenes with
fast-rectification applied will have a
publishing_stage = "preview".
Fast-rectification refers to the initial
rectification of the orthorectified product, to
enable faster publication. Once
full-rectification is applied, all assets will be
updated to publishing_stage = "finalized

quality_category Metric for image quality. To qualify for string (“standard”, “test”)
“standard” image quality an image must
meet a variety of quality standards, for
example: PAN motion blur less than 1.15
pixels, compression bits per pixel less than 3.
If the image does not meet these criteria it
is considered “test” quality.

satellite_azimuth Angle from true north to the satellite vector number (0 - 360)
at the time of imaging, projected on the
horizontal plane in degrees.

satellite_id Globally unique identifier of the satellite that string


acquired the underlying imagery.

strip_id Globally unique identifier of the image strip string


this scene was collected against

sun_azimuth Angle from true north to the sun vector number (0 - 360)
projected on the horizontal plane in
degrees.

sun_elevation Elevation angle of the sun in degrees. number (0 - 90)

updated The RFC 3339 timestamp at which this item string


was updated in the API.

view_angle Spacecraft across-track off-nadir viewing number (-90 - +90)


angle used for imaging, in degrees with +
being east and - being west.

© Planet Labs PBC 2022 54


5.13 SKYSAT ORTHO COLLECT PRODUCT SPECIFICATION

The Ortho Collect product is created by composing SkySat Ortho Scenes along an imaging strip into segments
typically unifying ~60 SkySat Ortho Scenes. The product may contain artifacts resulting from the composing
process, particular offsets in areas of stitched source scenes. In a next version artifacts caused by scene
misalignment will be hidden by cutlines. This is particularly important for the appearance of objects in built-up
areas and their accurate extraction.

● Visual - pansharpened, orthorectified, color corrected RGB


● Pansharpened Multispectral - pansharpened, orthorectified, color corrected BGRN
● Analytic - orthorectified, radiometrically corrected, multispectral BGRN
● Analytic DN - orthorectified, multispectral BGRN
● Panchromatic - orthorectified, radiometrically correct, panchromatic (PAN)
● Panchromatic DN - orthorectified, panchromatic (PAN)

*Asset attributes match those of the Scene counterparts listed above

Table 4-L: SkySat Ortho Collect Attributes

SKYSAT ORTHO COLLECT ATTRIBUTES

Attribute Description

Product Framing Strip Based

SkySat Satellites have three cameras per satellite, which capture overlapping strips. Each of
these strips contain overlapping scenes. One Collect product composes up to 60 scenes (up
to 20 per camera) and is approximately 20km x 5.9km.

Assets Visual: 3-band Pansharpened Image (8-bit Unsigned Integer)


Multispectral: 4-band Pansharpened Image (16-bit Unsigned Integer)
4-band Analytic DN Image (B, G, R, N) (16-bit Unsigned Integer)
1-band Panchromatic Image (16-bit Unsigned Integer)

© Planet Labs PBC 2022 55


Projection UTM WGS84

Geometric Corrections Sensor-related effects are corrected using sensor telemetry and a sensor model.
Orthorectification uses GCPs and fine DEMs (30m to 90m posting).

Positional Accuracy Less than 10 m RMSE

Radiometric Corrections No correction applied; pixel values are digital numbers

5.14 SKYSAT ANALYTIC COLLECT GEOTIFF PROPERTIES

Table 4-M: Properties included in the GeoTIFF Header, under ‘TIFFTAG_IMAGEDESCRIPTION’

Field Value Sample

radiometric_scale_factor Provides the parameter to convert the 0.01


scaled radiance pixel value to radiance.
Multiplying the scaled radiance pixel
values by the scale factor, derives the Top
of Atmosphere Radiance product. This
value is a constant, set to 0.01

reflectance_coefficients The value is a multiplicative, when [0.0019093447035360626,


multiplied with the DN values, provides 0.0021074819723268657,
the Top of Atmosphere Reflectance 0.002420630889355243, 0.003471901841411239]
values, in watts per steradian per square
meter (W/m²*sr*μm)

satellite_azimuth Angle from true north to the satellite 103.22169693


vector at the time of imaging, averaged
across the full SkySatCollect, projected
on the horizontal plane in degrees.

satellite_elevation Angle between the satellite pointing 61.32334041


direction and the local horizontal plane in
degrees, averaged across the full
SkySatCollect.

sun_azimuth Angle from true north to the sun vector 136.7200917


projected on the horizontal plane in
degrees, averaged across the full
SkySatCollect.

sun_elevation Elevation angle of the sun in degrees, 56.98039498


averaged across the full SkySatCollect.

5.15 SKYSAT COLLECT METADATA

Ortho Collect GeoJSON metadata

© Planet Labs PBC 2022 56


Table 4-N: Skysat Ortho Collect Geojson Metadata Schema

SKYSAT ORTHO COLLECT GEOJSON METADATA SCHEMA

Parameter Description Type

acquired The RFC 3339 acquisition time of the image. string

camera_id The specific detector used to capture the String (e.g. “d1”, “d2”)
scene.

cloud_cover Ratio of the area covered by clouds to that number (0 - 1)


which is uncovered.

ground_control_ra The ratio of scenes that make up the Collect float


tio with ground_control = true

gsd The ground sampling distance of the image number


acquisition.

item_type The name of the item type that models string (e.g. “PSScene”, ”SkySatCollect”)
shared imagery data schema.

provider Name of the imagery provider. string ("planetscope","rapideye", “skysat”)

published The RFC 3339 timestamp at which this item string


was added to the API.

publishing_stage Stage of publishing for an item. Both "l1a" string (“preview”, “finalized”)
assets and SkySatScenes with
fast-rectification applied will have a
publishing_stage = "preview".
Fast-rectification refers to the initial
rectification of the orthorectified product, to
enable faster publication. Once
full-rectification is applied, all assets will be
updated to publishing_stage = "finalized

quality_category Metric for image quality. To qualify for string (“standard”, “test”)
“standard” image quality an image must
meet a variety of quality standards, for
example: PAN motion blur less than 1.15
pixels, compression bits per pixel less than 3.
If the image does not meet these criteria it
is considered “test” quality.

satellite_azimuth Angle from true north to the satellite vector number (0 - 360)
at the time of imaging, projected on the
horizontal plane in degrees.

satellite_id Globally unique identifier of the satellite that string


acquired the underlying imagery.

strip_id Globally unique identifier of the image strip string


this scene was collected against

© Planet Labs PBC 2022 57


sun_azimuth Angle from true north to the sun vector number (0 - 360)
projected on the horizontal plane in
degrees.

sun_elevation Elevation angle of the sun in degrees. number (0 - 90)

updated The RFC 3339 timestamp at which this item string


was updated in the API.

view_angle Spacecraft across-track off-nadir viewing number (0 - 90)


angle used for imaging, in degrees.

5.16 SKYSAT BASEMAP MOSAIC TILES PRODUCT SPECIFICATION

All basemaps can be viewed at full resolution within the Planet graphical user interface (up to Zoom Level 18 in
the Web Mercator Projection), giving a resolution of 0.597 m at the Equator. The projection used in Planet
basemaps has been selected to match what is typically used in web mapping applications. The basemap
resolution improves at higher and lower latitudes. The Alpha Mask indicates areas of the quad where there is no
imagery data available.

Table 4-O: Individual Quad Specifications

INDIVIDUAL QUAD SPECIFICATIONS

Attribute Description

Sensors SkySat

Pixel Size (resolution) .597m

Image Bit Depth 8 bits per pixel

Bands Red, Green, Blue, Alpha

Projection WGS84 Web Mercator (EPSG:3857)

Size 4096 x 4096 pixels

Processing Pansharpened. Geometrically aligned. Seam lines are minimized with tonal balancing.
Cutlines to minimize visual breaks

© Planet Labs PBC 2022 58


6. OTHER PROVIDER IMAGERY PRODUCTS
Planet provides access to two other freely available datasets: Landsat 8, operated by NASA and the United
States Geological Survey, and Sentinel-2, operated by the European Space Agency. The goal is to make these
products easily available to Planet users to augment their analyses.

6.1 LANDSAT 8

For detailed characteristics of the Landsat 8 sensor and mission please refer to the official Landsat 8
documentation which can be found here: https://landsat.usgs.gov/landsat-8

Table 5-A: Landsat 8 data properties

LANDSAT 8 L1G PRODUCT ATTRIBUTE

Product Attribute Description

Information Content

Analytic Bands

Pan Band 8

Visible, NIR, SWIR Band 1-7 and Band 9 (Coastal/Aerosol, Blue, Green, Red, NIR, SWIR 1, SWIR 2, Cirrus)

Processing

Pixel Size 4-band Analytic DN Image (Blue, Green, Red, NIR)

1-band Panchromatic DN Image (Pan)

Pan 15 m

Visible, NIR, SWIR 30 m

TIR 100 m

Bit Depth 12-bit data depth, distributed as 16-bit data for easier processing

Geometric Corrections The Geometric Processing Subsystem (GPS) creates L1 geometrically corrected
imagery (L1G) from L1R products. The geometrically corrected products can be
systematic terrain-corrected (L1Gt) or precision terrain-corrected products (L1T). The
GPS generates a satellite model, prepares a resampling grid, and resamples the
data to create an L1Gt or L1T product. The GPS performs sophisticated satellite
geometric correction to create the image according to the map projection and
orientation specified for the L1 standard product.

Positional Accuracy 12 m CE90

Radiometric Corrections ● Converts the brightness of the L0R image pixels to absolute radiance in
preparation for geometric correction.

© Planet Labs PBC 2022 59


● Performs radiometric characterization of L0R images by locating radiometric
artifacts in images.
● Corrects radiometric artifacts and converts the image to radiance.

Metadata Landsat 8 MTL text file

6.2 SENTINEL-2

For detailed characteristics of the Sentinel-2 sensor and mission please refer to the official Sentinel-2
documentation which can be found here:

https://earth.esa.int/web/sentinel/user-guides/sentinel-2-msi/product-types/level-1c

Table 5-B: Sentinel-2 Data Properties

SENTINEL-2 LEVEL 1C PRODUCT ATTRIBUTE

Product Attribute Description

Information Content

Analytic Bands

Visible, NIR 4 bands at 10 m: blue (490 nm), green (560 nm), red (665 nm) and near infrared (842
nm).

RedEdge and NIR 4 narrow bands for vegetation characterisation (705 nm, 740 nm, 783 nm and 865
nm)

SWIR 2 larger SWIR bands (1610 nm and 2190 nm)

Aerosol, Water Vapor, Cirrus 443 nm for aerosols, 945 for water vapor and 1375 nm for cirrus detection

Processing

Pixel Size

Visible, NIR (4 bands) 10 m

RedEdge, NIR (6 bands) 20 m

SWIR (2 bands) 20 m

Cirrus, Aerosol, Water Vapor (3 60 m


bands)

Bit Depth 12

Geometric Corrections ● Resampling on the common geometry grid for registration between the Global
Reference Image (GRI) and the reference band.
● Collection of the tie-points from the two images for registration between the GRI
and the reference band.

© Planet Labs PBC 2022 60


● Tie-points filtering for image-GRI registration: filtering of the tie-points over
several areas. A minimum number of tie-points is required.
● Refinement of the viewing model using the initialized viewing model and GCPs.
The output refined model ensures registration between the GRI and the
reference band.
● Resampling grid computation: enabling linking of the native geometry image to
the target geometry image (ortho-rectified).
● Resampling of each spectral band in the geometry of the ortho-image using the
resampling grids and an interpolation filter.

Positional Accuracy 20 m 2σ without GCPs; 12.5 m 2σ with GCPs

Radiometric Corrections ● Dark Signal Correction


● Pixel Response non-uniformity correction
● Crosstalk correction
● Defective pixels identification
● High Spatial resolution bands restoration (deconvolution and de-noising)
● Binning of the 60m spectral bands
● TOA reflectance calculation

MetaData/Data Structure ● Level-1C_Tile_Metadata_File (Tile Metadata): XML main metadata file (DIMAP
mandatory file) containing the requested level of information and referring to all
the product elements describing the tile.
● IMG_DATA: folder containing image data files compressed using the JPEG2000
algorithm, one file per band.
● QI_DATA: folder containing QLQC XML reports of quality checks, mask files and
PVI files.
● Inventory_Metadata.xml: inventory metadata file (mandatory).
● manifest.safe: XML SAFE manifest file (Mandatory)
● rep-info: folder containing the XSD schema provided inside a SAFE Level-0
granule

© Planet Labs PBC 2022 61


7. PRODUCT PROCESSING

7.1 PLANETSCOPE PROCESSING

Several processing steps are applied to PlanetScope imagery products, listed in the table below.

Table 6-A: PlanetScope Processing Steps

PLANETSCOPE PROCESSING STEPS

Step Description

Darkfield/Offset Correction Corrects for sensor bias and dark noise. Master offset tables are created by
averaging on-orbit darkfield collects across 5-10 degree temperature bins and
applied to scenes during processing based on the CCD temperature at acquisition
time.

Flat Field Correction Flat fields are collected for each optical instrument prior to launch. These fields are
used to correct image lighting and CCD element effects to match the optimal
response area of the sensor. Flat fields are routinely updated on-orbit during the
satellite lifetime.

Camera Acquisition Parameter Determines a common radiometric response for each image (regardless of exposure
Correction time, number of TDI stages, gain, camera temperature and other camera
parameters).

Absolute Calibration As a last step, the spatially and temporally adjusted datasets are transformed from
digital number values into physical based radiance values (scaled to
W/(m²*str*μm)*100).

Visual Product Processing Presents the imagery as natural color, optimize colors as seen by the human eye.
This process is broken down into 4 steps:
● Flat fielding applied to correct for vignetting.
● Nominalization - Sun angle correction, to account for differences in latitude and
time of acquisition. This makes the imagery appear to look like it was acquired at
the same sun angle by converting the exposure time to the nominal time (noon).
● Two filters applied: an unsharp mask for improving local dynamic range, and a
sharpening filter for accentuating spatial features.
● Custom color curve applied post warping.

Orthorectification This process is broken down into 2 steps:


● The rectification tiedown process wherein tie points are identified across the
source images and a collection of reference images (ALOS, NAIP, OSM, Landsat)
and RPCs are generated.
● The actual orthorectification of the scenes using the RPCs, to remove terrain
distortions. The terrain model used for the orthorectification process is derived
from multiple sources (SRTM, Intermap, and other local elevation datasets)
which are periodically updated. Snapshots of the elevation datasets used are
archived (helps in identifying the DEM that was used for any given scene at any
given point).

© Planet Labs PBC 2022 62


Atmospheric Correction Removes atmospheric effects. This process consists of 3 steps:
● Top of Atmosphere (TOA) reflectance calculation using coefficients supplied with
the at-sensor radiance product.
● Lookup table (LUT) generation using the 6SV2.1 radiative transfer code and
MODIS near-real-time data inputs.
● Conversion of TOA reflectance to surface reflectance for all combinations of
selected ranges of physical conditions and for each satellite sensor type using its
individual spectral response as well as estimates of the state of the atmosphere.

The figure below illustrates the processing chain and steps involved to generate each of PlanetScope’s imagery
products.

Figure 5: PlanetScope Image Processing Chain

© Planet Labs PBC 2022 63


7.2 RAPIDEYE PROCESSING

For RapidEye imagery products, the processing steps are listed in the table below.

Table 6-B: RapidEye Processing Steps

RAPIDEYE PROCESSING STEPS

Step Description

Flat Field Correction (also referred Correction parameters to achieve the common response of all CCD elements when
to as spatial calibration) exposed to the same amount of light have been collected for each optical
instrument prior to launch. During operations, these corrections are adjusted every
quarter or more frequently on an as-needed basis when effects become visible or
measurable. The corrections are derived using side slither or statistical methods.
This step additionally involves statistical adjustments of the read-out channel gains
and offsets on a per image basis.

Temporal Calibration Corrections are applied so that all RapidEye cameras read the same DN (digital
number) regardless of when the image has been taken in the mission lifetime.
Additionally with this step a cross calibration between all spacecraft is achieved.

Absolute Calibration As a last step the spatially and temporally adjusted datasets are transformed from
digital number values into physical based radiance values (scaled to
W/(m²*str*µm)*100).

Visual Product Processing Presents the imagery as natural color, optimize colors as seen by the human eye.
This process is broken down into 3 steps:
● Nominalization - Sun angle correction, to account for differences in latitude and
time of acquisition. This makes the imagery appear to look like it was acquired at
the same sun angle by converting the exposure time to the nominal time (noon).
● Unsharp mask (sharpening filter) applied before the warp process.
● Custom color curve applied post warping.

Orthorectification Removes terrain distortions. This process is broken down into 2 steps:
● The rectification tiedown process wherein tie points are identified across the
source images and a collection of reference images (ALOS, NAIP,Landsat) and
RPCs are generated.
● The actual orthorectification of the scenes using the RPCs, to remove terrain
distortions. The terrain model used for the orthorectification process is derived
from multiple sources (Intermap, NED, SRTM and other local elevation datasets)
which are periodically updated. Snapshots of the elevation datasets used are
archived (helps in identifying the DEM that was used for any given scene at any
given point).

Atmospheric Correction Removes atmospheric effects. This process consists of 3 steps:


● Top of Atmosphere (TOA) reflectance calculation using coefficients supplied with
the at-sensor radiance product.
● Lookup table (LUT) generation using the 6SV2.1 radiative transfer code and
MODIS near-real-time data inputs.
● Conversion of TOA reflectance to surface reflectance for all combinations of
selected ranges of physical conditions and for each satellite sensor type using its
individual spectral response as well as estimates of the state of the atmosphere.

The figure below illustrates the processing chain and steps involved to generate each of RapidEye’s imagery
products.

© Planet Labs PBC 2022 64


Figure 6: RapidEye Image Processing Chain

© Planet Labs PBC 2022 65


7.3 SKYSAT PROCESSING

For SkySat imagery products, the processing steps are listed in the table below.

Table 6-C: SkySat Processing Steps

SKYSAT PROCESSING STEPS

Step Description

Darkfield/Offset Correction Corrects for sensor bias and dark noise. Master offset tables are created by
averaging ground calibration data collected across 5-10 degree temperature bins
and applied to scenes during processing based on the CCD temperature at
acquisition time.

Flat Field Correction Flat fields are created using cloud flats collected on-orbit post-launch. These fields
are used to correct image lighting and CCD element effects to match the optimal
response area of the sensor.

Camera Acquisition Parameter Determines a common radiometric response for each image (regardless of exposure
Correction time, TDI, gain, camera temperature and other camera parameters).

Inter Sensor Radiometric Cross calibrates the 3 sensors in each camera to a common relative radiometric
Response (Intra Camera) response. The offsets between each sensor are derived using on-orbit cloud flats
and the overlap regions between sensors on SkySat spacecraft.

Super Resolution A super-resolved image, SR, is the process of creating an improved resolution image
(Level 1B Processing) fusing information from low resolution images, with the created higher resolution
image being a better description of the scene.

Visual Product Processing Presents the imagery as natural color, optimizing colors as seen by the human eye.
Custom color curves applied post warping to deliver a visually appealing image.

Orthorectification Removes terrain distortions. This process is broken down into 2 steps:
The rectification tiedown process wherein tie points are identified across the source
images and a collection of reference images (NAIP, ALOS, Landsat, and high
resolution image chips) and RPCs are generated. The actual orthorectification of the
scenes using the RPCs, to remove terrain distortions. The terrain model used for the
orthorectification process is derived from multiple sources (SRTM, Intermap, and
other local elevation datasets) which are periodically updated. Snapshots of the
elevation datasets used are archived (helps in identifying the DEM that was used for
any given scene at any given point.

© Planet Labs PBC 2022 66


The figure below illustrates the processing chain and steps involved to generate SkySat’s Basic and Ortho Scene
products.

Figure 7: SkySat Image Processing Chain

© Planet Labs PBC 2022 67


8. PRODUCT METADATA

8.1 ORTHO TILES

8.1.1 PlanetScope

As mentioned in earlier sections, the Ortho Tile data in the Planet API will contain metadata in
machine-readable GeoJSON and supported by standards-compliant GIS tools (e.g. GDAL and derivatives,
JavaScript libraries). See APPENDIX A for info on general product XML metadata.

The table below describes the GeoJSON metadata schema for PlanetScope Ortho Tile products:

Table 7-A: PlanetScope Ortho Tile GeoJSON Metadata Schema

PLANETSCOPE ORTHO TILE GEOJSON METADATA SCHEMA

Parameter Description Type

acquired The RFC 3339 acquisition time of the image. string

anomalous_pixel Percentage of anomalous pixels. Pixels that number


have image quality issues documented in
the quality taxonomy (e.g. hot columns).
This is represented spatially within the UDM.

black_fill Ratio of image containing artificial black fill number (0 - 1)


due to clipping to actual data.

cloud_cover Ratio of the area covered by clouds to that number (0 - 1)


which is uncovered.

columns Number of columns in the image. number

epsg_code The identifier for the grid cell that the number
imagery product is coming from if the
product is an Ortho Tile (not used if Scene).

grid_cell The grid cell identifier of the gridded item. string

ground_control If the image meets the positional accuracy boolean


specifications this value will be true. If the
image has uncertain positional accuracy,
this value will be false.

gsd The ground sampling distance of the image number


acquisition.

© Planet Labs PBC 2022 68


item_type The name of the item type that models string (e.g. “PSOrthoTile”)
shared imagery data schema.

origin_x ULX coordinate of the extent of the data. number


The coordinate references the top left
corner of the top left pixel.

origin_y ULY coordinate of the extent of the data. number


The coordinate references the top left
corner of the top left pixel.

pixel_resolution Pixel resolution of the imagery in meters. number

provider Name of the imagery provider. string (e.g. "planetscope","rapideye"")

published The RFC 3339 timestamp at which this item string


was added to the API.

quality_category Metric for image quality. To qualify for string: “standard” or “test”
“standard” image quality an image must
meet the following criteria: sun altitude
greater than or equal to 10 degrees, off nadir
view angle less than 20 degrees, and
saturated pixels fewer than 20%. If the
image does not meet these criteria it is
considered “test” quality.

rows Number of rows in the image. number

satellite_id Globally unique identifier of the satellite string


that acquired the underlying imagery.

sun_azimuth Angle from true north to the sun vector number (0 - 360)
projected on the horizontal plane in
degrees.

sun_elevation Elevation angle of the sun in degrees. number (0 - 90)

updated The RFC 3339 timestamp at which this item string


was updated in the API.

view_angle Spacecraft across-track off-nadir viewing number (-25 - +25)


angle used for imaging, in degrees with +
being east and - being west.

© Planet Labs PBC 2022 69


The table below describes the metadata schema for Surface Reflectance products stored in the GeoTIFF header:

Table 7-B: PlanetScope Ortho Tile Surface Reflectance GeoTIFF Metadata Schema

PLANETSCOPE ORTHO TILE SURFACE REFLECTANCE GEOTIFF METADATA SCHEMA

Parameter Description Example

aerosol_model 6S aerosol model used continental

aot_coverage Percentage overlap between MODIS data 0.5625


and the scene being corrected

aot_method Method used to derive AOD value(s) for an fixed


image. ‘Map’ indicates that per-pixel AOD
values are used based on an interpolated
map over the scene; 'fixed' indicates a single
value for the entire image used when there
is not enough data coverage to produce a
map.

aot_mean_quality Average MODIS AOD quality value for the 1.0


overlapping NRT data in the range 1-10. This
is set to 127 when no data is available

aot_source Source of the AOD data used for the mod09cma_nrt


correction

aot_std Standard deviation of the averaged MODIS 0.033490001296168699


AOD data

aot_status A text string indicating state of AOD Missing Data - Using Default AOT
retrieval. If no data exists from the source
used, a default value 0.226 is used

aot_used Aerosol optical depth used for the 0.061555557780795626


correction

atmospheric_corre The algorithm used to generate LUTs 6SV2.1


ction_algorithm

atmospheric_mod Custom model or 6S atmospheric model water_vapor_and_ozone


el used

luts_version Version of the LUTs used for the correction 3

ozone_coverage Percentage overlap between MODIS data 0.53125


and the scene being corrected

ozone_mean_quali Average MODIS ozone quality value for the 255


ty overlapping NRT data. This will always be
255 if data is present

© Planet Labs PBC 2022 70


ozone_method Method used to derive ozone value(s) for an fixed
image. Currently only 'fixed' is used,
indicating a single value for the entire
image

ozone_source Source of the ozone data used for the mod09cmg_nrt


correction

ozone_status A text string indicating state of ozone Data Found


retrieval. If no ozone data is available for the
scene being corrected, the corrections falls
back to a 6SV built-in atmospheric model

ozone_std Standard deviation of the averaged MODIS 0


ozone data.

ozone_used Ozone concentration used for the 0.255


correction, in cm-atm

satellite_azimuth_a Always defined to be 0.0 degrees and solar 0.0


ngle zenith angle measured relative to it

satellite_zenith_an Satellite zenith angle, fixed to nadir pointing 0.0


gle

solar_azimuth_ang Sun azimuth angle relative to satellite, in 111.42044562850029


le degrees

solar_zenith_angle Solar zenith angle in degrees 30.26950393461825

sr_version Version of the correction applied. 1.0

water_vapor_cover Percentage overlap between MODIS data 0.53215


age and the scene being corrected

water_vapor_mean Average MODIS ozone quality value for the 1.5294


_quality overlapping NRT data in the range 1-10. This
is set to 127 when no data is available

water_vapor_meth Method used to derive water vapor value(s) fixed


od for an image. Currently only 'fixed' is used,
indicating a single value for the entire
image

water_vapor_sourc Source of the water vapor data used for the mod09cma_nrt
e correction

water_vapor_statu A text string indicating state of water vapor Data Found


s retrieval. If no water vapor data is available
for the scene being corrected, the
corrections falls back to a 6SV built-in
atmospheric model

© Planet Labs PBC 2022 71


water_vapor_std Standard deviation of the averaged MODIS 0.0587
AOD data

water_vapor_used Water vapor concentration used for the 4.0512


correction in g/cm^2

8.1.2 RapidEye

The table below describes the GeoJSON metadata schema for RapidEye Ortho Tile products:

Table 7-C: RapidEye Ortho Tile GeoJSON Metadata Schema

RAPIDEYE ORTHO TILE GEOJSON METADATA SCHEMA

Parameter Description Type

acquired The RFC 3339 acquisition time of the image. string

catalog_id The catalog ID for the RapidEye Basic Scene string


product.

anomalous_pixel Percentage of anomalous pixels. Pixels that number


have image quality issues documented in
the quality taxonomy (e.g. hot columns).
This is represented spatially within the UDM.

black_fill Ratio of image containing artificial black fill number (0 - 1)


due to clipping to actual data.

cloud_cover Ratio of the area covered by clouds to that number (0 - 1)


which is uncovered.

columns Number of columns in the image. number

epsg_code The identifier for the grid cell that the number
imagery product is coming from if the
product is an Ortho Tile (not used if Scene)

grid_cell The grid cell identifier of the gridded item. string

ground_control If the image meets the positional accuracy boolean


specifications this value will be true. If the
image has uncertain positional accuracy,
this value will be false.

gsd The ground sampling distance of the image number


acquisition.

item_type The name of the item type that models string (e.g “REOrthoTile”)
shared imagery data schema.

© Planet Labs PBC 2022 72


origin_x ULX coordinate of the extent of the data. number
The coordinate references the top left
corner of the top left pixel

origin_y ULY coordinate of the extent of the data. number


The coordinate references the top left
corner of the top left pixel

pixel_resolution Pixel resolution of the imagery in meters. number

provider Name of the imagery provider. string (e.g. "planetscope","rapideye"")

published The RFC 3339 timestamp at which this item string


was added to the API.

rows Number of rows in the image. number

satellite_id Globally unique identifier of the satellite string


that acquired the underlying imagery.

sun_azimuth Angle from true north to the sun vector number (0 - 360)
projected on the horizontal plane in
degrees.

sun_elevation Elevation angle of the sun in degrees. number (0 - 90)

updated The RFC 3339 timestamp at which this item string


was updated in the API.

usable_data Ratio of the usable to unusable portion of number (0 - 1)


the imagery due to cloud cover or black fill.

view_angle Spacecraft across-track off-nadir viewing number (-25 - +25)


angle used for imaging, in degrees with +
being east and - being west.

The table below describes the metadata schema for Surface Reflectance products stored in the GeoTIFF header:

Table 7-D: RapidEye Ortho Tile Surface Reflectance Metadata Schema

RAPIDEYE ORTHO TILE SURFACE REFLECTANCE METADATA SCHEMA

Parameter Description Example

aerosol_model 6S aerosol model used continental

aot_coverage Percentage overlap between MODIS data 0.5625


and the scene being corrected

© Planet Labs PBC 2022 73


aot_method Method used to derive AOD value(s) for an fixed
image. ‘Map’ indicates that per-pixel AOD
values are used based on an interpolated
map over the scene; 'fixed' indicates a single
value for the entire image used when there
is not enough data coverage to produce a
map.

aot_mean_quality Average MODIS AOD quality value for the 1.0


overlapping NRT data in the range 1-10. This
is set to 127 when no data is available

aot_source Source of the AOD data used for the mod09cma_nrt


correction

aot_std Standard deviation of the averaged MODIS 0.033490001296168699


AOD data

aot_status A text string indicating state of AOD Missing Data - Using Default AOT
retrieval. If no data exists from the source
used, a default value 0.226 is used

aot_used Aerosol optical depth used for the 0.061555557780795626


correction

atmospheric_corre The algorithm used to generate LUTs 6SV2.1


ction_algorithm

atmospheric_mod Custom model or 6S atmospheric model water_vapor_and_ozone


el used

luts_version Version of the LUTs used for the correction 3

ozone_coverage Percentage overlap between MODIS data 0.53125


and the scene being corrected

ozone_mean_quali Average MODIS ozone quality value for the 255


ty overlapping NRT data. This will always be
255 if data is present

ozone_method Method used to derive ozone value(s) for an fixed


image. Currently only 'fixed' is used,
indicating a single value for the entire
image

ozone_source Source of the ozone data used for the mod09cmg_nrt


correction

ozone_status A text string indicating state of ozone Data Found


retrieval. If no ozone data is available for the
scene being corrected, the corrections falls
back to a 6SV built-in atmospheric model

ozone_std Standard deviation of the averaged MODIS 0


ozone data.

© Planet Labs PBC 2022 74


ozone_used Ozone concentration used for the 0.255
correction, in cm-atm

satellite_azimuth_a Always defined to be 0.0 degrees and solar 0.0


ngle zenith angle measured relative to it

satellite_zenith_an Satellite zenith angle, fixed to nadir pointing 0.0


gle

solar_azimuth_ang Sun azimuth angle relative to satellite, in 111.42044562850029


le degrees

solar_zenith_angle Solar zenith angle in degrees 30.26950393461825

sr_version Version of the correction applied. 1.0

water_vapor_cover Percentage overlap between MODIS data 0.53215


age and the scene being corrected

water_vapor_mean Average MODIS ozone quality value for the 1.5294


_quality overlapping NRT data in the range 1-10. This
is set to 127 when no data is available

water_vapor_meth Method used to derive water vapor value(s) fixed


od for an image. Currently only 'fixed' is used,
indicating a single value for the entire
image

water_vapor_sourc Source of the water vapor data used for the mod09cma_nrt
e correction

water_vapor_statu A text string indicating state of water vapor Data Found


s retrieval. If no water vapor data is available
for the scene being corrected, the
corrections falls back to a 6SV built-in
atmospheric model

water_vapor_std Standard deviation of the averaged MODIS 0.0587


AOD data

water_vapor_used Water vapor concentration used for the 4.0512


correction in g/cm^2

© Planet Labs PBC 2022 75


8.2 ORTHO SCENES

8.2.1 PlanetScope

The table below describes the GeoJSON metadata schema for PlanetScope Ortho Scene products:

Table 7-E: PlanetScope Ortho Scene GeoJSON Metadata Schema

PLANETSCOPE ORTHO SCENE GEOJSON METADATA SCHEMA

Parameter Description Type

acquired The RFC 3339 acquisition time of the image. string

anomalous_pixel Percentage of anomalous pixels. Pixels that number


have image quality issues documented in
the quality taxonomy

cloud_cover Ratio of the area covered by clouds to that number (0 - 1)


which is uncovered.

ground_control If the image meets the positional accuracy boolean


specifications this value will be true. If the
image has uncertain positional accuracy,
this value will be false.

gsd The ground sampling distance of the image number


acquisition.

instrument The generation of the satellite telescope. string (e.g.”PS2”, “PS2.SD”)

item_type The name of the item type that models string (e.g. “PSScene)
shared imagery data schema.

pixel_resolution Pixel resolution of the imagery in meters. number

provider Name of the imagery provider. string (e.g. "planetscope","rapideye"")

published The RFC 3339 timestamp at which this item string


was added to the API.

publishing_stage Stage of publishing for an item. string


SkySatScenes are first published in a
"preview" stage and graduate to a "finalized"
stage.

© Planet Labs PBC 2022 76


quality_category Metric for image quality. To qualify for string: “standard” or “test”
“standard” image quality an image must
meet the following criteria: sun altitude
greater than or equal to 10 degrees, off nadir
view angle less than 20 degrees, and
saturated pixels fewer than 20%. If the
image does not meet these criteria it is
considered “test” quality.

satellite_azimuth Spacecraft off track pointing direction, in float


degrees (0-360).

satellite_id Globally unique identifier of the satellite string


that acquired the underlying imagery.

strip_id The unique identifier of the image stripe string


that the item came from.

sun_azimuth Angle from true north to the sun vector number (0 - 360)
projected on the horizontal plane in
degrees.

sun_elevation Elevation angle of the sun in degrees. number (0 - 90)

updated The RFC 3339 timestamp at which this item string


was updated in the API.

view_angle Spacecraft across-track off-nadir viewing number (-25 - +25)


angle used for imaging, in degrees with +
being east and - being west.

The PlanetScope Ortho Scenes Surface Reflectance product is provided as a 16-bit GeoTIFF image with
reflectance values scaled by 10,000. Associated metadata describing inputs to the correction is included in a
GeoTIFF TIFFTAG_IMAGEDESCRIPTION metadata header as a JSON encoded string.

© Planet Labs PBC 2022 77


The table below describes the metadata schema for Surface Reflectance products stored in the GeoTIFF header:

Table 7-F: PlanetScope Ortho Scene Surface Reflectance GeoTIFF Metadata Schema

PLANETSCOPE ORTHO SCENE SURFACE REFLECTANCE GEOTIFF METADATA SCHEMA

Parameter Description Example

aerosol_model 6S aerosol model used continental

aot_coverage Percentage overlap between MODIS data 0.5625


and the scene being corrected

aot_method Method used to derive AOD value(s) for an fixed


image. ‘Map’ indicates that per-pixel AOD
values are used based on an interpolated
map over the scene; 'fixed' indicates a single
value for the entire image used when there
is not enough data coverage to produce a
map.

aot_mean_quality Average MODIS AOD quality value for the 1.0


overlapping NRT data in the range 1-10. This
is set to 127 when no data is available

aot_source Source of the AOD data used for the mod09cma_nrt


correction

aot_std Standard deviation of the averaged MODIS 0.033490001296168699


AOD data

aot_status A text string indicating state of AOD Missing Data - Using Default AOT
retrieval. If no data exists from the source
used, a default value 0.226 is used

aot_used Aerosol optical depth used for the 0.061555557780795626


correction

atmospheric_corre The algorithm used to generate LUTs 6SV2.1


ction_algorithm

atmospheric_mod Custom model or 6S atmospheric model water_vapor_and_ozone


el used

luts_version Version of the LUTs used for the correction 3

ozone_coverage Percentage overlap between MODIS data 0.53125


and the scene being corrected

ozone_mean_quali Average MODIS ozone quality value for the 255


ty overlapping NRT data. This will always be
255 if data is present

© Planet Labs PBC 2022 78


ozone_method Method used to derive ozone value(s) for an fixed
image. Currently only 'fixed' is used,
indicating a single value for the entire
image

ozone_source Source of the ozone data used for the mod09cmg_nrt


correction

ozone_status A text string indicating state of ozone Data Found


retrieval. If no ozone data is available for the
scene being corrected, the corrections falls
back to a 6SV built-in atmospheric model

ozone_std Standard deviation of the averaged MODIS 0


ozone data.

ozone_used Ozone concentration used for the 0.255


correction, in cm-atm

satellite_azimuth_a Always defined to be 0.0 degrees and solar 0.0


ngle zenith angle measured relative to it

satellite_zenith_an Satellite zenith angle, fixed to nadir pointing 0.0


gle

solar_azimuth_ang Sun azimuth angle relative to satellite, in 111.42044562850029


le degrees

solar_zenith_angle Solar zenith angle in degrees 30.26950393461825

sr_version Version of the correction applied. 1.0

water_vapor_cover Percentage overlap between MODIS data 0.53215


age and the scene being corrected

water_vapor_mean Average MODIS ozone quality value for the 1.5294


_quality overlapping NRT data in the range 1-10. This
is set to 127 when no data is available

water_vapor_meth Method used to derive water vapor value(s) fixed


od for an image. Currently only 'fixed' is used,
indicating a single value for the entire
image

water_vapor_sourc Source of the water vapor data used for the mod09cma_nrt
e correction

water_vapor_statu A text string indicating state of water vapor Data Found


s retrieval. If no water vapor data is available
for the scene being corrected, the
corrections falls back to a 6SV built-in
atmospheric model

© Planet Labs PBC 2022 79


water_vapor_std Standard deviation of the averaged MODIS 0.0587
AOD data

water_vapor_used Water vapor concentration used for the 4.0512


correction in g/cm^2

8.2.2 SkySat

The table below describes the GeoJSON metadata schema for SkySat Ortho Scene products:

Table 7-G: Skysat Ortho Scene Geojson Metadata Schema

SKYSAT ORTHO SCENE GEOJSON METADATA SCHEMA

Parameter Description Type

acquired The RFC 3339 acquisition time of the image. string

camera_id The specific detector used to capture the String (e.g. “d1”, “d2”)
scene.

cloud_cover Ratio of the area covered by clouds to that number (0 - 1)


which is uncovered.

ground_control If the image meets the positional accuracy boolean


specifications this value will be true. If the
image has uncertain positional accuracy,
this value will be false.

gsd The ground sampling distance of the image number


acquisition.

item_type The name of the item type that models string (e.g. “PSScene3Band”, ”SkySatScene”)
shared imagery data schema.

provider Name of the imagery provider. string ("planetscope","rapideye", “skysat”)

published The RFC 3339 timestamp at which this item string


was added to the API.

publishing_stage Stage of publishing for an item. Both "l1a" string (“preview”, “finalized”)
assets and SkySatScenes with
fast-rectification applied will have a
publishing_stage = "preview".
Fast-rectification refers to the initial
rectification of the orthorectified product, to
enable faster publication. Once
full-rectification is applied, all assets will be
updated to publishing_stage = "finalized

© Planet Labs PBC 2022 80


quality_category Metric for image quality. To qualify for string (“standard”, “test”)
“standard” image quality an image must
meet a variety of quality standards, for
example: PAN motion blur less than 1.15
pixels, compression bits per pixel less than 3.
If the image does not meet these criteria it
is considered “test” quality.

satellite_azimuth Angle from true north to the satellite vector number (0 - 360)
at the time of imaging, projected on the
horizontal plane in degrees.

satellite_id Globally unique identifier of the satellite that string


acquired the underlying imagery.

strip_id Globally unique identifier of the image strip string


this scene was collected against

sun_azimuth Angle from true north to the sun vector number (0 - 360)
projected on the horizontal plane in
degrees.

sun_elevation Elevation angle of the sun in degrees. number (0 - 90)

updated The RFC 3339 timestamp at which this item string


was updated in the API.

view_angle Spacecraft across-track off-nadir viewing number (0 - 90)


angle used for imaging, in degrees.

8.3 BASIC SCENES

8.3.1 PlanetScope

The table below describes the GeoJSON metadata schema for PlanetScope Basic Scene products:

Table 7-H: PlanetScope Basic Scene GeoJSON Metadata Schema

Parameter Description Type

acquired The RFC 3339 acquisition time of the image. string

anomalous_pixel Percentage of anomalous pixels. Pixels that number


have image quality issues documented in
the quality taxonomy

cloud_cover Ratio of the area covered by clouds to that number (0 - 1)


which is uncovered.

© Planet Labs PBC 2022 81


ground_control If the image meets the positional accuracy boolean
specifications this value will be true. If the
image has uncertain positional accuracy,
this value will be false.

gsd The ground sampling distance of the image number


acquisition.

instrument The generation of the satellite telescope. string (e.g.”PS2”, “PS2.SD”)

item_type The name of the item type that models string (e.g. “PSScene)
shared imagery data schema.

pixel_resolution Pixel resolution of the imagery in meters. number

provider Name of the imagery provider. string (e.g. "planetscope","rapideye"")

published The RFC 3339 timestamp at which this item string


was added to the API.

publishing_stage Stage of publishing for an item. string


SkySatScenes are first published in a
"preview" stage and graduate to a "finalized"
stage.

quality_category Metric for image quality. To qualify for string: “standard” or “test”
“standard” image quality an image must
meet the following criteria: sun altitude
greater than or equal to 10 degrees, off nadir
view angle less than 20 degrees, and
saturated pixels fewer than 20%. If the
image does not meet these criteria it is
considered “test” quality.

satellite_azimuth Spacecraft off track pointing direction, in float


degrees (0-360).

satellite_id Globally unique identifier of the satellite string


that acquired the underlying imagery.

strip_id The unique identifier of the image stripe string


that the item came from.

sun_azimuth Angle from true north to the sun vector number (0 - 360)
projected on the horizontal plane in
degrees.

sun_elevation Elevation angle of the sun in degrees. number (0 - 90)

© Planet Labs PBC 2022 82


updated The RFC 3339 timestamp at which this item string
was updated in the API.

view_angle Spacecraft across-track off-nadir viewing number (-25 - +25)


angle used for imaging, in degrees with +
being east and - being west.

8.3.2 RapidEye

The table below describes the GeoJSON metadata schema for RapidEye Basic Scene products:

Table 7-I: RapidEye Basic Scene GeoJSON Metadata Schema

RAPIDEYE BASIC SCENE GEOJSON METADATA SCHEMA

Parameter Description Type

acquired The time that image was taken in ISO 8601 string
format, in UTC.

anomalous_pixel Count of any identified anomalous pixels number

cloud_cover The estimated percentage of the image number (0 - 100)


covered by clouds.

gsd The ground sample distance (distance number


between pixel centers measured on the
ground) of the image in meters.

black_fill The percent of image pixels without valid number (0)


image data. It is always zero.

catalog_id The catalog ID for the RapidEye Basic Scene string


product.

satellite_id A unique identifier for the satellite that string


captured this image.

view_angle The view angle in degrees at which the number


image was taken.

strip_id The RapidEye Level 1B catalog id for older string


L1B products or the ImageTake ID for newer
versions.

sun_elevation The altitude (angle above horizon) of the number


sun from the imaged location at the time of
capture in degrees.

© Planet Labs PBC 2022 83


sun_azimuth The azimuth (angle clockwise from north) of number
the sun from the imaged location at the
time of capture in degrees.

updated The last time this asset was updated in the string
Planet archive. Images may be updated
after they are originally published

usable_data Amount of image that is considered usable Number (0-1)


data, for example non-cloud cover pixels,
expressed as a percentage. Applies only to
RapidEye data.

columns The number of columns in the image number

rows The number of rows in the image number

published The date the image was originally published string

provider The satellite constellation String: “rapideye”

item_type The item type as cataloged in the Planet String: “REScene”


Archive

8.3.3 SkySat

The table below describes the GeoJSON metadata schema for SkySat Basic Scene products:

Table 7-J: Skysat Basic Scene Geojson Metadata Schema

SKYSAT BASIC SCENE GEOJSON METADATA SCHEMA

Parameter Description Type

acquired The RFC 3339 acquisition time of the image. string

camera_id The specific detector used to capture the String (e.g. “d1”, “d2”)
scene.

cloud_cover Ratio of the area covered by clouds to that number (0 - 1)


which is uncovered.

ground_control If the image meets the positional accuracy boolean


specifications this value will be true. If the
image has uncertain positional accuracy,
this value will be false.

gsd The ground sampling distance of the image number


acquisition.

© Planet Labs PBC 2022 84


item_type The name of the item type that models string (e.g. “PSScene3Band”, ”SkySatScene”)
shared imagery data schema.

provider Name of the imagery provider. string ("planetscope","rapideye", “skysat”)

published The RFC 3339 timestamp at which this item string


was added to the API.

publishing_stage Stage of publishing for an item. Both "l1a" string (“preview”, “finalized”)
assets and SkySatScenes with
fast-rectification applied will have a
publishing_stage = "preview".
Fast-rectification refers to the initial
rectification of the orthorectified product, to
enable faster publication. Once
full-rectification is applied, all assets will be
updated to publishing_stage = "finalized

quality_category Metric for image quality. To qualify for string (“standard”, “test”)
“standard” image quality an image must
meet a variety of quality standards, for
example: PAN motion blur less than 1.15
pixels, compression bits per pixel less than 3.
If the image does not meet these criteria it
is considered “test” quality.

satellite_azimuth Angle from true north to the satellite vector number (0 - 360)
at the time of imaging, projected on the
horizontal plane in degrees.

satellite_id Globally unique identifier of the satellite string


that acquired the underlying imagery.

strip_id Globally unique identifier of the image strip string


this scene was collected against

sun_azimuth Angle from true north to the sun vector number (0 - 360)
projected on the horizontal plane in
degrees.

sun_elevation Elevation angle of the sun in degrees. number (0 - 90)

updated The RFC 3339 timestamp at which this item string


was updated in the API.

view_angle Spacecraft across-track off-nadir viewing number (0 - 90)


angle used for imaging, in degrees.

8.4 ORTHO COLLECT

8.4.1 SkySat

The table below describes the GeoJSON metadata schema for SkySat Ortho Collect products:

© Planet Labs PBC 2022 85


Table 7-K: Skysat Ortho Collect Geojson Metadata Schema

SKYSAT ORTHO SCENE GEOJSON METADATA SCHEMA

Parameter Description Type

acquired The RFC 3339 acquisition time of the image. string

camera_id The specific detector used to capture the string (e.g. “d1”, “d2”)
scene.

cloud_cover Ratio of the area covered by clouds to that number (0 - 1)


which is uncovered.

ground_control If the image meets the positional accuracy boolean


specifications this value will be true. If the
image has uncertain positional accuracy,
this value will be false.

gsd The ground sampling distance of the image number


acquisition.

item_type The name of the item type that models string (e.g. “PSScene3Band”, ”SkySatScene”)
shared imagery data schema.

provider Name of the imagery provider. string ("planetscope","rapideye", “skysat”)

published The RFC 3339 timestamp at which this item string


was added to the API.

publishing_stage Stage of publishing for an item. Both "l1a" string (“preview”, “finalized”)
assets and SkySatScenes with
fast-rectification applied will have a
publishing_stage = "preview".
Fast-rectification refers to the initial
rectification of the orthorectified product, to
enable faster publication. Once
full-rectification is applied, all assets will be
updated to publishing_stage = "finalized

quality_category Metric for image quality. To qualify for string (“standard”, “test”)
“standard” image quality an image must
meet a variety of quality standards, for
example: PAN motion blur less than 1.15
pixels, compression bits per pixel less than 3.
If the image does not meet these criteria it
is considered “test” quality.

satellite_azimuth Angle from true north to the satellite vector number (0 - 360)
at the time of imaging, projected on the
horizontal plane in degrees.

satellite_id Globally unique identifier of the satellite that string


acquired the underlying imagery.

© Planet Labs PBC 2022 86


strip_id Globally unique identifier of the image strip string
this scene was collected against

sun_azimuth Angle from true north to the sun vector number (0 - 360)
projected on the horizontal plane in
degrees.

sun_elevation Elevation angle of the sun in degrees. number (0 - 90)

updated The RFC 3339 timestamp at which this item string


was updated in the API.

view_angle Spacecraft across-track off-nadir viewing number (0 - 90)


angle used for imaging, in degrees.

ground_lock_ratio The percentage of SkySat frames that make Number (0 - 1)


up the full Collect product that have good
ground control

© Planet Labs PBC 2022 87


9. PRODUCT DELIVERY
All imagery products are made available via Application Processing Interface (API) and Graphical User Interface
(GUI).

9.1 PLANET APPLICATION PROGRAMMING INTERFACES (APIS)

Planet offers REST API access that allows listing, filtering, and downloading of data to anyone using a valid API
key. The metadata features described in this document are all searchable via our Data API and downloadable
via our Orders API.

Details on searching and ordering via Planet APIs are available in Planet’s Developer Center. Links are also
available below.

● Catalog Overview (Items Types & Assets Types)


● Search with Planet’s Data API
● Order with Planet’s Orders API

9.2 PLANET EXPLORER GRAPHICAL USER INTERFACE (GUI)

Planet Explorer is a web-based tool that can be used to search Planet’s catalog of imagery, view metadata, and
download full-resolution images. The interface and all of its features are built entirely on the externally available
Planet API.

Planet Explorer allows users to:

1. View Timelapse Mosaics: A user can view Planet’s quarterly and monthly mosaics, and can zoom in up
to zoom level 12 (38 m / pixel per OpenStreetMap)

2. Search: A user can Search for any location or a specific area of interest by entering into the input box OR
by uploading a geometry file (Shapefile, GeoJSON, KML, or WKT).

3. Save Search: The Save functionality allows a user to save search criteria based on area of interest, dates,
and filters.

4. Filter: A user can filter by a specific date range and/or customizing metadata parameters (e.g. estimated
cloud cover, GSD).

5. Zoom and Preview Imagery: Zoom and Preview allows a user to zoom in or out of the selected area and
preview imagery.

6. View Imagery Details: A user can review metadata details about each imagery product.

7. Download: The Download icon allows a user to download imagery based on subscription type.

© Planet Labs PBC 2022 88


8. Draw Tools: These tools allow you to specify an area to see imagery results. The draw tool capabilities
available are drawing a circle, drawing a rectangle, drawing a polygon, and/or limiting the size of the
drawing to the size of loadable imagery.

9. Imagery Compare Tool: The Compare Tool allows you to compare sets of Planet imagery from different
dates.

Planet will also enable additional functionality in the form of “Labs,” which are demonstrations of capability
made accessible to users through the GUI. Labs are active product features and will evolve over time based on
Planet technology evolution and user feedback.

9.3 PLANET ACCOUNT MANAGEMENT TOOLS

As part of the Planet GUI, an administration and account management tool is provided. This tool is used to
change user settings and to see past data orders. In addition, users who have administrator privileges will be
able to manage users in their organization as well as review usage statistics.

The core functionality provided by account management tools are outlined below, and Planet may evolve
Account Management tools over time to meet user needs:

1. User Accounts Overview: Every user account on the Planet Platform is uniquely identified by an email
address. Each user also has a unique API key that can be used when interacting programmatically with
the Platform.

2. Organization and Sub-organization Overview: Every user on the Planet Platform belongs to one
organization. The Platform also supports “sub-organizations,” which are organizations that are attached
to a “parent” organization. An administrator of a parent organization is also considered an administrator
on all sub-organizations.

3. Account Privileges: Every user account on the Planet Platform has one of two roles: user or
administrator. An administrator has elevated access and can perform certain user management
operations or download usage metrics that are not available to standard users. An administrator of a
parent organization is also considered an administrator on all sub-organizations. Administrators can
enable or disable administrator status and enable or disable users’ access to the platform altogether.

4. Orders and Usage Review: This tool records all part orders made and allows users and administrators to
view and download past orders. Usage metrics are also made available, including imagery products
downloaded and bandwidth usage. Usage metrics are displayed for each individual API key that is part
of the organization.

© Planet Labs PBC 2022 89


APPENDIX A – IMAGE SUPPORT DATA
All PlanetScope Ortho Tile Products are accompanied by a set of image support data (ISD) files. These ISD files
provide important information regarding the image and are useful sources of ancillary data related to the
image. The ISD files are:

1. General XML Metadata File


2. Unusable Data Mask File
3. Usable Data Mask File

Each file is described along with its contents and format in the following sections.

1. GENERAL XML METADATA FILE

All PlanetScope Ortho Tile Products will be accompanied by a single general XML metadata file. This file
contains a description of basic elements of the image. The file is written in Geographic Markup Language (GML)
version 3.1.1 and follows the application schema defined in the Open Geospatial Consortium (OGC) Best
Practices document for Optical Earth Observation products version 0.9.3, see
http://www.opengeospatial.org/standards/gml.

The contents of the metadata file will vary depending on the image product processing level. All metadata files
will contain a series of metadata fields common to all imagery products regardless of the processing level.
However, some fields within this group of metadata may only apply to certain product levels. In addition, certain
blocks within the metadata file apply only to certain product types. These blocks are noted within the table.

The table below describes the fields present in the General XML Metadata file for all product levels.

Table A-1: General XML Metadata File Field Descriptions

GENERAL XML METADATA FILE FIELD DESCRIPTIONS

Field Description

“metaDataProperty” Block

EarthObservationMetaData

Identifier Root file name of the image

acquisitionType Nominal acquisition

productType Product level listed in product filename

status Status type of image, if newly acquired or produced from a previously


archived image

downlinkedTo

© Planet Labs PBC 2022 90


acquisitionStation X-band downlink station that received image from satellite

acquisitionDate Date and time image was acquired by satellite

archivedIn

archivingCenter Location where image is archived

archivingDate Date image was archived

archivingIdentifier Catalog ID of image

processing

processorName Name of ground processing system

processorVersion Version of processor

nativeProductFormat Native image format of the raw image data

license

licenseType Name of selected license for the product

resourceLink Hyperlink to the physical license file

versionIsd Version of the ISD

orderId Order ID of the product

tileId Tile ID of the product corresponding to the Tile Grid

pixelFormat Number of bits per pixel per band in the product image file

“validTime” Block

TimePeriod

beginPosition Start date and time of acquisition for source image take used to create
product, in UTC

endPosition End date and time of acquisition for source image take used to create
product, in UTC

“using” Block

EarthObservationEquipment

platform

shortName Identifies the name of the satellite platform used to collect the image

serialIdentifier ID of the satellite that acquired the data

orbitType Orbit type of satellite platform

instrument

shortName Identifies the name of the satellite instrument used to collect the image

© Planet Labs PBC 2022 91


sensor

sensorType Type of sensor used to acquire the data.

resolution Spatial resolution of the sensor used to acquire the image, units in meters

scanType Type of scanning system used by the sensor

acquistionParameters

orbitDirection The direction the satellite was traveling in its orbit when the image was
acquired

incidenceAngle The angle between the view direction of the satellite and a line
perpendicular to the image or tile center

illuminationAzimuthAngle Sun azimuth angle at center of product, in degrees from North (clockwise)
at the time of the first image line

illuminationElevationAngle Sun elevation angle at center of product, in degrees

azimuthAngle The angle from true north at the image or tile center to the scan (line)
direction at image center, in clockwise positive degrees.

spaceCraftView Angle Spacecraft across-track off-nadir viewing angle used for imaging, in
degrees with “+” being East and “-” being West

acquisitionDateTime Date and Time at which the data was imaged, in UTC. Note: the imaging
times will be somewhat different for each spectral band. This field is not
intended to provide accurate image time tagging and hence is simply the
imaging time of some (unspecified) part of the image.

“target” Block

Footprint

multiExtentOf

posList Position listing of the four corners of the image in geodetic coordinates in
the format:
ULX ULY URX URY LRX LRY LLX LLY ULX ULY
where X = latitude and Y = longitude

centerOf

pos Position of center of product in geodetic coordinate X and Y, where X =


latitude and Y = longitude

geographicLocation

topLeft

latitude Latitude of top left corner in geodetic WGS84 coordinates

longitude Longitude of top left corner in geodetic WGS84 coordinates

topRight

latitude Latitude of top right corner in geodetic WGS84 coordinates

longitude Longitude of top right corner in geodetic WGS84 coordinates

© Planet Labs PBC 2022 92


bottomLeft

latitude Latitude of bottom left corner in geodetic WGS84 coordinates

longitude Longitude of bottom left corner in geodetic WGS84 coordinates

bottomRight

latitude Latitude of bottom right corner in geodetic WGS84 coordinates

longitude Longitude of bottom right corner in geodetic WGS84 coordinates

“resultOf” Block

EarthObservationResult

browse

BrowseInformation

type Type of browse image that accompanies the image product as part of the
ISD

referenceSystemIdentifier Identifies the reference system used for the browse image

fileName Name of the browse image file

product

fileName Name of image file.

productFormat File format of the image product

spatialReferenceSystem

epsgCode EPSG code that corresponds to the datum and projection information of
the image

geodeticDatum Name of datum used for the map projection of the image

projection Projection system used for the image

projectionZone Zone used for map projection

resamplingKernel Resampling method used to produce the image. The list of possible
algorithms is extendable

numRows Number of rows (lines) in the image

numColumns Number of columns (pixels) per line in the image

numBands Number of bands in the image product

rowGsd The GSD of the rows (lines) within the image product

columnGsd The GSD of the columns (pixels) within the image product

radiometricCorrectionApplied Indicates whether radiometric correction has been applied to the image

geoCorrectionLevel Level of correction applied to the image

© Planet Labs PBC 2022 93


elevationCorrectionApplied Indicates the production elevation model used for ortho

atmosphericCorrectionApplied Indicates whether atmospheric correction has been applied to the image

atmosphericCorrectionParameters

mask

MaskInformation

type Type of mask file accompanying the image as part of the ISD

format Format of the mask file

referenceSystemIdentifier EPSG code that corresponds to the datum and projection information of
the mask file

fileName File name of the mask file

cloudCoverPercentage Estimate of cloud cover within the image

cloudCoverPercentageQuot- Method of cloud cover determination


ationMode

unusableDataPercentage Percent of unusable data with the file

The following group is repeated for each spectral band included in the image product

bandSpecificMetadata

bandNumber Number (1-5) by which the spectral band is identified.

startDateTime Start time and date of band, in UTC

endDateTime End time and date of band, in UTC

percentMissingLines Percentage of missing lines in the source data of this band

percentSuspectLines Percentage of suspect lines (lines that contained downlink errors) in the
source data for the band

binning Indicates the binning used (across track x along track)

shifting Indicates the sensor applied right shifting

masking Indicates the sensor applied masking

radiometricScaleFactor Provides the parameter to convert the scaled radiance pixel value to
radiance Multiplying the Scaled Radiance pixel values by the values,
derives the Top of Atmosphere Radiance product. This value is a constant,
set to 0.01

reflectanceCoefficient The value is a multiplicative, when multiplied with the DN values, provides
the Top of Atmosphere Reflectance values

The remaining metadata fields are only included in the file for L1B RapidEye Basic products

spacecraftInformationMetadataFile Name of the XML file containing attitude, ephemeris and time for the 1B
image

rpcMetadataFile Name of XML file containing RPC information for the 1B image

© Planet Labs PBC 2022 94


mask

MaskInformation

type Type of mask file accompanying the image as part of the ISD

format Format of the mask file

referenceSystemIdentifier EPSG code that corresponds to the datum and projection information of
the mask file

fileName File name of the mask file

cloudCoverPercentage Estimate of cloud cover within the image

cloudCoverPercentageQuotationMode Method of cloud cover determination

unusableDataPercentage Percent of unusable data with the file

The following group is repeated for each spectral band included in the image product

bandSpecificMetadata

bandNumber Number (1-5) by which the spectral band is identified.

startDateTime Start time and date of band, in UTC

endDateTime End time and date of band, in UTC

percentMissingLines Percentage of missing lines in the source data of this band

percentSuspectLines Percentage of suspect lines (lines that contained downlink errors) in the
source data for the band

binning Indicates the binning used (across track x along track)

shifting Indicates the sensor applied right shifting

masking Indicates the sensor applied masking

radiometricScaleFactor Provides the parameter to convert the scaled radiance pixel value to
radiance Multiplying the Scaled Radiance pixel values by the values,
derives the Top of Atmosphere Radiance product. This value is a constant,
set to 0.01

reflectanceCoefficient The value is a multiplicative, when multiplied with the DN values, provides
the Top of Atmosphere Reflectance values

harmonizationTransform Provides coefficients to transform the Next-Generation PlanetScope


sensor values to match those of the previous PlanetScope satellites

sourceSensor The new instrument to be transformed to the targetSensor

targetSensor The target instrument that the transform harmonizes values to

targetMeasure The physical unit that the harmonization transform is valid for

bandCoefficients Matrix of coefficients to transform band values to match those of the


targetSensor, in combination with the finalOffset

© Planet Labs PBC 2022 95


finalOffset An offset value for each band used in combination with the
bandCoefficients to perform the transform

The remaining metadata fields are only included in the file for L1B RapidEye Basic products

spacecraftInformationMetadataFile Name of the XML file containing attitude, ephemeris and time for the 1B
image

rpcMetadataFile Name of XML file containing RPC information for the 1B image

File Naming Example: Ortho Tiles

The General XML Metadata file will follow the naming conventions as in the example below.

Example: 2328007_2010-09-21_RE4_3A_visual_metadata.xml

2. UNUSABLE DATA MASK FILE

The unusable data mask file provides information on areas of unusable data within an image (e.g. cloud and
non-imaged areas).

The pixel size after orthorectification will be 3.125 m for PlanetScope OrthoTiles, 3.0m for PlanetScope Scenes, 50
m for RapidEye, and 0.8 m for SkySat. It is suggested that when using the file to check for usable data, a buffer
of at least 1 pixel should be considered. Each bit in the 8-bit pixel identifies whether the corresponding part of
the product contains useful imagery:

● Bit 0: Identifies whether the area contains blackfill in all bands (this area was not imaged). A value of “1”
indicates blackfill.

● Bit 1: Identifies whether the area is cloud covered. A value of “1” indicates cloud coverage. Cloud
detection is performed on a decimated version of the image (i.e. the browse image) and hence small
clouds may be missed. Cloud areas are those that have pixel values in the assessed band (Red, NIR or
Green) that are above a configurable threshold. This algorithm will:

○ Assess snow as cloud

○ Assess cloud shadow as cloud free

○ Assess haze as cloud free

● Bit 2: Identifies whether the area contains missing (lost during downlink) or suspect (contains down-
link errors) data in band 1. A value of “1” indicates missing/suspect data. If the product does not include
this band, the value is set to “0”.

● Bit 3: Identifies whether the area contains missing (lost during downlink and hence blackfilled) or
suspect (contains downlink errors) data in the band 2. A value of “1” indicates missing/suspect data. If
the product does not include this band, the value is set to “0”.

© Planet Labs PBC 2022 96


● Bit 4: Identifies whether the area contains missing (lost during downlink) or suspect (contains downlink
errors) data in the band 3. A value of “1” indicates missing/suspect data. If the product does not include
this band, the value is set to “0”.

● Bit 5: Identifies whether the area contains missing (lost during downlink) or suspect (contains downlink
errors) data in band 4. A value of “1” indicates missing/suspect data. If the product does not include this
band, the value is set to “0”.

● Bit 6: Identifies whether the area contains missing (lost during downlink) or suspect (contains downlink
errors) data in band 5. A value of “1” indicates missing/suspect data. If the product does not include this
band, the value is set to “0”.

● Bit 7: Is currently set to “0”.

The UDM information is found in band 8 of the Usable Data Mask file.

3. USABLE DATA MASK FILE

The usable data mask file provides information on areas of usable data within an image (e.g. clear, snow,
shadow, light haze, heavy haze and cloud).

The pixel size after orthorectification will be 3.125 m for PlanetScope OrthoTiles and 3.0m for PlanetScope
Scenes. The usable data mask is a raster image having the same dimensions as the image product, comprised
of 8 bands, where each band represents a specific usability class mask. The usability masks are mutually
exclusive, and a value of one indicates that the pixel is assigned to that usability class.

● Band 1: clear mask (a value of “1” indicates the pixel is clear, a value of “0” indicates that the pixel is not
clear and is one of the 5 remaining classes below)
● Band 2: snow mask
● Band 3: shadow mask
● Band 4: light haze mask
● Band 5: heavy haze mask
● Band 6: cloud mask
● Band 7: confidence map (a value of “0” indicates a low confidence in the assigned classification, a value
of “100” indicates a high confidence in the assigned classification)
● Band 8: unusable data mask

File Naming

The UDM2 file will follow the naming conventions as in the example below.

Example: 20180921_102852_0f34_1A_udm2.tif (basic_udm2 asset)


20180921_102852_0f34_3B_udm2.tif (ortho_udm2 asset)

© Planet Labs PBC 2022 97


© Planet Labs PBC 2022 98
APPENDIX B - TILE GRID DEFINITION
PlanetScope Ortho Tile imagery products are based on the UTM map grid as shown in Figure B-1 and B-2. The
grid is defined in 24km by 24km tile centers, with 1km of overlap, resulting in 25km by 25km tiles.

Figure B-1: Layout of UTM Zones

An Ortho Tile imagery products is named by the UTM zone number, the grid row number, and the grid column
number within the UTM zone in the following format:

<ZZRRRCC>

Where:

ZZ = UTM Zone Number (This field is not padded with a zero for single digit zones in the tile
shapefile)
RRR = Tile Row Number (increasing from South to North, see Figure B-2)
CC = Tile Column Number (increasing from West to East, see Figure B-2)

© Planet Labs PBC 2022 99


Example: Tile 547904 = UTM Zone = 5, Tile Row = 479, Tile Column = 04

Tile 3363308 = UTM Zone = 33, Tile Row = 633, Tile Column = 08

Figure B-2: Layout of Tile Grid within a single UTM zone

© Planet Labs PBC 2022 100


Due to the convergence at the poles, the number of grid columns varies with grid row as illustrated in Figure
B-3.

Figure B-3: Illustration of grid layout of Rows and Columns for a single UTM Zone

The center point of the tiles within a single UTM zone are defined in the UTM map projection to which standard
transformations from UTM map coordinates (x,y) to WGS84 geodetic coordinates (latitude and longitude) can
be applied.

col = 1..29
row = 1..780
Xcol = False Easting + (col –15) x Tile Width + Tile Width/2
Yrow = (row – 391) x Tile Height + Tile Height/2

Where: (X and Y are in meters)

False Easting = 500,000m


Tile Width = 24,000m
Tile Height = 24,000m

The numbers 15 and 391 are needed to align to the UTM zone origin.

© Planet Labs PBC 2022 101

You might also like