Saving coral reefs from space: Allen Coral Atlas satellite-based data
February 4, 2019
 

pt. 1:  from satellites to the bottom of the ocean

 

Two core products – global maps and monitoring


In response to the many crises facing the world’s coral reefs, Vulcan, together with our scientific and technical partners from the Arizona State University (ASU), the HawaiĘ»i Institute of Marine Biology, Planet, and the University of Queensland, formed a partnership in late 2017 with the lofty goal of mapping and monitoring all of the world’s coral reefs using a globally consistent methodology. 
 
While there have been several attempts to map all of the world’s reefs, none of those have been completed using a globally comparable methodology.  And, while there are many very good maps of individual corals reefs, the techniques used to create those maps often do not lend themselves to global scaling, usually due to data gathering methods that would be cost-prohibitive at global scale.  The partnership with Planet allows us to leverage their large constellation of Dove satellites to achieve mapping and monitoring at global scale using a comparable methodology and at an affordable cost.  
 
dove.png
A Planet Dove satellite.  © 2019 Planet Labs Inc.
 
We launched the Allen Coral Atlas with the first products of this partnership in October 2018.  These first products demonstrated the scientific viability of the data processing and mapping methodologies.  Throughout 2019 we will scale up those products geographically, demonstrate our monitoring products and initiate integrations with complementary data products.  In 2020 we will deliver global coverage of all of these products.
 
This is the first of several articles in which we’ll follow the data streams from the satellites, where we collect much of our source data, to the products produced by the partnership.   
 

Imagery from satellites to the bottom of the ocean

Both the mapping and monitoring products depend on calculating bathometry, the measurement of water depth and bottom reflectance, which is an approximation of what would be measured if you held an imaging sensor (much like that in a digital camera) just above the bottom of the water.  In our case, that would be just above the part of the coral reef we intend to map and monitor.  To arrive at bottom reflectance, we need to correct for all of the effects of the atmosphere, the water surface and the water column in the radiance that was measured by the sensor on the satellite.  Because the effects of the atmosphere and water change over time, and from place to place, removing them provides for comparability over time and space.
 
For the Allen Coral Atlas, the atmospheric correction, sea surface effect correction, water column correction and depth retrieval algorithms were developed by the team at ASU’s new Center for Global Discovery & Conservation Science (who were formally at the Carnegie Institution for Science’s Department of Global Ecology).  Other image pre-processing algorithms were developed by Planet.  The processing of these algorithms is currently being hosted by Planet but as we scale throughout 2019 operational implementations of all of the algorithms will be collaboratively developed by Vulcan, ASU and Planet and parts of the processing pipeline will be migrated to new hosting.
 
ACA-simple-image-processing-flow-1.png
Allen Coral Atlas data processing to bottom reflectance.  (Green border = product, blue border = process.)
 
Here is a more detailed look at the processing steps:
  • The process starts with the top-of-atmosphere radiance scenes from the Dove satellites. These are four-band images with 16bit depth for each band (blue, green, red and near-infrared).  Dove satellites have a spatial resolution of approximately 3.7 meters per pixel.  (For more information, here’s a technical pdf describing the Planet imagery products and there is a description of the imagery and processing on the Atlas website.) 
  • Pre-processing steps are then performed on the imagery, including orthorectification and radiometric calibrations and corrections:
    • Orthorectification is the removal of geometric distortions in an image, like those caused by variations in terrain, to create an image with consistent scale.
    • The radiometric calibrations include converting from the strength of the signal as measured by the satellite’s sensor to a measure of radiance and then normalizing the measure of radiance to a measure of reflectance.
    • The radiometric corrections include correcting for sensor irregularities and atmospheric correction, where the reflectance values that were measured at the top of the atmosphere are “corrected” to a measure of reflectance on the surface of the earth, or “surface reflectance”.  Atmospheric correction accounts for the scattering and absorption effects of the gases and particles in the atmosphere.      
  • We then select scenes based on criteria that optimize for the possibility of deriving accurate bathymetry and maps.  The criteria include selecting for minimum cloud cover, minimum breaking waves and consistent tide state, high or low. 
  • We then composite the scenes into mosaics for the geographic areas we will be mapping.  If the scenes will be analyzed as part of the monitoring product, there is no need to mosaic them as the pixels of interest can be analyzed scene-by-scene.  Foregoing mosaicking for the monitoring product also allows us to sample at a higher frequency per scene because gathering a sufficient number of scenes that satisfy our selection criteria across the mosaic may require a longer period of time then selecting a single scene that satisfies the criteria.
  • Following the mosaicking, or with the individual scenes, we can estimate the amount of chlorophyll a in the water, which we will soon use to determine how much the light is attenuated in the water column.  After this, the value of the near infrared band can be used to remove sun glint from the images and the below-surface reflectance value can be derived.  
  • Given the below-surface reflectance, and with the chlorophyll a estimation we made above, we can derive the depth.  These values are validated by field measurements at select sites.  The depth values stored in a single band raster and since very little reflected light can be measured beyond a depth of 20 meters with the instruments we are using, we only consider the range 0 to 20.  For the benthic map products where we want a higher threshold of confidence our maximum depth is 10 meters.
bathymetry-kari.png
Sample bathymetry data from Karimunjawa, Indonesia.  Lighter tints of blue are shallower water, darker shades are deeper.  Darkest is >=20m.   © 2019 Allen Coral Atlas
  • Finally, we correct for the effects of the water column, including the scattering of light and absorption of light by particles suspended in the water column, and arrive at the bottom reflectance values. 
  • See the “Correction Models” section of the Methods page on the Atlas website for more information and references.

What’s next?

This article took us from the source satellite data to depth and bottom reflectance data, but these both serve more as “intermediate” products rather than “final” products, that we would put in front of Atlas users.  In future articles we’ll describe how we get from here to mapping and monitoring, as well as complementary data streams that we will be integrating into the Atlas: 
  • We’ll explore the rules and models the University of Queensland team is developing for the map products, which include maps of the geomorphology of the reefs and maps of the benthic habitat of the reefs, and describe how field data is used for calibrating and validating the mapping processes.
  • We will describe the models and detectors that the ASU team is developing for the monitoring system.
  • We’ll investigate collaborations with complementary data providers and data modelers, such as Coral Reef Watch
About the Author
Kirk L.
Senior Software Engineer
Kirk has been with Vulcan for 17 years and has contributed to many of the diverse projects Vulcan has undertaken in that time.  Recently he has led the engineering efforts for the Sea Around Us project, the Great Elephant Census project and the Global Finprint project.  He currently leads the engineering effort for the Allen Coral Atlas project.

Category Tags
Coral Conservation
Digital Mapping
Ocean Health
Remote Sensing
About the Author
Kirk L.
Senior Software Engineer
Kirk has been with Vulcan for 17 years and has contributed to many of the diverse projects Vulcan has undertaken in that time.  Recently he has led the engineering efforts for the Sea Around Us project, the Great Elephant Census project and the Global Finprint project.  He currently leads the engineering effort for the Allen Coral Atlas project.

Category Tags
Coral Conservation
Digital Mapping
Ocean Health
Remote Sensing
Build a better future
Working at Vulcan