Eyeing the Deep from Orbit
By Jake Linsky
Imagine you are strolling along a dock’s edge. As you cross the water line you gaze down and are greeted by vibrant reef in the coastal shallows. Manini dart in and out of latticed structure and a rainbow assortment of nudibranch sea slugs graze among the coral. As you wander further, the seafloor drops and the colors fade and converge into shades of teal. You see a flicker of movement. Was it a fish? Perhaps it was just the wavering wind over water or your eyes are playing tricks… You continue your walk and wonder how deep the water is beneath you, and what fantastic creatures lie just out of sight.
Lately, I’ve been pondering the same thing, although my eyes are focused on a computer monitor displaying imagery captured 450km from the ocean’s surface by a satellite in Earth’s orbit. At 50cm resolution per pixel, the small, colorful inhabitants of the shallow reefs are far beyond my sight. Yet as I pan through the imagery, every now and again, the profile of one of the ocean's most gargantuan inhabitants emerges — a baleen whale.
Satellite imagery is rapidly emerging as a tool to monitor whale populations. From imagery collected in 2022 Maui Nui alone, we've detected over 100 whales! Mosaic made from sourced imagery: Jacques Descloitres, MODIS Land Rapid Response Team at NASA GSFC, 2022 Planet PBC
Our Wildlife from Space project, at the Marine Conservation Innovation Group, and Marine Mammal Research Program envisions using satellite platforms to globally monitor marine ecosystems and the whale populations that inhabit them. To accomplish this, we are designing cutting-edge tools to detect animals by training artificial intelligence to comb through swaths of very high-resolution satellite imagery. So why am I so concerned about what lies at the edge of the water's visibility? The answer is simple: the depth at which we can see a whale is the key to converting animal counts into robust population estimates.
Whales, like all other mammals, are constrained to the surface by their need to breathe air. Unlike most other mammals, however, whales are marine predators that inhabit some of our planet’s most extreme environments. Diving champions, the beaked and sperm whales, routinely descend to depths in the thousands of meters. Baleen whales, such as humpbacks and blues, don’t seem to push their diving limits quite as far. Even so, a whale feeding a few hundred meters below the surface is far beyond what we can see from space. Luckily, when whales migrate from resource-rich high latitudes into the tropics to breed, they spend far more time near the surface, often crossing into our window of visible detection. If we can accurately identify where in the water column that boundary lies, we can use advanced biologging tools, such as suction-cup tags, to determine the proportion of time whales spend at observable depths.
As whales descend, detection becomes more challenging. Both of these photos highlight whales at various depths in clear water, the image on the left was taken by drone and to the right is very high resolution satellite imagery. Drone photograph taken under permit in QLD, Australia by Jacob Linsky (2020); satellite image from 2022 Planet PBC
The problem with finding this boundary lies in the tools we use to detect it. We could sink objects on a measured line and observe when they disappear from view. Better yet, we could fly a drone overhead to measure detection depth. But neither approach quite replicates the way a satellite sensor gathers information. Very high-resolution satellite systems capture only a small portion of the light reflected from the surface. Unlike our eyes, and cameras designed for human vision, many satellite systems are sensitive to a broad range of light frequencies, extending well beyond the colors we can see. On top of this, photons reflected from the ocean's surface must make a chaotic journey through Earth's atmosphere before reaching the sensor, careening into particles and distorting as they pass into outer space. To truly understand what a satellite sensor sees, there is no substitute for using the sensor itself.
With this in mind, we are designing a delightfully challenging set of experiments, supported by the U.S. Marine Mammal Commission, to ground-truth ocean visibility across diverse conditions using actual satellite imagery. Our approach is to image objects at known depths and determine whether they are visible in the imagery. Working with intelligence provider Vantor, we plan to leverage the WorldView-3 satellite for initial experiments, precisely timing image captures at known locations while accounting for any weather that may obstruct our view. Unfortunately, coordinating these experiments with the whales themselves is one logistical challenge we have yet to resolve — so instead, we are building model 'whales' that can be submerged to known depths at the exact time and place of each image capture. By sinking four models to different depths, we can home in on the depth we lose sight of the whales.
As you may expect, these models must be whale-sized! While the models themselves are 'only' about 7 meters long, we designed them to approximate the visible surface area of a 13-meter humpback whale as seen from above. To get a prototype ready for deployment, I was joined by our project intern, Kyoko Suzuki — a University of Hawaiʻi at Mānoa oceanography undergraduate who first joined us as a Sea Grant Summer Undergraduate Research Fellow with a keen interest in ocean conditions and remote sensing. Kyoko proved a quick learner in the workshop, and together we sawed and drilled our way through a prototype PVC skeleton.
Our next step is to fill these frames with dark canvas to match the reflective properties of a humpback whale. To accomplish this, graduate student researcher and savvy sailor Kerri Luttrell reached out to her skipper Emma Waldman and the Stardust racing team, who kindly donated used sail material for us to repurpose into life-sized whales. We are immensely grateful to Stardust racing, and to Derek LeVault and the Papahānaumokuākea Marine Debris Project team, who have generously offered to help hem the sails for sturdy deployment!
Our first set of experiments will provide critical, if coarse, information. In turbid Hawaiian waters, near estuaries and coastlines after heavy rain, we may lose sight of the models beyond 5 meters. This contrasts with clear water conditions, where (with any luck) we will see whales far deeper, perhaps at the 20–30 meters where reef shapes remain distinguishable over a sandy bottom. Even this broad distinction between clear and murky water represents a massive advancement over what we currently know. But we hope for more.
As the experiments unfold, Kyoko will collect water samples in near-real-time within each satellite image footprint. By analyzing key water properties, such as sediment turbidity and indicators of primary productivity, we can begin to map the relationship between ocean composition, the color profile of satellite pixels, and the depth limits of whale visibility. As these data fill in parameters currently occupied by guesswork, we will increasingly be able to resolve visibility and ocean composition at fine scales. These efforts serve as a blueprint that can be replicated across diverse geographies and ocean conditions, ultimately providing a powerful space-borne tool to simultaneously monitor whale abundance and quantify features of the environments they inhabit.
We couldn't be more excited to get out there and start imaging from space! We hope you'll follow along as we report our challenges and successes. No matter how cold and wet we get, we promise to remain stubborn in our pursuit of new ways to observe and protect our ocean giants.
https://www.oceansphere.org/donate