Karen Bakker is a professor at the University of British Columbia, a Guggenheim fellow and a 2022-23 fellow of the Harvard Radcliffe Institute for Advanced Studies.
On a chilly morning in January 1952, Alan Baldridge witnessed a murder. Sailing off the coast of California in pursuit of a pod of migrating whales, he heard screams in the distance. The pod abruptly vanished. Scanning the horizon, he spotted a large gray whale “spy-hopping,” swimming vertically and raising its head above the surface. Baldridge, a marine biologist at Stanford, decided to investigate; drawing closer, he saw seven orcas singing hunting cries, circling a small gray whale calf. As its mother watched nearby, the orcas began devouring the lips, tongue and throat of the dead baby.
Baldridge’s story inspired a controversial research agenda. Soon after his encounter, the Navy began using orca sounds in an attempt to control cetaceans. Their hypothesis: Whales could decode information from sound, a contrarian claim in an era when most researchers believed that animal noise was devoid of meaning. One of the Navy’s first experiments involved sailing a catamaran off the coast of San Diego, playing recorded orca screams to gray whales swimming south on their annual migration. The results were “spectacular”: The whales whirled around and fled north or hid deep in nearby kelp beds, slowly popping their heads above the surface to search for predators. When they finally resumed swimming south, the whales were in stealth mode: sneaking past, with little of their bodies showing above the surface, their breathing scarcely audible.
The Navy’s next experiment was in Alaska, where a local fish and game official named John Vania was at war with beluga whales along the Kvichak River, home to the largest red salmon run in the world. While bears and eagles feasted on the shore, belugas would surf the mighty tide up the muddy brown estuary, feeding on the endless conveyor belt of salmon swimming toward the sea. After the fishermen complained that belugas were eating too many fish, Vania tried chasing the whales with motorboats, blaring rock music, even throwing small charges of explosives — all in vain.
But when he pumped the Navy’s orca recordings through jerry-rigged underwater speakers, every single beluga immediately turned and fled. On the Alaska coast, some tides are strong enough to fling large boulders into the forest, but the belugas would battle even the strongest tide surge in order to escape. And although they responded to hunting screams, the belugas appeared most frightened of orca “clicks,” as if a warning was encoded in the staccato sounds.
At the time, industrial whaling and dolphin hunting were still permitted. Whalers killed tens of thousands of bowhead, sperm and right whales annually, their oil and other parts rendered into lubricant, perfume and lipstick. Canadian government officials mounted a .50-caliber Browning machine gun on a promontory north of Vancouver with the sole aim of slaughtering orcas, which were viewed as pests by local fishermen — although they never fired the gun. In pursuit of tuna, fishermen killed an estimated 6 million dolphins in the Eastern Pacific in a few short decades following the Second World War.
But a ragtag organization named Greenpeace was beginning to send protestors on small inflatable boats into the northern Pacific and Alaskan waters to protect whales from bullets and harpoons. Their efforts, caught on camera, inspired public outrage. A global movement to save the whales led to a commercial moratorium on industrial whaling, and to “dolphin safe” fishing legislation.
Now no longer permitted to kill cetaceans, fishers began using acoustic deterrents; the devices were often mandated by national governments on fishing boats, fish farms and even fishing nets. A truce of sorts was declared between cetaceans and humanity.
The apparent benefits were short-lived. Acoustic deterrence creates damaging side effects, including hearing impairment. In the underwater world, where sound travels faster than it does through air, cetaceans use echolocation (also called biosonar) to “see” the world through sound. Human noise pollution renders cetaceans and other marine organisms near deaf and blind, unable to echolocate, communicate or find prey.
When the din of motors and seismic blasts is added to acoustic deterrence devices, cetaceans can find themselves caught in a blinding acoustic fog, unable to detect approaching ships. Marine traffic accidents are now a primary cause of whale deaths.
Although no longer using bullets and bombs, humans are still killing cetaceans by the tens of thousands every year. Could digital technologies provide a solution?
The Santa Barbara Channel — through which the world’s largest whales, on one of the world’s longest migrations, move past some of the busiest ports in the world — is a global nexus for marine roadkill. No better place, then, for developing a Waze for whales.
Whale Safe is the creation of scientists at UC Santa Barbara and is funded by Marc Benioff, who was apparently inspired to create Salesforce while swimming with dolphins off the coast of Hawaii. An AI-powered monitoring system, Whale Safe creates virtual whale lanes to enable safe passage for cetaceans and prevent ship strikes in near real-time.
The system incorporates five digital technologies: an underwater acoustic monitoring system that detects whale calls; AI algorithms that detect and identify blue, humpback or fin whales in near real-time; oceanographic modeling combining satellite and digital buoy data with ocean circulation models and animal tags; whale sighting data reported by citizen scientists, mariners and whale-watchers using mobile apps; and locational data from ships’ automatic information systems (a mandatory global system of satellite tracking that enables precise monitoring of ships’ locations at all times).
The output: a whale presence rating overlaid on a map, similar to a weather report, which is relayed in near real-time to ship captains, who can decide to slow down or leave the area altogether. The Whale Safe team also tracks ships to see if they are complying with slow-speed zones and publishes public report cards tracking compliance, naming and shaming ships that fail to comply.
Scientists are also developing infrared thermal imaging cameras to mount on the bows of ships to detect whales — and whale strikes. Killing cetaceans used to happen out of sight, but with dashcams mounted on ships, whale sightings could be automatically reported and whale deaths automatically recorded. Initial studies confirm a reduction in whale strikes by at least half.
Similar digital whale protection systems have also been implemented on the east coast of North America, where aquatic drones now roam the Atlantic, searching for endangered right whales in the Gulf of St. Lawrence, which transports over 100 million metric tons of cargo a year. The whales’ location is pinpointed using digital bioacoustics, their trajectory forecast using AI algorithms trained on datasets of whale movements, and the information is conveyed to ships’ captains and fishing boats, who face stiff fines of several hundred thousand dollars if they fail to slow down and leave the area.
Digitally enabled whale lanes now trump shipping lanes in some places: Only a few decades ago, North Atlantic right whales were hunted to the brink of extinction, but today, less than 400 of them have been empowered to control the movements of thousands of ships in a region home to 45 million people.
Scientists are now advocating for similar systems to be created around the world. Some are proposing an ambitious agenda: a digitally enabled global network of Marine Protected Areas (MPAs) whose boundaries change position as endangered species migrate through the oceans, and which literally “follow the fish.” Endangered tuna off the coast of Australia and turtles off the coast of Hawaii are now being protected by an array of digital tracking devices — sensors, acoustic drones, satellites — which feed data to machine learning algorithms in order to precisely forecast the location of endangered species. Mobile protected areas provide a flexible, responsive web of protection — a digital cloak of inviolability in a changing sea.
In an era of rapid global warming-induced changes in the world’s oceans, in which many marine species are becoming climate refugees, policymakers are now debating how we might apply these systems at the planetary scale. Near real-time, mobile and potentially spatially ubiquitous form of ocean governance relies on digital hardware that collects data from various sources (like nano-satellites, aerial and underwater drones, environmental sensor networks, digital bioacoustics and marine tags), combined with machine learning algorithms, computer vision and ecological informatics.
But its most novel aspect is its agility: responsive, near real-time adaptation to environmental variability, species mobility and disturbance dynamics. It’s a fitting governance model for an increasingly unpredictable world of environmental hazards and extreme events, and a timely response to the new global commitment, reached at the U.N.’s global biodiversity conference in late 2022, to protect at least 30% of Earth’s land and water by 2030.
These digitally enabled ocean conservation schemes benefit humans as well as the whales. When ships slow down, they not only reduce whale strikes but also release fewer pollutants and emit less carbon dioxide. Moreover, whales’ nutrient-rich waste acts like a fertilizer for phytoplankton, which sequester enormous amounts of carbon. IMF economists have estimated the value of the ecosystem services provided by each individual whale (of the largest species) at over $2 million and called for a new global program of economic incentives to return whale populations to pre-industrial whaling levels as a “nature-based solution” to climate change.
Perhaps the most novel aspect of these digital ocean governance schemes is their inclusion of nonhumans into decision-making. Simply by singing, a whale can turn aside a container ship: a digitally mediated decentering of the human.
Marine navigation becomes a matter of interspecies cooperation, as whales influence and constrain human action by controlling the decisions and movements of ship captains and fishers. Nonhumans, enabled by digital computation, are being enrolled in ocean governance, in stark contrast to the way that humans treated these species only a few decades ago, a grounded example of what Dipesh Chakrabarty calls the extension of “ideas of politics and justice to the nonhuman”: multispecies environmental regulation.
Whale Safe illustrates the remarkable change underway in planetary environmental governance. Confronted with accelerating biodiversity loss, scientists and conservationists are adapting digital tools to achieve conservation goals. Digital environmental monitoring and decision-making platforms are operational on every continent, in every major biome on Earth. Repurposed cellphones, hidden high the tree canopy in tropical forests, are surveilling illegal loggers. Anti-terrorism software is being used to help predict and prevent poaching. Artificial intelligence algorithms use facial recognition to identify individual animals — from zebras to whale sharks — helping to track members of endangered species.
At some point in the past 24 hours, a flock of nano-satellites called Doves flew over your head: the first system able to image the entire surface of the Earth every day. Its developers — a team of ex-NASA engineers — are building a search engine for the entire surface of the planet that will operate in near-real time. One day soon, you will be able to search the surface of the Earth just like you search the web for images or text. Satellites are also being used to identify greenhouse gas emissions like methane; NGOs are publishing “name and shame” lists of the world’s biggest climate polluters.
These technologies are akin to those used in “smart cities” but articulated across a much wider range of ecosystems and land use types: a “Digital Earth,” monitored by systems of satellites and sensors that are increasingly instrumented, interconnected and intelligent. Digital Earth networks undertake a form of nested planetary computation, incorporating not only climate but also living beings, both biotic and abiotic elements of Gaia.
Digital Earth technologies have several implications for environmental governance. First, environmental data is becoming super-abundant rather than scarce. Second, environmental data is becoming ubiquitous: automated sensors, satellites and drones collect data continuously, even in remote places that humans find difficult to access, sensing and managing the environment everywhere, all the time. This creates time-space compression (governance is temporally and spatially ubiquitous) and time-space agility (governance is spatially and temporally dynamic).
Third and most powerful of all: Rather than responding to environmental crises after they occur, digital technologies enable near-real time responses and may even predict hazards and catastrophes before they happen. Environmental governance can thus be preventive rather than reactive, and environmental criminals will find it harder to hide. Crowdsourcing and citizen science can be used to involve the public in conservation efforts; sites such as Zooniverse herald a resurgence of public engagement in science akin to the Victorian era. Although the incursions of Big Tech into this space are cause for concern, the engagement of thousands of not-for-profit conservation groups in this agenda raises the likelihood that digital environmental governance might evolve to be more inclusive, enabling new patterns of inclusion, subsidiarity and solidarity.
Digital Earth also has the potential to be a multispecies affair, enrolling what Achille Mbembe refers to as “le vivant” (the living, enlivened, lively world) into planetary governance. Digital technologies may allow nonhumans to participate as active subjects in environmental management. Planetary computation, in other words, is not merely a set of tools for monitoring and manipulating the planet, but also a potential means of extending political voice to nonhumans, akin to what Isabelle Stengers terms “cosmopolitics.”
Planetary computation and planetary governance are thus not merely extensions of the old engineering mantra of “command and control.” Instead, they offer us a new paradigm: “communicate and cooperate,” which extends a form of voice to nonhumans, who become active subjects co-participating in environmental regulation, rather than passive objects. The environmental becomes inescapably political, but the political is not solely human. Digital Earth technologies offer the possibility of creating what Bruno Latour once called the “Parliament of Things”: a digitally enabled Parliament of Earthlings.
Can The Planet Speak?
Humans might choose to listen to nonhumans, but do they have anything meaningful to say? Here, too, Digital Earth technologies offer insights. In the past two decades, digital bio- and eco-acoustic networks have been deployed from the Arctic to the Amazon, recording and decoding nonhuman sounds — many of which occur beyond human hearing range in the high infrasound or low ultrasound. The proliferation of these digital listening systems reveals that much more meaningful information is encoded in acoustic communication within and between species than humans suspected.
Many species that scientists once thought to be mute or relatively vocally inactive actually make sound. To give just one example, researchers have recorded sounds made by over 50 fish and turtle species — once thought to be voiceless — making hundreds of different sounds, revealing complex coordination behaviors, evidence of parental care and a remarkable ability of embryos of at least one turtle species to time the moment of their collective birth through vocal communication. Peacocks emit loud, low infrasound during their mating dances. Elephants use similar frequencies to communicate across long distances, seemingly telepathically.
Nearly every species to which scientists have listened makes some form of sound. Nonhumans were once believed to be largely deaf and mute, but now we are realizing that in nature, silence is an illusion.
What else are we learning through digitally mediated listening? There is much more ecologically complex information contained in nonhuman vocalizations than we realized. Elephants have specific signals for different threats such as honeybees and humans, and their vocalizations even distinguish between humans from different tribes; researchers are now building an elephant dictionary with thousands of sounds. Honeybees also have hundreds of distinct sounds; although we have only deciphered a few, we know that there are specific sounds in honeybee language — which is spatial and vibrational as well as acoustic — with specific meanings, such as a “stop” signal and a begging signal. Queens even have their own distinct vocabulary.
Joining this biophony is a resounding geophonic chorus from the planet itself: the low frequencies of volcanoes and hurricanes, calving glaciers and earthquakes, ringing the atmosphere like a quiet bell. The world resonates with nature’s sounds, which human ears cannot detect. But our computers can.
Digital listening is also revealing that interspecies communication is much more widespread than scientists previously understood. Moths can detect bat and even jam bat sonar. When buzzing bees approach flowers, the flowers flood themselves with nectar within minutes. Plants can detect the sound of specific insect predators and distinguish threatening from non-threatening insects with astonishing precision. Corn, tomato and tobacco plants emit high-pitched infrasound that we can’t hear, but insects likely can; in one experiment, researchers trained an AI algorithm on the distinct sounds emitted by healthy, dehydrated and wounded plants — and the algorithm was soon able to diagnose the plants’ condition, simply by listening. Although these ultrasounds are beyond human hearing range, we know that some insects can hear them. Could other creatures be listening to the plants and detecting their state of health?
Sound is only one modality of nonhuman communication; other species use many mechanisms — from the gestural to the biochemical to the electrostatic — to communicate information and even emotions. Digital technologies can detect these multimodal forms of information, whether from forests or honeybees. As humans deploy digital technologies to enhance our ability to monitor and decode this information, we create the potential for a new type of environmental governance, and a new type of multispecies politics.
Complex communication is ubiquitous in nature, and thus many nonhumans could be said to possess a form of political “voice.” Modern humans have been hard of hearing, yet Digital Earth technologies offer new ways of listening to nonhuman preferences. This is by no means novel (Indigenous traditions offer powerful ways of nonhuman listening) nor neutral (digital technologies can be misused and abused). But with caveats and safeguards, digital technologies offer humanity a powerful new window into the nonhuman world.
Think of planetary computation as one means of eavesdropping on multispecies conversations, in which nonhumans can use digital technologies to convey information, influence human action and thus express a grounded form of voice. How might nonhuman preferences be incorporated into our decision-making frameworks, into new forms of Earthly politics? To begin formulating an answer, we’ll have to listen more closely to our nonhuman kin.