Dr Sandra Brooke, Florida State University Coastal and Marine Lab, explores whether the over-exploitation of deep oceans can be averted as deep-sea science continues to be outpaced
Humans have been exploiting the oceans for thousands of years, but focused scientific enquiry into the deep sea did not begin until the early 19th century. In 1843, Edward Forbes proposed his Azoic Hypothesis, which posited that life could not exist below 300 fathoms (550 m).
The Challenger expedition (1872-1876) was the first global survey of the deep-sea, and their dredging operations revealed great abundance and diversity of life far below 300 fathoms. The Azoic Hypothesis became obsolete and dredges were the tool of choice for deep-sea naturalists.
The middle 20th century heralded a leap forward in deep-sea research. The advent of acoustic systems to generate seafloor maps allowed scientists to locate topographic features and plumes from chemosynthetic ecosystems. Rugged ecosystems such as submarine canyons became accessible to science through the use of underwater vehicles.
Over the past few decades, technological advances have created sophisticated mapping, navigation, and imaging systems that have greatly increased our ability to study the deep. New technologies such as autonomous data collectors, new sensors, artificial intelligence (AI) and automated sample processing will continue to advance the pace of marine research.
Large research vessels and sophisticated technologies are extremely expensive, and their use is generally limited to wealthy nations. The deep North Atlantic Ocean is fairly well characterised due to significant investments in deep-sea research by Europe and North America.
Over the past decade, over 500 putative cold seeps were discovered off the western Atlantic margin, miles of deep coral reefs were mapped and explored, dense faunal assemblages were documented within submarine canyons, and many new species were described. By contrast, regions such as the Indian Ocean remain virtually unknown. Vast areas of the deep seafloor have yet to be mapped or explored and we can only guess at the kinds of secrets they hold.
With increasing pressure on financial resources, one could question the wisdom of massive expenditures in deep-sea research. Do we really need to understand a system that most people will never even see? One that is so vast and inaccessible that we couldn’t possibly impact it? Why not use our precious funding to study our coastal areas, which are being overexploited, eroded and polluted within sight of human population centres?
The deep sea is not as safe from human activities as one would think. As coastal resources become depleted, nations are moving further offshore in search of unexploited fish stocks, oil reserves, and materials needed to supply the global insatiable demand for technology.
Industrial bottom trawling is the most destructive deep-sea fishing practice, indiscriminately harvesting large quantities of target and non-target species, most of which are discarded. Deep-sea species are slow-growing and long-lived, which makes them highly vulnerable to overfishing.
In addition, large swaths of ancient deep coral habitats have been destroyed by bottom trawling, and recovery has been slow or non-existent. In return for this ecological damage, deep-sea fisheries contribute less than 0.5% of global fisheries landings, and most are heavily subsidised, suggesting they are also economically unsustainable.
Seabed mining for valuable metals is an emerging threat to deep-sea ecosystems such as manganese nodule fields, seamounts and hydrothermal vents. Many mining activities will occur on the high seas and are controlled by the International Seabed Authority (ISA). The ISA is responsible for ‘protecting the marine environment from harmful effects’, and is working with interested parties to develop mining regulations.
It remains to be seen however, how seamount crusts and active vents can be removed without causing harm. Superimposed on these physical impacts is the looming spectre of climate change, the effects of which we are just beginning to understand.
It is easy to place a value on the exploitation of an ecosystem, but much more challenging to calculate the value of preserving it. For example, some deep-sea ecosystems, particularly isolated hydrothermal vent systems or seamounts have high levels of endemism. Many benthic species use ‘chemical weapons’ for feeding or defence, and human medicines are often derived from these bio-active compounds. Microbes and metazoans that live in extreme environments can provide the key to evolution of life on earth or create new technologies. What then, is the ultimate cost of not knowing what we destroy?
Research has accomplished much over the past few decades, but cannot keep up with resource exploitation. Virgin fish stocks can be depleted faster than scientific support for protection can be generated. Exploration contracts for mineral extraction have been granted in areas where no research exists. Terrestrial and coastal ecosystems have been mined, overharvested and contaminated for short-term economic gain. Unless something changes, the deep oceans will suffer the same fate. As human populations increase and food security becomes a global issue, sustainable resource management will be critical.
The Magnuson-Stevens Act (MSA) is the primary law governing marine fisheries management in U.S. federal waters. One of the tenets of MSA is the ‘precautionary principle’, which stipulates that if an activity threatens marine resources, protective measures should be taken, even in the absence of scientific cause-and-effect evidence. This approach removes the burden on science to prove harm from industry activities and recognises the value of caution in an uncertain world.
Unless ‘precautionary principles’ are applied to ecosystem management globally, we risk squandering priceless and irreplaceable resources. Whether humans can radically change their behaviour on a sufficient scale to prevent an environmental crisis remains to be seen.
Please note: This is a commercial profile