The Ocean Decade Depends on Cloud Bathymetry
By Norman Barker
Global, detailed knowledge of the shape of the seafloor, also known as bathymetry, is critical to humankind: a vital step for more informed decision making, as well as groundbreaking scientific discoveries. In April 2023, for instance, the discovery of more than 19,000 undersea volcanoes helped advance studies in ecology, ocean mixing, and plate tectonics while improving our ability to protect and sustainably manage the ocean.
The United Nations has proclaimed 2021 to 2030 as the Ocean Decade, with researchers sharing the goal of mapping the world’s entire ocean floor by the end of 2030. Critical to this aim will be the evolution of database technology to deliver the capacity and compute power for such as massive endeavor.
Only about 25 percent of the world’s ocean floor has been properly mapped–and only about 6 percent at high resolution. The rest has been predicted from satellite altimeter data, which provide only an approximation of the shape of the seafloor.
There are several reasons for this current lack of knowledge. One is the high cost of mapping expeditions. The other is the absence of a unified database. Bathymetric, or marine geospatial, data exist in various unstructured forms, primarily sound navigation and ranging (sonar) point clouds, as well as videos, satellite-derived bathymetry, light detection and ranging (LiDAR) and satellite altimetry. Different types of data often reside in specialty databases, but siloing data can make it difficult to do overlays, or superimpose multiple data sets to reveal real-time insights. It can take months to cobble these systems together for analysis.
Consider that mapping an area as large as the ocean floor requires international collaboration and partnership among different research groups, and this siloing challenge can increase exponentially. The increasing volume, variety and velocity of data are putting more raw information at our fingertips than ever before, but the heterogeneity makes it very difficult to analyze data in aggregate and extract valuable observations.
In addition, the seafloor is a dynamic canvas. Accurate mapping requires the collection and analysis of vast amounts of data as close to real time as possible, as the seafloor’s shape constantly shifts. An example is the seabed mapping initiative of Geoscience Australia and its AusSeabed Marine Data Portal, which generate massive volumes of data with analysis in high-performance computing environments in the cloud. This system requires data to be immediately shareable from the vessel to the cloud.
It’s clear that bathymetric data–and the ultimate goal of global ocean bathymetry–need more advanced database approaches, coupled with a collaborative environment that works for all users. Geoscience Australia demonstrates the power of a unified system that streamlines all data into a single, contextual hub that enhances data sharing and collaboration among multifunctional teams and facilitates data governance as data are modified and shared among different groups, dramatically simplifying collaboration on large projects.
As we move forward in the Ocean Decade, data need to be cloud- and analysis-ready, right from the boat. There should be one data engine used on the vessel that can interoperate directly with the cloud in near real time. Given the sheer volume of seafloor data, compute power should exist right alongside the data to streamline the process of ocean discovery.
Norman Barker is the vice president of geospatial at TileDB. Previously, he focused on spatial indexing and image processing and held engineering positions at IBM Cloudant and Mapbox.
