Melanie Olsen with the Blue ROV2. (Credit: AIMS, via communication)
The Great Barrier Reef (GBR) is composed of over 3,000 individual reefs, making it the most massive reef system on Earth. Its 900-plus islands cover more than 344,400 square kilometers, about half the size of Texas. That’s a lot of ground to cover if you’re a scientist or research team studying the reef; add in the difficulties inherent to conducting research underwater, including anything from sharks to bad weather, and you have a serious challenge on your hands.
Now, a research team from the Australian Institute of Marine Science (AIMS) is trialing an underwater robot with a hyperspectral camera that works in tandem with aerial drones, testing their ability to monitor the GBR.
The ROV the team is working with is built on a base vehicle that dives at up to 100 meters: the Blue Robotics BlueROV2.
“We redesigned the power supply system and some of the connectors to make them more field friendly, such as being able to disconnect the tether and battery without breaking seals,” explains Melanie Olsen, the AIMS Technology Transformation leader.
The robot navigates through the water semi-autonomously as a team member at the surface uses a gaming controller to assist the robot.
“We also have the ability to load guidance, navigation, and control algorithms to meet our purpose,” details Olsen. “One example of this is a scientific use case where a diver swims along a transect tape. We want our robot to automatically lock onto the transect tape and swim along.”
The robot also uses a Headwall Nanohyperspec hyperspectral camera in the 400 to 1000 nanometer spectral range to improve the vehicle’s performance and its monitoring abilities. The camera achieves frame rates up to 350 fps with its 640 x 480 CMOS image sensor and 7.4 µm pixel size. The Nano-Hyperspec also boasts a 17 mm lens, a GigE Vision interface, 640 spatial bands and 270 spectral bands, and onboard data processing and storage, with a storage capacity of 480GB.
“Hyperspectral cameras provide next level datasets for science when it comes to things like monitoring change over time (e.g. health),” Olsen describes. “They also interface well with machine learning augmented data processing workflows to enable us to more accurately and quickly make automated assessments of benthic health and biodiversity assessments.”
The hyperspectral camera captures in excess of 270 bands of color information—far more than what the unassisted human eye can manage—allowing it to provide robust, detailed information on reef health. The camera can tell the team about the water’s depth, map the ocean floor, and identify stretches of bleached coral.
Using this kind of ROV can free up a research team, keep them safer, and help stretch their labor hours so they go further.
“The ROV can definitely stay underwater for much longer and go deeper than a diver,” comments Olsen. “The research team can plan their missions to make better use of their special skills without having to have a full dive team (with their safety overheads) in the water.”
The ROV can also be deployed at night, offering a unique opportunity for monitoring and data collection.
“The underwater world of the Great Barrier Reef is very different at night, much like nocturnal versus daytime animals on land,” remarks Olsen. “By being able to conduct routine monitoring missions at night, we open up the ability to build long-term monitoring datasets of new reef/fish behavioral and biodiversity aspects. There are also other things we can measure at night such as bioluminescence. We successfully tested a bioluminescence sensor on the recent trial too.”
Deploying the aerial drone—also equipped with a hyperspectral camera—in tandem with the ROV also adds a new layer of sophistication to monitoring operations for the team. This was the first time the AIMS team tested the drone and ROV simultaneously during a night-time mission.
“We want to do coordinated missions between the air, surface, and subsurface layers in order to get the most value from the time we spend in the field,” states Olsen. “In the video, we had one of our normal imagery drones monitoring the track of the BlueROV2 at night.”
For Olsen, the ability to customize a monitoring system for a specific research need is one of the most important takeaways from this work.
“I think the most interesting thing is that we are trying to integrate off the shelf systems into capable maritime unmanned scientific platforms, and then deploy them together into one layered marine observation system,” comments Olsen. “The unmanned vehicle sector is advancing rapidly, so we want to be able to integrate new tech cost-effectively when it hits the market. The most expensive platform doesn’t always equate to the best tool for the job when you need to sustain an operational fleet at sea.”
Top image: Melanie Olsen with the Blue ROV2. (Credit: AIMS, via communication)