Last Pool Test With Beluga MK1

Silje Greidung Head of Marketing

(Member since 2021)

This weekend we were ready for the last pool test with Beluga MK1. The plan for this weekend was to take ROS-bags of the calibration board in water, to test the pipeline, color detection and the sonar, and in the end do the mock pre-qualifier where we test everything altogether.

The Perception group met at Marine Cybernetics lab at Tyholt at 9 am. Since last time, the Perception group has been working with the ROS-bags from the previous pool test in January. The ROS-bags contains all the data from the runs – video, depth data, position data etc., and makes it possible to simulate the pool test when back in the office. The group started the day working with camera calibration, but there were some issues with the live calibration. Simon has made a node that could calibrate on the fly with the open CV, but, as we have experienced before, things that work well in the office can be a hassle when testing it live at the pool test. This time the live calibration didn’t work, probably due to pathing issues with the Xavier.

ROS-bags of the calibration board were recorded in order to calibrate the camera later. The idea is that we will be able to calibrate in real time – because of difference in light in each pool, we will have to calibrate the camera in each pool Beluga enters.

Benjaminas trying to keep the calibration board from floating to the surface.
How many engineering students does it take to make the calibration board sink?


Henrik and Jana studied the “RCFA” - Red Channel Frequency Analysis. The previous pixel histogram used for depth has been changed so it works for color instead. This works well with the color red, but now the goal is to detect the color orange, as both the gate and the paths which has to be followed in the RoboSub competition are orange. A test have been added to see if we have a gate based on the color intensity.

Jana and Henrik discussing the code for color detection.

Then there was the sonar testing. We are using a Gemini Imagining Sonar, which will give a 2D map of the surroundings – when it works as intended. We were able to get raw data (read: a lot of numbers which did not make any sense) from the sonar, but we were not able to visualize it. Therefore, most of this weekend’s work regarding the sonar consisted of debugging the sonar processor.

The sonar before testing.

On Sunday the Autonomous group started off with Beluga at 9 am. The plan for the day was tuning the localization systems and control systems, testing the automation Finite State Machine and and finally do the pre-qualifier.

The group experienced some problems with the visualization of the directions, which Finn and Tarek worked tirelessly to fix. More tests were needed to get an accurate measure of different accelerations. After the visualization problem was fixed they worked with virtual controlling the drone. The program used for this allows us to set a path that the drone will follow. A loud cheering from the Autonomous group roamed the room when Beluga perfectly followed a preset square route!

The route set for Beluga to follow.

At the end of the day we ran pre-qualifier tests for RoboSub 2022. We discovered some data transfer issues, relating to the communication between the FSM (Fine State Machine) and the drone's camera, that we now are working diligently to fix.

It has been two long days at the MC lab, which has given us a lot of data and lines of code. After this weekend we have gotten even closer to making Beluga fully autonomous, and are excited to test Beluga MK2 in a few weeks!