February Update

Børge Pahlm Project manager (Member since Sept. 2019)

This Sunday most of our members got together to present each other's progress on the AUV (Autonomous Underwater Vehicle) and ASV (Autonomous Surface Vehicle) projects. A lot has been accomplished since the start of the semester including advances in camera calibration, solving core problems in our software and having clearer solutions for our acoustics and mechanical challenges.

Some notable events from the last few weeks:

  • We're officially registered for the AUV competition RoboSub 2022!

  • We have registered the team for the ASV competition Njord - The Automous Ship Challenge.

  • Norway lifted all corona restrictions which made working on the project much more enjoyable!

  • Our software group now consists of three independent groups with their own group leaders instead of having one big group assigning tasks when needed. The new groups are Perception, Embedded and Autonomous.

Mechanical upgrades!

This year we're continuing to work on our AUV Beluga with the goal of redesigning it while keeping the core concepts. The mechanical group has been working hard to create a whole new chassis for the drone which will be unveiled later this spring. Our current frame was built from aluminium profiles for a modular structure in case the hardware configuration needed to be changed. This was the big hardware challenge when creating a drone that was meant to be autonomous from scratch for the first time considering the uncertainties of what equipment we can get ahold of as a student organization. This year's mechanical AUV design will be our way of solidifying the end result of that decision in a more rigid and better-looking upgrade.

One of the things that will stay identical to last year's design is the pneumatic actuator system. Today the mechanical team tested out a new mobile testing device for the pneumatic actuators. The image shows the white test box connected to the pneumatic glass housing which in turn is connected to our 10 bar tank. That's another great Quality of Life tool added to our repertoire.

The ASV is nearing its final design, and our current task has been to find a company to machine it from our buoyancy material. This is our first time developing a surface vehicle and the situation's similar to when we were creating Beluga last year. Focus on the practical concepts of which makes it work - then continue development from there. I'll say this much, it's a lot of power for such a small boat drone.

Additionally, we have performed corrosion tests on critical AUV components for the current AUV, also known as Beluga MK1, to prepare for more testing in saltwater. The Norwegian AUV competition we competed in last summer, TAC Challenge, was entirely in the open sea at one of Norway's biggest test sites for subsea equipment. This has helped influence the material choices for Beluga MK2. There's also a wide assortment of actuator upgrades to look forward to as well!

Debugging the EKF

We only needed to use our smallest available pool today to work on our current software problems. The Autonomous group spent the day debugging the EKF (Extended Kalman Filter) which provides pose estimates in the world frame.

We have struggled with the EKF spitting out unreasonable data. Apparently the variance (squared deviation) of one of the frames gave out a negative number. How a squared number can still be negative is one for history books of software errors making no sense at first glance. It was, however, a welcome breakthrough resulting in more sensible data when using a temporary workaround for the variance issue.

Camera Calibration and Underwater Computer Vision

Imagine wanting to have a camera to recognize your everyday office supplies. That's a fun challenge, and you can even use ready-made solutions for OpenCV to get you ready. Now imagine using this for a stereo camera. That's two cameras, and it means we can estimate the position to the object as well similar to our own eyes. Of course to make this work you'd have to imagine a few evenings working out the math and calibrating the camera too. And then you'd want to add your own objects that you want to identify to make it more relevant for your needs as well. Easy, right? Then imagine putting that camera underwater with both the waterproof glass layer in front of the lenses and the water refraction to mess with your results . Now that's a challenge!

Our Perception members working with computer vision is dealing with this exact problem. The last weeks, or months even, has been spent trying out different ways of calibrating the camera to give satisfying results. Normally, to identify an object like the gate in the RoboSub competition, you might want to try and recognize the gate based on the gate color and straight lines. The picture on the right shows how detrimental water refraction can mess with the recognition algorithm. The water makes it seem like the gate is bent which in turn hurts the efficiency of this traditional object recognition method. It serves as a great example of how well-calibrated equipment and well-tuned controllers are indispensable.

Localizing Acoustic Pingers

Localizing the pingers underwater needed for the competition has for years been a cross-group challenge involving hydrophones, custom amplifier boards, ADC's and rigid software. This year we're closer than ever to solving it. Our current solution is to use multilateration in 3D space which demands five hydrophones to acquire the desired localization results. Proof-of-concept is currently being translated from Python to C for our Teensy's, the amplifier PCB is soon to be ready and a new order of hydrophones has been filled out on our part. We're all excited to see all this progress on this problem as it's been with us for some time.