The Perception software group is a team of students focused on developing software that can extract features and positions from sensor data. The team utilizes different sensors such as camera, lidar, and sonar. Their primary objective is to provide filtered positions to the guidance and navigation systems of autonomous drones, enabling the drone to make informed decisions based on the data received.
What we have done so far:
# Created a Docker container
A Docker container is a lightweight and standalone executable package that includes all the necessary components to run an application. This includes the code, runtime environment, libraries, and system tools. Together with DevOps, we have developed a docker container for the perception stack. This makes it possible to test code without compatibility issues.
# Implemented first draft of PDAF
A Probabilistic Data Association Filter (PDAF) is a type of filtering algorithm used in Multi-Target Tracking (MTT) systems. MTT systems involve tracking multiple target such as other vehicles, pedestrians, or objects simultaneously. PDAF handles uncertainty and noise in data received from sensors. Thus, enabling it to make decisions based on probability, and it can be used to track multiple targets simultaneously.
Here is an example of the PDAF on dummy data. Every red x is a detection. The blue line is ground truth of a object moving in the world. Every green x are measurements that is filtered by PDAF. It can be seen that the filtered measurements coincide with the ground truth track.
# First draft of Lidar detection algorithm
Lidar detection algorithms using DBSCAN and Euclidian clustering has been implemented. This allows the ASV to locate objects in the world by using Lidar.
Here is a picture showing point cloud data of the vortex office. Where you see the white box, is the lidar detector recognizing an object.
What is in progress:
# Pipeline following
For Tau Autonomy Challenge’s pipeline inspection, we need to be able to follow a pipeline. We currently have path following from RoboSub, but we need to adapt this algorithm so that it can be used on a pipeline with ArUco markers on.
Here is an example of the old path following algorithm finding the contour of a path.
#Detecting a power puck on docking plate with ArUco markers on
We are currently able to detect ArUco markers in an image. Work is currently focused towards getting depth of the power puck relative to the camera.
An ArUco marker being detected.
#Buoy detection and path following
We are currently looking into how to robustly detect buoys and send detections to the guidance and navigation systems of the drone.
Goals for the semester:
The goal for the perception team is to complete all the tasks in both Njord and TAC. However, the team is focusing on the tasks autonomous docking and pipeline following in TAC, while also working on COLAV and Bouey following for Njord.