The 2nd Workshop on Maritime Computer Vision (MaCVi) 2024 addresses maritime computer vision for Unmanned Aerial Vehicles (UAV) and Unmanned Surface Vehicles (USV). Three challenges categories are considered: (i) UAV-based Maritime Object Tracking with Re-ideruification, (ii) USV-based Maritime Obstacle Segmentation and Detection, (iii) USV-based Maritime Boat Tracking. The USV-based Maritime Obstacle Segmentation and Detection features three sub-challenges, including a new embedded challenge addressing efficicent inference on real-world embedded devices. This report offers a comprehensive overview of the findings from the challenges. We provide both statistical and qualitative analyses, evaluating trends from over 195 submissions. All datasets, evaluation code, and the leaderboard are available to the public at https://macvi.org/workshop/macvi24.
The process of fish cage inspections, which is a necessary maintenance task at any fish farm, be it small-scale or industrial, is a task that has the potential to be fully automated. Replacing trained divers who perform regular inspections with autonomous marine vehicles would lower the costs of manpower and remove the risks associated with humans performing underwater inspections. Achieving such a level of autonomy implies developing an image processing algorithm that is capable of estimating the state of biofouling buildup. The aim of this work is to propose a complete solution for automating the said inspection process; from developing an autonomous control algorithm for an ROV, to automatically segmenting images of fish cages, and accurately estimating the state of biofouling. The first part is achieved by modifying a commercially available ROV with an acoustic SBL positioning system and developing a closed-loop control system. The second part is realized by implementing a proposed biofouling estimation framework, which relies on AI to perform image segmentation, and by processing images using established computer vision methods to obtain a rough estimate of the distance of the ROV from the fish cage. This also involved developing a labeling tool in order to create a dataset of images for the neural network performing the semantic segmentation to be trained on. The experimental results show the viability of using an ROV fitted with an acoustic transponder for autonomous missions, and demonstrate the biofouling estimation framework’s ability to provide accurate assessments, alongside satisfactory distance estimation capabilities. In conclusion, the achieved biofouling estimation accuracy showcases clear potential for use in the aquaculture industry.
In the interest of both enabling long-term autonomous monitoring of at-risk marine environments and raising awareness and capabilities among citizens, a heterogeneous system of marine robots was developed, integrated, and deployed on a mission in the Adriatic Sea. This paper details a use-case scenario for a team of marine robotic agents for the purpose of cooperative marine litter detection and mapping, while also including interested citizens in the loop and allowing them to serve as operators. Two Autonomous Surface Vehicles (ASVs), a Remotely Operated Vehicle (ROV), and a Smart Buoy were deployed in a real marine environment to demonstrate the cooperative abilities of this system.
This paper presents an overview of advances in estimation of the biofouling state of fish cages as a part of the HEKTOR (Heterogeneous Autonomous Robotic System in Viticulture and Mariculture) project. Firstly, the developed framework for biofouling estimation is shown and explained in brief. A method using k-means clustering for labeling images of fish cages is outlined. It is followed by results of machine learning approaches for automatic inferring of semantic meaning of pixels in an image trained on a recorded dataset using said outlined method. Furthermore, a brief overview and results of contour detection on images classified by trained machine learning models are given. Moreover, a method of feature-based monocular camera distance estimation constrained by assumptions of the viewing angle is presented. All mentioned methods and algorithms fit together in order to produce an estimation of how biofouled the observed net is. The successfulness of the estimation depends on the viewing conditions while filming the cages. In good conditions the results are satisfactory and could be used by industrial fisheries in place of human labour. All the developed algorithms are fast enough so that the entire process from start to finish takes less than 1 second.
Aquaculture net pens inspection and monitoring are important to ensure net stability and fish health in the fish farms. Remotely operated vehicles (ROVs) offer a low-cost and sophisticated solution for the regular inspection of the underwater fish net pens due to their ability of visual sensing and autonomy in a challenging and dynamic aquaculture environment. In this paper, we report the integration of an ROV with a visual servoing scheme for regular inspection and tracking of the net pens. We propose a vision-based positioning scheme that consists of an object detector, a pose generator, and a closed-loop controller. The system employs a modular approach that first utilizes two easily identifiable parallel ropes attached to the net for image processing through traditional computer vision methods. Second, the reference positions of the ROV relative to the net plane are extracted on the basis of a vision triangulation method. Third, a closed-loop control law is employed to instruct the vehicle to traverse from top to bottom along the net plane to inspect its status. The proposed vision-based scheme has been implemented and tested both through simulations and field experiments. The extensive experimental results have allowed the assessment of the performance of the scheme that resulted satisfactorily and can supplement the traditional aquaculture net pens inspection and tracking systems.
There are activities in viticulture and mariculture that require extreme physical endurance from human workers, making them prime candidates for automation and robotization. This paper presents a novel, practical, heterogeneous, autonomous robotic system divided into two main parts, each dealing with respective scenarios in viticulture and mariculture. The robotic components and the subsystems that enable collaboration were developed as part of the ongoing HEKTOR project, and each specific scenario is presented. In viticulture, this includes vineyard surveillance, spraying and suckering with an all-terrain mobile manipulator (ATMM) and a lightweight autonomous aerial robot (LAAR) that can be used in very steep vineyards where other mechanization fails. In mariculture, scenarios include coordinated aerial and subsurface monitoring of fish net pens using the LAAR, an autonomous surface vehicle (ASV), and a remotely operated underwater vehicle (ROV). All robotic components communicate and coordinate their actions through the Robot Operating System (ROS). Field tests demonstrate the great capabilities of the HEKTOR system for the fully autonomous execution of very strenuous and hazardous work in viticulture and mariculture, while meeting the necessary conditions for the required quality and quantity of the work performed.
This paper deals with the development of control software for a remotely operated vehicle (ROV) as part of an automated fish cage inspection system in aquaculture. The ROV navigates autonomously around the fish cages streaming video to the topside computer that runs the algorithms for vertical rope detection. The topside computer sends back the velocity references to the ROV in real-time in order to successfully complete the inspection task. These images are also used to determine the state of net biofouling using a pre-trained convolutional neural network to help determine which nets are in need of cleaning. Robot Operating System (ROS) framework is developed to enable the topside computer to access the video stream from the ROV, process the images, and send back velocity references that would result in the complete inspection of fish cages. The inspection task is planned by following the recognizable rope segments of the outer structure of the fish cage downwards by controlling the vehicle's yaw, heave, and depth.
This paper presents an overview and preliminary results of the HEKTOR (Heterogeneous Autonomous Robotic System in Viticulture and Mariculture) project. An survey of applications of a heterogeneous cooperative autonomous robotic system, consisting of aerial, surface and underwater vehicles, in mariculture scenarios is presented. Target mariculture applications of the HEKTOR robotic system are autonomous fish net cage inspection (biofouling and damage detection) as well as biomass estimation. Furthermore, detailed description of the acquired autonomous vehicles is given, namely unmanned aerial vehicle and remotely operated vehicle, as well as the catamaran-shaped autonomous surface vehicle that is developed in the scope of the project.
This paper presents the overview and preliminary results of the HEKTOR - Heterogeneous Autonomous Robotic System in Viticulture and Mariculture project. HEKTOR is divided into two main parts, each dealing with specific scenarios in viticulture and mariculture. The robots used in the project and each specific scenario considered are presented. In viticulture, this includes vineyard surveillance, spraying and bud rubbing using an all-terrain mobile manipulator and unmanned aerial vehicle (UAV). In mariculture, scenarios include coordinated monitoring of fish net cages from below the surface and from the air, using the UAV, an unmanned surface vehicle (USV) and a remotely operated underwater vehicle (ROV).
Underwater cultural heritage sites are subject to constant change, whether due to natural forces such as sediments, waves, currents or human intervention. Until a few decades ago, the documentation and research of these sites was mostly done manually by diving archaeologists. This paper presents the results of the integration of remote sensing technologies with autonomous marine vehicles in order to make the task of site documentation even faster, more accurate, more efficient and more precisely georeferenced. It includes the integration of multibeam sonar, side scan sonar and various cameras into autonomous surface and underwater vehicles, remotely operated vehicle and unmanned aerial vehicle. In total, case studies for nine underwater cultural heritage sites around the Mediterranean region are presented. Each case study contains a brief archaeological background of the site, the methodology of using autonomous marine vehicles and sensors for their documentation, and the results in the form of georeferenced side-scan sonar mosaics, bathymetric models or reconstructed photogrammetric models. It is important to mention that this was the first time that any of the selected sites were documented with sonar technologies or autonomous marine vehicles. The main objective of these surveys was to document and assess the current state of the sites and to establish a basis on which future monitoring operations could be built and compared. Beyond the mere documentation and physical preservation, examples of the use of these results for the digital preservation of the sites in augmented and virtual reality are presented.
Plitvice Lakes National Park is the largest national park in Croatia and also the oldest from 1949. It was added to the UNESCO World Natural Heritage List in 1979, due to the unique physicochemical and biological conditions that have led to the creation of 16 named and several smaller unnamed lakes, which are cascading one into the next. Previous scientific research proved that the increased amount of dissolved organic matter (pollution) stops the travertine processes on Plitvice Lakes. Therefore, this complex, dynamic but also fragile geological, biological and hydrological system required a comprehensive limnological survey. Thirteen of the sixteen lakes mentioned above were initially surveyed from the air by an unmanned aircraft equipped with a survey grade GNSS and a full frame high-resolution full-screen camera. From these recordings, a georeferenced, high-resolution orthophoto was generated, on which the following surveys by a multibeam sonar depended. It is important to mention that this was the first time that these lakes had ever been surveyed both with the multibeam sonar technique and with such a high-resolution camera. Due to the fact that these thirteen lakes are difficult to reach and often too shallow for a boat-mounted sonar, a special autonomous surface vehicle was developed. The lakes were surveyed by the autonomous surface vehicle mounted with a multibeam sonar to create detailed bathymetric models of the lakes. The missions were planned for the surface vehicle based on the orthophoto from the preliminary studies. A detailed description of the methodology used to survey the different lakes is given here. In addition, the resulting high-resolution bathymetric maps are presented and analysed together with an overview of average, maximum depths and number of data points. Numerous interesting depressions, which are phenomena consistent with previous studies of Plitvice Lakes, are noted at the lake beds and their causes are discussed. This study shows the huge potential of remote sensing technologies integrated into autonomous vehicles in terms of much faster surveys, several orders of magnitude more data points (compared to manual surveys of a few decades ago), as well as data accuracy, precision and georeferencing.
Nema pronađenih rezultata, molimo da izmjenite uslove pretrage i pokušate ponovo!
Ova stranica koristi kolačiće da bi vam pružila najbolje iskustvo
Saznaj više