Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

GNV2-SLAM: vision SLAM system for cowshed inspection robots

View through CrossRef
Simultaneous Localization and Mapping (SLAM) has emerged as one of the foundational technologies enabling mobile robots to achieve autonomous navigation, garnering significant attention in recent years. To address the limitations inherent in traditional SLAM systems when operating within dynamic environments, this paper proposes a new SLAM system named GNV2-SLAM based on ORB-SLAM2, offering an innovative solution for the scenario of cowshed inspection. This innovative system incorporates a lightweight object detection network called GNV2 based on YOLOv8. Additionally, it employs GhostNetv2 as backbone network. The CBAM attention mechanism and SCDown downsampling module were introduced to reduce the model complexity while ensuring detection accuracy. Experimental results indicate that the GNV2 network achieves excellent model compression effects while maintaining high performance: mAP@0.5 increased by 1.04%, reaching a total of 95.19%; model parameters were decreased by 41.95%, computational cost reduced by 36.71%, and the model size shrunk by 40.44%. Moreover, the GNV2-SLAM system incorporates point and line feature extraction techniques, effectively mitigate issues reduced feature point extraction caused by excessive dynamic targets or blurred images. Testing on the TUM dataset demonstrate that GNV2-SLAM significantly outperforms the traditional ORB-SLAM2 system in terms of positioning accuracy and robustness within dynamic environments. Specifically, there was a remarkable reduction of 96.13% in root mean square error (RMSE) for absolute trajectory error (ATE), alongside decreases of 88.36% and 86.19% for translation and rotation drift in relative pose error (RPE), respectively. In terms of tracking evaluation, GNV2-SLAM successfully completes the tracking processing of a single frame image within 30 ms, demonstrating expressive real-time performance and competitiveness. Following the deployment of this system on inspection robots and subsequent experimental trials conducted in the cowshed environment, the results indicate that when the robot operates at speeds of 0.4 m/s and 0.6 m/s, the pose trajectory output by GNV2-SLAM is more consistent with the robot's actual movement trajectory. This study systematically validated the system's significant advantages in target recognition and positioning accuracy through experimental verification, thereby providing a new technical solution for the comprehensive automation of cattle barn inspection tasks.
Title: GNV2-SLAM: vision SLAM system for cowshed inspection robots
Description:
Simultaneous Localization and Mapping (SLAM) has emerged as one of the foundational technologies enabling mobile robots to achieve autonomous navigation, garnering significant attention in recent years.
To address the limitations inherent in traditional SLAM systems when operating within dynamic environments, this paper proposes a new SLAM system named GNV2-SLAM based on ORB-SLAM2, offering an innovative solution for the scenario of cowshed inspection.
This innovative system incorporates a lightweight object detection network called GNV2 based on YOLOv8.
Additionally, it employs GhostNetv2 as backbone network.
The CBAM attention mechanism and SCDown downsampling module were introduced to reduce the model complexity while ensuring detection accuracy.
Experimental results indicate that the GNV2 network achieves excellent model compression effects while maintaining high performance: mAP@0.
5 increased by 1.
04%, reaching a total of 95.
19%; model parameters were decreased by 41.
95%, computational cost reduced by 36.
71%, and the model size shrunk by 40.
44%.
Moreover, the GNV2-SLAM system incorporates point and line feature extraction techniques, effectively mitigate issues reduced feature point extraction caused by excessive dynamic targets or blurred images.
Testing on the TUM dataset demonstrate that GNV2-SLAM significantly outperforms the traditional ORB-SLAM2 system in terms of positioning accuracy and robustness within dynamic environments.
Specifically, there was a remarkable reduction of 96.
13% in root mean square error (RMSE) for absolute trajectory error (ATE), alongside decreases of 88.
36% and 86.
19% for translation and rotation drift in relative pose error (RPE), respectively.
In terms of tracking evaluation, GNV2-SLAM successfully completes the tracking processing of a single frame image within 30 ms, demonstrating expressive real-time performance and competitiveness.
Following the deployment of this system on inspection robots and subsequent experimental trials conducted in the cowshed environment, the results indicate that when the robot operates at speeds of 0.
4 m/s and 0.
6 m/s, the pose trajectory output by GNV2-SLAM is more consistent with the robot's actual movement trajectory.
This study systematically validated the system's significant advantages in target recognition and positioning accuracy through experimental verification, thereby providing a new technical solution for the comprehensive automation of cattle barn inspection tasks.

Related Results

Eyes on Air
Eyes on Air
Abstract We at ADNOC Logistics & Services have identified the need for a Fully Integrated Inspection and Monitoring Solution to meet our operational, safety and ...
Information metrics for localization and mapping
Information metrics for localization and mapping
Decades of research have made possible the existence of several autonomous systems that successfully and efficiently navigate within a variety of environments under certain conditi...
Agricultural Robots for Harvesting and Planting
Agricultural Robots for Harvesting and Planting
The agricultural sector is at the forefront of technological innovation, seeking sustainable solutions to address the increasing demand for food production in the face of populatio...
Parallel robots with unconventional joints to achieve under-actuation and reconfigurability
Parallel robots with unconventional joints to achieve under-actuation and reconfigurability
The aim of the thesis is to define, analyze, and verify through simulations and practical implementations, parallel robots with unconventional joints that allow them to be under-ac...
Dynamic SLAM: A Visual SLAM in Outdoor Dynamic Scenes
Dynamic SLAM: A Visual SLAM in Outdoor Dynamic Scenes
Abstract Simultaneous localization and mapping (SLAM) has been widely used in augmented reality(AR), virtual reality(VR), robotics, and autonomous vehicles as the theoretic...
Depth-aware salient object segmentation
Depth-aware salient object segmentation
Object segmentation is an important task which is widely employed in many computer vision applications such as object detection, tracking, recognition, and ret...
Performance Analysis of the Microsoft Kinect Sensor for 2D Simultaneous Localization and Mapping (SLAM) Techniques
Performance Analysis of the Microsoft Kinect Sensor for 2D Simultaneous Localization and Mapping (SLAM) Techniques
This paper presents a performance analysis of two open-source, laser scanner-based Simultaneous Localization and Mapping (SLAM) techniques (i.e., Gmapping and Hector SLAM) using a...
Vision-specific and psychosocial impacts of low vision among patients with low vision at the eastern regional Low Vision Centre
Vision-specific and psychosocial impacts of low vision among patients with low vision at the eastern regional Low Vision Centre
Purpose: To determine vision-specific and psychosocial implications of low vision among patients with low vision visiting the Low Vision Centre of the Eastern Regional Hospital in ...

Back to Top