Abstract
For manufacturing robots equipped with 3D vision sensors, the presence of environmental interference significantly impedes the precision of edge extraction. Existing edge feature extraction methods often enhance adaptability to interference at the expense of final extraction precision. This paper introduces a novel 3D visual edge detection method that ensures greater precision while maintaining adaptability, capable of addressing various forms of interference in real manufacturing scenarios. To address the challenge, data-driven and traditional visual approaches are integrated. Deep groove edge feature extraction and guidance tasks are used as a case study. R-CNN and improved OTSU algorithm with adaptive threshold are combined to identify groove features. Subsequently, a scale adaptive average slope sliding window algorithm is devised to extract groove edge points, along with a corresponding continuity evaluation algorithm. Real data is used to validate the performance of the proposed method. The experiment results show that the average error in processing interfered data is 0.29 mm, with an average maximum error of 0.54 mm, exhibiting superior overall performance and precision compared to traditional and data-driven methods.
| Original language | English |
|---|---|
| Article number | 116082 |
| Number of pages | 19 |
| Journal | Sensors and Actuators A: Physical |
| Volume | 381 |
| Early online date | 22 Nov 2024 |
| DOIs | |
| Publication status | Published - 1 Jan 2025 |
Bibliographical note
Publisher Copyright:© 2024 Elsevier B.V.
Keywords
- Deep groove edge feature
- Edge feature detection
- Interference environment
- Robotic 3D visual guidance
- Visual detection and guidance framework
ASJC Scopus subject areas
- Electronic, Optical and Magnetic Materials
- Instrumentation
- Condensed Matter Physics
- Surfaces, Coatings and Films
- Metals and Alloys
- Electrical and Electronic Engineering