SAM-based Structural Surface Damage Detection

Zehao Ye, Lucy Lovell, Asaad Faramarzi, Jelena Ninic*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Downloads (Pure)

Abstract

Traditional visual inspections of infrastructure assets include interpretation of structural surface defects and their severity, which is labour-intensive, time-consuming, and highly subjective task. Hence, automation is crucial for increasing productivity in infrastructure asset management while ensuring consistency. To tackle that problem, we introduce the application of the largest base vision model to date, the prompt-based Segment Anything Model (SAM), as a backbone for instance segmentation of surface defects. By fine-tuning its backbone using LoRA and innovatively integrating an advanced decoder, we achieved state-of-the-art performance in two datasets, including masonry crack and concrete crack. We distilled this model at various points, resulting in a student model with parameters fewer than 120 times that of SAMs backbone, yet maintaining advanced levels. Finally, we introduced a monocular-based perspective transformation method using the masonry crack dataset, evaluated crack sizes across multiple dimensions, and validated them through laser scanning. This research further advances automated damage detection methods.
Original languageEnglish
Title of host publicationProceedings of the 31st International Workshop on Intelligent Computing in Engineering
EditorsBelén Riveiro, Pedro Arias
PublisherUniversity of Vigo
Pages176-185
Number of pages10
Publication statusPublished - 3 Jul 2024
Event31st International Workshop on Intelligent Computing in Engineering - School of Industrial Engineering, Vigo, Spain
Duration: 1 Jul 20245 Jul 2024
https://3dgeoinfoeg-ice.webs.uvigo.es/home

Conference

Conference31st International Workshop on Intelligent Computing in Engineering
Abbreviated titleEG-ICE 2024
Country/TerritorySpain
CityVigo
Period1/07/245/07/24
Internet address

Cite this