An Effective Survey for Detecting and Classifying Road Damage using Deep Learning Algorithms
Abstract - The detection of road damages is required to keep driving conditions safe and to avoid accidents. The road has been maintained with the help of an expensive high-performance sensor. The deep learning technique has seen great success in recent decades and it has successfully employed to solve a range of object detection difficulties. Different deep learning-based image processing algorithms can detect and categorize various types of road damage, allowing for more efficient maintenance and resource management. Typically, region-based object detection fails to reliably discover cracks and is inefficient. Furthermore, in crack-like object detection, the networks face a serious data imbalance issue, which can cause the training to fail. Deep learning-based approaches are also domain-specific, resulting in poor model generalization. In this study, we focused on various models for detecting and classifying real-world road images, as well as an experimental result and its challenges. A Bi-level Grayscale enhanced Threshold Optimal Segmentation (BGTOS) method is built and tested using various datasets to detect and classify images more quickly and effectively for fully automated crack identification that does not require human intervention. Crack detection datasets from road surface photos are used to validate the proposed approaches.BGTOSis a cutting edge method for detecting visual object, localization, instance segmentation, and classification algorithm that can be used to detect and classify various types of damage in real-world road images. This model shows an accuracy of 97.6%when compared to various existing models. These benefits allow the above methodology as an efficacious assisted method for the detection and classification of road damages effectively.
Keywords - Region-based object detection, Bi-level Grayscale Enhanced Threshold Optimal Segmentation, BGTOS, Crack Detection, Instance Segmentation, Road Damage.