This paper focuses on a comparative analysis of three deep learning models - Densely Connected Convolutional Networks (DenseNet) for image classification, Residual Network-Long Short Term Memory (ResNet-LSTM) for video classification, and Vision Transformer (ViT) for image classification. Here, the aim is to develop underwater video classification system with an emphasis on the trade-offs between accuracy and computational efficiency. Underwater video classification plays a critical role in applications such as marine life monitoring, autonomous underwater vehicles (AUVs), and underwater exploration, where choosing an optimal model requires balancing the performance with resource constraints. The FishCLEF2015 dataset, which contains images and videos of fish species with annotations, was evaluated using these models based on key metrics including classification accuracy, training time, and memory usage. The findings show that DenseNet provides efficient performance for image-based classification with moderate memory usage, making it suitable for real-time applications. The ResNet-LSTM hybrid model achieves the highest classification accuracy by leveraging both spatial and temporal features, although with higher computational costs. The ViT demonstrated strong performance for image classification tasks, particularly in challenging underwater visibility conditions, although with increased computational requirements. This study highlights the need for context-specific model selection depending on the application requirements for accuracy, speed, and computational resources. The paper further discusses potential optimization areas, such as lightweight architectures and hardware accelerators, to enhance model performance in underwater environments. These findings provide valuable insights for future research and practical applications in marine exploration.<p></p>
Funding
Jazan University
Saudi Arabia Cultural Bureau in the UK, London
History
Related Materials
1.
ISBN - Is identical to 979-8-3315-6874-0 (urn:isbn:979-8-3315-6874-0)