Image anomaly detection and prediction scheme based on SSA optimized ResNet50-BiGRU model

Authors

  • Qianhui Wan Department of Mathematics, University of California Davis, Davis, California, 95616, United Author
  • Zecheng Zhang New York University, Brooklyn, New York, 11201, United States; Author
  • Liheng Jiang New York University, Manhattan, New York, 10012, United States; Author
  • Zhaoqi Wang Viterbi School of Engineering, University of Southern California, Los Angeles, California, 90089, United States; Author
  • Yan Zhou Northeastern University San Jose, San Jose, California, 95131, United States; Author

Keywords:

ResNet50, BiGRU, SSA, Abnormal Detection, Damage Analysis

Abstract

Image anomaly detection is a popular research direction, with many methods emerging in recent years due to rapid advancements in computing. The use of artificial intelligence for image anomaly detection has been widely studied. By analyzing images of athlete posture and movement, it is possible to predict injury status and suggest necessary adjustments. Most existing methods rely on convolutional networks to extract information from irrelevant pixel data, limiting model accuracy. This paper introduces a network combining Residual Network (ResNet) and Bidirectional Gated Recurrent Unit (BiGRU), which can predict potential injury types and provide early warnings by analyzing changes in muscle and bone poses from video images. To address the high complexity of this network, the Sparrow search algorithm was used for optimization. Experiments conducted on four datasets demonstrated that our model has the smallest error in image anomaly detection compared to other models, showing strong adaptability. This provides a new approach for anomaly detection and predictive analysis in images, contributing to the sustainable development of human health and performance.

Published

2024-07-18

How to Cite

Image anomaly detection and prediction scheme based on SSA optimized ResNet50-BiGRU model. (2024). Journal of Intelligence Technology and Innovation, 2(2), 35-52. https://itip-submit.com/index.php/JITI/article/view/JITI-51-DOI-008.pdf