Computational Analysis of Table Tennis Matches from Real-Time Videos Using Deep Learning
Date
Authors
Supervisor
Item type
Conference Contribution
Degree name
Journal Title
Journal ISSN
Volume Title
Publisher
Springer
Abstract
In this paper, utilizing a multiscale training dataset, YOLOv8 demonstrates rapid inference capabilities and exceptional accuracy in detecting visual objects, particularly smaller ones. This outperforms transformer-based deep learning models, makes it a leading algorithm in its domain. Typically, the efficacy of visual object detection is gauged by using pre-trained models based on augmented datasets. Yet, for specific situations like table tennis matches and coaching sessions, fine-tuning is essential. Challenges in these scenarios include the rapid ball movement, color, light conditions, and bright reflections caused by intense illumination. In this paper, we introduce a motion-centric algorithm to the YOLOv8 model, aiming to boost the accuracy in predicting ball trajectories, landing spots, and ball velocity within the context of table tennis. Our adapted model not only enhances the real-time applications in sports coaching but also showcases potential for applications in other fast-paced environments. The experimental results indicate an improvement in detection rates and reduced false positives.Description
Source
In: Image and Video Technology 11th Pacific-Rim Symposium, PSIVT 2023, Auckland, New Zealand, November 22–24, 2023, Proceedings edited by Yan W. pp 69-81. Part of the book series: Lecture Notes in Computer Science ((LNCS, volume 14403))
Publisher's version
Rights statement
This is the Author's Accepted Manuscript of a conference paper published in the Proceedings of the 11th Pacific-Rim Symposium on Image and Video Technology. This version of the article has been accepted for publication, after peer review (when applicable) and is subject to Springer’s AM terms of use, but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record can be found at doi: 10.1007/978-981-97-0376-0_6
