A Computational Framework for Sports Analytics

Document Type : Original Article

Authors

1 Department of Artificial Intelligence and Data Science, St.Joseph's College of Engineering

2 Dept of Electrical and Communication Engineering, St.Joseph's College of Engineering

3 Department of Electrical and Communication Engineering, St.Joseph's College of Engineering

4 Department of Artificial Intelligence and Data Science, St. Joseph's College of Engineering

Abstract
This paper presents a computational framework for estimating stride rate in football using advanced computer vision and deep learning techniques, integrating modules for player detection, tracking, team identification, pitch mapping, and performance analysis. Convolutional Neural Networks (CNNs) are used for spatial feature extraction, while YOLOv8 enables accurate player detection, and a Kalman filter supports robust multi-object tracking by modeling player motion as continuous trajectories in two-dimensional Euclidean space. Player kinematics are derived by computing velocity as the time derivative of position and total distance as the cumulative displacement between frames. The system incorporates AlphaPose for anatomical keypoint detection, allowing precise motion capture, and models periodic movement using sinusoidal functions to estimate stride frequency. To enhance accuracy, the Savitzky–Golay filter is applied for trajectory smoothing. Experimental evaluation on broadcast football footage demonstrates strong performance, achieving 93.1% consistency in stride rate estimation, a 90.3% success rate, and an error margin below 2%. Additionally, the integration of digital twinning technology enables real-time visualization of player movements, supporting applications in performance optimization, fatigue monitoring, and injury prevention, thereby advancing automated sports analytics through data-driven decision-making.

Keywords

Subjects


Articles in Press, Accepted Manuscript
Available Online from 08 May 2026

  • Receive Date 14 November 2025
  • Revise Date 01 May 2026
  • Accept Date 08 May 2026