Transformer-based Explainable AI for Biomechanical Prediction Analysis: Optimized Kinematics and Kinetics Predicition (MSc Thesis, Guided Research)
17.11.2025, Diplomarbeiten, Bachelor- und Masterarbeiten
This project aims to analyze the contribution of different input features to biomechanical prediction models based on wearable sensor data. The focus lies on applying and evaluating explainable AI (XAI) techniques to a transformer-based architecture. By integrating post-hoc interpretability methods with a state-of-the-art transformer model, the project aims to systematically quantify the impact of individual sensor features on the prediction of lower-body kinematics and kinetics.
The Human-Centered Computing and Extended Reality Lab of the Professorship for Machine Intelligence in Orthopedics seeks applicants for Master Thesis or Guided Research from now until a fitting candidate is found
AbstractThis project aims to analyze the contribution of different input features to biomechanical prediction models based on wearable sensor data. The focus lies on applying and evaluating explainable AI (XAI) techniques to a transformer-based architecture. By integrating post-hoc interpretability methods with a state-of-the-art transformer model [1], the project seeks to systematically quantify how individual sensor features affect the prediction of lower-body kinematics and kinetics. The overall objective is to derive insights for sensor reduction, model optimization, and improved interpretability in human motion analysis.
Background & MotivationAccurate motion tracking is essential for biomedical modeling, clinical diagnostics, and performance monitoring. While marker-based motion capture systems remain the gold standard in terms of precision, their usage is limited by high costs, controlled laboratory settings, and complex preparation procedures. Wearable sensors offer a practical and scalable alternative, enabling movement analysis in real-world environments. Inertial Measurement Units (IMUs) are widely used in modern wearable devices. Recent research demonstrates that combining IMU and electromyography (EMG) data provides strong predictive power for estimating joint kinematics and kinetics. However, achieving state-of-the-art accuracy typically requires extensive feature sets, which may limit model efficiency, interpretability, and usability. This project addresses these limitations by applying XAI methods to a transformer model to better understand feature relevance. Insights from this work can guide feature selection, reduce computational cost, and support future applications.
Student’s Task
- Training and evaluation of a transformer model
- Investigating XAI strategies for transformer models
- Applying these to a recently developed transformer architecture
- Evaluating the outcomes and providing suggestions for future changes
- Python, including libraries such as NumPy, Pandas, PyTorch, and TensorFlow
- Explainable AI methods
- Interest in optimization, explainability and development of machine learning models
- Interest in motion analysis
Please send your transcript of records, CV and motivation to: (Daniel.homm@tum.de) with CC to hex-thesis.ortho@mh.tum.de
You can find more information and other topics for theses on our website: https://hex-lab.io
Literatur
[1] Daryakenari, F. H., & Farizeh, T. (2025). A novel transformer-based method for full lower-limb joint angles and moments prediction in gait using sEMG and IMU data. doi:10.48550/ARXIV.2506.04577
[2] Chotikunnan, P., Khotakham, W., Chotikunnan, R., Roongprasert, K., Pititheeraphab, Y., Puttasakul, T., … Thongpance, N. (2025). Enhanced angle estimation using optimized artificial neural networks with temporal averaging in IMU-based motion tracking. Journal of Robotics and Control (JRC), 6(2), 1069–1082. doi:10.18196/jrc.v6i2.26345
[3] Xiang, L., Gao, Z., Yu, P., Fernandez, J., Gu, Y., Wang, R., & Gutierrez-Farewik, E. M. (2025). Explainable artificial intelligence for gait analysis: advances, pitfalls, and challenges - a systematic review. Frontiers in Bioengineering and Biotechnology, 13(1671344). doi:10.3389/fbioe.2025.1671344
[4] Kalasampath, K., Spoorthi, K. N., Sajeev, S., Kuppa, S. S., Ajay, K., & Maruthamuthu, A. (2025). A literature review on applications of explainable artificial intelligence (XAI). IEEE Access: Practical Innovations, Open Solutions, 13, 41111–41140. doi:10.1109/access.2025.3546681
[5] Cheng, Q., Xing, J., Xue, C., & Yang, X. (2025). Unifying prediction and explanation in time-series transformers via Shapley-based pretraining. doi:10.48550/ARXIV.2501.15070
Kontakt: daniel.homm@tum.de, hex-thesis.ortho@mh.tum.de
More Information
| Thesis-XAI |
Project: Explainable AI in Trasnformer-based motion analysis,
(Type: application/pdf,
Size: 128.1 kB)
Save attachment
|


