[Master Thesis] On-body AI-guided visualization of medical imaging volumes
02.10.2025, Diplomarbeiten, Bachelor- und Masterarbeiten
This research investigates the visualization of medical volumes, like computed tomography (CT) scans, on a patient's body. This is to be done with the help of virtual reality glasses, overlaying the scan over a user's virtual body. The goal is to provide an easy-to-understand way in which doctors and patients can communicate about upcoming medical procedures with visual support.
The Human-Centered Computing and Extended Reality Lab of the Professorship for Machine Intelligence in Orthopedics seeks applicants for Bachelor/Master Thesis for the Summer Semester 2025.
Project DescriptionDoctors and surgeons usually talk to patients about upcoming procedures to clarify options and explain details. Research has shown repeatedly that patients often fail to understand what is communicated in these meetings. This is due to multiple reasons, like being high-stress situations, complex topic, and a lack of time and aiding technologies. This is where this project will investigate options to support patients and doctors in their communication. The goal of this project is to show available medical volumetric data, like CT scans of patients' pathologies. This could be scans of cancer, broken bones, or similar. We want these scans to be shown in the anatomically correct place on the user's body using virtual reality headsets. As such, there are two main challenges in this project: (A) Creating a visualization environment where these scans can be loaded and displayed in virtual reality, as well as a (stylized) user's body. (B) Automatically positioning and aligning the medical volumes correctly to the body. This could, for example, be done using AI-guided segmentation to understand what is in these volumes and where the matching counterpart is on the user's body. There are preliminary works for both of these challenges available at the professorship.
Key research Areas
- Investigating alignment strategies for anatomical medical data
- Developing a visualization environment for the display of virtual avatar and anatomy
- Evaluating options to capture the wearer's pose to correctly reflect it in the virtual environment
- Investigating multi-user options so the doctor and patient can experience the visualization together
- Experience with the Unity game engine and C# programming
- Experience with mixed reality visualization and shader programming
- Experience with selecting, running, and adapting AI models to a specific task
- Interest in working with medical data
Please send your transcript of records, CV and motivation to: Constantin Kleinbeck (constantin.kleinbeck@tum.de) with CC to hex-thesis.ortho@mh.tum.de
Literatur
[1] A. Halbig, S. K. Babu, S. Gatter, M. E. Latoschik, K. Brukamp, and S. von Mammen, “Opportunities and Challenges of Virtual Reality in Healthcare – A Domain Experts Inquiry,” Front. Virtual Real., vol. 3, p. 837616, Mar. 2022, doi: 10.3389/frvir.2022.837616.
[2] O. Kutter et al., “Real-time Volume Rendering for High Quality Visualization in Augmented Reality”.
[3] J. Wasserthal et al., “TotalSegmentator: Robust Segmentation of 104 Anatomic Structures in CT Images,” Radiology: Artificial Intelligence, vol. 5, no. 5, p. e230024, Sept. 2023, doi: 10.1148/ryai.230024.
[4] K. Engel, M. Hadwiger, J. M. Kniss, and C. Rezk-Salama, “Real-Time Volume Graphics,” 2006, Accessed: Dec. 30, 2021. [Online]. Available: https://diglib.eg.org:443/xmlui/handle/10.2312/egt.20061064.0595-0748
[5] N. Hofmann, J. Hasselgren, P. Clarberg, and J. Munkberg, “Interactive Path Tracing and Reconstruction of Sparse Volumes,” Proc. ACM Comput. Graph. Interact. Tech., vol. 4, no. 1, pp. 1–19, Apr. 2021, doi: 10.1145/3451256.
Kontakt: hex-thesis.ortho@mh.tum.de, constantin.kleinbeck@tum.de


