Direkt zum Inhalt springen
login.png Login    |
de | en
MyTUM-Portal
Technische Universität München

Technische Universität München

Sitemap > Schwarzes Brett > Abschlussarbeiten, Bachelor- und Masterarbeiten > Master Thesis: Efficient 6DoF Surgical Robotic Grasping Based on Contact Heatmap
auf   Zurück zu  Nachrichten-Bereich    vorhergehendes   Browse in News  nächster    

Master Thesis: Efficient 6DoF Surgical Robotic Grasping Based on Contact Heatmap

01.05.2026, Abschlussarbeiten, Bachelor- und Masterarbeiten

Project Director: Prof. Dr. Daniel Roth

Project Advisors: Shiyu Li, Victor Schaack

Valid From: 01.05.2026

Valid Until: 30.06.2026

Contact: shiyu.li@tum.de, vic.schaack@tum.de

Project Description

Surgical pick-and-place tasks of instruments remain a cost- and labor-intensive, but cognitive trivial endeavor in the operating room. Robotic scrub nurses can help reduce the manual workload of human scrub nurses by taking over these tasks. While robotic scrub nurse concepts exist, they usually do not focus on picking up instruments for handover to a human operator.

The use of contact areas for robotic grasping remains an underexplored approach, especially in the context of surgical processes. In this thesis, the student will use a pre-recorded, proprietary dataset of real instrument-handling scenes recorded during simulated surgical procedures by real medical staff to develop a novel grasping policy that accounts for the subsequent handover to the human operator in its grasping pose selection. Isaac Lab will be used to build a physics simulation of the tools, robot, and operator, and train a policy, which is then validated on a real-life robot using real surgical instruments.

Goals

  • Generate grasping candidates using the CAD data of surgical tools and existing grasping networks.
  • Develop a filtering algorithm to reliably select the most successful grasping candidate, weighting candidates based on contact maps of their use during real surgical procedures.
  • Validate pipeline in an Isaac Lab-based simulation and real-life experiment.

    Technical Requirements

    • Student in robotics, computer science, or related fields
    • Good proficiency with C/C++ and Python
    • Good proficiency with Computer Vision
    • Good proficiency with programming using PyTorch
    • Prior knowledge of robotic kinematics, computer vision, and deep learning

    Literature

    Depierre, A., Dellandréa, E., & Chen, L. (2018). Jacquard: A large-scale dataset for robotic grasp detection. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

    Brahmbhatt, S., Tang, C., Twigg, C. D., Kemp, C. C., & Hays, J. (2020). ContactPose: A dataset of grasps with object contact and hand pose. European Conference on Computer Vision (ECCV).

    Fang, H. S., Wang, C., Gou, M., & Lu, C. (2020). GraspNet-1Billion: A large-scale benchmark for general object grasping. IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

    Vuong, A. D., Vu, M. N., Le, H., Huang, B., Binh, H. T. T., Vo, T., & Nguyen, A. (2024). Grasp-Anything: Large-scale grasp dataset from foundation models. IEEE International Conference on Robotics and Automation (ICRA).

    Li, Shunlei et al. (2025). Robonurse-VLA: Robotic scrub nurse system based on vision-language-action model. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

    Song, X., Li, Y., Zhang, Y., Liu, Y., & Jiang, L. (2025). An overview of learning-based dexterous grasping: recent advances and future directions. Artificial Intelligence Review.

  • Kontakt: shiyu.li@tum.de, hex-thesis.ortho@mh.tum.de

    Termine heute

    no events today.

    Veranstaltungskalender