Abstract
PURPOSE: As medical education adopts a competency-based training method, experts are spending substantial amounts of time instructing and assessing trainees' competence. In this study, we look to develop a computer-assisted training platform that can provide instruction and assessment of open inguinal hernia repairs without needing an expert observer. We recognize workflow tasks based on the tool-tissue interactions, suggesting that we first need a method to identify tissues. This study aims to train a neural network in identifying tissues in a low-cost phantom as we work towards identifying the tool-tissue interactions needed for task recognition. METHODS: Eight simulated tissues were segmented throughout five videos from experienced surgeons who performed open inguinal hernia repairs on phantoms. A U-Net was trained using leave-one-user-out cross validation. The average F-score, false positive rate and false negative rate were calculated for each tissue to evaluate the U-Net's performance. RESULTS: Higher F-scores and lower false negative and positive rates were recorded for the skin, hernia sac, spermatic cord, and nerves, while slightly lower metrics were recorded for the subcutaneous tissue, Scarpa's fascia, external oblique aponeurosis and superficial epigastric vessels. CONCLUSION: The U-Net performed better in recognizing tissues that were relatively larger in size and more prevalent, while struggling to recognize smaller tissues only briefly visible. Since workflow recognition does not require perfect segmentation, we believe our U-Net is sufficient in recognizing the tissues of an inguinal hernia repair phantom. Future studies will explore combining our segmentation U-Net with tool detection as we work towards workflow recognition.
Original language | English |
---|---|
Title of host publication | Medical Imaging 2022 |
Subtitle of host publication | Image-Guided Procedures, Robotic Interventions, and Modeling |
Editors | Cristian A. Linte, Jeffrey H. Siewerdsen |
Publisher | SPIE |
Number of pages | 7 |
ISBN (Electronic) | 9781510649439 |
DOIs | |
Publication status | Published - Apr 2022 |
Event | Medical Imaging 2022: Image-Guided Procedures, Robotic Interventions, and Modeling - Virtual, Online Duration: 21 Mar 2022 → 27 Mar 2022 |
Publication series
Series | Progress in Biomedical Optics and Imaging - Proceedings of SPIE |
---|---|
Volume | 12034 |
ISSN | 1605-7422 |
Conference
Conference | Medical Imaging 2022: Image-Guided Procedures, Robotic Interventions, and Modeling |
---|---|
City | Virtual, Online |
Period | 21/03/22 → 27/03/22 |
Bibliographical note
Funding Information:E. Klosa is funded by the NSERC Undergraduate Summer Research Award. This work was funded, in part, by NIH/NIBIB and NIH/NIGMS (via grant 1R01EB021396-01A1 - Slicer+PLUS: Point-of-Care Ultrasound) and by CANARIE's Research Software Program. This work was also financially supported as a Collaborative Health Research Project (CHRP #127797), a joint initiative between the Natural Sciences and Engineering Research Council of Canada (NSERC) and the Canadian Institutes of Health Research (CIHR). R. Hisey is supported by NSERC as a Canada Graduate Scholar. G. Fichtinger is supported as a Canada Research Chair in Computer-Integrated Surgery. Financial support was received from the Southeastern Ontario Academic Medical Association (SEAMO), Educational Innovation and Research Fund.
Publisher Copyright:
© 2022 SPIE.