Tissue segmentation for workflow recognition in open inguinal hernia repair training

Elizabeth Klosa, Rebecca Hisey, Tahmina Nazari, Theo Wiggers, Boris Zevin, Tamas Ungi, Gabor Fichtinger

Research output: Chapter/Conference proceedingConference proceedingAcademicpeer-review

2 Citations (Scopus)

Abstract

PURPOSE: As medical education adopts a competency-based training method, experts are spending substantial amounts of time instructing and assessing trainees' competence. In this study, we look to develop a computer-assisted training platform that can provide instruction and assessment of open inguinal hernia repairs without needing an expert observer. We recognize workflow tasks based on the tool-tissue interactions, suggesting that we first need a method to identify tissues. This study aims to train a neural network in identifying tissues in a low-cost phantom as we work towards identifying the tool-tissue interactions needed for task recognition. METHODS: Eight simulated tissues were segmented throughout five videos from experienced surgeons who performed open inguinal hernia repairs on phantoms. A U-Net was trained using leave-one-user-out cross validation. The average F-score, false positive rate and false negative rate were calculated for each tissue to evaluate the U-Net's performance. RESULTS: Higher F-scores and lower false negative and positive rates were recorded for the skin, hernia sac, spermatic cord, and nerves, while slightly lower metrics were recorded for the subcutaneous tissue, Scarpa's fascia, external oblique aponeurosis and superficial epigastric vessels. CONCLUSION: The U-Net performed better in recognizing tissues that were relatively larger in size and more prevalent, while struggling to recognize smaller tissues only briefly visible. Since workflow recognition does not require perfect segmentation, we believe our U-Net is sufficient in recognizing the tissues of an inguinal hernia repair phantom. Future studies will explore combining our segmentation U-Net with tool detection as we work towards workflow recognition.

Original languageEnglish
Title of host publicationMedical Imaging 2022
Subtitle of host publicationImage-Guided Procedures, Robotic Interventions, and Modeling
EditorsCristian A. Linte, Jeffrey H. Siewerdsen
PublisherSPIE
Number of pages7
ISBN (Electronic)9781510649439
DOIs
Publication statusPublished - Apr 2022
EventMedical Imaging 2022: Image-Guided Procedures, Robotic Interventions, and Modeling - Virtual, Online
Duration: 21 Mar 202227 Mar 2022

Publication series

SeriesProgress in Biomedical Optics and Imaging - Proceedings of SPIE
Volume12034
ISSN1605-7422

Conference

ConferenceMedical Imaging 2022: Image-Guided Procedures, Robotic Interventions, and Modeling
CityVirtual, Online
Period21/03/2227/03/22

Bibliographical note

Funding Information:
E. Klosa is funded by the NSERC Undergraduate Summer Research Award. This work was funded, in part, by NIH/NIBIB and NIH/NIGMS (via grant 1R01EB021396-01A1 - Slicer+PLUS: Point-of-Care Ultrasound) and by CANARIE's Research Software Program. This work was also financially supported as a Collaborative Health Research Project (CHRP #127797), a joint initiative between the Natural Sciences and Engineering Research Council of Canada (NSERC) and the Canadian Institutes of Health Research (CIHR). R. Hisey is supported by NSERC as a Canada Graduate Scholar. G. Fichtinger is supported as a Canada Research Chair in Computer-Integrated Surgery. Financial support was received from the Southeastern Ontario Academic Medical Association (SEAMO), Educational Innovation and Research Fund.

Publisher Copyright:
© 2022 SPIE.

Fingerprint

Dive into the research topics of 'Tissue segmentation for workflow recognition in open inguinal hernia repair training'. Together they form a unique fingerprint.

Cite this