Recognition of Instrument Passing and Group Attention for Understanding Intraoperative State of Surgical Team
Koji Yokoyama, Goshiro Yamamoto, Chang Liu, Osamu Sugiyama, Luciano HO Santos, Tomohiro Kuroda
Vol. 11 (2022) p.37-47
Appropriate evaluation of the intraoperative state of a surgical team is essential for the improvement of teamwork and hence a safe surgical environment. Traditional methods to evaluate intraoperative team states such as interview and self-check questionnaire on each surgical team member often require human efforts, which are time-consuming and can be biased by individual recall. One effective solution is to analyze the surgical video and track the important team activities, such as whether the members are complying with the surgical procedure or are being distracted by unexpected events. However, due to the complexity of the situations in an operating room, identifying the team activities without any human effort remains challenging. In this work, we propose a novel approach that automatically recognizes and quantifies intraoperative activities from surgery videos. As a first step, we focus on recognizing two activities that especially involve multiple individuals: (a) passing of clean-packaged surgery instruments which is a representative interaction between the surgical technologists such as the circulating nurse and scrub nurse, and (b) group attention that may be attracted by unexpected events. We record surgical videos as input, and apply pose estimation and particle filters to extract individual’s face orientation, body orientation, and arm raise. These results coupled with individual IDs are then sent to an estimation model that provides the probability of each target activity. Simultaneously, a person model is generated and bound to each individual, which describes all the involved activities along the timeline. We tested our method using videos of simulated activities. The results showed that the system was able to recognize instrument passing and group attention with F1 = 0.95 and F1 = 0.66, respectively. We also implemented a system with an interface that automatically annotated intraoperative activities along the video timeline, and invited feedback from surgical technologists. The results suggest that the quantified and visualized activities can help improve understanding of the intraoperative state of the surgical team.