Purpose: Providing manual formative feedbacks to trainees in minimally invasive surgery is time-consuming and requires the observations of experts whose availability is often limited. The existing automatic assessment methods often provide a coarse and unidimensional evaluation that lacks formative value. Method: We trained a multi-layer perceptron to establish the multiple non-linear regression mapping a set of kinematic metrics to five scores representing six surgical technical criteria. We tested our method on two datasets: A new in-house laparoscopy dataset and a well-known robotic laparoscopy dataset. Results: Our results report that a simple deep learning model using motion metrics as inputs outperforms the state-of-the-art method based on a more elaborate end-to-end neural network. We also show that our method is suitable for both robotic and nonrobotic laparoscopic skills assessment. Moreover, the provided assessment is formative by rating specific technical skills, showing the student where to direct her/his training efforts. Conclusions: We developed a rather simple method for the automatic assessment, not only of the global surgical technical level, but of precise surgical technical skills. Our method shows that descriptive motion metrics combined with a simple deep learning model is powerful enough to capture and extract high level concepts such as surgical flow of operation or tissue handling. Such method can simplify access to a formative surgical training in both robotic and non-robotic laparoscopy.
KEYWORDS: Medical imaging, Ultrasonography, Mobile devices, Video, Data modeling, Imaging systems, 3D image reconstruction, Medicine, Complex systems, Dielectrophoresis
Introduction: Medical imaging technology has revolutionized health care over the past 30 years. This is especially true for ultrasound, a modality that an increasing amount of medical personal is starting to use. Purpose: The purpose of this study was to develop and evaluate a platform for improving medical image interpretation skills regardless of time and space and without the need for expensive imaging equipment or a patient to scan. Methods, results and conclusions: A stable web application with the needed functionality for image interpretation training and evaluation has been implemented. The system has been extensively tested internally and used during an international course in ultrasound-guided neurosurgery. The web application was well received and got very good System Usability Scale (SUS) scores.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.