Evaluation of user‐interfaces for controlling movements of virtual minimally invasive surgical instruments

Recent tele‐mentoring technologies for minimally invasive surgery (MIS) augments the operative field with movements of virtual surgical instruments as visual cues. The objective of this work is to assess different user‐interfaces that effectively transfer mentor's hand gestures to the movements of virtual surgical instruments.

mentor to transfer realistic visual cues in a form of the motion of virtual surgical instruments and has emerged as an effective mode of transferring information pertaining to tool-tissue interaction. [12][13][14][15] The surgical instruments used for MIS are articulated in nature with multiple degrees-of-freedom and exhibit movement in threedimensional (3D) space with constraints imposed by incision points. 16 To control these virtual surgical instruments, high degreesof-freedom (DOF) input devices are generally needed to accurately capture human hand movement from the real world and translate it to the movement of virtual surgical instruments overlaid onto the operative field. Thus, a suitable user-friendly input device is necessary to facilitate efficient acquisition of information that the mentor wants to convey to the mentee.
In a MIS tele-mentoring, the mentor demonstrates the required tool-tissue interaction to the mentee using virtual surgical instrument motions. The mentee mentally grasps these visual cues (augmented on the operating field) and performs the surgical substep as demonstrated by the mentor. The study performed by Shabir et al. 17 shows that a path (projected on a two-dimensional operative field) defined by the mentor's virtual surgical instrument movement when compared to a predefined path varies with Dynamic Time Warping (DTW) distance of 1176.5 � 331. 8

. Dynamic Time
Warping distance is a similarity measure between two paths 18 and is used to assess the similarity between the paths defined by motions of surgical instruments. 19,20 In the operating room, when the mentee replicates the motion of virtual instrument performed by the mentor, the average DTW distance further increases to 3195.3 � 971.4 between the paths defined by the mentor's instrument movements and those of the mentee. Therefore, a selection of suitable of user interface is vital to reduce the prior error that may be induced in the tele-mentoring system by the mentor while manipulating virtual surgical instrument using the user interface. This will ensure that the information rendered to the mentee is accurate from the mentor's side.
Several previous works have been done to study and compare user interfaces for tele-robotic surgery and tele-mentoring scenarios during MIS. [21][22][23] These studies included quantification of humanmachine interactions via user-interfaces for tele-robotic surgery, 21 comparison of user interfaces of robotic surgical platforms based on degrees-of-freedom and force feedback, 22 and perception and interpretation of the transmitted operating field video on different visualization interfaces for tele-mentoring. 23 Though, the notion of using a user interface to control virtual surgical instrument motion for tele-mentoring has been explored and demonstrated in both laparoscopic 12,13,17,24 and robotic surgery, 13

| MATERIALS AND METHODS
To evaluate the user interfaces for surgical tele-mentoring during a MIS, a tele-mentoring framework proposed by Shabir et al. 13

| Interface devices used in the study
Three interfaces were used in the study (as shown in terface's stylus is presented in Figure 2. From a cost perspective, the SpaceMouse and Oculus Rift cost less (in range of $300-$700) as compared to the Touch Haptic Device (over $1000). The higher cost of Touch Haptic Device is due to the motors used for rendering feedback forces in the virtual environment. In the study, no feedback forces were rendered using the Touch Haptic Device.

| Interfacing algorithm
The following Algorithm 1 RenderInstrument describes the rendering of the overlaid virtual surgical instrument motion controlled by the user interface device, during our experiments.  tooltips contributing to the pinching mechanism, and the tooltips are rendered at θ 3 þ θ 4 and θ 3 − θ 4 . In the case of laparoscopic instrument type, a two degree-of-freedom surgical instrument ( Figure 3E) is used. It is considered as a special case of four degreeof-freedom robotic instrument type, where θ 3 and θ 2 are constant (180°).  Figure 4A).

| Setup for the user study
For the study, four surgical scenarios, namely A, B, C, and D, were simulated. The details of each scenario are presented in Table 2.
Laparoscopic tooltips have fewer degrees-of-freedom as compared to robotic tooltips, resulting in limited articulation. This constraint was taken into consideration while designing the scenarios. The Only right hand was used Participant was asked to move the virtual surgical instrument along a path using right hand ( Figure 4A). The task measured the ease of using the interface with right hand.

Scenario B Laparoscopic (manual) and robotic
Only left hand was used Participant was asked to move the virtual surgical instrument along a path using left hand. The task measured the ease of using the interface with left hand.

Scenario C Laparoscopic (manual) and robotic
Both left and right hand were used concurrently Participant was asked to move two virtual surgical instruments together along a path using both left and right hand. The task measured the ease of using the interface for dexterous maneuvering of the virtual surgical instruments using both hands.

Scenario D Robotic
Both left and right hand were used consecutively Participant was asked to orient a virtual surgical instrument (by matching it to a rendered V-shape as shown in Figure 4B) along a path while traversing. The tool was first traversed in a direction using right hand and then traversed back in the opposite direction using left hand.
(presented in    (Figure 6), the participants were able to traverse the path with a better accuracy (smaller deviations) using

| RESULTS
Oculus Rift, compared to other two interfaces in all the three scenarios.
In Figure 7, the participants performed better using Oculus Rift

Total duration
The total time taken by a participant to complete the task.
Average distance Average distance maintained by the tool from the path, that is, summation of distance of the tooltip from the curve with respect to time divided by the total time to complete the task Where Δt i is the change in time for a given instance i and d i is the Euclidian distance between the tooltip and the point on the path closest to the tooltip's position.

Percentage of time spent in a bin
Four bins with incremental size were defined (0-5, 5-10, 10-15, 15<). Each bin represents a range of distance maintained by the tooltip of the virtual surgical instrument from the path. The time spent in a bin corresponds to the time spent by the tooltip within the range of distance defined for the bin. The summation of the individual times in each bin gives the total duration. The percentage of the time spent in a bin is equal to the time spent in a bin with respect to the total duration.

✓ ✓ ✓
Orientation time Time required to orient the tooltip to match the rendered V-Shape. The next V-shape along the path is rendered only when the user properly orients the virtual tooltip for the current V-shape.
-7 of 13 the percentages of time spent in a bin was used. While the aforementioned results presented in Figure 5, Figure 6, and Figure 7 compare the user interfaces on the basis of two metrics (the average duration to complete the task and the average distance maintained by the tooltip from the path during the task), Figure 8 combines

F I G U R E 5
The average duration to complete the task and the average distance maintained by the tooltip during the task using the three user-interfaces for scenario A (right hand only), B (left hand only), and C (using both hands) in case of laparoscopic (manual) virtual surgical instruments. The comparison is tagged based on the p-value using a star-type categorisation: if p-value ≤ 0.05 then '**', if 0.05 < p-value ≤ 0.1 then '*', and if 0.1 < p-value, then '' Figure 9 shows the time required to orient the tooltip of a robotic surgical instrument to match the rendered V-shape. We While the user study was tailored to assess the user interfaces for MIS tele-mentoring scenarios, future work would be geared towards assessing the user interfaces on three broader directions. The first F I G U R E 7 Comparison of the average duration to complete the task and the average distance maintained by the tooltip during the task for manual and robotic virtual surgical instruments. The comparison is tagged based on the p-value using a star-type categorisation: if p-value ≤ 0.05 then '**', if 0.05 < p-value ≤ 0.1 then '*', and if 0.1 < p-value, then '' F I G U R E 8 Percentage of time spent by the virtual surgical instrument in the four bins with incremental size varying from 0 to 5 mm, 5-10 mm, 10-15 mm, and greater than 15 mm. Each bin represents a range of distances. The shortest distance between the tooltip of the virtual surgical instrument from the path is computed and categorised into one of the four bins. A percentage of time spent in a bin is calculated as the time spent by the tooltip in a given range of distance (corresponding to a bin) with respect to the total duration of the task F I G U R E 9 Comparison of average orientation time taken by each interface in scenario D when using robotic tools. The comparison is tagged based on the p-value using a star-type categorisation: if p-value ≤ 0.05 then '**', if 0.05 < p-value ≤ 0.1 then '*', and if 0.1 < p-value, then'' SHABIR ET AL.