Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?


Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 03.05.19 in Vol 21, No 5 (2019): May

Preprints (earlier versions) of this paper are available at, first published Aug 12, 2018.

This paper is in the following e-collection/theme issue:


    Use of Commercial Off-The-Shelf Devices for the Detection of Manual Gestures in Surgery: Systematic Literature Review

    1Faculty of Health Sciences, Universitat Oberta de Catalunya, Barcelona, Spain

    2Faculty of Health Sciences, Universidad de Manizales, Caldas, Colombia

    3Faculty of Psychology and Education Sciences, Universitat Oberta de Catalunya, Barcelona, Spain

    *all authors contributed equally

    Corresponding Author:

    Francesc Saigí-Rubió, PhD

    Faculty of Health Sciences

    Universitat Oberta de Catalunya

    Avinguda del Tibidabo 39-43

    Barcelona, 08035


    Phone: 34 933263622



    Background: The increasingly pervasive presence of technology in the operating room raises the need to study the interaction between the surgeon and computer system. A new generation of tools known as commercial off-the-shelf (COTS) devices enabling touchless gesture–based human-computer interaction is currently being explored as a solution in surgical environments.

    Objective: The aim of this systematic literature review was to provide an account of the state of the art of COTS devices in the detection of manual gestures in surgery and to identify their use as a simulation tool for motor skills teaching in minimally invasive surgery (MIS).

    Methods: For this systematic literature review, a search was conducted in PubMed, Excerpta Medica dataBASE, ScienceDirect, Espacenet, OpenGrey, and the Institute of Electrical and Electronics Engineers databases. Articles published between January 2000 and December 2017 on the use of COTS devices for gesture detection in surgical environments and in simulation for surgical skills learning in MIS were evaluated and selected.

    Results: A total of 3180 studies were identified, 86 of which met the search selection criteria. Microsoft Kinect (Microsoft Corp) and the Leap Motion Controller (Leap Motion Inc) were the most widely used COTS devices. The most common intervention was image manipulation in surgical and interventional radiology environments, followed by interaction with virtual reality environments for educational or interventional purposes. The possibility of using this technology to develop portable low-cost simulators for skills learning in MIS was also examined. As most of the articles identified in this systematic review were proof-of-concept or prototype user testing and feasibility testing studies, we concluded that the field was still in the exploratory phase in areas requiring touchless manipulation within environments and settings that must adhere to asepsis and antisepsis protocols, such as angiography suites and operating rooms.

    Conclusions: COTS devices applied to hand and instrument gesture–based interfaces in the field of simulation for skills learning and training in MIS could open up a promising field to achieve ubiquitous training and presurgical warm up.

    J Med Internet Res 2019;21(5):e11925





    The increasingly pervasive presence of technology in the operating room raises the need to study the interaction between the surgeon and computer system. In sterile environments, using the hand to operate a mouse, keyboard, or touchscreen is unacceptable as it alters the normal pace of surgery and breaks asepsis and antisepsis protocols [1-6]. Using a physical barrier between the surgeon’s gloves and the interaction device [7], or the foot for manipulation, are not practical solutions either, as they do not allow fine interaction and carry risks of contamination [8]. Moreover, using a person to manipulate images in accordance with the surgeon’s verbal instructions has proven difficult and is prone to giving rise to misunderstandings when the visualization of specific areas of the image are requested [9,10].

    Early solutions to circumvent any contact between the surgeon and computer were based on voice recognition Automated Endoscopic System for Optimal Positioning (AESOP) and HERMES (Stryker Europe) [11,12], but these systems were impractical as they were difficult to use when performing complex tasks [13]. Natural user interfaces were first developed in the 1990s to enable interaction with the computer through natural human movements to manipulate radiological images in sterile surgical environments [14]. Gesture-based interfaces were another variant [15]. These enabled touchless manipulations to be performed and held great promise as a viable solution in the operating room and autopsy suites [10,16-19]. However, they could not be employed in sterile environments as they required some contact when gloves or position sensors were used [20-24].

    Early attempts to use touchless gestures in minimally invasive surgery (MIS) involved hand and facial gestures [9,25]. Gesture recognition systems with Web and video cameras were later described [26,27] using the time-of-flight principle [28] and achieving interaction with the OsiriX viewer [17,29]. However, these systems were very expensive and inaccurate and required calibration and a complex setup, making them impractical for use in the operating room [30].

    A new generation of tools known as commercial off-the-shelf (COTS) devices enabling touchless gesture–based human-computer interaction is currently being explored as a solution in surgical environments. The term COTS refers to a device that can be taken from a shelf, that is, sold over the counter. In addition to being low-cost, wireless, and ergonomic, they facilitate real-time interactivity and allow the user to point to and manipulate objects with 6 degrees of freedom [31]. Hansen et al described the use of the Wii Remote (Nintendo) for the intraoperative modification of resection planes in liver surgery [32], whereas Gallo et al used it for pointing to and manipulating 3-dimensional (3D) medical data in a number of ways [31,33-36]. However, intraoperative manipulation of the device required it to be wrapped in a sterile bag, thus eliminating the concept of contactless. In November 2010, the Microsoft Kinect (MK) 3D depth camera system (Microsoft Corp) was launched as a device for the Xbox 360 games console. The first descriptions of MK for medical use were in relation to physical and cognitive rehabilitation [37]. Subsequent experiences in this field showed that additional studies were required on issues such as effectiveness, commitment, and usability [38-40]. Its use in an operating room was first reported in 2011, at Sunnybrook Hospital in Toronto, when it was used to view magnetic resonance imaging and computed tomography scans, eventually giving rise to the GestSure system [13]. In 2012, the Leap Motion Controller (LMC; Leap Motion Inc) was launched, and in July 2013, the Myo armband (Thalmic Labs) was launched.

    Construct validity [41,42], concurrent validity [43,44], and predictive validity [45,46] studies, as well as systematic reviews [47,48], have shown that simulation in virtual reality environments is an effective tool for motor skills learning in MIS. However, the high cost of virtual reality and augmented reality simulators calls for the development of new, portable low-cost solutions enabling ubiquitous learning. New COTS technologies that allow hand gestures and instrument movements to be detected open up an interesting field of exploration for the development and validation of new simulation models in virtual environments. One of the objectives of this systematic review was to recognize the existence of developments in this area.


    The aim of this systematic review was to provide an account of the state of the art of COTS devices in the detection of manual gestures in surgery and to identify their use as a simulation tool for motor skills teaching in MIS.


    Article Retrieval

    A search was conducted in the electronic databases PubMed, Excerpta Medica database (EMBASE), ScienceDirect, Espacenet, OpenGrey, and the Institute of Electrical and Electronics Engineers (IEEE) for articles published between January 2000 and December 2017, using combinations of the following Medical Subject Headings (MeSH) terms: surgery, computer simulation, simulation training, laparoscopy, minimally invasive surgical procedures, robotic surgical procedures, and virtual reality. The following were used as free terms: commercial off-the-shelf, COTS, surgical education, surgical simulation, Wii, Microsoft Kinect, Xbox Kinect, Leap Motion, Leap Motion Controller, Myo armband, and gesture control. The search strategy used a combination of MeSH terms and free terms. Boolean operators (AND and OR) were used to expand, exclude, or join keywords in the search. The devised strategy was applied first to PubMed and then to the remaining databases.

    The search was limited to English-language publications and was complemented using the snowballing technique to identify relevant articles in the references of articles returned by our search [49]. A manual search was also conducted on the indices of the following publications: Surgical Endoscopy, Surgical Innovation, Minimally Invasive Therapy and Allied Technologies, the Journal of Medical Internet Research, and the Journal of Surgical Education. The snowballing search and the manual reviews enabled the retrieval of conference proceedings, letters to the editor, and simple concept descriptions. A MeaSurement Tool to Assess systematic Reviews (AMSTAR) [50] and Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) [51] checklists were used to ensure the quality of the review. In total, 3 authors assessed the risk of bias. Disagreement on bias assessment and the interpretation of results was resolved by consensus discussions.

    Study Selection

    A total of 3180 studies were identified, and the abstracts were reviewed to determine whether they met the inclusion and exclusion criteria. The inclusion criteria were (1) original research articles, (2) proof-of-concept or prototype user testing and feasibility testing studies, (3) studies conducted in surgical environments (preoperative, intraoperative, or postoperative), and (4) studies carried out in real or simulated surgical settings. The exclusion criteria were (1) studies on COTS devices requiring hand contact, (2) studies conducted in nonsurgical clinical environments, and (3) studies on the technical description of devices that did not include criteria of clinical usability, feasibility, or acceptance as an outcome. Studies on COTS devices requiring hand contact (ie, Wii) were excluded from the analysis. After the first review of the titles and abstracts, 361 studies were selected, 220 of which corresponded to the Wii device and were therefore discarded. Of the 141 remaining articles, 55 were duplicate references. After reading the full texts of these studies, 86 were deemed to have met the search selection criteria. The search and selection processes are summarized in Figure 1.

    Figure 1. Flow diagram of studies through the review.
    View this figure

    We used a standardized form for data extraction, which included the following items: study, device on which the study was conducted, year of publication, aim, type of study, intervention, metrics, sample, and results and conclusions; clinical areas in which the study was conducted and types of surgical intervention (Tables 1-4) (see Multimedia Appendices 1-3 for the full Tables 1-3) and use of gesture-based COTS devices in surgery (Table 5). In total, 2 authors (FAL and MM) screened all the articles individually. Discrepancies were always resolved through discussion with the senior author (FSR) whenever necessary. All the data were analyzed qualitatively and quantitatively.


    Of the 86 articles identified, 43 (50%) were on MK, 31 (36%) were on the LMC, 2 compared MK with the LMC [77,113], 1 compared the LMC with the Myo armband [58], 1 compared MK with the LMC and the Myo armband [52], 6 were on web, video, or commercial cameras (7%), and 2 reviewed gesture interaction in general [59,65]. The data and detailed information on the studies reviewed are shown in Tables 1-3 (see Multimedia Appendices 1-3 for the full Tables 1-3). The results are organized by the type of COTS device used (Tables 1-3, see Multimedia Appendices 1-3 for the full Tables 1-3), by the type of surgical specialties in which COTS devices were used (Table 4), and by the type of use made of COTS devices in surgery, including simulation for motor skills learning (Table 5).

    Table 1. Summary of included studies evaluating Microsoft Kinect.
    View this table
    Table 2. Summary of included studies evaluating the Leap Motion Controller.
    View this table
    Table 3. Summary of included studies evaluating other devices.
    View this table
    Table 4. Clinical areas and types of surgical intervention in which gesture-based commercial off-the-shelf devices were used.
    View this table
    Table 5. Use of gesture-based commercial off-the-shelf devices in surgery.
    View this table

    Aims, Types of Study, Metrics, Samples, Results and Conclusions

    In 78% (67/86) of the articles, the aim was to develop, create, present, describe, propose, examine, or explore a COTS-based system for gesture recognition in surgery. Most of the articles [65] identified in this systematic review were proof-of-concept or prototype user testing and observational and feasibility testing studies (Tables 1-3, see Multimedia Appendices 1-3 for the full Tables 1-3). In the 5 ethnographic studies included, the aim was to identify interactions between the staff and gesture-based COTS systems in interventional radiology departments or in the operating room [19,59,65,78,114]. In 4 studies, the aim was to compare the performance of MK with that of a mouse [5,79,80,96]; in 1 study, it was to compare the performance of the LMC with that of a mouse [81]; and in 4 studies, it was to compare different COTS devices [52,58,77,113]. In 10 studies, the aim was to evaluate face validity [97,120], content validity [97], construct validity [66,110,111,120,121,126,127,132], or concurrent validity of the devices [66,71,121,126]. A total of 7 studies involved experiments [19,26,113,115,122,123,131] and there was 1 patent application for an LMC-based application [124] and 1 interrater reliability study [72]. In addition, 1 study was a quasi-experimental prospective, blinded study with test-retest reliability [121]. Only 2 randomized controlled trials were identified [80,98], and when a tool for assessing risk of bias in randomized trials [133] was applied to them, it was found to be low in both.

    In total, 25 out of 86 (29%) articles failed to describe the metric used, whereas 23 out of 86 (27%) used time as the main one. Given the varied nature of the design of the studies, the remaining 38 articles described multiple metrics such as performance rates, percentage of gesture recognition, accuracy of gesture recognition and/or speed of transmission thereof, measures of volume or distance, and questionnaires or interviews. Similarly, the sample types and numbers were very dissimilar: 17.4% of the articles did not describe the sample type, and the remainder stated that the samples comprised medical or veterinary students or specialists in several radiological or surgical specialties (Table 4).


    The most common intervention (42 studies) was image manipulation in general radiology, ultrasound imaging, interventional radiology, angiography, computed tomography, magnetic resonance imaging, and real-time elastography (in the operating room, in the operative dentistry setting, or in the interventional radiology suites; Tables 1-3; see Multimedia Appendices 1-3 for the full Tables 1-3). Table 5 shows other uses identified for gesture-based COTS devices in surgical environments.

    Use of Commercial Off-The-Shelf Devices as Simulation Tools for Motor Skills Teaching in Minimally Invasive Surgery

    In the field of skills learning in MIS, in 2013, Pérez et al first described the tracking of laparoscopic instruments using webcams, with encouraging results [122]. From 2016, several authors proposed the interesting possibility of using COTS devices for tracking laparoscopic instruments. Such devices include both the LMC [108,121,123,124] and MK [125]. In 2017, a portable low-cost simulator using the LMC [120] for basic motor skills learning in MIS was described, and so too were a simulator for endoscopic third ventriculostomy learning [66] and a head-mounted display system using Oculus Rift and the LMC to guide neuroendoscopic surgery by manipulating 3D images [70]. Others used the approach of tracking hand movements during MIS training [109,126]. Only 1 study explored the use of the LMC to assess surgical dexterity in tying surgical knots in open surgery [127].

    Furthermore, 1 study compared 3 natural user interfaces (MK, the LMC, and the Myo armband) in combination with voice control to perform 2 hepatectomies and 2 partial nephrectomies on an experimental porcine model [52]; similar to the studies by Wright [66] and Xu [70], this study used 3D reconstructions of preoperative images of the patient, which were manipulated by gestures during surgery. However, the application of gesture control technology in these cases is not for training purposes but for surgical assistance and planification.


    Principal Findings

    Using commercial devices to detect manual gestures in surgery is a very topical issue, given the need to manipulate medical images and for real-time 3D reconstructions during procedures without breaking asepsis and antisepsis protocols. Early studies published on this possibility used COTS systems with webcams, Complementary Metal-Oxide-Semiconductor-sensor cameras, and commercial digital cameras [26,27,53,82]. These pioneering studies showed that contactless interaction with images and medical information in environments such as operating rooms was possible using low-cost devices.

    In this systematic review, MK and the LMC were identified as the most widely used COTS systems. MK was rated as a useful tool for the manipulation of medical data in sterile environments, with a positive rate of acceptance in 85% (39/46) of the studies on it. The LMC had a positive rate of acceptance in 83% (29/35) of the studies on it. The Myo armband was used to manipulate interventional neuroradiology images [58]. In addition, in a comparative study of the Myo armband, MK, and the LMC, they were used to manipulate images while hepatectomies and partial nephrectomies were being performed on an animal model [52]. In both cases, the device was rated highly. The main positive characteristics identified for the devices were the following: there was no need for contact; they were low-cost and portable; there was no need for calibration at the time of use; the gesture learning curve was easy; and the gesture recognition rates were high.

    Performance of Individual Devices

    MK [30] and the LMC [14,81,87,134,135] both use infrared cameras. The MK system is based on the time-of-flight principle [61], whereas the LMC is based on a sensor for infrared optical tracking with stereo vision accuracy. The MK depth sensor works at a distance between 0.8 m and 3.5 m, and the interface tracks the skeleton of the system operator. The wide range of distances at which the device recognizes gestures presents problems when using it in close interaction. The LMC detects the positions of fine objects such as finger tips or pen tips in a Cartesian plane. Its interaction zone is an inverted cone of approximately 0.23 m³ and the motion detection range fluctuates between 20 mm and 600 mm [91,129]. The manufacturer reports an accuracy of 0.01 mm for fingertip detection, although 1 study showed an accuracy of 0.7 mm, which is considered superior to that achieved using MK [134,136]. The dimensions of the MK device are 280 mm (width), 71 mm (depth), and 66 mm (height) and its weight is 556 g, whereas those of the LMC are 76 mm (width), 30 mm (depth), and 13 mm (height) and its weight is 45 g.

    Only 5 of the 46 (11%) studies that evaluated MK identified disadvantages relating to a longer latency time, difficulty in recreating an image when compared with a keyboard or mouse [5], limited gesture recognition, interference between the movements of different people in small environments [85,89,130], and the users’ preference for a mouse in a comparative study [96]. Various studies have highlighted the inaccuracy of MK in detecting finger movements [5,17,85,137], and the system also requires the use of large format screens [14,24,54,85,90]. The system was taken off the market in October 2017.

    With regard to the LMC, once the 6 studies on robotics had been discarded, 4 articles were identified that presented limitations derived from using the device (18%). These studies noted alterations in performance when there was dirt on the surface of the device, as well as the limited number of gestures recognized owing to the occlusion phenomenon [87], alterations caused by ambient lighting [129], fatigue in some users [90], and a lack of studies validating the device for medical use [77].

    The Myo armband was launched in 2013. This wearable wireless device is able to record electromyography via 8 stainless steel dry surface electrodes. It has a 9-axis inertial measurement unit sensor, haptic feedback, and Bluetooth communication capability. The main disadvantage is its limited sampling frequency of 200 Hz [138-140]. In total, 2 studies on the Myo armband were identified. The first concluded that the combination of the Myo armband and voice commands provided the most intuitive and accurate natural user interface [141]. The second compared the Myo armband and LMC with traditional image manipulation methods in surgery and concluded that the new input modalities had the potential to become more efficient [58].

    Commercial Off-The-Shelf Devices in Robotic Surgery

    Studies on the application of gesture-based COTS devices in robot-assisted surgery failed to demonstrate usefulness, owing to either the high cost of the robotic arm when using commercial cameras in surgical instrumentation [115] or, in the case of the LMC, the need for a more robust Application Programming Interface [116,117] and the lack of sufficient accuracy and robustness for manipulating a medical robot [113]. However, an ethnographic study found that MK was useful for workflow monitoring and for avoiding collisions between medical robots and operating room staff [114]. A simulation study of endonasal pituitary surgery comparing the LMC with the Phantom Omni showed that surgeons achieved a very similar percentage of tumor mass resection and procedure duration using the LMC to control the robot [118]. Another study found that the robotic tools could be controlled by gestures for training purposes but that the level of control had yet to reach that of a contact-based robotic controller [119].

    Commercial Off-The-Shelf Devices in Training and Simulation

    Studies on the use of COTS devices for gesture-based interfaces using the hand in the field of education in surgery refer to the use of virtual reality and augmented reality for teaching anatomy or for living the immersive experience within a virtual operating room. A total of 3 studies explored the possibility of using MK as a tool for skills learning in bronchoscopy and colonoscopy by means of simulation [110-112].

    Various authors explored the possibility of hand tracking [109,126] or instrument tracking [108,121-125] using COTS devices to assess performance in MIS training. From these 2 approaches, Lahanas [120] eventually presented a portable low-cost model of a virtual reality simulator for basic motor skills learning in MIS, which was based on the LMC and capable of tracking instruments. The author also presented face and contrast validity studies. The original forceps tracking problems noted by the author were probably because of the fact that they were black. Problems caused by this color were also described in the study by Oropesa. This issue had already been raised by our group [108].

    In the field of simulation for robotic surgery learning, the first studies published [113,115-117] found that the interfaces did not allow robots to be manipulated by gestures. However, the most recent publications [118,119] have suggested that the LMC could be a low-cost solution for creating control interfaces for surgical robots for the purposes of performing operations or training by means of simulation.

    Ethnographic Studies

    Ethnographic studies [59,65,78,83,114] deserve a separate mention as they transcend proofs-of-concept and user and prototype testing and approach gesture-based touchless interaction from a holistic viewpoint that includes the social practices of surgery, as well as the way in which medical images and manipulation devices are embedded and made meaningful within the collaborative practices of the surgery [10].

    Requirements for the Future

    There was found to be a shortage of objective validation studies (face validity: 1 study; concurrent validity: 3 studies; construct validity: 3 studies; discriminant validity: none; and predictive validity: none) of the different applications developed and presented as prototypes or proofs-of-concept for use in the clinical or teaching field. In teaching, the field of hand gesture–based interfaces should prioritize the following research objectives: first, to transcend studies on technical feasibility and individual hand gesture–based interaction with medical images so as to tackle the issue systematically within a framework of collaborative discussion, as happens in real surgical environments; and second, to conduct experimental studies in simulated surgical environments that allow hand gestures to be validated as a useful tool for touchless interaction in real operating rooms. To that end, the language of hand gestures for medical use would have to be standardized, so that the surgeons’ cognitive load can be reduced. In turn, algorithms should be developed to allow differentiation between intentional and unintentional gestures (spotting) in the small spaces of the operating room. Finally, the problem of temporal segmentation ambiguity (how to define the gesture start and end points) and that of spatial-temporal variability (gestures can vary significantly from one individual to another) must be resolved.

    From the range of evidence found, it is possible to infer that, with regard to the use of COTS devices, there is a very interesting field of study for the development and objective validation (contrast, concurrent, discriminant, and predictive validities) of portable low-cost virtual reality simulators for motor skills learning in MIS and robotic surgery. Such simulators will enable surgeons to do presurgical warm-ups anywhere at any time based on 3D reconstructions of specific patients’ images [52,66,70,108]. Thus, surgeons will be able to practice the surgery the night before they are due to perform it from the comfort of their own homes.

    Despite the fact that MK was taken off the market in 2017 and that the LMC software only allows tool tracking up to V2 Tracking, the use of interaction with gesture-based virtual environments in the field of simulation identified in this review will enable new COTS devices (ie, the Myo armband) to be explored for skills learning in MIS and robotic surgery.


    A number of potential methodological limitations in our systematic review should be discussed. First, our inclusion criteria were limited to English-language publications. Second, although we used the most commonly used search engines in the health field (PubMed, EMBASE, ScienceDirect, Espacenet, OpenGrey, and IEEE) and complemented that by using the snowballing technique to identify relevant articles in the results generated by our search, we may have missed a few articles related to our research question. Finally, there may have been some potential for subjectivity in analyzing the findings, although 2 authors carefully reviewed each study independently and then discussed the results while double-checking each process and subsequently resolved any discrepancies through discussions with the third author whenever necessary.


    As most of the articles identified in this systematic review are proof-of-concept or prototype user testing and feasibility testing studies, we can conclude that the field is still in the exploratory phase in areas requiring touchless manipulation within environments and settings that must adhere to asepsis and antisepsis protocols, such as angiography suites and operating rooms.

    Without doubt, COTS devices applied to hand and instrument gesture–based interfaces in the field of simulation for skills learning and training in MIS could open up a promising field to achieve ubiquitous training and presurgical warm-up.

    The withdrawal of MK from the market and suspension of the instrument tracking function in the latest LMC software versions constitute threats to the new developments identified in this review. Nevertheless, gesture-based interaction devices are clearly useful for manipulating images in interventional radiology environments or the operating room and for the development of virtual reality simulators for skills training in MIS and robotic surgery.

    Authors' Contributions

    All the authors contributed substantially to the study conception and design, data analysis and interpretation of the findings, and manuscript drafting. FAL participated in the collection and assembly of data. FSR is the guarantor of the paper. All the authors have read, revised, and approved the final manuscript.

    Conflicts of Interest

    None declared.

    Multimedia Appendix 1

    Summary of included studies evaluating Microsoft Kinect.

    PDF File (Adobe PDF File), 176KB

    Multimedia Appendix 2

    Summary of included studies evaluating the Leap Motion Controller.

    PDF File (Adobe PDF File), 132KB

    Multimedia Appendix 3

    Summary of included studies evaluating other devices.

    PDF File (Adobe PDF File), 66KB


    1. Bures S, Fishbain JT, Uyehara CF, Parker JM, Berg BW. Computer keyboards and faucet handles as reservoirs of nosocomial pathogens in the intensive care unit. Am J Infect Control 2000 Dec;28(6):465-471. [CrossRef] [Medline]
    2. Schultz M, Gill J, Zubairi S, Huber R, Gordin F. Bacterial contamination of computer keyboards in a teaching hospital. Infect Control Hosp Epidemiol 2003 Apr;24(4):302-303. [CrossRef] [Medline]
    3. Hartmann B, Benson M, Junger A, Quinzio L, Röhrig R, Fengler B, et al. Computer keyboard and mouse as a reservoir of pathogens in an intensive care unit. J Clin Monit Comput 2004 Feb;18(1):7-12. [CrossRef] [Medline]
    4. Lu P, Siu LK, Chen T, Ma L, Chiang W, Chen Y, et al. Methicillin-resistant Staphylococcus aureus and Acinetobacter baumannii on computer interface surfaces of hospital wards and association with clinical isolates. BMC Infect Dis 2009 Oct 1;9:164 [FREE Full text] [CrossRef] [Medline]
    5. Ebert LC, Hatch G, Ampanozi G, Thali MJ, Ross S. You can't touch this: touch-free navigation through radiological images. Surg Innov 2012 Sep;19(3):301-307. [CrossRef] [Medline]
    6. D'Antonio NN, Rihs JD, Stout JE, Yu VL. Computer keyboard covers impregnated with a novel antimicrobial polymer significantly reduce microbial contamination. Am J Infect Control 2013 Apr;41(4):337-339. [CrossRef] [Medline]
    7. Ionescu AV. A mouse in the OR Ambidextrous. Stanford Univ Journal of Design 2006;30:2 [FREE Full text]
    8. van Veelen MA, Snijders CJ, van Leeuwen E, Goossens RH, Kazemier G. Improvement of foot pedals used during surgery based on new ergonomic guidelines. Surg Endosc 2003 Jul;17(7):1086-1091. [CrossRef] [Medline]
    9. Grätzel C, Fong T, Grange S, Baur C. A non-contact mouse for surgeon-computer interaction. Technol Health Care 2004;12(3):245-257 [FREE Full text] [Medline]
    10. O'Hara K, Dastur N, Carrell T, Gonzalez G, Sellen A, Penney G, et al. Touchless interaction in surgery. Commun ACM 2014 Jan 1;57(1):70-77 [FREE Full text] [CrossRef]
    11. El-Shallaly GE, Mohammed B, Muhtaseb MS, Hamouda AH, Nassar AH. Voice recognition interfaces (VRI) optimize the utilization of theatre staff and time during laparoscopic cholecystectomy. Minim Invasive Ther Allied Technol 2005;14(6):369-371. [CrossRef] [Medline]
    12. Nathan CO, Chakradeo V, Malhotra K, D'Agostino H, Patwardhan R. The voice-controlled robotic assist scope holder AESOP for the endoscopic approach to the sella. Skull Base 2006 Aug;16(3):123-131 [FREE Full text] [CrossRef] [Medline]
    13. Strickland M, Tremaine J, Brigley G, Law C. Using a depth-sensing infrared camera system to access and manipulate medical imaging from within the sterile operating field. Can J Surg 2013 Jun;56(3):E1-E6 [FREE Full text] [CrossRef] [Medline]
    14. Rosa GM, Elizondo ML. Use of a gesture user interface as a touchless image navigation system in dental surgery: case series report. Imaging Sci Dent 2014 Jun;44(2):155-160 [FREE Full text] [CrossRef] [Medline]
    15. Wachs J. Purdue Equestrain Team. 2007. Optimal Hand-Gesture Vocabulary Design Methodology for Virtual Robotic Control   URL: [accessed 2019-04-01] [WebCite Cache]
    16. Yanagihara Y, Hiromitsu H. System for selecting and generating images controlled by eye movements applicable to CT image display. Med Imaging Technol 2000;18:725 [FREE Full text] [CrossRef]
    17. Gallo L, Placitelli A, Ciampi M. Controller-free exploration of medical image data: experiencing the Kinect. In: Proceedings of the 2011 24th International Symposium on Computer-Based Medical Systems. 2011 Jun 27 Presented at: CMBS'11; June 27-30, 2011; Bristol, UK p. 1-6. [CrossRef]
    18. Coddington J, Xu J, Sridharan S, Rege M, Bailey R. Gaze-based image retrieval system using dual eye-trackers. 2012 Jan 12 Presented at: 2012 IEEE International Conference on Emerging Signal Processing Applications; January 12-14, 2012; Las Vegas, NV, USA p. 37. [CrossRef]
    19. Jacob MG, Wachs JP, Packer RA. Hand-gesture-based sterile interface for the operating room using contextual cues for the navigation of radiological images. J Am Med Inform Assoc 2013 Jun;20(e1):e183-e186 [FREE Full text] [CrossRef] [Medline]
    20. Tani B, Maia R, von Wangenheim A. A Gesture Interface for Radiological Workstations. In: Twentieth IEEE International Symposium on Computer-Based Medical Systems. Maribor: IEEE; 2007 Jun 20 Presented at: CMBS'07; June 20-22, 2007; Maribor, Slovenia p. 07. [CrossRef]
    21. Zudilova-Seinstra E, de Koning P, Suinesiaputra A, van Schooten B, van der Geest R, Reiber J, et al. Evaluation of 2D and 3D glove input applied to medical image analysis. Int J Hum Comput Stud 2010 Jun;68(6):355-369 [FREE Full text] [CrossRef]
    22. Kirmizibayrak C. Interactive Volume Visualization and Editing Methods for Surgical Applications. Washington, DC: George Washington University; 2001.
    23. Bigdelou A, Schwarz A, Navab N. An adaptive solution for intra-operative gesture-based human-machine interaction. In: Proceedings of the 2012 ACM international conference on Intelligent User Interfaces. New York, NY, USA: ACM; 2012 Presented at: IUI'12; February 14-17, 2012; Lisbon, Portugal p. 75-84. [CrossRef]
    24. Ren G, O'Neill E. 3D selection with freehand gesture. Comput Graph 2013 May;37(3):101-120. [CrossRef]
    25. Nishikawa A, Hosoi T, Koara K, Negoro D, Hikita A, Asano S, et al. FAce MOUSe: a novel human-machine interface for controlling the position of a laparoscope. IEEE Trans Robot Autom 2003 Oct;19(5):825-841. [CrossRef]
    26. Wachs JP, Stern HI, Edan Y, Gillam M, Handler J, Feied C, et al. A gesture-based tool for sterile browsing of radiology images. J Am Med Inform Assoc 2008;15(3):321-323 [FREE Full text] [CrossRef] [Medline]
    27. Wachs J, Stern H, Edan Y, Gillam M, Feied C, Smithd M, et al. Real-time hand gesture interface for browsing medical images. Int J Intell Comput Med Sci Image Process 2008 Jan;2(1):15-25 [FREE Full text] [CrossRef]
    28. Soutschek S, Penne J, Hornegger J, Kornhuber J. 3-D gesture-based scene navigation in medical imaging applications using Time-of-Flight cameras. In: IEEE. Anchorage, AK: IEEE; 2008 Presented at: 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops; June 23-28, 2008; Anchorage, AK, USA p. 08. [CrossRef]
    29. Kipshagen T, Tronnier V, Bonsanto M, Hofmann UG. Touch-marker-free interaction with medical software. Berlin, Heidelberg: Springer; 2009 Sep 07 Presented at: World Congress on Medical Physics and Biomedical Engineering; September 7-12, 2009; Munich, Germany p. 7-12   URL: [CrossRef]
    30. Ruppert GC, Reis LO, Amorim PH, de Moraes TF, da Silva JV. Touchless gesture user interface for interactive image visualization in urological surgery. World J Urol 2012 Oct;30(5):687-691. [CrossRef] [Medline]
    31. Gallo L, De Pietro G, Marra I. 3D interaction with volumetric medical data: experiencing the Wiimote. In: Proceedings of the 1st international conference on Ambient media and systems. Quebec, Canada: ICTS, editor; 2008 Presented at: Ambi-Sys'08; February 11-14, 2008; Brussels, Belgium. [CrossRef]
    32. Hansen C, Köhn A, Schlichting S, Weiler F, Zidowitz S, Kleemann M, et al. Intraoperative modification of resection plans for liver surgery. Int J CARS 2008 Jun 4;3(3-4):291-297 [FREE Full text] [CrossRef]
    33. Gallo L, De Pietro G, Coronato A. Toward a natural interface to virtual medical imaging environments. In: Proceedings of the working conference on Advanced visual interfaces. New York: ACM; 2008 Presented at: AVI'08; May 28-30, 2008; Napoli, Italy   URL:
    34. Gallo L, Pietro G. Input devices and interaction techniques for VR-enhanced medicine. In: Jeong J, Damiani E, editors. Multimedia Techniques for Device and Ambient Intelligence. Boston, MA: Springer US; 2009:115.
    35. Gallo L, Minutolo A, de Pietro G. A user interface for VR-ready 3D medical imaging by off-the-shelf input devices. Comput Biol Med 2010 Mar;40(3):350-358. [CrossRef] [Medline]
    36. Gallo L. A glove-based interface for 3D medical image visualization. In: Tsihrintzis G, Damiani E, Virvou M, Howlett R, Jain L, editors. Intelligent Interactive Multimedia Systems and Services. Berlin Heidelberg: Springer; 2010:221.
    37. Chang Y, Chen S, Huang J. A Kinect-based system for physical rehabilitation: a pilot study for young adults with motor disabilities. Res Dev Disabil 2011;32(6):2566-2570. [CrossRef] [Medline]
    38. Leiker AM, Miller M, Brewer L, Nelson M, Siow M, Lohse K. The relationship between engagement and neurophysiological measures of attention in motion-controlled video games: a randomized controlled trial. JMIR Serious Games 2016 Apr 21;4(1):e4 [FREE Full text] [CrossRef] [Medline]
    39. Simor FW, Brum MR, Schmidt JD, Rieder R, de Marchi AC. Usability evaluation methods for gesture-based games: a systematic review. JMIR Serious Games 2016 Oct 4;4(2):e17 [FREE Full text] [CrossRef] [Medline]
    40. Dimaguila GL, Gray K, Merolli M. Person-generated health data in simulated rehabilitation using Kinect for stroke: literature review. JMIR Rehabil Assist Technol 2018 May 8;5(1):e11 [FREE Full text] [CrossRef] [Medline]
    41. Gallagher A, Satava RM. Virtual reality as a metric for the assessment of laparoscopic psychomotor skills. Learning curves and reliability measures. Surg Endosc 2002 Dec;16(12):1746-1752. [CrossRef] [Medline]
    42. Korndorffer J, Clayton J, Tesfay S, Brunner W, Sierra R, Dunne J, et al. Multicenter construct validity for southwestern laparoscopic videotrainer stations. J Surg Res 2005 Sep;128(1):114-119. [CrossRef] [Medline]
    43. Ritter E, Kindelan T, Michael C, Pimentel EA, Bowyer MW. Concurrent validity of augmented reality metrics applied to the fundamentals of laparoscopic surgery (FLS). Surg Endosc 2007 Aug;21(8):1441-1445. [CrossRef] [Medline]
    44. Hennessey I, Hewett P. Construct, concurrent, and content validity of the eoSim laparoscopic simulator. J Laparoendosc Adv Surg Tech A 2013 Oct;23(10):855-860. [CrossRef] [Medline]
    45. Seymour NE, Gallagher AG, Roman SA, O'Brien MK, Bansal VK, Andersen DK, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg 2002 Oct;236(4):458-63; discussion 463. [CrossRef] [Medline]
    46. Schijven M, Jakimowicz J, Broeders IA, Tseng L. The Eindhoven laparoscopic cholecystectomy training course--improving operating room performance using virtual reality training: results from the first E.A.E.S. accredited virtual reality trainings curriculum. Surg Endosc 2005 Sep;19(9):1220-1226. [CrossRef] [Medline]
    47. Gurusamy K, Aggarwal R, Palanivelu L, Davidson B. Systematic review of randomized controlled trials on the effectiveness of virtual reality training for laparoscopic surgery. Br J Surg 2008 Sep;95(9):1088-1097. [CrossRef] [Medline]
    48. Larsen C, Oestergaard J, Ottesen B, Soerensen J. The efficacy of virtual reality simulation training in laparoscopy: a systematic review of randomized trials. Acta Obstet Gynecol Scand 2012 Sep;91(9):1015-1028. [CrossRef] [Medline]
    49. Greenhalgh T, Peacock R. Effectiveness and efficiency of search methods in systematic reviews of complex evidence: audit of primary sources. Br Med J 2005 Nov 5;331(7524):1064-1065 [FREE Full text] [CrossRef] [Medline]
    50. Shea BJ, Grimshaw JM, Wells GA, Boers M, Andersson N, Hamel C, et al. Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews. BMC Med Res Methodol 2007 Feb 15;7:10 [FREE Full text] [CrossRef] [Medline]
    51. Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gøtzsche PC, Ioannidis JP, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: explanation and elaboration. Br Med J 2009 Jul 21;339:b2700 [FREE Full text] [CrossRef] [Medline]
    52. Sánchez-Margallo FM, Sánchez-Margallo JA, Moyano-Cuevas J, Pérez EM, Maestre J. Use of natural user interfaces for image navigation during laparoscopic surgery: initial experience. Minim Invasive Ther Allied Technol 2017 Oct;26(5):253-261. [CrossRef] [Medline]
    53. Grange S, Terrence W, Fong T, Baur C. M/ORIS: A medical/operating room interaction system. In: Proceedings of the 6th international conference on Multimodal interfaces. 2004 Presented at: ICMI'04; October 13-15, 2004; State College, PA, USA p. 159-166. [CrossRef]
    54. Bizzotto N, Costanzo A, Bizzotto L, Regis D, Sandri A, Magnan B. Leap motion gesture control with OsiriX in the operating room to control imaging: first experiences during live surgery. Surg Innov 2014 Dec;21(6):655-656. [CrossRef] [Medline]
    55. Bizzotto N, Costanzo A, Maluta T, Dall?Oca C, Lavini F, Sandri A. Preliminary experience with the use of leap motion gesture control to manage imaging in the operating room. J Orthopaed Traumatol 2014 Nov;15(Suppl 1):19-20 [FREE Full text]
    56. Streba C, Gheonea I, Streba L, Sandulescu L, Saftoiu A, Gheone D. Virtual Palpation Model -combining spiral CT and elastography data: a proof-of-concept study. Gastroenterology 2014;146(5):344-345 [FREE Full text] [CrossRef]
    57. Nouei M, Kamyad A, Soroush A, Ghazalbash S. A comprehensive operating room information system using the Kinect sensors and RFID. J Clin Monit Comput 2015 Apr;29(2):251-261. [CrossRef] [Medline]
    58. Hettig J, Saalfeld P, Luz M, Becker M, Skalej M, Hansen C. Comparison of gesture and conventional interaction techniques for interventional neuroradiology. Int J Comput Assist Radiol Surg 2017 Sep;12(9):1643-1653. [CrossRef] [Medline]
    59. Johnson R, O?Hara K, Sellen A, Cousins C, Criminisi C. Exploring the Potential for Touchless Interaction in Image-Guided Interventional Radiology. 2011 Presented at: CHI'11; May 7-12, 2011; Vancouver, BC, Canada p. 3323-3332   URL: [CrossRef]
    60. Hötker AM, Pitton MB, Mildenberger P, Düber C. Speech and motion control for interventional radiology: requirements and feasibility. Int J Comput Assist Radiol Surg 2013 Nov;8(6):997-1002. [CrossRef] [Medline]
    61. Tan JH, Chao C, Zawaideh M, Roberts AC, Kinney TB. Informatics in Radiology: developing a touchless user interface for intraoperative image control during interventional radiology procedures. Radiographics 2013;33(2):E61-E70. [CrossRef] [Medline]
    62. Iannessi A, Marcy P, Clatz O, Fillard P, Ayache N. Touchless intra-operative display for interventional radiologist. Diagn Interv Imaging 2014 Mar;95(3):333-337 [FREE Full text] [CrossRef] [Medline]
    63. Bercu Z, Patil V, Patel RS, Kim E, Nowakowski F, Lookstein R. Abstracts of the BSIR 2013 Annual Scientific Meeting, November 13-15, 2013, Manchester, England. Cardiovasc Intervent Radiol 2014 Jan;37(Suppl 1):1-82 [FREE Full text] [CrossRef] [Medline]
    64. Bercu Z, Patil VV, Patel R, Kim E, Nowakowski S, Lookstein R, et al. Use of hands free gesture-based imaging control for vessel identification during hepatic transarterial chemoembolization and selective internal radiotherapy procedures. J Vasc Interv Radiol 2015 Feb;26(2):S186-S187 [FREE Full text] [CrossRef]
    65. Mentis H, O'Hara K, Sellen A, Rikin TR. Interaction proxemics and image use in neurosurgery. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York, NY, USA: ACM Conference on Computer-Human Interaction; 2012 Presented at: CHI'12; May 5-10, 2012; New York, NY, USA p. 927-936. [CrossRef]
    66. Wright T, de Ribaupierre S, Eagleson R. Design and evaluation of an augmented reality simulator using leap motion. Healthc Technol Lett 2017 Oct;4(5):210-215 [FREE Full text] [CrossRef] [Medline]
    67. Yoshimitsu K, Muragaki Y, Maruyama T, Saito T, Suzuki T, Ikuta S. Clinical trials of the non-touch intraoperative image controllable interface system using KINECT(TM). Int J Comput Assist Radiol Surg 2012;7(Suppl 1):S209-S210.
    68. Yoshimitsu K, Muragaki Y, Maruyama T, Yamato M, Iseki H. Development and initial clinical testing of "OPECT": an innovative device for fully intangible control of the intraoperative image-displaying monitor by the surgeon. Neurosurgery 2014 Mar;10 Suppl 1:46-50; discussion 50. [CrossRef] [Medline]
    69. di Tommaso L, Aubry S, Godard J, Katranji H, Pauchot J. A new human machine interface in neurosurgery: The Leap Motion(®). Technical note regarding a new touchless interface. Neurochirurgie 2016 Jun;62(3):178-181. [CrossRef] [Medline]
    70. Xu X, Zheng Y, Yao S, Sun G, Xu B, Chen X. A low-cost multimodal head-mounted display system for neuroendoscopic surgery. Brain Behav 2018 Dec;8(1):e00891 [FREE Full text] [CrossRef] [Medline]
    71. Henseler H, Kuznetsova A, Vogt P, Rosenhahn B. Validation of the Kinect device as a new portable imaging system for three-dimensional breast assessment. J Plast Reconstr Aesthet Surg 2014 Apr;67(4):483-488. [CrossRef] [Medline]
    72. Wheat JS, Choppin S, Goyal A. Development and assessment of a Microsoft Kinect based system for imaging the breast in three dimensions. Med Eng Phys 2014 Jun;36(6):732-738. [CrossRef] [Medline]
    73. Pöhlmann ST, Harkness E, Taylor C, Gandhi A, Astley S. Preoperative implant selection for unilateral breast reconstruction using 3D imaging with the Microsoft Kinect sensor. J Plast Reconstr Aesthet Surg 2017 Aug;70(8):1059-1067. [CrossRef] [Medline]
    74. Klumb F, Dubois-Ferriere V, Roduit N, Barea C, Strgar T, Ahmed K. CARS 2017-Computer Assisted Radiology and Surgery Proceedings of the 31st International Congress and Exhibition Barcelona, Spain, June 20-24, 2017. Int J Comput Assist Radiol Surg 2017 Jun;12(Suppl 1):1-286 [FREE Full text] [CrossRef] [Medline]
    75. Pauly O, Diotte B, Fallavollita P, Weidert S, Euler E, Navab N. Machine learning-based augmented reality for improved surgical scene understanding. Comput Med Imaging Graph 2015 Apr;41:55-60. [CrossRef] [Medline]
    76. Jacob MG, Wachs JP. Context-based hand gesture recognition for the operating room. Pattern Recognit Lett 2014 Jan;36:196-203. [CrossRef]
    77. Hughes P, Nestorov N, Healy N, Sheehy N, O'Hare N. Comparing the utility and usability of the Microsoft Kinect and Leap Motion sensor devices in the context of their application for gesture control of biomedical images. 2015 Presented at: ECR 2015; March 4–8, 2015; Vienna. [CrossRef]
    78. O’Hara K, Gonzalez G, Penney G, Sellen A, Corish R, Mentis H, et al. Interactional order and constructed ways of seeing with touchless imaging systems in surgery. Comput Supported Coop Work 2014 May 7;23(3):299-337 [FREE Full text] [CrossRef]
    79. Kirmizibayrak C, Radeva N, Wakid M, Philbeck J, Sibert J, Hahn J. Evaluation of gesture based interfaces for medical volume visualization tasks. Int J Virtual Real 2012:1-13 [FREE Full text] [CrossRef]
    80. Wipfli R, Dubois-Ferrière V, Budry S, Hoffmeyer P, Lovis C. Gesture-controlled image management for operating room: a randomized crossover study to compare interaction using gestures, mouse, and third person relaying. PLoS One 2016;11(4):e0153596 [FREE Full text] [CrossRef] [Medline]
    81. Ogura T, Sato M, Ishida Y, Hayashi N, Doi K. Development of a novel method for manipulation of angiographic images by use of a motion sensor in operating rooms. Radiol Phys Technol 2014 Jul;7(2):228-234. [CrossRef] [Medline]
    82. Wachs J, Stern H, Edan Y, Gillam M, Feied C, Smith M. A Real-Time Hand Gesture Interface for Medical Visualization Applications. In: Tiwari A, Roy R, Knowles J, Avineri E, Dahal K, editors. Applications of Soft Computing, Volume 36 of Advances in Intelligent and Soft Computing. Berlin Heidelberg: Springer; 2006:153.
    83. Jacob M, Cange C, Packer R, Wachs J. Intention, context and gesture recognition for sterile MRI navigation in the operating room. In: Alvarez L, Mejail M, Gomez L, Jacobo J, editors. Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications, Volume 7441 of Lecture Notes in Computer Science. Berlin Heidelberg: Springer; 2012:220-227.
    84. Frame M. A novel system for hands free manipulation of digital X-rays in a sterile environment using consumer electronics and software. Int J Comput Assist Radiol Surg 2012;7(Supplement 1):S208 [FREE Full text]
    85. Ebert L, Hatch G, Thali M, Ross S. Invisible touch—Control of a DICOM viewer with finger gestures using the Kinect depth camera. J Forensic Radiol Imaging 2013 Jan;1(1):10-14 [FREE Full text] [CrossRef]
    86. Ogura T, Sato M, Ishida Y, Hayashi N, Doi K. Development of a novel method for manipulation of angiographic images by use of a motion sensor in operating rooms. Radiol Phys Technol 2014 Jul;7(2):228-234. [CrossRef] [Medline]
    87. Ebert L, Flach P, Thali M, Ross S. Out of touch – a plugin for controlling OsiriX with gestures using the leap controller. J Forensic Radiol Imaging 2014 Jul;2(3):126-128 [FREE Full text] [CrossRef]
    88. Rossol N, Cheng I, Shen R, Basu A. Touchfree medical interfaces. Conf Proc IEEE Eng Med Biol Soc 2014;2014:6597-6600. [CrossRef] [Medline]
    89. Iannessi A, Marcy PY, Clatz O, Ayache N, Fillard P. Touchless user interface for intraoperative image control: almost there. Radiographics 2014;34(4):1142-1144. [CrossRef] [Medline]
    90. Widmer A, Schaer R, Markonis D, Müller H. Gesture interaction for content--based medical image retrieval. ACM New York, NY, USA: ACM International Conference on Multimedia Retrieval. Glasgow; 2014 Presented at: ICMR'14; 2014; Glasgow, United Kingdom. [CrossRef]
    91. Ogura T, Sato M, Kadowaki Y, Yasumoto Y, Okajima M, Tsutsumi S. Development of a new method for manipulation of dental images using a motion sensor in dentistry. 2015 Presented at: ECR 2015; March 4-8, 2015; Vienna, Austria. [CrossRef]
    92. Ogura T, Sato M, Ishida Y, Hayashi N, Doi K. Development of a novel method for manipulation of angiographic images by use of a motion sensor in operating rooms. Radiol Phys Technol 2014 Jul;7(2):228-234. [CrossRef] [Medline]
    93. Mewes A, Saalfeld P, Riabikin O, Skalej M, Hansen C. A gesture-controlled projection display for CT-guided interventions. Int J Comput Assist Radiol Surg 2016 Jan;11(1):157-164. [CrossRef] [Medline]
    94. Nainggolan F, Siregar B, Fahmi F. Anatomy learning system on human skeleton using Leap Motion Controller. : IEEE; 2016 Aug 15 Presented at: 2016 3rd International Conference on Computer and Information Sciences (ICCOINS); August 15-17, 2016; Kuala Lumpur, Malaysia p. 2016-2013. [CrossRef]
    95. Virag I, Stoicu-Tivadar L, Crisan-Vida M. Gesture-based interaction in medical interfaces. 2016 Jul 11 Presented at: 2016 IEEE 11th International Symposium on Applied Computational Intelligence and Informatics (SACI); May 12-14, 2016; Timisoara, Romania. [CrossRef]
    96. Juhnke B, Berron M, Philip A, Williams J, Holub J, Winer E. Comparing the microsoft kinect to a traditional mouse for adjusting the viewed tissue densities of three-dimensional anatomical structures. 2013 Presented at: Medical Imaging 2013: Image Perception, Observer Performance, and Technology Assessment; 2013; Baltimore, Maryland, USA p. 86731   URL: [CrossRef]
    97. Pulijala Y, Ma M, Pears M, Peebles D, Ayoub A. An innovative virtual reality training tool for orthognathic surgery. Int J Oral Maxillofac Surg 2018 Sep;47(9):1199-1205. [CrossRef] [Medline]
    98. Pulijala Y, Ma M, Pears M, Peebles D, Ayoub A. Effectiveness of immersive virtual reality in surgical training-a randomized control trial. J Oral Maxillofac Surg 2018 May;76(5):1065-1072. [CrossRef] [Medline]
    99. Placitelli A, Gallo L. 3D point cloud sensors for low-cost medical in-situ visualization. USA: IEEE; 2011 Presented at: 2011 IEEE International Conference on Bioinformatics and Biomedicine Workshops (BIBMW); November 12-15, 2011; Atlanta, GA, USA p. 596. [CrossRef]
    100. Samosky JT, Wang B, Nelson DA, Bregman R, Hosmer A, Weaver RA. BodyWindows: enhancing a mannequin with projective augmented reality for exploring anatomy, physiology and medical procedures. Stud Health Technol Inform 2012;173:433-439. [Medline]
    101. Blum T, Kleeberger V, Bichlmeier C, Navab N. Mirracle: an augmented reality magic mirror system for anatomy education. 2012 Presented at: 2012 IEEE Virtual Reality Workshops (VRW); March 4-8, 2012; Costa Mesa, CA, USA p. 433-439. [CrossRef]
    102. Dargar S, Nunno A, Sankaranarayanan G, De S. Microsoft Kinect based head tracking for Life Size Collaborative Surgical Simulation Environments (LS-CollaSSLE). Stud Health Technol Inform 2013;184:109-113. [CrossRef] [Medline]
    103. Juhnke B. Iowa State University. 2013. Evaluating the Microsoft Kinect compared to the mouse as an effective interaction device for medical imaging manipulations   URL:
    104. Guo X, Lopez L, Yu Z, Steiner KV, Barner K, Bauer T, et al. A portable immersive surgery training system using RGB-D sensors. Stud Health Technol Inform 2013;184:161-167. [CrossRef] [Medline]
    105. Yang Y, Guo X, Yu Z, Steiner KV, Barner KE, Bauer TL, et al. An immersive surgery training system with live streaming capability. Stud Health Technol Inform 2014;196:479-485. [CrossRef] [Medline]
    106. Hochman JB, Unger B, Kraut J, Pisa J, Hombach-Klonisch S. Gesture-controlled interactive three dimensional anatomy: a novel teaching tool in head and neck surgery. J Otolaryngol Head Neck Surg 2014;43:38 [FREE Full text] [CrossRef] [Medline]
    107. Kocev B, Ritter F, Linsen L. Projector-based surgeon-computer interaction on deformable surfaces. Int J Comput Assist Radiol Surg 2014 Mar;9(2):301-312. [CrossRef] [Medline]
    108. Alvarez-Lopez F, Maina MF, Saigí-Rubió F. Natural user interfaces: is it a solution to accomplish ubiquitous training in minimally invasive surgery? Surg Innov 2016 Aug;23(4):429-430. [CrossRef] [Medline]
    109. Juanes JA, Gómez JJ, Peguero PD, Ruisoto P. Digital environment for movement control in surgical skill training. J Med Syst 2016 Jun;40(6):133. [CrossRef] [Medline]
    110. Svendsen MB, Preisler L, Hillingsoe JG, Svendsen LB, Konge L. Using motion capture to assess colonoscopy experience level. World J Gastrointest Endosc 2014 May 16;6(5):193-199 [FREE Full text] [CrossRef] [Medline]
    111. Colella S, Svendsen MB, Konge L, Svendsen LB, Sivapalan P, Clementsen P. Assessment of competence in simulated flexible bronchoscopy using motion analysis. Respiration 2015;89(2):155-161 [FREE Full text] [CrossRef] [Medline]
    112. Coles T, Cao C, Dumas C. SAGES. 2014. ETrack: An affordable Ergonomic assessment tool for surgical settings   URL: http:/​/www.​​meetings/​annual-meeting/​abstracts-archive/​etrack-an-affordable-ergonomic-assessment-tool-for-surgical-settings/​ [accessed 2019-04-02] [WebCite Cache]
    113. Kim Y, Kim P, Selle R, Shademan A, Krieger A. Experimental evaluation of contact-less hand tracking systems for tele-operation of surgical tasks. 2014 May 31 Presented at: 2014 IEEE International Conference on Robotics and Automation (ICRA); May 31-June 7, 2014; Hong Kong, China p. 2014. [CrossRef]
    114. Beyl T, Schreiter L, Nicolai P, Raczkowsky J, Wörn H. 3D perception technologies for surgical operating theatres. Stud Health Technol Inform 2016;220:45-50. [Medline]
    115. Jacob M, Li Y, Akingba G, Wachs JP. Gestonurse: a robotic surgical nurse for handling surgical instruments in the operating room. J Robot Surg 2012 Mar;6(1):53-63. [CrossRef] [Medline]
    116. Despinoy F, Sánchez A, Zemiti N, Jannin P, Poignet P. Comparative assessment of a novel optical human-machine interface for laparoscopic telesurgery. In: Stoyanov D, editor. Information Processing in Computer-Assisted Interventions. Cham: Springer; 2014:21.
    117. Vargas H, Vivas O. Gesture recognition system for surgical robot's manipulation. 2014 Presented at: 2014 XIX Symposium on Image, Signal Processing and Artificial Vision; September 17-19, 2014; Armenia, Colombia. [CrossRef]
    118. Travaglini TA, Swaney PJ, Weaver KD, Webster RJ. Initial experiments with the leap motion as a user interface in robotic endonasal surgery. Robot Mechatron (2015) 2016;37:171-179 [FREE Full text] [CrossRef] [Medline]
    119. Despinoy F, Zemiti N, Forestier G, Sánchez A, Jannin P, Poignet P. Evaluation of contactless human-machine interface for robotic surgical training. Int J Comput Assist Radiol Surg 2018 Jan;13(1):13-24. [CrossRef] [Medline]
    120. Lahanas V, Loukas C, Georgiou K, Lababidi H, Al-Jaroudi D. Virtual reality-based assessment of basic laparoscopic skills using the Leap Motion controller. Surg Endosc 2017 Dec;31(12):5012-5023. [CrossRef] [Medline]
    121. Kowalewski K, Hendrie JD, Schmidt MW, Garrow CR, Bruckner T, Proctor T, et al. Development and validation of a sensor- and expert model-based training system for laparoscopic surgery: the iSurgeon. Surg Endosc 2017 Dec;31(5):2155-2165. [CrossRef] [Medline]
    122. Pérez F, Sossa H, Martínez R, Lorias D. Video-based tracking of laparoscopic instruments using an orthogonal webcams system. Acad Sci Eng Technol Int J 2013;7(8):440-443 [FREE Full text] [CrossRef]
    123. Oropesa I, de Jong T, Sánchez-González P, Dankelman J, Gómez E. Feasibility of tracking laparoscopic instruments in a box trainer using a Leap Motion Controller. Measurement 2016 Feb;80:115 [FREE Full text] [CrossRef]
    124. Beck P. Free Patents Online. 2016. Accurate Three-dimensional Instrument Positioning   URL: [accessed 2019-04-02] [WebCite Cache]
    125. Owlia M, Khabbazan M, Mirbagheri MM, Mirbagheri A. Real-time tracking of laparoscopic instruments using kinect for training in virtual reality. Conf Proc IEEE Eng Med Biol Soc 2016 Dec;2016:3945-3948. [CrossRef] [Medline]
    126. Partridge RW, Brown FS, Brennan PM, Hennessey IA, Hughes MA. The LEAPTM gesture interface device and take-home laparoscopic simulators: a study of construct and concurrent validity. Surg Innov 2016 Feb;23(1):70-77. [CrossRef] [Medline]
    127. Sun X, Byrns S, Cheng I, Zheng B, Basu A. Smart sensor-based motion detection system for hand movement training in open surgery. J Med Syst 2017 Feb;41(2):24. [CrossRef] [Medline]
    128. Hartmann F, Schlaefer A. Feasibility of touch-less control of operating room lights. Int J Comput Assist Radiol Surg 2013 Mar;8(2):259-268. [CrossRef] [Medline]
    129. Mauser S, Burgert O. Touch-free, gesture-based control of medical devices and software based on the leap motion controller. Stud Health Technol Inform 2014;196:265-270. [CrossRef] [Medline]
    130. Schröder S, Loftfield N, Langmann B, Frank K, Reithmeier E. Contactless operating table control based on 3D image processing. Conf Proc IEEE Eng Med Biol Soc 2014;2014:388-392. [CrossRef] [Medline]
    131. Jacob M, Wachs J. Context-based hand gesture recognition for the operating room. Pattern Recognit Lett 2014 Jan;36:196-203 [FREE Full text] [CrossRef]
    132. Sweet R, Kowalewski T, Oppenheimer P, Weghorst S, Satava R. Face, content and construct validity of the University of Washington virtual reality transurethral prostate resection trainer. J Urol 2004 Nov;172(5 Pt 1):1953-1957. [CrossRef] [Medline]
    133. Higgins JP, Altman DG, Gøtzsche PC, Jüni P, Moher D, Oxman AD, Cochrane Bias Methods Group, Cochrane Statistical Methods Group. The Cochrane Collaboration's tool for assessing risk of bias in randomised trials. Br Med J 2011 Oct 18;343:d5928 [FREE Full text] [CrossRef] [Medline]
    134. Weichert F, Bachmann D, Rudak B, Fisseler D. Analysis of the accuracy and robustness of the Leap Motion Controller. Sensors (Switzerland) 2013 Jan;13(5):6380-6393. [Medline]
    135. Bachmann D, Weichert F, Rinkenauer G. Evaluation of the leap motion controller as a new contact-free pointing device. Sensors (Basel) 2014 Dec 24;15(1):214-233 [FREE Full text] [CrossRef] [Medline]
    136. Guna J, Jakus G, Pogačnik M, Tomažič S, Sodnik J. An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors (Basel) 2014 Feb 21;14(2):3702-3720 [FREE Full text] [CrossRef] [Medline]
    137. Khoshelham K, Elberink SO. Accuracy and resolution of Kinect depth data for indoor mapping applications. Sensors (Basel) 2012;12(2):1437-1454 [FREE Full text] [CrossRef] [Medline]
    138. Mendez I, Hansen B, Grabow C, Smedegaard E, Skogberg N, Uth X, et al. Evaluation of the Myo armband for the classification of hand motions. IEEE Int Conf Rehabil Robot 2017 Dec;2017:1211-1214. [CrossRef] [Medline]
    139. Li C, Ren J, Huang H, Wang B, Zhu Y, Hu H. PCA and deep learning based myoelectric grasping control of a prosthetic hand. Biomed Eng Online 2018 Aug 6;17(1):107 [FREE Full text] [CrossRef] [Medline]
    140. Ur Rehman MZ, Waris A, Gilani S, Jochumsen M, Niazi IK, Jamil M, et al. Multiday EMG-based classification of hand motions with deep learning techniques. Sensors (Basel) 2018 Aug 1;18(8):1-16 [FREE Full text] [CrossRef] [Medline]
    141. Sánchez-Margallo JA, Sánchez-Margallo FM, Pagador Carrasco JB, Oropesa García I, Gómez Aguilera EJ, Moreno del Pozo J. Usefulness of an optical tracking system in laparoscopic surgery for motor skills assessment. Cir Esp 2014;92(6):421-428. [CrossRef] [Medline]


    3D: 3-dimensional
    COTS: commercial off-the-shelf
    EMBASE: Excerpta Medica dataBASE
    IEEE: Institute of Electrical and Electronics Engineers
    LMC: Leap Motion Controller
    MeSH: Medical Subject Headings
    MIS: minimally invasive surgery
    MK: Microsoft Kinect

    Edited by G Eysenbach; submitted 12.08.18; peer-reviewed by K Kowalewski, JA Sánchez Margallo, B Davies; comments to author 13.10.18; revised version received 04.01.19; accepted 25.01.19; published 03.05.19

    ©Fernando Alvarez-Lopez, Marcelo Fabián Maina, Francesc Saigí-Rubió. Originally published in the Journal of Medical Internet Research (, 14.04.2019.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on, as well as this copyright and license information must be included.