This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.
Language mapping during awake brain surgery is currently a standard procedure. However, mapping is rarely performed for other cognitive functions that are important for social interaction, such as visuospatial cognition and nonverbal language, including facial expressions and eye gaze. The main reason for this omission is the lack of tasks that are fully compatible with the restrictive environment of an operating room and awake brain surgery procedures.
This study aims to evaluate the feasibility and safety of a virtual reality headset equipped with an eye-tracking device that is able to promote an immersive visuospatial and social virtual reality (VR) experience for patients undergoing awake craniotomy.
We recruited 15 patients with brain tumors near language and/or motor areas. Language mapping was performed with a naming task, DO 80, presented on a computer tablet and then in 2D and 3D via the VRH. Patients were also immersed in a visuospatial and social VR experience.
None of the patients experienced VR sickness, whereas 2 patients had an intraoperative focal seizure without consequence; there was no reason to attribute these seizures to virtual reality headset use. The patients were able to perform the VR tasks. Eye tracking was functional, enabling the medical team to analyze the patients’ attention and exploration of the visual field of the virtual reality headset directly.
We found that it is possible and safe to immerse the patient in an interactive virtual environment during awake brain surgery, paving the way for new VR-based brain mapping procedures.
ClinicalTrials.gov NCT03010943; https://clinicaltrials.gov/ct2/show/NCT03010943.
Brain mapping by direct electrical stimulation (DES) during awake craniotomy is currently a standard procedure that reduces the risk of permanent neurological deficits and increases the extent of tumor resection and the success of epilepsy surgery [
Verbal language, which is controlled by the dominant hemisphere, is widely mapped in this way [
A few years ago, we began exploring the feasibility of testing cognitive functions during awake craniotomy by immersing the patient in virtual situations with a virtual reality headset (VRH). We have developed several different approaches using different types of headsets and software. The first virtual reality (VR) tasks were developed with the aim of preventing postoperative hemianopsia and unilateral neglect [
We performed a single-center, prospective, and open-label study, and the study protocol was evaluated and approved by the Agence Nationale de Sécurité du Médicament et des produits de santé, the local ethics committee, and Commission Nationale de l'Informatique et des Libertés. All patients signed a written informed consent form before inclusion in the study. This study was registered at ClinicalTrials.gov (NCT03010943). As indicated above, an amendment was requested and accepted to assess the feasibility of using eye tracking with a VRH during awake craniotomy and to explore the possibilities and limitations of the visuospatial and social VR experience. During the extension of the study, we continued to use questionnaires completed by the patient and medical professionals to assess tolerance (discomfort, nausea, vomiting, and visual-vestibular-somatosensory conflict) and satisfaction (
The inclusion criteria were as follows: patients aged >18 years hospitalized for a brain tumor near language and/or motor areas (determined by neuropsychological evaluation and resting-state functional magnetic resonance imaging [fMRI]) in the left or right hemisphere who gave written informed consent. The exclusion criteria were all contraindications for awake surgery (cognitive impairment, whether related to the surgical lesion, aphasia, or morbid anxiety). A total of 15 patients were included in the extension study.
This study was performed with a Tobii Pro VR Integration, an eye-tracking retrofitted HTC VIVE wired to a computer connected to a neuronavigational system (Brainlab). The VRH has a visual field of 110°, an adjustable interpupillary distance, a latency <20 milliseconds, a refresh rate of 90 Hz, a resolution of 2160×1200 pixels, and adjustable focus. The VRH includes the eye-tracking systems developed by Tobii Pro for research purposes (Tobii Pro), which collects various types of eye movement data, such as gaze origin and direction, pupil position, and absolute pupil size with an accuracy of 0.5° visual angle at a rate of 120 Hz. What the patient sees in the VRH is visualized on one of the screens of the neuronavigational system.
The picture-naming task, DO 80, was implemented in the VRH in 2 versions [
The new visuospatial and social VR experience that we have developed uses animated synthetic characters (avatars;
(A) Patient wearing the virtual reality headset. (B) and (C) Example of the item “phone” in the DO 80 naming task presented in 2D (B) and 3D (C) with the virtual reality headset. The green spot indicates the patient’s gaze.
Left: view of the operating room during the procedure. (A) Head of the patient wearing the virtual reality headset; (B) application of direct electrical stimulation to the exposed brain; (C) screen showing what the patient sees in the virtual reality headset, his gaze materialized by a green spot; (D) neuronavigational system showing brain white matter fascicles and the position of the electrode. Right: example of a layout after the virtual reality task simulating a visuospatial and social experience. (E) The image that is visualized and analyzed on the screen (C). The movement of the patient’s gaze is visualized as a blue line (with the starting point in green and the endpoint in pink). The green box indicates the avatar making eye contact. The white arrow indicates the avatar on which the patient focuses for more than 0.6 seconds (triggering the expression of a dynamic facial emotion). In this example, the patient identified the avatar making eye contact in 2.53 seconds and indicated the emotion expressed 3.77 seconds later.
The procedure has been described in detail elsewhere [
Brain mapping for language was performed with the picture-naming task, DO 80, using a computer tablet. Sites were identified as language sites if interference (speech arrest, anomia, dysarthria, semantic or phonemic paraphasia, or delayed naming >5 seconds) was detected in at least 3 meticulous tests (not necessarily consecutively) and were tagged on the cortex. A second round of mapping was then performed using the VRH, with the 2D DO 80 and then with the 3D DO 80 task. Differences in responses were carefully noted. Depending on the location of the tumor, other tests were proposed on a computer tablet (spontaneous speech production, counting, reading, etc).
The visuospatial and social VR experience with avatars was included in brain mapping by DES when considered necessary to test these functions. In other situations, this VR task was proposed for patients without DES, generally during the closure period. The 4 quadrants of the visual field and all emotions (joy, surprise, or anger) were presented randomly to the patient. As for language mapping, a site was identified as eloquent if interference through DES (difficulties exploring the space, difficulties locating the avatar making eye contact, or failure to recognize the facial emotion, resulting in a delay or lack of response from the patient) was detected 3 times. Once the task was completed, the gaze layout, the time to perform the task, and the patient’s answer were recorded (
The entire procedure was performed in the presence of an engineer and a neuropsychologist. Heart rate, blood pressure, and EEG signals were recorded continuously during the procedure. Spontaneous or stimulation-induced afterdischarges recorded on EEG, were defined as 2 consecutive spikes or sharp waves distinct from background activity. Any drug administration differing from that laid out in the predefined protocol was noted. Tolerance was also assessed with a questionnaire completed by the patient, the anesthetist, the neuropsychologist, and the neurosurgeon.
Baseline characteristics of the 15 patients are presented in
Baseline characteristics of the 15 patients and the virtual reality tasks they performed.
Patient | Sex | Age (years) | Handedness | Diagnosis | Hemisphere | Lobe | Preoperative training | Brain mapping |
1 | Male | 68 | Left | Metastasis | Left | Parietal | Task 1a and task 2b | Task 1 and task 2 |
2 | Male | 41 | Right | Oligodendroglioma II | Right | Frontal | Task 1 and task 2 | Motor and task 2 |
3 | Female | 25 | Right | Astrocytoma III | Left | Frontal | Task 1 | Motor and task 1 |
4 | Female | 66 | Right | Oligodendroglioma III | Right | Frontal | Task 2 | Motor |
5 | Male | 39 | Left | Astrocytoma III | Right | Frontal | Task 1 and task 2 | Motor and task 1 and task 2 |
6 | Female | 60 | Right | Glioblastoma | Left | Temporoparietal | Task 1 | Task 1 |
7 | Male | 48 | Right | Oligodendroglioma III | Left | Frontal | Task 1 and task 2 | Task 1 and task 2 |
8 | Female | 53 | Right | Glioblastoma | Left | Parietal | Task 1 | Task 1 |
9 | Male | 68 | Right | Glioblastoma | Left | Frontal | Task 1 and task 2 | Task 1 |
10 | Male | 73 | Right | Metastasis | Left | Frontal | Task 1 and task 2 | Task 1 and task 2 |
11 | Female | 47 | Right | Astrocytoma III | Left | Parietal | Task 1 and task 2 | Motor and task 1 |
12 | Male | 61 | Right | Metastasis | Left | Temporoparietal | Task 1 and task 2 | Task 1 |
13 | Male | 43 | Right | Astrocytoma III | Left | Parietal | Task 1 and task 2 | Task 1 and task 2 |
14 | Female | 53 | Right | Astrocytoma III | Left | Frontotemporal insular | Task 1 and task 2 | Task 1 |
15 | Female | 41 | Right | Astrocytoma III | Right | Frontal | Task 2 | Task 2 |
aTask 1: DO 80 (tablet, 2D virtual reality, and 3D virtual reality).
bTask 2: visuospatial and social virtual reality experience.
Only 3 patients had experienced VR before inclusion. Before surgery, 13 patients were trained with the DO 80 task (tablet, 2D VR, and 3D VR), and 12 patients were trained with the VR task simulating a visuospatial and social experience (
The mean duration of surgery was 4 hours and 23 minutes (range 3 h and 6 min-5 h and 30 min), with a mean duration of the awake phase of 2 hours and 20 minutes (range 25 min-4 h). The mean intensity of DES was 1.9 mA (range 1-4 mA), and the mean total duration of VRH use per patient was 11 minutes in 2 to 4 sessions.
For the 13 patients for whom brain mapping was performed for language, the same language eloquent areas were identified, regardless of the DO 80 presentation used (computer tablet, 2D VR, or 3D VR). However, for 1 patient (patient 13), the results were unclear in some areas for DO 80 on the computer tablet (hesitation or delay in denomination) that clearly were not eloquent according to assessment with the VRH. Eye tracking was functional, making it possible to trace the gaze of the patient during the task. During the DO 80 task, we noted that patients did not read the sentence “this is...,” instead saying it automatically.
Among the 10 patients who were able to perform the visuospatial and social VR experience without difficulty before surgery, 7 patients performed this task during brain mapping by DES and 2 patients (patients 11 and 14) during closure without DES (
Despite the discomfort associated with the awake surgery procedure, none of the patients experienced vertigo or any vegetative signs of VR sickness. EEG modifications (afterdischarge or spike-and-wave) were observed in 27% (4/15) of the patients during the standard brain mapping procedure (without VRH). The same abnormalities persisted during brain mapping+VRH in 3 of these patients. IOSs occurred in 13% (2/15) of the patients. Epilepsy was the first sign for these patients and neither of these two patients displayed EEG modifications during the brain mapping procedure. The IOSs observed were short motor seizures, disappearing rapidly after cortical irrigation with iced saline. IOSs occurred during DES, before using the VRH for one patient and during the VR task for the other.
According to the questionnaire completed after surgery by the patient, the neurosurgeon, and the anesthetist, the use of the VRH was not an issue during surgery. During 1 operation, the neuropsychologist found it difficult to position the VRH. All participants agreed to continue studying this approach.
VR is a domain with growing applications in the field of neuroscience. This computer technology generates realistic images, sounds, and other sensations that simulate a user’s physical presence in a virtual or imaginary environment. A person using a VRH can look around the artificial world,
The extension of our initial prospective trial confirmed that VRH with eye tracking and immersive virtual experiences was safe for patients undergoing awake craniotomy and brain mapping using DES. None of the patients experienced VR sickness and we observed no sympathetic nervous activity reported for this syndrome [
All these data indicate that the use of a VRH during brain mapping in awake surgery does not specifically increase the rate of IOSs. Nevertheless, we recommend several precautions to prevent seizures during the use of a VRH for brain mapping procedures, including a well-trained team and, although there is no consensus regarding its usefulness, intraoperative monitoring of brain electrical activity.
This trial also demonstrated the feasibility of using eye tracking in patients undergoing awake craniotomy and brain mapping using DES. One of our apprehensions was the potential interference caused by devices in the operating room emitting infrared light, such as the neuronavigation system for tracking gaze. No such interference was observed and we were able to track eye movements on one of the screens of the neuronavigational system. Eye tracking revealed that the patients never read the sentence “this is...” associated with the image in the DO 80 VR task. This observation suggests that the patient said the sentence automatically, focusing only on the naming task. In the future, it would be interesting to develop a new VR task through a VRH with eye tracking for the specific exploration of reading. We recognize that eye-tracking sensors can be used in combination with regular computer screens or tablets. However, the eye tracker in the VRH, which combines features of both mobile and remote setups, prevents the risk of losing calibration and improves the success rate for measurements of eye positions and movements. Furthermore, the use of a VRH immerses the patient in the VR task, which is completely isolated from the surrounding operating room.
This trial also explored the possibilities and limitations of the visuospatial and social VR experiment developed with animated synthetic avatars. Avatars are perceived in a similar manner to real human beings and can be used to explore the complex processes of nonverbal language, empathy, and theory of mind [
One of the difficulties in the field of VR research is the rapid progress of technology and the regular release of new VRHs. At the beginning of our research on the use of VR in the operating room, intending to detect hemianopsia and unilateral neglect during DES, we used the Oculus VRHs DK1 and DK2 (visual field 100°, resolution 1280×800 pixels, and refresh rate 60 Hz; Oculus) [
This study extended an initial prospective trial designed to confirm the feasibility and safety of VRH use and immersive virtual experiences for patients undergoing awake craniotomy and brain mapping using DES. Its added value lies in using a latest generation VRH, including eye tracking and a VR task designed to simultaneously test visuospatial and social functions.
Questionnaires completed by the patients and medical professionals.
direct electrical stimulation
electrocorticogram
electroencephalogram
functional magnetic resonance imaging
intraoperative seizure
virtual reality
virtual reality headset
This project received funding from Fondation de l’Avenir, Paris, France (AP-RM 18-032), and Angers University Hospital. The authors thank Doctor Jean-Michel Lemée, Doctor Florian Bernard (Département de Neurochirugie, Centre Hospitalier Universitaire d’Angers, Angers, France), Doctor Jérémy Besnard, and Professor Philippe Allain (LPPL-EA4638, Université d’Angers, Angers, France) for their help in the design of the new clinical trial (ClinicalTrials.gov NCT04288505). The authors also thank Alex Edelman and Associates (La Crouzille, France) for correcting the manuscript.
RS is a cofounder of Dynamixyz, which markets the facial expression transfer tool used to animate avatars. He reports personal fees from Dynamixyz. None of the other authors have any conflicts of interest to declare.