This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.
Producing a rich, personalized Web-based consultation tool for plastic surgeons and patients is challenging.
(1) To develop a computer tool that allows individual reconstruction and simulation of 3-dimensional (3D) soft tissue from ordinary digital photos of breasts, (2) to implement a Web-based, worldwide-accessible preoperative surgical planning platform for plastic surgeons, and (3) to validate this tool through a quality control analysis by comparing 3D laser scans of the patients with the 3D reconstructions with this tool from original 2-dimensional (2D) pictures of the same patients.
The proposed system uses well-established 2D digital photos for reconstruction into a 3D torso, which is then available to the user for interactive planning. The simulation is performed on dedicated servers, accessible via Internet. It allows the surgeon, together with the patient, to previsualize the impact of the proposed breast augmentation directly during the consultation before a surgery is decided upon. We retrospectively conduced a quality control assessment of available anonymized pre- and postoperative 2D digital photographs of patients undergoing breast augmentation procedures. The method presented above was used to reconstruct 3D pictures from 2D digital pictures. We used a laser scanner capable of generating a highly accurate surface model of the patient’s anatomy to acquire ground truth data. The quality of the computed 3D reconstructions was compared with the ground truth data used to perform both qualitative and quantitative evaluations.
We evaluated the system on 11 clinical cases for surface reconstructions and 4 clinical cases of postoperative simulations, using laser surface scan technologies showing a mean reconstruction error between 2 and 4 mm and a maximum outlier error of 16 mm. Qualitative and quantitative analyses from plastic surgeons demonstrate the potential of these new emerging technologies.
We tested our tool for 3D, Web-based, patient-specific consultation in the clinical scenario of breast augmentation. This example shows that the current state of development allows for creation of responsive and effective Web-based, 3D medical tools, even with highly complex and time-consuming computation, by off-loading them to a dedicated high-performance data center. The efficient combination of advanced technologies, based on analysis and understanding of human anatomy and physiology, will allow the development of further Web-based reconstruction and predictive interfaces at different scales of the human body. The consultation tool presented herein exemplifies the potential of combining advancements in the core areas of computer science and biomedical engineering with the evolving areas of Web technologies. We are confident that future developments based on a multidisciplinary approach will further pave the way toward personalized Web-enabled medicine.
Since the creation of the World Wide Web in the early 1990s, its use for medical applications has attracted much attention due to the possibilities of centralized storage and the efficient sharing of information. The creation of the picture archiving and communication system and related Web-enabled interfaces for the Internet demonstrates the interest from the medical community in accessing information in a reliable, economical, and convenient way [
A field in which Internet capabilities can be used for medical purposes is 3-dimensional (3D) human anatomy. Contrary to Web-enabled medical tools for education purposes, where standard data models are employed, the scenario is more complex when considering confidential patient-specific or personalized medical imaging data from 2-dimensional (2D) pictures. Therefore, the tool presented in this paper was developed and tested in a multidisciplinary effort by a team of experts consisting of surgeons, biomedical engineers, computer graphic specialists, and Web developers and designers.
In breast augmentation surgery, surgeon–patient communication is vital, as the diagnosis, treatment, and outcome are dominated by the patient’s subjective assessment of the visual results of the elective surgical procedure. Failure to meet the patient’s expectations (augmentation volume, breast projection, etc) can lead to the need for reoperations and ultimately to legal action. It is therefore essential that patients be personally involved in the process of implant selection, supported by a realistic visual representation of their body, the previsualization of the final result. The success of the surgical outcome depends significantly on the choice of implant shape, size, projection, and anatomical placement, and these are key factors in the decision process.
Available computerized 3D anatomical visualization can be divided into the following categories. First, image-morphing techniques are software solutions working exclusively in 2 dimensions, where a patient’s photograph might or might not be the basis for the projected postoperative result (eg, PhotoShop, ReShapr [
The above-mentioned techniques have inherent limitations for application in the daily clinical work of a surgeon.
We propose a patient-specific system for breast augmentation previsualization using well-established 2D digital photos of the patient’s body taken with a digital camera (together with a few extra body measurements for scaling) and transforming them into a 3D, interactive, visual surface representation of the upper torso, which is then available for interactive planning to the user. The simulation is performed on dedicated servers, accessible via Internet. It allows the surgeon, together with the patient, to previsualize the impact of the proposed breast augmentation directly during the consultation before a surgery is decided upon. The hypothesis of this study was thus that the proposed Web-based system would allow previsualization of results, with varying implant size and varying implant locations, acting as a guide for the preoperative planning and decision process.
The 3 goals of this study were thus to (1) develop a computer tool that allows the individual reconstruction and simulation of 3D soft tissue from ordinary digital photos of breasts, (2) implement a Web-based, worldwide-accessible preoperative surgical planning platform for plastic surgeons, and (3) validate this tool through a quality control analysis by comparing 3D laser scans of the patients with the 3D reconstructions made using this tool from original 2D pictures of the same patients.
The following subsections describe some particular adopted strategies, giving particular emphasis to Web-related components and data management.
General overview of the developed system. An Internet-based solution combining advanced technologies enables a realistic, patient-specific, simulated clinical scenario. 2D = 2-dimensional; 3D = 3-dimensional.
The real-time generation of a 3D model of the patient’s anatomy is based on the extraction of patient-specific information, provided by the user in the form of 2D digital pictures, taken at 3 different angles (frontal and lateral images). In addition, and in order to create a plausible model, 2 physical distance measurements of the patient’s anatomy are requested, such as nipple-to-nipple and nipple-to-submammary fold. The set of 2D images and sparse measurements allow for calibration of images to the actual patient’s anatomy and for reconstruction of a realistic model of the patient’s anatomy.
An important aspect is to provide the user with understandable information regarding the way digital pictures need to be taken. This is indicated to the user as guidelines for taking suitable patient photos; generally, plain white hospital walls offer sufficient contrast. Conventional fluorescent lighting found in offices and hospitals is perfectly acceptable for the system to be able to reliably detect the patient’s contour from the 2D pictures.
To initiate the body extraction algorithm, the user is required to define a few anatomical landmarks on each of the 3 pictures (see
Three landmarked photos of a patient. Visual aids on where to place landmarks and a simple Web interface guide the user through the annotation of images. Cropped screenshot taken from the Web-based interface.
Once body extraction and breast type characterization is finished, a specialized image-based 3D/2D reconstruction algorithm is used to estimate the 3D shape of the patient’s anatomy, from the imaging and morphometric information provided by the user. The user is presented with a 3D visual representation of the 3 views, in the form of a textured surface model (see
Web-based 3-dimensional (3D) annotations on a reconstructed patient model. A set of tools including 3D distances, text, and body drawing enable a personalized virtual clinical analysis. Cropped screenshot taken from the Web-based interface.
3D annotations, such as floating text, lines, and landmarks, can be added directly to the models (
A widely accepted breast augmentation procedure consists of choosing between three different implant placement techniques, such as subglandular, submuscular, and dual plane. The implants themselves come in a plethora of different widths, heights or projections, lengths, and shapes. It is in this large array of choices that use of a physics-based implant simulator on the virtual patient is important to quickly and decisively give an idea of a final postoperative result. Furthermore, interpatient anatomical variability adds to the complex decision-making process.
Consequently, a Web-enabled simulator for breast augmentation needs to consider the current breast augmentation techniques while adapting them to the Web. The following subsections introduce the Web-based planning and biomechanical simulator.
In the planning process, implant positions and diameters are defined for use during the simulation. Positioning and sizing can be indicated directly on the photo or defined numerically (
Selection of implant position and diameter. Cropped screenshot taken from the Web-based interface.
To stay true to reality, many pathways have been explored in terms of viable simulation solutions such as fluidics [
Instead of creating a simulator that builds from prefabricated examples, our platform is based on the physical properties of human tissue using the tissue elastic model (TEM), which closely resembles the finite element method. The most important features differentiating the two methods are that in TEM
Deformations such as torsion, volume, and angular constraints are more relaxed.
Speed of execution is emphasized, such that simulations of thousands of iterations on large and complex aggregates of voxelized tissue (defined below) are handled in a few seconds.
A domain-specific implementation involving a biomechanics model focuses solely on the breasts.
Tissue elasticity is inherent in the model.
The TEM engine has an elasticity module included that takes into account the existing degree of skin elasticity and type, which is chosen by the surgeon who submits the pictures. We took into consideration 4 elasticity types: loose, moderate, tight, and very tight. Based on the selected skin elasticity the biomechanical engine modulates the global outcome of the simulation. Systems with a similar basis of gridlike structures can be found [
The simulator starts by defining the volume constraints of the breasts and by subdividing the breast tissue into tiny 3D cubes called voxels (or volumetric pixels). This process is called voxelization and volumetrically approximates the different tissue layers of the breasts such as muscles, fat, glands, and skin. The innermost layers are the torso and bones, followed by the muscle layer of the pectorals, then fat and glands, and finally skin. This biomechanical model is medically relevant to be faithful to the surgeon’s expected outcome (see
The simulator requires that the surgeon place the 2 implants in an image of the front part of the patient with the center part of the implant as visual guidance (see
Voxelized breasts and implants. The fat layer is seen in yellow, the skin in orange, and the muscle layer in red. The implants are shown as white voxels or particles. This screenshot is taken from the simulation developer’s point of view and is not visible in the Web-based interface.
We retrospectively conducted a quality control assessment on available anonymized pre- and postoperative 2D digital photographs of patients undergoing breast augmentation procedures. The above-presented method was used to reconstruct 3D pictures from 2D digital pictures. A laser scanner (EScan3D [
For a qualitative evaluation, an overlay of the reconstruction is superimposed on the laser scan and presented to 4 plastic surgeons for direct visual comparison. The scanner is capable of submillimeter precision and can capture the breast field in 3 overlapping sweeps, each sweep lasting between 5 and 10 seconds. Slight patient motion during laser acquisition of the surfaces introduces errors in the evaluation. The patient was required to hold her breath to minimize chest motion. Eder et al [
Laser technology relies on several acquisitions, since the field of view is insufficient to cover the entire thorax. The patient is likely to move between acquisitions. To account for potential patient motion between scans, the patient is repositioned to stand with her back against the wall aligned with a patient-specific template with the head tilted back, and the elbow and scapula in contact with the wall. Testing of this protocol on several patients found that patient motion due to breathing was minimized, thus ensuring a good laser reconstruction. The alignment of the arms against the template also enables breast deformations to be easily matched during reconstruction of the individual 3D surface scans. These findings are also in agreement with Eder et al [
The laser scans are considered for the purposes of this study to be the criteria index, since they incorporate true depth information of the surface. To validate the algorithm, surface scans of the patient were taken at the same time as the photographs used for the 3D reconstruction.
To construct an overlay of the resulting 3D reconstruction and the corresponding ground truth, the reconstructed breasts were aligned to the laser scan surface using surface-matching techniques in Amira 5.3.3 software (Visage Imaging GmbH, Berlin, Germany). The surface matching of the laser scan and the 3D reconstruction is required in order to bring meshes from different sources into the same coordinate system. During this surface-matching step, the shape of the breasts was preserved so as not to bias the results.
As a quantitative metric, the surface-to-surface distance between the overlaid reconstructed surface and the baseline scan was computed for each point of the reconstructed surface.
The quality control evaluation was based on a single ambulatory follow-up outpatient visit and had no implication in therapy planning for the patients. The current photographic 2D documentation was completed by a 3D camera scan to allow quality control of the efficiency of the Web-based 3D reconstruction system. Following anonymization of the data, the analysis was performed in a blinded manner. There are no prospective intents or implications in this retrospective photographic comparison. In total, we present 11 datasets, of which 10 are postoperative.
Overlapping scanner views (left) and the resulting surface scan generated by the commercial scanner software (right).
To ensure that the 3D reconstruction algorithm was evaluated fairly, the breasts were cropped from the reconstructed surface in order to directly compare against the baseline laser scan, since this is the most important part of the model to present to users. The direct comparison is illustrated by the red wire-frame mesh (reconstruction) on the gray surface (laser scan) in the last column of
Composite figure sets showing, from left to right: patient photographs, corresponding 3-dimensional (3D) surface reconstruction, laser scan ground truth, and overlaid reconstructed surfaces. The laser scan textures were acquired in the absence of flash photography; hence, their illumination appears slightly different from that in the patient photos. The laser scan surface is shown as a transparent surface in the last column, and the 3D reconstruction is displayed as a superimposed red wire frame. The preoperative simulated images are screenshots from the Web-based interface, as seen in the Proposed System column.
Visually, the overlaid reconstructions appear to correlate well with the laser scan. The results of these surface distances are shown as errors on a box plot, as displayed in
A recent paper [
It should be noted that the aim of the system is not to compete against the very accurate 3D laser scanner technologies, but to propose a Web-based patient-specific tool to aid surgeons with the consultation process and patients with their preoperative choice.
Box plots of left and right breast: 3-dimensional (3D) surface reconstructions compared with laser scans. The patients in
To further control the quality of the proposed system as a breast implant consultation tool, simulations of postoperative surgery are predicted from the patients’ preoperative photos along with the knowledge of the implants chosen for the surgery. The predictions were validated using laser scans of postoperative patients. Routine photographic documentation was taken according to common surgical practice. None of these patients were presented preoperatively with a 3D reconstruction or simulation, and no clinical therapeutic decision was based on these data.
Qualitative results are shown in
Composite figure showing pre- and postoperative photos, pre- and postoperative 3-dimensional (3D) reconstruction, and implant simulation surface renderings from the simulation visualization. The reconstructed and simulated surfaces were computed from the preoperative photos and hence show similarities in the textures. The pre- and postoperative simulated images are screenshots from the Web-based interface (middle columns).
Postoperative simulation results predicted from preoperative images compared with postoperative laser scans.
We present a tool developed for a 3D Web-based, patient-specific consultation in the clinical scenario of breast augmentation. The main finding of this study are that the current state of development allows for the creation of a responsive and effective Web-enabled 3D consultation tool for breast augmentation surgery based on 3D image reconstruction of 2D pictures, even with highly complex and time-consuming computation, by off-loading it to a dedicated high-performance data center.
The efficient combination of advanced technologies, based on analysis and understanding of human anatomy and physiology, will allow for the development of further Web-based reconstruction and predictive interfaces at different scales of the human body.
This consultation tool exemplifies the potential of combining advancements in the core areas of computer science and biomedical engineering along with the evolving areas of progress in Web technologies. We are confident that future developments based on a multidisciplinary approach will further pave the way toward personalized Web-enabled medicine.
This technology has potential for other medical applications, such as reconstructive surgery of facial malformations, aesthetic facial and anti-aging procedures, or preoperative volume evaluations in breast reduction surgery. These areas can be targeted by modeling the different anatomical and physiological processes.
For the future development of the presented system, optimization plans include improving texturing of the reconstruction from the patient’s photographs, user retargeting of geometry [
2-dimensional
3-dimensional
tissue elastic model
We acknowledge the KTI Promotion Agency (grant number 10263.1 PFLS-LS) for their support of this study and thank Heather Gray for proofreading this paper.
Dr Garcia is co-owner of, and receives income from, Crisalix, S.A, which is developing products related to the research described in this paper and developed through the Swiss Agency KTI for promotion of medical technologies. The terms of this arrangement have been reviewed and approved by the University of Bern, Switzerland, in accordance with their respective conflict of interest policies.