Central Message
This systematic review explores Augmented Reality techniques in minimally invasive surgery of deformable organs, aiming to identify suitable AR applications for pulmonary Video or Robotic Assisted Thoracic Surgery. Surface-based registration methods show promise for accurate navigation. Clinical feasibility assessment is crucial, focusing on registration accuracy and impact on surgical outcomes.
Introduction
Background
In pulmonary surgery, minimally invasive surgery (MIS) techniques reduce tissue trauma and allow for accelerated postoperative recovery., (Robotic) Video-assisted Thoracoscopic Surgery (RATS/VATS) is a minimally invasive procedure that is currently widely performed. Besides the advantages in surgical outcomes, limited field of view (FoV), and surgical target access in VATS remain a challenge. Moreover, patient-specific anatomical variations in bronchial or vascular structures can introduce challenges, elevating procedural complexity. The complexity is further exacerbated when zooming in on specific anatomical details, potentially leading to a loss of contextual awareness. This emphasizes the need for exact knowledge and evaluation of the patient-specific anatomy, such as the location of the tumor in relation to the intrathoracic vessels, bronchi and segmental borders., To obtain better understanding of anatomical relationships and plan surgery beforehand, three-dimensional (3D) visualization has become popular and advanced technologies are available, such as 3D computed tomography (CT) based reconstruction and simulation, combined with immersive extended reality techniques, such as Virtual Reality (VR), Mixed Reality (MR), and Augmented Reality (AR)., These immersive techniques enable visualization, manipulation, and interaction of patient specific 3D models, in a fully virtual or hybrid simulated world.,,, VR has limited purpose for intraoperative use, as the user is immersed in a complete virtual environment, resulting in full occlusion of the actual operative field. By contrast, AR and MR are suitable for intraoperative use by augmenting 3D models onto the physical environment, eg, the surgical field. This hybrid simulated world is perceived either through a head mounted display or indirectly using a monitor., The superimposition of 3D overlays on the surgical field may provide enhanced intraoperative perception and can serve as guidance during surgical procedures, navigating towards anatomical objects of interest and maintaining orientation regarding critical landmarks.- AR/MR-guided MIS could potentially shorten learning curves for young surgeons, decrease surgical exploration and procedural time, and increase safety of the procedure for both surgeon and the patient. However, disadvantages of direct overlays in a surgical context include compromised spatial perception, restricted bandwidth of the surgical video, potentially obscuring of critical aspects of the actual surgical scene, and delayed responsiveness to intraoperative changes. In practice, the terminology for MR varies among studies and depending on the definition, may include applications of AR and vice versa., In the remainder of this article, all AR and MR technologies are both referred to as AR.
Correct alignment and visualization of the 3D models onto the surgical field necessitates several steps, identified as the AR-workflow: 1. 3D reconstruction, 2. initial registration, 3. dynamic tracking, and 4. AR visualization (Figure 1)., While the AR-workflow has been increasingly implemented for surgical navigation, most applications consider target organs that maintain a constant spatial relationship with adjacent anatomical structures. As such, most research has been performed for neurosurgery, otolaryngology, and orthopedic surgery. Therefore, it is essential to understand that registration is further divided in two subtypes, rigid and non-rigid registration. Rigid registration considers translation, rotation, and scaling while deformations of the object are not considered. Rigid registration is particularly suited for these aforementioned specialties eg, orthopedic surgery., Registration of deformable organs with their rigid preoperative 3D models is more challenging due to inherent organ movement and deformation, requiring more extensive non-rigid registration that does acknowledge deformation to achieve correct alignment. Therefore, the application of AR in pulmonary surgery, and surgery of other types of deformable organs/non-rigid organs, remains limited. During RATS/VATS procedures, the lungs undergo further deformation, including intraoperative collapse of the lung, altered effects of gravity due to difference in patient positioning, and manipulation by the surgeon. These factors impose additional challenges for AR alignment of the 3D models onto the surgical view.,- Successful application of AR should meet the specific needs for pulmonary surgery, focusing on correct alignment of the complex broncho-vascular anatomy and its relationship with the tumor, particularly considering the extended deformation of a collapsed, highly movable, and pliable lung.

Figure 1
AR workflow pipeline. The steps needed for AR visualization are summed up in this figure. First imaging (for example CT-imaging) and a 3D segmentation is accomplished. Then registration is performed, which refers to the correct alignment of the coordinate systems of the virtual model and the true patient anatomy. Registration consists of two steps, initial registration and dynamic alignment. Initial registration refers to the initial alignment of the pre-operative 3D model to the intraoperative anatomy. Dynamic alignment refers to constantly updating the alignment to account for intraoperative changes such as manipulation by the surgeon. Registration is performed using a transformation matrix, which includes translation (movement in the x, y and z direction), rotation, and scaling. This registration can be based on anatomical landmarks, fiducials, or surface-based. Anatomical landmark-based registration calculates the transformation matrix between a paired set of anatomical landmark points, for example the ribs or sternum. The use of anatomical landmarks in pulmonary surgery can be difficult, because of altered anatomical distribution due to collapse of the lung intraoperatively. Fiducial based registration implies the use of artificial landmarks or fiducial markers that are attached to the patient for registration. The placement of these fiducials could have the risk of tissue damage and could complicate the workflow. Finally, surface-based registration is based on minimizing the distance between the two 3D surfaces. In surface-based registration, a point cloud is created by finding corresponding features in image pairs and calculating the disparity between the two. The disparity can be used to calculate the depth of the object in the image. Subsequently, the surface of the point cloud is registered to the surface of the preoperative 3D model. In surface-based registration, insufficient detection of features, poor texture appearance and partial visibility are challenges to be tackled. After registration, AR visualization is performed using several visualization methods, such as a transparent overlay of the 3D model onto the intraoperative view.
Aim
The purpose of this review is to systematically assess current applications of AR within minimally invasive surgery of (non-rigid) deformable organs, focusing on the employed registration, dynamic tracking, and visualization methods. The objective is to explore the adaptability of these insights on the specific requirements for a dedicated pulmonary AR-workflow. Through this approach, our aim is to identify the optimal combination of technologies, while considering both the complexity of pulmonary surgery and the technical possibilities of AR applications.
Methods
Search Strategy
A systematic search was conducted on May 5, 2023, and updated on April 16, 2024, following three databases “Embase”, “Medline (Ovid)”, and “Web of science”. The results from the three database searches were combined. For this systematic review, the Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) guidelines were followed. The search was based on three characteristics: the use of extended reality (AR or MR), minimally invasive intraoperative applications on deformable organs, and an image-guided or navigational purpose. The search terms are included in the appendix section (Appendix A). In addition, articles involving animals, humans (including cadaveric material or in vivo clinical data) were included if the eventual purpose was for human use. The search was restricted to articles written in English.
Exclusion of articles was performed along the following criteria: 1. Review articles; 2. Registration/Navigation based on bony structures; 3. Training/Simulation/Educational purpose; 4. No application of AR/MR; 5. Comment/Erratum to article; 6. User-experience; 7. Non intraoperative application; 8. Open surgery; 9. Ultrasound as imaging modality; 10. No information of the (XR) technology/application; and 11. No full text available.
Article selection was performed by two authors (JP, MD). First exclusion of articles was based on eligibility of the title and abstract according to predefined selection criteria by all authors. Additional full text analysis was performed on articles for which uncertainty on in- or exclusion remained. Any uncertainty within the study selection process was resolved by consensus among two authors (JP, MD).
Data Extraction and Quality Assessment
Data extraction consisted of author, surgical discipline, main topic, preoperative imaging modality, intraoperative imaging modality, initial registration, tracking registration, registration method, type of research, sample type, sample size, and main outcome measures. For further analysis the included studies were sub-grouped according to the AR workflow, considering the type and method of initial registration and dynamic tracking.
Depending on the study design, following scales were used for quality assessment: the Newcastle-Ottowa scale for case-control studies, the Joanna Briggs Institute checklist for case-reports and case-series, and the QUACS (Quality Appraisal for Cadaveric Studies) scale for pre-clinical animal studies.
Results
Systematic Search
The initial search resulted in 2081 results, of which 1101 were duplicates. After title and abstract analysis, 847 articles were excluded, leaving 133 articles for full text assessment. Eventually, 33 articles were found eligible and were included for further evaluation in the systematic review, as shown in the PRISMA flowchart (Figure 2).

Figure 2
PRISMA flowchart of the included and excluded studies. AR – Augmented Reality, MR – Mixed Reality, US - Ultrasound.
Study Characteristics
The included articles in this systematic review have been published between 2009 and 2024 (Figure 3A). All study characteristics are summarized in Supplemental Table S1. The majority of the included studies investigated urological surgical procedures (45%, n = 15) concerning a prostatectomy or a (partial) nephrectomy, followed by gastroenterological (39%, n = 13), gynecological (9%, n = 3), general laparoscopic (3%, n = 1), and thoracic (3%, n = 1) surgical applications (Figure 3B). Of the reviewed studies, 76% (n = 25) have used the AR technology intraoperatively for patients, of which 18 (55%) included less than 10 subjects. Furthermore, some studies performed experiments using in- or ex-vivo animal material (18%, n = 6).

Figure 3
(A) Number of publications on AR for minimally invasive deformable organ surgery per year according to our search strategy, between 2009 and 2024, and b) per surgical specialty.
3D Reconstruction
As presented in Supplemental Table S1, 31 (94%) of the included articles performed CT or Magnetic Resonance Imaging to construct a 3D model based on pre-operative imaging data. Intraoperative imaging was required in 12% of the studies (n = 4). Subsequently, to identify and extract organs and structures of interest, segmentation was performed. Overall, segmentation was most often accomplished manually or semi-automatically, using software such as 3D Slicer,,- ITK Snap,, Blender, or Meshmixer. After segmentation, structures were reconstructed through the conversion to a 3D model, for example, using triangle meshes. In one study, fully automatic segmentation and reconstruction was pursued through the holographic Visual3D modelling software.
Registration
Articles were sorted according to initial registration and dynamic tracking. The three classified types of registration are surface, anatomical landmark, and fiducial-based registration. Surface-based registration was applied in 16 studies (48%), anatomical landmark-based registration in 12 (36%), and fiducial based registration in only 5 studies (15%). An overview of the pros and cons of the different registration techniques is shown in Table 1. For initial registration, 52% (n = 17) of the studies performed manual registration, 9% (n = 3) semi-automatic, and 39% (n = 13) fully automatic (Table 2). For dynamic tracking, 36% (n = 12) were performed manually, 18% (n = 6) semi-automatic, and 45% (n = 15) fully automatic. Manual registration for initial registration, tracking, or both was performed in 88% (n = 29) of all studies. Complete manual registration was performed in 8 studies (24%).,-
Outcome Measures
The included studies have investigated varying outcome measures. Eleven (33%) studies have based the evaluation of their results on the surgical outcomes like operative time, blood loss, hospitalization time and postoperative complications.-,- Other studies assessed registration time (n = 7, 21%),,,,- resection margin accuracy (n = 3, 9%),,, robustness to noise or occlusion of view (n = 2, 6%),, calibration accuracy (n = 1, 3%), identification precision (n = 1, 3%), and spatial perception (n = 1, 3%). Furthermore, 36% (n = 12) of the articles evaluated registration accuracy (Table 3),,,,,,,,,,,, using varying metrics. In general, registration accuracy refers to the distance error between the aligned target and the real object. Target registration error (TRE) is used for different types of registration methods, whilst fiducial registration error (FRE) is referred to for fiducial-based methods, and surface registration error (SRE) is used for a surface-based application. Root mean square distance (RMSD) was measured in two of the 12 (17%) studies assessing accuracy., A large proportion of studies evaluating accuracy measured TRE (n = 5, 42%).,,,, A registration accuracy of below five mm was achieved by eight of the 12 (67%) studies,,,,,,, and three articles (25%) have presented a submillimeter accuracy.,,
Quality Assessment
Quality assessment was performed in 17 (52%) studies, evaluating human patients or animal specimens. Of the assessed studies, two (6%) studies included animal specimens for pre-clinical research., Among the clinical studies, three (9%) case reports were identified,,, with an average quality score of 88% (range 81-94%), 6 (16%) case series,,,-, with an average score of 83% (range 70-95%), and 6 (18%) prospective case-control studies,-,,, with an average score of 98%. The pre-clinical studies were rated with 77% and 85%. Due to diversity in evaluation metrics, the individual studies could not be compared reliably. In addition, due to heterogeneity of surgical specialty, study type, sample type, and sample size, no valid comparisons could be performed.
Discussion
This systematic review provides an overview of current literature on the application of AR for MIS of deformable organs and the underlying registration, tracking, and visualization methods. Within the studies included, a considerable amount of diverse research has been conducted on the use of AR. However, most of these studies are early stage and no large-scale randomized controlled trials have been conducted yet. In addition, the studies cannot be compared reliably, and no straightforward correlation can be measured between the different registration methods’ accuracies, due to considerable heterogeneity in registration approaches, applications, and evaluation metrics. Finally, the different AR methods are often custom-tailored for a specific application and vary from entirely manual support to fully automated systems, using either fiducial, anatomical landmark, or surface-based registration methods. Within the next sections, we will briefly discuss and highlight our understanding of the current state-of-the-art techniques for the eventual realization of AR for MIS of deformable organs and translatability to pulmonary surgery.
In order to obtain patient-specific 3D anatomical information, segmentation of pre-operative imaging can be performed. Next, these models can be leveraged for pre-operative planning and intra-operative guidance during MIS procedures., While segmentation was traditionally performed manually or semi-automatically, recent advances in artificial intelligence (AI) have introduced automated algorithms for this purpose.- Furthermore, biomechanical properties of deformable organs can be considered to simulate some extent of intraoperative deformation.,,,,, Additionally, deformable models are being developed to enable dynamic simulations of surgical procedures., Moreover, in the context of MIS, it would be interesting to also consider that deformable organs are subject to plastic deformation, due to intraoperative traction and dissection. The next phase of development will involve integrating this plastic deformation into the dynamic models, ensuring the coherence of the intraoperative 3D model with the real-time changing anatomy. These deformable models are a promising innovation, potentially facilitating the replication of intraoperative deformations for AR visualization.
Besides the creation of (dynamic/deformable) 3D models, registration is a critical step to enable AR-based visualization during MIS (Figure 1). Unfortunately, to date, no automatic and robust solutions for dynamic and deformable AR organ registration method are available for routine clinical use. In this systematic review, we have identified three different methods that have been explored previously. Fiducial based registration, using varying multi-modality artificial landmarks eg, infrared reflection or fluorescence,,, appears least promising for registration of deformable organs. This is attributed to potential registration inaccuracies caused by the fluctuating relationship between the fiducial and the target deformable organ, insufficient attachment, and the invasive nature of fiducial placement.
The second potentially interesting method for registration is anatomical landmark registration. Varying anatomical landmark techniques can be employed, based on organ ridges and shapes,, or vascular structures.,, While the use of anatomical landmarks is intuitive and non-invasive, it is also potentially unreliable. Anatomical landmark registration is fully dependent on optical perception, is highly influenced by adequate visibility of the concerning structures, and often requires manual intervention. This may be associated with a lack of registration accuracy measurements. Nevertheless, using AR based on manual anatomical landmark registration was associated with improved surgical outcomes for both prostatectomy and nephrectomy procedures., In pulmonary resections, the identification of the origin and bifurcations of the broncho-vascular anatomy is most crucial for achieving adequate anatomic resection, surpassing the significance of focusing purely on the lung parenchyma. This parallels the importance of understanding renal hilar and biliary-vascular anatomy in nephrectomy and hepatectomy procedures.,, Therefore, the “vascular bifurcation labelling” method, considering bifurcations as anatomical landmarks, may prove to be particularly relevant for AR registration of these deformable organs, because the bifurcation landmarks are relatively stable in deformable organ procedures.
Finally, surface-based registration can be employed. Surface-based registration requires pre- and intraoperative point-clouds of the organ surface, obtained through various methods,, providing 3D shape information. To detect and match these surface point clouds, various feature mapping algorithms have been used.,,, The main challenge for surface-based registration remains the partial-to-whole image registration, as the intraoperative surface is often a fraction of the complete preoperative model. Most of the time, the intraoperative surface reconstruction is incomplete, since only a small surface region of the organ can be reconstructed when the laparoscope is close to the target. However, surface-based approaches demonstrated the most promising registration accuracies, holding significant potential for the marker-less alignment of a 3D stereoscopic scene with a pre-operative 3D model.
After (advanced) 3D modeling and registration, the next step is dynamic alignment and tracking. In general, the main challenge of AR is the discrepancy between preoperative imaging and the intraoperative situation, due to continuous organ tissue deformation., Throughout the included studies, several (semi-)automatic tracking methods were explored to continuously update the registration in response to intraoperative changes.,,,,, These tracking methods can be potentially unreliable when organs are moving and deforming, surface textures are repeated, or the view is occluded or blurred, as is often the case in deformable organ MIS. In most included articles, some type of manual intervention was required for initial registration and tracking. This may be attributed to the absence of cross-modality similarities between preoperative and intraoperative situations, complicating the automation process. Manual registration often requires the use of a 3D mouse and is highly dependent on the surgeon’s or remote assistant’s anatomical and technical expertise. As a result, the accuracy of manual registration and tracking methods is confined to a visual level and is subject to a large inter- and intra-user variability. Consequently, several studies applied semi-automatic registration methods, which involve a combination of automatic initial registration with manual correction for misalignments or errors, and vice versa.,,,,,,,-,,,, Fully automatic frameworks as proposed by Zhang et al., have investigated marker-less non-rigid registration and thus are pioneers for future fully automatic and deformable AR systems.,
Next to technical challenges for registration and dynamic tracking, validation of the approaches and methods remains a critical step in the deployment of certain workflows for routine clinical practice. Most of the included studies in this review have validated their registration and tracking methods on short video sequences. Meanwhile, the overarching goal remains to achieve real-time dynamic augmentation, prompting numerous studies to investigate algorithms for longer duration sequences. However, it is crucial to critically assess the necessity of continuous real-time intraoperative overlays throughout the entire MIS procedure. Given that the added value of AR navigation lies in the precise identification of anatomical structures, which is mostly important during specific surgical phases, a continuous real-time overlay may not even be necessary. Among the included studies, the quality of the AR technology was often assessed in terms of registration accuracy, ranging from submillimeter to >5 mm. However, throughout the studies it remains unclear what level of registration accuracy is necessary for safe implementation in a clinical setting. The required level of accuracy may not need to be submillimeter, as it can vary depending on the specific procedure and the intended purpose of the AR navigation. For instance, lower registration accuracy might be acceptable to identify visible vascular branches compared to the accuracy needed to localize a tumor concealed within the parenchyma. Furthermore, currently the primary focus is on technical aspects, such as registration accuracy and error. However, it is of greater importance to also incorporate clinical validation methods, before implementing the AR technology in clinical practice. Finally, following the registration of preoperative models onto the surgical field, it is crucial to ensure the appropriate visualization of both models. While most studies use semi-transparent overlays, research focused on AR visualization methods is limited and no consensus on the best visualization method has been established. Nevertheless, the TileProTM application (Intuitive Surgical, California, USA) enables intraoperative display of 3D virtual overlays directly within the surgical console.,-,, In AR visualization, an ongoing challenge is the establishment of realistic depth perception, and the occlusion of surgical instruments by the AR overlay. Current research is addressing these issues by focusing on real-time automated instrument delineation and luminescence, adding a sense of depth to the 3D model-tissue interaction., While these two studies showed interesting insights in the current AR visualization methods, they were excluded from this study due to lack of information of the registration technique. Simultaneously, considerable research is being conducted for other approaches aimed at improving intraoperative guidance and navigation. These include (automatic) intraoperative anatomy recognition and surgical phase registration., The use of these methods to potentially facilitate or contribute to the application of AR during MIS can be investigated.
Future Potential and Outlook for AR Applications in Pulmonary Surgery
It is evident that the extent of deformation varies significantly among different deformable organs and methods remain relatively organ-specific., The phenomena of intraoperative induced pneumothorax during pulmonary MIS imposes additional challenges compared to other deformable organs, as the intraoperative anatomy does not directly translate to the pre-operatively created 3D reconstruction, impacting intraoperative orientation and correct structure identification. Therefore, compensation for lung collapse may be crucial for accurate AR-based guidance for pulmonary surgery. Several studies are investigating this intraoperative deformation by simulating the deformation that a lung experiences during a pneumothorax, often utilizing intraoperative cone-beam CT.- However, research performed on this topic remains limited due to minimal usage, a lack of clinical datasets, and lower accuracy of intraoperative CT of lungs resulting from atelectasis and bronchial collapse. Furthermore, the actual impact of this deformation on the accuracy of the AR overlay requires investigation as well. Anticipating future advancements, the incorporation of a 3D deformable lung model into AR technologies promises intriguing applications. This integration allows for realistic intraoperative simulations, offering a dynamic portrayal of pulmonary manipulations across varying orientations, such as an interlobar view on the fissure.
Given the potential clinically applicable registration accuracy of surface registration for nephrectomy procedures, surface-based registration emerges as a promising and relevant method for further exploration. In addition to surface-based registration, exploring a vascular bifurcation-based method is highly intriguing, given that (broncho)-vascular bifurcations are critical landmarks for pulmonary resections, and these landmarks relatively maintain stable relationships with each other. Using these methods, the eventual goal is to obtain an AR-system that can automatically and accurately achieve intraoperative 3D augmentation, even during rapid motion, occlusion of view, and pulmonary deformation. Additionally, the applied AR technology should be user friendly, not significantly delay or interrupt a procedure, and/or even improve the intraoperative workflow. Achieving these goals can contribute greatly to the future of pulmonary surgery, where AR can provide enhanced visibility while navigating through the complex and highly vulnerable anatomy of the lung, offering guidance and confidence while making critical surgical decisions.
To accomplish this, it is necessary to validate the surgical time saving impact of AR navigation, by assisting surgeons in the localization of specific anatomical structures. The enhanced anatomical insights and confirmation of surgeons’ anatomical knowledge may improve patient safety by minimizing unnecessary exploration, potentially reducing blood loss and postoperative air leaks. Upon validation of these aspects, a phased integration of AR technology into the clinical workflow can be contemplated. Additionally, the objective is to demonstrate that enhanced intraoperative anatomical insight provided by AR navigation has the potential to expedite the learning curve of pulmonary RATS/VATS procedures for novice surgeons. Therefore, anatomical AR-based 3D overlays can potentially contribute to the education and training of novice surgeons, residents, and medical students when integrated early in their training.
Conclusion
Increasing interest in and development of AR systems and the underlying registration and tracking methods for MIS of deformable organs is observed. Although AR has not been implemented in large patient studies, current research shows great potential to facilitate and improve the intraoperative use of AR for a wide range of surgical disciplines. With regards to the specific pulmonary requirements, surface-based registration, combined with anatomical vascular labeling were identified as the most promising and applicable methods for pulmonary application, as the parenchyma is highly deformed and the broncho-vascular and tumor relationship are the most important for intraoperative navigation (Table 1). These methods should be explored and adjusted further for pulmonary MIS purposes. Moreover, before clinical AR implementation during RATS/VATS pulmonary procedures, the potential added value during pulmonary procedures has to be assessed, and a clinical feasible registration accuracy should be investigated and defined.
Acknowledgements
The authors wish to thank S.T.G. Meertens-Gunput and W. Bramer from the Erasmus MC Medical Library for developing and updating the search strategies.
Author Contributions MD: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Visualization, Writing-original draft; JP: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Visualization, Writing-original draft, AM: Supervision, Writing – review & editing, JR: Supervision, Writing – review & editing, PB: Supervision, Writing – review & editing; BC: Conceptualization, Methodology, Supervision, Writing – review & editing, EM: Supervision, Writing – review & editing, AS: Conceptualization, Methodology, Supervision, Visualization, Writing – review & editing; JK: Supervision, Writing – review & editing.
Declaration of Conflicting Interests The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding The author(s) received no financial support for the research, authorship, and/or publication of this article.
Supplemental Material Supplemental material for this article is available online.
References
- 1. Tokuno J, Chen-Yoshikawa TF, Nakao M, et al. Creation of a video library for education and virtual simulation of anatomical lung resection. Interact Cardiovasc Thorac Surg. 2022;34:808–813.
- 2. Lim E, Batchelor T, Shackcloth M, et al. Study protocol for VIdeo assisted thoracoscopic lobectomy versus conventional Open LobEcTomy for lung cancer, a UK multicentre randomised controlled trial with an internal pilot (the VIOLET study). BMJ Open. 2019;9:e029507. doi:.
- 3. Kim D, Woo W, Shin JI, Lee S. The uncomfortable truth: open thoracotomy versus minimally invasive surgery in lung cancer: a systematic review and meta-analysis. Cancers (Basel). 2023;15:2630. doi:.
- 4. Bakhuis W, Sadeghi AH, Moes I, et al. Essential surgical plan modifications after virtual reality planning in 50 consecutive segmentectomies. Ann Thorac Surg. 2022;115:1247–1255. doi:.
- 5. den Boer RB, de Jongh C, Huijbers WTE, et al. Computer-aided anatomy recognition in intrathoracic and -abdominal surgery: a systematic review. Surg Endosc. 2022;36:8737–8752. doi:.
- 6. Tokuno J, Chen-Yoshikawa TF, Nakao M, Matsuda T, Date H. Resection Process Map: a novel dynamic simulation system for pulmonary resection. J Thorac Cardiovasc Surg. 2020;159:1130–1138.
- 7. Li C, Zheng B, Yu Q, Yang B, Liang C, Liu Y. Augmented reality and 3-dimensional printing technologies for guiding complex thoracoscopic surgery. Ann Thorac Surg. 2021;112:1624–1631.
- 8. Sadeghi AH, Mathari SE, Abjigitova D, et al. Current and future applications of virtual, augmented, and mixed reality in cardiothoracic surgery. Ann Thorac Surg. 2022;113:681–691.
- 9. Vervoorn MT, Wulfse M, Mohamed Hoesein FAA, Stellingwerf M, van der Kaaij NP, de Heer LM. Application of three-dimensional computed tomography imaging and reconstructive techniques in lung surgery: a mini-review. Front Surg. 2022;9:1079857. doi:.
- 10. Dho YS, Park SJ, Choi H, et al. Development of an inside-out augmented reality technique for neurosurgical navigation. Neurosurg Focus. 2021;51:E21.
- 11. Andrews C, Southworth MK, Silva JNA, Silva JR. Extended reality in medical practice. Curr Treat Options Cardiovasc Med. 2019;21:18.
- 12. Bol Raap G, Koning AH, Scohy TV, et al. Virtual reality 3D echocardiography in the assessment of tricuspid valve function after surgical closure of ventricular septal defect. Cardiovasc Ultrasound. 2007;5:8.
- 13. Sadeghi AH, Taverne Y, Bogers A, Mahtab EAF. Immersive virtual reality surgical planning of minimally invasive coronary artery bypass for Kawasaki disease. Eur Heart J. 2020;41:3279.
- 14. Nam HH, Herz C, Lasso A, et al. Simulation of transcatheter atrial and ventricular septal defect device closure within three-dimensional echocardiography-derived heart models on screen and in virtual reality. J Am Soc Echocardiogr. 2020;33:641–644.e2.
- 15. Verhey JT, Haglin JM, Verhey EM, Hartigan DE. Virtual, augmented, and mixed reality applications in orthopedic surgery. Int J Med Robot. 2020;16:e2067.
- 16. Zhang X, Wang J, Wang T, et al. A markerless automatic deformable registration framework for augmented reality navigation of laparoscopy partial nephrectomy. Int J Comput Assist Radiol Surg. 2019;14:1285–1294.
- 17. Rad AA, Vardanyan R, Lopuszko A, et al. Virtual and augmented reality in cardiac surgery. Braz J Cardiovasc Surg. 2022;37:123–127.
- 18. Ma L, Huang T, Wang J, Liao H. Visualization, registration and tracking techniques for augmented reality guided surgery: a review. Phys Med Biol. 2023;68. doi:.
- 19. Collins T, Pizarro D, Gasparini S, et al. Augmented reality guided laparoscopic surgery of the uterus. IEEE Trans Med Imaging. 2021;40:371–380.
- 20. Su LM, Vagvolgyi BP, Agarwal R, Reiley CE, Taylor RH, Hager GD. Augmented reality during robot-assisted laparoscopic partial nephrectomy: toward real-time 3D-CT to stereoscopic video registration. Urology. 2009;73:896–900.
- 21. Cannizzaro D, Zaed I, Safa A, et al. Augmented reality in neurosurgery, state of art and future projections. a systematic review. Front Surg. 2022;9:864792.
- 22. Rose AS, Kim H, Fuchs H, Frahm JM. Development of augmented-reality applications in otolaryngology-head and neck surgery. Laryngoscope. 2019;129(Suppl 3):S1–S11.
- 23. Joeres F, Heinrich F, Schott D, Hansen C. Towards natural 3D interaction for laparoscopic augmented reality registration. Comput Methods Biomech Biomed Eng Imaging and Visualization. 2020;9:384–391.
- 24. Ieiri S, Uemura M, Konishi K, et al. Augmented reality navigation system for laparoscopic splenectomy in children based on preoperative CT image using optical tracking device. Pediatr Surg Int. 2012;28:341–346.
- 25. Teber D, Guven S, Simpfendörfer T, et al. Augmented reality: a new tool to improve surgical accuracy during laparoscopic partial nephrectomy? Preliminary in vitro and in vivo results. Eur Urol. 2009;56:332–338.
- 26. Lerotic M, Chung AJ, Mylonas G, Yang GZ. Pq-space based non-photorealistic rendering for augmented reality. Med Image Comput Comput Assist Interv. 2007;10:102–109.
- 27. Shamseer L, Moher D, Clarke M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation. Bmj. 2015;350:g7647. doi:.
- 28. GA Wells BS, O'Connell D, Peterson J, Welch V, Losos M, Tugwell P. The Newcastle-Ottawa Scale (NOS) for assessing the quality of nonrandomised studies in meta-analyses. 2021. https://www.ohri.ca/programs/clinical_epidemiology/oxford.asp
- 29. Moola S MZ, Tufanaru C, Aromataris E, Sears K, Sfetcu R, Currie M, Qureshi R, Mattis P, Lisy K, Mu P-F. Joanna Briggs Institute reviewer’s manual. In: Aromataris E MZ, ed. Chapter 7: Systematic reviews of etiology and risk; 2017.
- 30. Wilke J, Krause F, Niederer D, et al. Appraising the methodological quality of cadaveric studies: validation of the QUACS scale. J Anat. 2015;226:440–446. doi:.
- 31. Wang R, Geng Z, Zhang ZX, Pei R, Meng X. Autostereoscopic augmented reality visualization for depth perception in endoscopic surgery. Displays. 2017;48:50–60.
- 32. Zhang X, Wang T, Zhang X, Zhang Y, Wang J. Assessment and application of the coherent point drift algorithm to augmented reality surgical navigation for laparoscopic partial nephrectomy. Int J Comput Assist Radiol Surg. 2020;15:989–999.
- 33. Adballah M, Espinel Y, Calvet L, et al. Augmented reality in laparoscopic liver resection evaluated on an ex-vivo animal model with pseudo-tumours. Surg Endosc. 2022;36:833–843.
- 34. Puerto-Souza GA, Cadeddu JA, Mariottini GL. Toward long-term and accurate augmented-reality for monocular endoscopic videos. IEEE Trans Biomed Eng. 2014;61:2609–2620.
- 35. Singla R, Edgcumbe P, Pratt P, Nguan C, Rohling R. Intra-operative ultrasound-based augmented reality guidance for laparoscopic surgery. Healthc Technol Lett. 2017;4:204–209.
- 36. Chen L, Tang W, John NW. Real-time geometry-aware augmented reality in minimally invasive surgery. Healthc Technol Lett. 2017;4:163–167.
- 37. Schiavina R, Bianchi L, Lodi S, et al. Real-time augmented reality three-dimensional guided robotic radical prostatectomy: preliminary experience and evaluation of the impact on surgical planning. Eur Urol Focus. 2021;7:1260–1267.
- 38. Li G, Dong J, Wang J, et al. The clinical application value of mixed-reality-assisted surgical navigation for laparoscopic nephrectomy. Cancer Med. 2020;9:5480–5489.
- 39. Porpiglia F, Checcucci E, Amparore D, et al. Three-dimensional elastic augmented-reality robot-assisted radical prostatectomy using hyperaccuracy three-dimensional reconstruction technology: a step further in the identification of capsular involvement. Eur Urol. 2019;76:505–514.
- 40. Porpiglia F, Checcucci E, Amparore D, et al. Three-dimensional augmented reality robot-assisted partial nephrectomy in case of complex tumours (PADUA ≥10): a new intraoperative tool overcoming the ultrasound guidance. Eur Urol. 2020;78:229–238.
- 41. Puliatti S, Sighinolfi MC, Rocco B, et al. First live case of augmented reality robot-assisted radical prostatectomy from 3D magnetic resonance imaging reconstruction integrated with PRECE model (Predicting Extracapsular extension of prostate cancer). Urol Video J. 2019;1:100002.
- 42. Bianchi L, Chessa F, Angiolini A, et al. The use of augmented reality to guide the intraoperative frozen section during robot-assisted radical prostatectomy. Eur Urol. 2021;80:480–488.
- 43. Paulus CJ, Haouchine N, Kong SH, Soares RV, Cazier D, Cotin S. Handling topological changes during elastic registration : application to augmented reality in laparoscopic surgery. Int j comput assist radiol surg. 2017;12:461–470. doi:.
- 44. Marzano E, Piardi T, Soler L, Diana M, Mutter D, Marescaux J, Pessaux P. Augmented reality-guided artery-first pancreatico-duodenectomy. J Gastrointest Surg. 2013;17:1980–1983. doi:.
- 45. Tao HS, Lin JY, Luo W, et al. Application of real-time augmented reality laparoscopic navigation in splenectomy for massive splenomegaly. World J Surg. 2021;45:2108–2115.
- 46. Zhu W, Zeng X, Hu H, et al. Perioperative and disease-free survival outcomes after hepatectomy for centrally located hepatocellular carcinoma guided by augmented reality and indocyanine green fluorescence imaging: a single-center experience. J Am Coll Surg 2023; 236: 328–337. doi:.
- 47. Wang Z, Tao H, Wang J, et al. Laparoscopic right hemi-hepatectomy plus total caudate lobectomy for perihilar cholangiocarcinoma via anterior approach with augmented reality navigation: a feasibility study. Surg Endosc. 2023;37:8156–8164. doi:.
- 48. Amparore D, Checcucci E, Piramide F, et al. Robotic vena cava thrombectomy with three-dimensional augmented reality guidance. Eur Urol Open Sci. 2024;62:43–46. doi:.
- 49. Wang D, Hu H, Zhang Y, et al. Efficacy of augmented reality combined with indocyanine green fluorescence imaging guided laparoscopic segmentectomy for hepatocellular carcinoma. J Am Coll Surg. 2024;238:321–330.
- 50. Bourdel N, Chauvet P, Calvet L, Magnin B, Bartoli A, Canis M. Use of augmented reality in gynecologic surgery to visualize adenomyomas. J Minim Invasive Gynecol. 2019;26:1177–1180.
- 51. Bourdel N, Collins T, Pizarro D, et al. Use of augmented reality in laparoscopic gynecology to visualize myomas. Fertil Steril. 2017;107:737–739.
- 52. Singh P, Alsadoon A, Prasad PWC, et al. A novel augmented reality to visualize the hidden organs and internal structure in surgeries. Int J Med Robot. 2020;16:e2055.
- 53. Conrad C, Fusaglia M, Peterhans M, Lu H, Weber S, Gayet B. Augmented reality navigation surgery facilitates laparoscopic rescue of failed portal vein embolization. J Am Coll Surg. 2016;223:e31–e34.
- 54. Chauvet P, Collins T, Debize C, et al. Augmented reality in a tumor resection model. Surg Endosc. 2018;32:1192–1201.
- 55. Wild E, Teber D, Schmid D, et al. Robust augmented reality guidance with fluorescent markers in laparoscopic surgery. Int j comput assist radiol surg. 2016;11:899–907.
- 56. Bernhardt S, Nicolau SA, Agnus V, Soler L, Doignon C, Marescaux J. Automatic localization of endoscope in intraoperative CT image: a simple approach to augmented reality guidance in laparoscopic surgery. Med Image Anal. 2016;30:130–143.
- 57. Gribaudo M, Piazzolla P, Porpiglia F, Vezzetti E, Violante MG. 3D augmentation of the surgical video stream: toward a modular approach. Comput Methods Programs Biomed. 2020;191:105505.
- 58. Luo HL, Yin DL, Zhang SG, et al. Augmented reality navigation for liver resection with a stereoscopic laparoscope. Comput Meth Programs Biomed. 2020;187:105099.
- 59. Schneider C, Thompson S, Totz J, et al. Comparison of manual and semi-automatic registration in augmented reality image-guided liver surgery: a clinical feasibility study. Surg Endosc. 2020;34:4702–4711.
- 60. Sadeghi AH, Maat A, Taverne Y, et al. Virtual reality and artificial intelligence for 3-dimensional planning of lung segmentectomies. JTCVS Tech. 2021;7:309–321.
- 61. Chen X, Wang Z, Qi Q, et al. A fully automated noncontrast CT 3-D reconstruction algorithm enabled accurate anatomical demonstration for lung segmentectomy. Thorac Cancer. 2022;13:795–803. doi:.
- 62. Yang G, Gu J, Chen Y, et al. Automatic kidney segmentation in CT images based on multi-atlas image registration. In: 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 26-30 Aug 2014; 2014:5538–5541.
- 63. Wang Y, Fang B, Pi J, Wu L, Wang PSP, Wang H. Automatic multi-scale segmentation of intrahepatic vessel in CT images for liver surgery planning. Int J Pattern Recognit Artif Intell. 2013;27:1357001.
- 64. Bakhuis W, Max SA, Nader M, et al. Video-assisted thoracic surgery S7 segmentectomy: use of virtual reality surgical planning and simulated reality intraoperative modelling. Multimed Man Cardiothorac Surg. 2023;2023. doi:.
- 65. Wild ET, D., Teber D, Schmid D, Simpfendörfer T, Müller M, Baranski AC, Kenngott H, Kopka K, Maier-Hein L. Robust augmented reality guidance with fluorescent markers in laparoscopic surgery. Int j comput assist radiol surg. 2016;11:899–907. doi:.
- 66. Detmer FJ, Hettig J, Schindele D, Schostak M, Hansen C. Virtual and augmented reality systems for renal interventions: a systematic review. IEEE Rev Biomed Eng. 2017;10:78–94. doi:.
- 67. He J, Xu X. Thoracoscopic anatomic pulmonary resection. J Thorac Dis. 2012;4:520–547. doi:.
- 68. Bijlstra OD, Broersen A, Oosterveer TTM, et al. Integration of three-dimensional liver models in a multimodal image-guided robotic liver surgery cockpit. Life (Basel). 2022;12:667. doi:.
- 69. Mountney P, Fallert J, Nicolau S, Soler L, Mewes PW. An augmented reality framework for soft tissue surgery. Med Image Comput Comput Assist Interv. 2014;17:423–431.
- 70. Simpfendörfer T, Baumhauer M, Müller M, et al. Augmented reality visualization during laparoscopic radical prostatectomy. J Endourol. 2011;25:1841–1845.
- 71. Paulus CJ, Haouchine N, Kong SH, Soares RV, Cazier D, Cotin S. Handling topological changes during elastic registration : application to augmented reality in laparoscopic surgery. Int j comput assist radiol surg. 2017;12:461–470.
- 72. Samei G, Tsang K, Kesch C, et al. A partial augmented reality system with live ultrasound and registered preoperative MRI for guiding robot-assisted radical prostatectomy. Med Image Anal. 2020;60:101588.
- 73. Reissis A, Yoo S, Clarkson MJ, Thompson S. The effect of luminance on depth perception in augmented reality guided laparoscopic surgery. Proc SPIE Int Soc Opt Eng. 2023;12466:12466.
- 74. De Backer P, Van Praet C, Simoens J, et al. Improving augmented reality through deep learning: real-time instrument delineation in robotic renal surgery. Eur Urol. 2023;84:86–91. doi:.
- 75. Garrow CR, Kowalewski K-F, Li L, et al. Machine learning for surgical phase recognition: a systematic review. Ann Surg. 2021;273:684–693. doi:.
- 76. Maekawa H, Nakao M, Mineura K, Chen-Yoshikawa TF, Matsuda T. Model-based registration for pneumothorax deformation analysis using intraoperative cone-beam CT images. Annu Int Conf IEEE Eng Med Biol Soc. 2020;2020:5818–5821. doi:.
- 77. Nakao M, Maekawa H, Mineura K, et al. Kernel-based modeling of pneumothorax deformation using intraoperative cone-beam CT images. SPIE. 2021.
- 78. Nakao M, Kobayashi K, Tokuno J, Chen-Yoshikawa T, Date H, Matsuda T. Deformation analysis of surface and bronchial structures in intraoperative pneumothorax using deformable mesh registration. Med Image Anal. 2021;73:102181. doi:.
- 79. Uneri A, Nithiananthan S, Schafer S, et al. Deformable registration of the inflated and deflated lung in cone-beam CT-guided thoracic surgery: initial investigation of a combined model- and image-driven approach. Med Phys. 2013;40:017501. doi:.
Appendix Abbreviations
3D: Three dimensional
AR: Augmented reality
CBCT: Cone beam computed tomography
CT: Computed tomography
FoV: Field of view
FRE: Fiducial registration error
GE: Gastroenterology
HMD: Head mounted display
MIS: Minimally invasive surgery
MR: Mixed reality
MRI: Magnetic resonance imaging
RATS: Robotic assisted thoracic surgery
RMS: Root mean square
RMSD: Root mean square distance
SRE: Surface registration error
TRE: Target registration error
VATS: Video assisted thoracic surgery
VR: Virtual reality
XR: Extended reality


