Image guided navigation

The use of minimally invasive and flexible access surgery has imposed significant challenges on surgical navigation as the operator can no longer have direct access to the surgical site with unrestricted vision, force and tactile feedback. How to combine prior knowledge of the anatomical model with subject specific information derived from pre- and intra-operative imaging is a significant research challenge. Furthermore, effective surgical guidance is essential to the clinical application of dynamic active constraints for robotically assisted MIS and the development of new surgical tools for emerging flexible access surgical approaches such as Natural Orifice Transluminal Endoscopic Surgery (NOTES) and Single Incision Laparoscopic Surgery (SILS).

For image-guided surgery, another key requirement is the augmentation of the exposed surgical view with pre- or intra-operatively acquired images or 3D models. The inherent challenge is accurate registration of pre- and intra-operative data to the patient, especially for soft tissue where there is large-scale deformation. With the increasing use of intra-operative imaging techniques, our work has been focussed on the development of high- fidelity Augmented Reality (AR) techniques combined with real-time surgical vision and new robotic instruments for further enhancing the accuracy, safety, and consistency of MIS procedures.

Research themes

Research themes

Surgical workflow analysis

Surgical episode segmentation based on vision, instrument usage and kinematics, and video-oculography (eye tracking); workflow recovery for intra-operative CAD (computer aided decision support) and machine learning; analysis and modelling of surgical workflow with consideration of perceptual and cognitive factors, as well as team interaction for detecting dis-orientation, hesitation and precursors to surgical errors.

3D tissue deformation recovery

Real-time tissue deformation recovery based on computer vision by combing multiple depth cues with statistical shape modes and linear/non-linear shape instantiation techniques. Image Constrained Biomechanical Models – inverse finite element modelling based on anatomical constraints derived from surface fiducials or intra-operative imaging data, incorporating tissue feature tracking, spatiotemporal registration, real-time finite element simulation, force recovery and AR visualisation for surgical guidance.

Augmented reality visualisation

High fidelity AR visualisation based on our patented technology on ‘inverse realism’ and real-time instrument tracking for intra-operative imaging probes (e.g., pCLE, OCT and US).

Dynamic active constraints

Patient specific model generation, adaptation and real-time proximity query, haptic rendering for prescribing dynamic active constraints during robotically assisted MIS.

Selected publications

Howitt 2012Hughes-Hallett A, Pratt P, Mayer E, Di Marco A, Yang GZ, Vale J, Darzi A
Intraoperative Ultrasound Overlay in Robot-assisted PartialNephrectomy: First Clinical Experience
Eur Urol. 2013 Nov12


LancetEur Urol. 2013 Nov12
Comparative Effectiveness of 3-D versus 2-D and HD versus SD Neuroendoscopy: A Preclinical Randomized Crossover Study
Neurosurgery. 2013 Nov 11


Yang GZ

Lopez E, Kwok KW, Payne CJ, Giataganas P, Yang GZ
Implicit Active Constraints for Robot-Assisted Arthroscopy
ICRA 2013; 5370-5375


Yang GZ

Kwok KW, Hung Tsoi K, Vitiello V, Clark J, Chow GCT, Luk W, Yang GZ
Dimensionality Reduction in Controlling Articulated Snake Robot for Endoscopy Under Dynamic Active Constraints
IEEE Transactions on Robotics. 2013;29(1):15-31


He BThiemjarus S, James A, GZ Yang
An eye-hand data fusion framework for pervasive sensing of surgical activities
Pattern Recognition 45(8): 2855-2867 (2012)


Yang GZ

Pratt P, Marco AD, Payne C, Darzi A, Yang GZ
Intraoperative ultrasound guidance for transanal endoscopic microsurgery
MICCAI 2012; 2012;15(Pt 1):463-70


He BPratt P, Stoyanov D, Visentini-Scarzanella M, Yang GZ
Dynamic guidance for robotic surgery using image-constrained biomechanical models
MICCAI 2010; 13 (Pt 1): 77-85


Yang GZ

Giannarou S, Yang GZ
Content-Based Surgical Workflow Representation Using Probabilistic Motion Modeling
MIAR 2010: 314-323


He BLee SL, Lerotic M, Vitiello V, Giannarou S, Kwok KW, Visentini-Scarzanella M, Yang GZ
From medical images to minimally invasive intervention: Computer assistance for robotic surgery
Comput Med Imaging Graph. 2010 Jan;34(1):33-45


Yang GZ

Lerotic M, Lee SL, Keegan J, Yang GZ
Image Constrained Finite Element Modelling For Real-Time Surgical Simulation and Guidance
ISBI 2009, 1063-1066


He BJames A, Vieira D, Lo B, Darzi A, Yang GZ
Eye-gaze driven surgical workflow segmentation
MICCAI 2007;10(Pt 2):110-7


Yang GZ

Lerotic M, Chung AJ, Mylonas G, Yang GZ
pq-space based non-photorealistic rendering for augmented reality
MICCAI 2007;10(Pt 2):102-9