Surgical Imaging and vision

The use of surgical robots has helped to realise the full potential of MIS with improved consistency, safety and accuracy. The development of articulated, precision tools to enhance the surgeon's dexterity has evolved in parallel with advances in imaging and human-robot interaction. This has improved hand-eye coordination and manual precision, with the capability of navigating through complex anatomical pathways.

At the Hamlyn Centre, we believe that the future medical robotics research should focus on the design of lightweight, cost-effective, flexible manipulators with minimum footprint in the operative theatre. Such surgical robots are intrinsically complex and intelligent, yet simple, lightweight and natural to use with seamless user control. They should enhance the current surgical workflow, rather than alter it radically or become a hindrance to normal procedures. To this end, we are currently working towards the development of new generation of miniaturised and intelligent mechatronic devices and robots for flexible access surgery, as well as investigating new techniques for providing synergistic control between the surgeon and the robot.


Research themes

Research Themes

Surgical Imaging

Fundamentals and current state-of-the-art in surgical imaging (including interactive MRI, CT, Fluoroscopy and US guidance), real-time image reconstruction and integration with surgical instruments and robotic platforms.

Surgical vision

Machine vision algorithms and real-time systems for tracking, 3D scene reconstruction, tissue deformation recovery, intra-operative registration and retargeting, as well as hardware platforms for vision including the use of structured illumination and robotic guidance.

Robot-assisted in vivo microscopic imaging

Robot-assisted large area, in vivo optical biopsy based on microscopic imaging and mosaicking; in vivo localisation, mapping, retargeting and visualisation for optical biopsy and the development of manifold based techniques for scene association and retargeting.

Multi-modal image fusion and visualisation

Multi-scale image fusion and real-time intra-operative augmented reality systems for surgical navigation and online CAD (computer aided diagnosis).

Imaging informatics and big data analytics

Integration of image and bio-informatics into peri-operative settings, and the development of real-time analysis techniques for data mining and knowledge discovery.


Selected publications

Howitt 2012Giannarou S, Visentini-Scarzanella M, Yang GZ
Probabilistic tracking of affine-invariant anisotropic regions
IEEE Trans Pattern Anal Mach Intell. 2013 Jan;35(1):130-43



Howitt 2012Mountney P, Yang GZ
Context specific descriptors for tracking deforming tissue
Med Image Anal. 2012 Apr;16(3):550-61



Howitt 2012Giannarou S, Zhang Z, Yang GZ
Deformable Structure From Motion by Fusing Visual and Inertial Measurement Data
IROS 2012, 4816-4821



Howitt 2012Ye M, Giannarou S, Patel N, Teare J, Yang GZ
Pathological Site Retargeting under Tissue Deformation using Geometrical Association and Tracking
MICCAI (2) 2013: 67-74



Howitt 2012Atasoy S, Mateus D, Meining A, Yang GZ, Navab N
Endoscopic video manifolds for targeted optical biopsy
IEEE Trans Med Imaging. 2012 Mar;31(3):637-53



Howitt 2012James DR, Leff DR, Orihuela-Espina F, Kwok KW, Mylonas GP, Athanasiou T, Darzi AW, Yang GZ
Enhanced frontoparietal network architectures following "gaze-contingent" versus "free-hand" motor learning
Neuroimage. 2013 Jan 1;64:267-76



Howitt 2012Hughes M, Yang GZ
Robotics and smart instruments for translating endomicroscopy to in situ, in vivo applications
Comput Med Imaging Graph. 2012 Dec;36(8):589-90



Howitt 2012Pratt P, Mayer E, Vale J, Cohen D, Edwards E, Darzi A, Yang GZ
An effective visualisation and registration system for image-guided robotic partial nephrectomy
Journal of Robotic Surgery, 2012; 6(1), 23-31