C-arm-based surgical data visualization and repositioning using augmented reality

dc.contributor.committeeMemberNavab, Nassir
dc.creatorSong, Tianyu
dc.creator.orcid0000-0002-8428-9651
dc.date.accessioned2019-07-30T03:04:33Z
dc.date.available2019-07-30T03:04:33Z
dc.date.created2019-05
dc.date.issued2019-05-02
dc.date.submittedMay 2019
dc.date.updated2019-07-30T03:04:34Z
dc.description.abstractPurpose: As the trend towards minimally invasive and percutaneous interventions continues, the importance of appropriate surgical data visualization becomes more evident. Ineffective interventional data display techniques yield poor ergonomics that hinder hand-eye coordination, and therefore, promote frustration which can compromise on-task performance up to adverse outcome. A very common example of ineffective visualization is monitors attached to the base of mobile C-arm X-ray systems. Methods: We present a spatially- and imaging geometry-aware paradigm for visualization of fluoroscopic images using Interactive Flying Frustums (IFFs) in a mixed reality environment. We exploit the fact that the C-arm imaging geometry can be modeled as a pinhole camera giving rise to an 11 degree of freedom view frustum on which the X-ray image can be translated while remaining valid. Visualizing IFFs to the surgeon in an augmented reality environment intuitively unites the virtual 2D X-ray image plane and the real 3D patient anatomy. To achieve this visualization, the surgeon and C-arm are tracked relative to the same coordinate frame using image-based localization and mapping, with the augmented reality environment being delivered to the surgeon via a state-of-the-art optical see-through head-mounted display. Results: We present a thorough evaluation of the hand-eye calibration procedure. Results suggest convergence when using 50 pose pairs or more. The mean translation and rotation errors at convergence are 5.7mm and 0.26◦, respectively. The root-mean-squared error of C-arm source tracking after hand-eye calibration was determined as 0.43◦ ± 0.34◦ and 4.6 ± 2.7mm in rotation and translation, respectively. Finally, we demonstrated the application of spatially-aware data visualization for internal fixation of pelvic fractures and percutaneous vertebroplasty. Conclusion: Our spatially-aware approach to transmission image visualization effectively unites patient anatomy with X-ray images by enabling spatial image manipulation that abides image formation. Our proof-of-principle findings indicate potential applications for surgical tasks that mostly rely on orientational information such as placing the acetabular component in total hip arthroplasty, making us confident that the proposed augmented reality concept can pave the way for improving surgical performance and visuo-motor coordination in fluoroscopy-guided surgery.
dc.format.mimetypeapplication/pdf
dc.identifier.urihttp://jhir.library.jhu.edu/handle/1774.2/61682
dc.language.isoen_US
dc.publisherJohns Hopkins University
dc.publisher.countryUSA
dc.subjectAugmented reality
dc.subjectFrustum
dc.subjectC-arm
dc.subjectX-ray
dc.subjectSurgical data visualization
dc.titleC-arm-based surgical data visualization and repositioning using augmented reality
dc.typeThesis
dc.type.materialtext
thesis.degree.departmentnot listed
thesis.degree.disciplineRobotics
thesis.degree.grantorJohns Hopkins University
thesis.degree.grantorWhiting School of Engineering
thesis.degree.levelMasters
thesis.degree.nameM.S.E.
Files
Original bundle
Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Name:
SONG-THESIS-2019.pdf
Size:
2.25 MB
Format:
Adobe Portable Document Format
No Thumbnail Available
Name:
Johns Hopkins University Master Essay.zip
Size:
2.42 MB
Format:
Unknown data format
Download
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
LICENSE.txt
Size:
2.67 KB
Format:
Plain Text
Description: