Author(s):
Lap-Heng Keung
lkeung@mcw.edu
Medical College of Wisconsin Class of 2022

Zuhayr Shaikh
University of Virginia SOM Class of 2024
zs8zfm@virginia.edu

George Sidrak
Howard University COM Class of 2022
george.sidrak@bison.howard.edu

Teresa Varghese
MBBS, KMC Manipal
teresamvarghese@gmail.com

Editor(s):
John T. Moon, M.D., Integrated Interventional Radiology Resident (PGY-2), Emory University
jtmoon@emory.edu

Brief History of Extended Reality Technologies
The growing processing power of computers in smaller and more accessible packages has created a space for technologies such as virtual (VR), augmented (AR), and mixed reality (MR) – collectively known as extended reality (XR). The aim of Virtual Reality is to provide a user with sensorimotor activities in a synthetic world, allowing them to feel fully immersed and able to interact in a virtual environment. That has since been implemented by replacing a user’s visual field with a screen – whether that’s through a mobile device as seen in Google’s Cardboard, or with a complex headset connected to a powerful PC, such as the Oculus. Augmented and Mixed Reality are more ambiguous terms that generally refer to a projection of a virtual image into reality or an alteration of how the user would see or interact with their environment, such as the, “placement,” of virtual furniture in a room by many mobile shopping applications before purchasing the product. Nonetheless, the culmination of these terms, Extended Reality, encompasses the growing spectrum of technologies that are constantly being iterated upon and toeing the edge between science-fiction and reality.

VR has come a long way since its earliest commercial appearances in high-tech science-fiction movies and video gaming. In 1960, filmmaker Morton Heilig, credited as the Father of VR, patented the, “Telesphere Mask,” the first ever Head Mounted Display (HMD), which showed a series of stereoscopic 3D images. Two years later he invented the, “Sensorama,” the first commercially available 3D immersive simulator in which the observer experienced a motorcycle ride through New York City (NYC). The observer could sit in a booth, use the HMD to ride through the streets in NYC, feel the vibrations of the chair, hear the sound of wind and leaves passing by, and smell the aroma of fresh NY pizza! Less than a decade later, Ivan Sutherland, credited as the Father of Computer graphics, and his student Bob Sproull created the first VR HMD, Sword of Damocles. It was a large device suspended from the ceiling, connected to a computer and projected primitive wireframe images into a room that moved with each head movement (1,2). The technology progressed over the decades as hardware continued to advance, but was most notably intended for aircraft simulation training and remote operation of NASA’s exploration vehicles. Due to its large form factor, necessity of expensive custom components, and large research and development costs associated with it, the technology continued to fly under the radar in consumer markets.

Although lagging behind the public sectors, film and entertainment industries began to catch up with VR utilization, culminating in its first peak in popularity in sci-fi action-adventure film, Tron, commercial VR arcade machines, and portable VR gaming consoles. Healthcare bought into the technology soon after and the first medical application for AR was introduced by Dr. Loomis and his colleagues in 1993.  They developed an assisted navigation tool with integrated spatial audio information using an AR GPS-based system to help the visually impaired. VR was also trialed as therapy for post-traumatic stress disorder (PTSD) and phobia treatment modalities through exposure therapy (3).  The attempts at mass market infiltration and commercialization were unfortunately met with begrudging failure as public dissatisfaction grew due to the widening gap between bold expectations and limitations. User complaints revolved around bulky devices, high initial investment, and motion-sickness from prolonged use.

However, XR technologies did not remain as sleeping giants for long. Technology industry titans along with grassroots developers and enthusiasts revisited XR soon after the turn of the 21st century. Support for this technology was better received this time around from all angles as consumers contributed to crowdfunded projects and industry titans invested into fledgling companies, one of which was Oculus. This time the medical field was ready and enthusiastically welcomed XR for education, training, and treatment.

Current Applications in Medicine and Interventional Radiology
Enhanced engagement within XR-driven virtual environments offers unique potential within interventional radiology, a field where strong spatial understanding of often complex patient anatomy is essential for diagnoses and procedures. As minimally invasive and image-guided specialists, XR can greatly complement healthcare delivery, better ensure patient safety, and support quality improvement processes moving forward. Several field-relevant applications of VR and AR have been developed in recent years showing promise for shifting paradigms within the field.

At the 2018 Society of Interventional Radiology (SIR) Annual Meeting, Dr. Ziv Haskal demonstrated such XR applications in medical training with a 360-degree video walking through a Transjugular Intrahepatic Portosystemic Shunt (TIPS) case, which he uploaded to YouTube. Dr. Haskal made this immersive demonstration available to anyone in the world with a phone, which can be converted to a stereoscopic VR viewer using affordable mobile device accessories, like Google Cardboard (4). This creative showcase served as a salient promotion to attract new aspiring trainees and provided patients with a better understanding of their planned procedures.

From a medical student standpoint, there are continuing efforts in utilizing these technologies to help introduce and advocate for the field of interventional radiology. There are already applications that display 3D projections of anatomical dissections for education. Newer iterations now include DICOM viewers compatible with VR headsets that display interactive 3D reconstructions of preloaded or uploaded images for supplementary education alternatives. Patient imaging can also be utilized allowing for collaborative surgical planning and practical scenarios for medical education. For many medical students and even interested pre-medical students, IR is an abstract and nebulous specialty, so being able to immerse oneself in a virtual IR suite and interact with different instruments or even view procedures would help with improving outreach and spreading awareness of the field (5). For residents, fellows and even attending staff, it is important to continue being exposed to a variety of IR tools, skills, and procedures. Not all institutions and IR training programs are exposed to the same volume of procedures, nor will those procedures necessarily fall within the IR scope of practice at those institutions. As a result, trainees can experience and learn those procedures virtually and attending staff can continue to hone those skills as a form of continuing professional education. An added benefit of this form of training is that there are no lasting real-world complications and would allow users to also manage complications.

XR has also been utilized in patient education and communication. Notable uses have been for chemoradiation therapy for cancer patients. Cancer diagnoses are understandably shocking revelations and very anxiety inducing, so many studies have utilized XR technologies to help augment patient education and introduce them to the treatment process and what they can expect (6,7). Professor Levin at Penn Medicine uses VR programs to visually educate patients about their disease and treatment plans. Oncomfort is an app that offers younger patients an interactive game that allows players to shoot and destroy cancer cells. This facilitates an easier to understand discussion between doctors and younger patients to help alleviate the fears and anxieties associated with cancer therapy (8). Similar efforts have taken place in vascular surgery and the use of VR for patient education regarding abdominal aortic aneurysms and what the treatment process entails (9). IR could adopt these use cases by developing patient education modules for their more advanced procedures, especially within their interventional oncology, hepatobiliary, and vascular realms.

To date, VR has been utilized in a number of settings, albeit primarily for psychiatric disorders, but new studies in recent years have since realized VR as a positive distractor for acute pain during medical procedures and hospitalizations. Relying on its ability to provide an immersive environment, VR is thought to overwhelm the patient with sensory input to limit the patient’s processing of nociceptive pain (10). These findings have spurred further research into whether VR can also be utilized for the treatment of neuropathic pain with promising initial findings (11). Studies have found the technology to be an effective adjunct in multimodal pain management with the hopes that it can help decrease the use of pain medications, specifically opioids. Interventional radiology has an ever expanding repertoire of techniques for neoplastic-related pain management, consisting of cement and/or screw fixation and ablations (12). These techniques have been applied to minimize opioid dependence and pain management could further be supported with the use of VR technologies. On a less chronic basis, VR can be an important tool pre- and post-procedurally by helping to reduce anxiety prior to a procedure and for post-operative pain management. There is already less risk and decreased surgical pain associated with minimally invasive procedures compared to their open counterparts. Perhaps with the use of VR for additional pain management, more procedures may be completedin outpatient settings entirely or with fewer hospitalization days.

Other applications have been in the quality improvement sphere, in pursuit of goals such as decreasing radiation exposure for all involved in procedures. One team in Japan, for example, approached this challenge by designing an intuitive mixed-reality real-time dose monitoring visualization that is directly overlaid onto patients (13). This minimizes the need for clinicians to look away at a separate screen during procedures. Another approach explored improving CT-guided procedure efficiency for out-of-plane lesions using AR, demonstrated by a needle pass task completed on an abdominal phantom (14).

3D-printed, patient-specific anatomical models have found recent relevance for patient education, surgical planning, and medical device design in other specialties. Oftentimes, the complex architecture of patient vasculature (often thin and tortuous) and surrounding anatomy is not readily recreated without extensive model refinement from outside teams. Extended reality technologies offer promise in streamlining such efforts to produce engaging demonstrations from imaging. In addition, without material constraints, users are able to manipulate and navigate imagery in real-time—marked functional advantages over 3d-printed models when tactile feedback is not essential. Such patient-facing demonstrations have been explored by vascular neurologists (15). Similarly, IR could adopt these use cases when discussing with patients the minimally invasive treatment options for their malignancies, with visual walkthroughs for embolisation or ablations. Other examples include, but are not limited to, visual walkthroughs of endovascular procedures such as stent and graft placements, filter placements, and angioplasties. There has been a shift in recent years where IR has grown from being consulting proceduralists to comprehensive disease specialists as evidenced by post-procedural rounding and follow up clinic appointments. As a result, this technology could be used to show patients in follow-up clinic visits the progression of their treatment, be it shrinking fibroids, post-ablation malignancies, or stabilization of endografts.

What It Means for Interventional Radiology
From its beginnings in the mid-20th century to its iterative innovations throughout the 21st century, the many realities have seen its fair share of use cases globally. Current applications consist of continued professional training, patient education, procedural planning and augmentation, symptom management pre- and post-operatively, among many others. Without diving too deep into speculative territory and beyond the realm of reality, it’s important to consider how these innovations and use-cases can be further refined and improved upon for IR use.

The most important concern moving forward is addressing the economy of scale and how to encourage more widespread adoption of this technology. A systematic review of wearable heads-up display devices in the operating room identified IR as having the lowest number of published studies among 10 other procedural specialties (16). As minimally invasive and image-guided proceduralists, it’s important to support further experimentation, development, and adoption of technologies that aid in doing so. While high financial costs are current barriers to adoption and use, as the demand for this technology increases, there should be a proportionate saving in costs likewise. Another aspect to overcome in encouraging adoption is that the burden of creating institution-specific content will inevitably fall on that institution itself in order to provide the most appropriate and familiar experience. While it may seem like a tall order, institutions have already been able to take the initiative in integrating VR/AR technology within their health systems (5).

There are already AR smartphone applications that teach by prompting users to undergo specific steps necessary to perform a procedure from beginning to end. However, the available IR training modules have only included more basic procedures such as paracentesis, thoracentesis, and central line placements. As this technology continues to be integrated within trainee education, more advanced IR procedures will hopefully be introduced that teach more fine motor skills as well. Currently, AR and VR technologies within interventional radiology have predominantly been visually dominant, with little to no ability to interact with and obtain visual-tactile feedback from the environment. The currently existing hand-tracking devices only provide point-and-click functionality. Moving forward, it will be important to increase the realism of these simulations with accurate haptic feedback and improved immersion within the environment (17).

Different technologies currently exist that serve to help refine patient anatomy by utilizing existing radiographic imaging modalities in ways that can help physicians navigate through tortuous vasculature to their target locations. Improving the efficiency in reaching those locations and being able to better visualize the pathways thereby create safer treatments by reducing the amount of radiation exposure and bolstering treatment confidence. Moving forward, it will be tantamount to refine these technologies in ways that will account for common problems such as needle and wire bending, adjusting for patient respirations, soft tissue deformations, and organ movements and mobilization (18).

Another common barrier to entry is the ergonomic limitations from prolonged use, otherwise known as cybersickness. It is a multifactorial problem, without clear causes and solutions, that researchers are studying. Each subsequent generation of devices attempt to mitigate cybersickness through hardware-related factors that better account for physical differences, such as pupil distance and accommodation and convergence distances. Hardware solutions also serve to maintain high frame rate renderings and reduce latency. Even with good hardware implementation, the other aspect of cybersickness has to deal with the software and designed content for viewing. Visual-spatial-motor differences in acceleration, field of view, environmental and contextual blurring, and dynamic focus points can also contribute to cybersickness, among many others. This occurs via sensory mismatch and overload and postural instability in the sense that the virtual environment is not dynamically adjusting to how the user is intending to interact with it or expecting to perceive their surroundings based on their movements (19,20). Initially research has led to the development of GingerVR, an open source software tool kit of cybersickness reduction packages for the Unity game engine (21). Nonetheless, there are ongoing efforts in research and development that will hopefully allow a more comfortable viewing experience consistently, especially in situations where it may not be reasonable to take periodic breaks such as in procedures.

Interventional Radiology has been at the forefront of medicine by integrating cutting-edge technology into its scope of practice. The sheer utility that these display technologies can provide are appealing on many levels in training, education, procedure planning and perioperative management, and symptom control. As these digital technologies improve, the promising future of IR looks more virtual and augmented. Although our base reality has been used to create XR, the roles may reverse in the future as XR may redefine our reality

References:

1. Poetker B. The Very Real History of Virtual Reality (+ A Look Ahead). Learning Hub. https://learn.g2.com/history-of-virtual-reality. Published September 26, 2019. Accessed April 12, 2021.

2. Harrison D. Infographic: The History and Future of Augmented & Virtual Reality. Robotics Business Review. https://www.roboticsbusinessreview.com/news/infographic-the-history-and-future-of-augmented-virtual-reality/. Published December 19, 2019. Accessed April 12, 2021.

3. Cipresso P, Giglioli IAC, Raya MA, Riva G. The Past, Present, and Future of Virtual and Augmented Reality Research: A Network and Cluster Analysis of the Literature. Front Psychol. 2018;9:2086. doi:10.3389/fpsyg.2018.02086

4. Davis J. Going Virtual: How VR is Guiding Interventional Radiology. Elsevier Connect. https://www.elsevier.com/connect/going-virtual-how-vr-is-guiding-interventional-radiology. Published September 10, 2018. Accessed April 12, 2021.

5. Uppot RN, Laguna B, McCarthy CJ, et al. Implementing Virtual and Augmented Reality Tools for Radiology Education and Training, Communication, and Clinical Care. Radiology. 2019;291(3):570-580. doi:10.1148/radiol.2019182210

6. Jimenez YA, Cumming S, Wang W, Stuart K, Thwaites DI, Lewis SJ. Patient education using virtual reality increases knowledge and positive experience for breast cancer patients undergoing radiation therapy. Support Care Cancer. 2018;26(8):2879-2888. doi:10.1007/s00520-018-4114-4

7. Wang LJ, Casto B, Luh JY, Wang SJ. Virtual Reality-Based Education for Patients Undergoing Radiation Therapy. J Cancer Educ. 2020. doi:10.1007/s13187-020-01870-7

8. Malmo K. Transforming Reality: Using Virtual Reality to Treat Patients with Cancer. Cure. https://www.curetoday.com/view/transforming-reality-using-virtual-reality-to-treat-patients-with-cancer. Published May 5, 2020. Accessed April 13, 2021.

9. Pandrangi VC, Gaston B, Appelbaum NP, Albuquerque FC, Levy MM, Larson RA. The Application of Virtual Reality in Patient Education. Ann Vasc Surg. 2019;59:184-189. doi:10.1016/j.avsg.2019.01.015

10. Spiegel B, Fuller G, Lopez M, et al. Virtual reality for management of pain in hospitalized patients: A randomized comparative effectiveness trial. Gupta V, ed. PLoS One. 2019;14(8). doi:10.1371/journal.pone.0219115

11. Austin PD, Siddall PJ. Virtual reality for the treatment of neuropathic pain in people with spinal cord injuries: A scoping review. J Spinal Cord Med. 2021;44(1):8-18. doi:10.1080/10790268.2019.1575554

12. Key BM, Scheidt MJ, Tutton SM. Advanced Interventional Pain Management Approach to Neoplastic Disease Outside the Spine. Tech Vasc Interv Radiol. 2020;23(4). doi:10.1016/j.tvir.2020.100705

13. Takata T, Nakabayashi S, Kondo H, et al. Mixed Reality Visualization of Radiation Dose for Health Professionals and Patients in Interventional Radiology. J Med Syst. 2021;45(4):1-8. doi:10.1007/s10916-020-01700-9

14. Park BJ, Hunt SJ, Nadolski GJ, Gade TP. Augmented reality improves procedural efficiency and reduces radiation dose for CT-guided lesion targeting: a phantom study using HoloLens 2. Sci Rep. 2020;10(1):1-8. doi:10.1038/s41598-020-75676-4

15. The Future of Surgery: Virtual Reality – JFK Medical Center, Neuroscience – Hackensack Meridian Health. Hackensack Meridian Health. https://www.hackensackmeridianhealth.org/patient-perspectives/the-future-of-surgery-virtual-reality/. Published April 21, 2021. Accessed April 22, 2021.

16. Park BJ, Hunt SJ, Martin C, Nadolski GJ, Wood BJ, Gade TP. Augmented and Mixed Reality: Technologies for Enhancing the Future of IR. J Vasc Interv Radiol. 2020;31(7):1074-1082. doi:10.1016/j.jvir.2019.09.020

17. Mashar M, Nanapragasam A, Haslam P. Interventional radiology training: where will technology take us? BJR|Open. 2019;1(1):20190002. doi:10.1259/bjro.20190002

18. Solbiati LA. Augmented Reality: Thrilling Future for Interventional Oncology? Cardiovasc Intervent Radiol. 2021;44(5):782-783. doi:10.1007/s00270-021-02801-9

19. Weech S, Kenny S, Barnett-Cowan M. Presence and cybersickness in virtual reality are negatively related: A review. Front Psychol. 2019;(FEB). doi:10.3389/fpsyg.2019.00158

20. Stanney K, Lawson BD, Rokers B, et al. Identifying Causes of and Solutions for Cybersickness in Immersive Technology: Reformulation of a Research and Development Agenda. Int J Hum Comput Interact. 2020;36(19):1783-1803. doi:10.1080/10447318.2020.1828535

21. Nazir M. New open-source software aims to reduce cybersickness in VR use. Tech Xplore. https://techxplore.com/news/2020-06-open-source-software-aims-cybersickness-vr.html. Published June 29, 2020. Accessed April 12, 2021.