Where and how is extended reality already applied in surgery?
Virtual reality (VR) is often used in education and training, e.g. anatomy teaching, where the students can examine the virtual models. In medical school, the students train on artificially created anatomical models. If it is not a cadaver or comes from patients’ images, the models are standardized. But that is not how it is in reality because patients have anatomical variations. In VR, you can create highly detailed and realistic patient-specific 3D models. The users can also be placed in life-like environments, simulations, and use controllers to get haptic feedback during interactions to train before the actual procedure.
Augmented reality (AR) comes closer to the actual surgery as virtual images can be overlaid on top of the real-world object. At the Intervention Centre at Oslo University Hospital, we have tested AR for laparoscopic liver surgery. Here, the user can place 3D models and see these virtual elements in the 3D camera view. However, there are still some limitations related to the complex workflow, visualization and how these virtual elements should be presented on the laparoscopic camera view.
In mixed reality (MR), the user views the digital images in the physical world and interacts with them. At our hospital, we’re working with Microsoft’s HoloLens 2, which allows us to put virtual objects into our environment. We create holograms of patients to walk around and look at their anatomy and pathology individually or in groups of clinicians. It enables us to visualize and explore different treatment strategies in several disciplines – liver surgery, congenital heart surgery, orthopedic surgery, etc.
One of the advantages of using MR is the fact that the clinicians are more accessible and not bound by the physical location of the hardware. With a head-mounted mixed reality device, one can see a model with depth as if it were in real life and can point and interact using one’s hands. You can toggle the visibility of the virtual elements or peek inside to better understand the spatial distribution of the anatomy and pathology.
What are the challenges of implementing XR in surgery?
It is crucial to remember that patients are in three dimensions but medical professionals still focus on flat images for decision-making. For education and case reporting, documentation is commonly limited to text or annotations on medical images. It is constrained and requires a lot of imagination to understand where all these locations are inside the patient. By creating 3D volume renderings and holograms, you can point and show virtual models and combine all these mediums and information for more detailed documentation.
However, one major challenge is the preparation of XR, the data that needs to be processed into 3D models and holograms. Hospitals are sitting on a lot of unprocessed data they still use for the traditional way of examining. I believe that doctors, surgeons, and even patients should start asking for 3D representations of their anatomy and pathology.
Then again, to create 3D models of every patient, you need to have sufficient evidence that it will help, that you cannot make the best decision for the patient without it. The hospitals are sitting on this resource, but too few are processing it; too few are bold enough to invest without the solid evidence that having that data processed will be important in the future. But the proof can only be established if someone spends enough time on this and publishes the research results.
One of the crucial future technologies in surgery is Augmented Reality. Most experts agree that AR will increase safety and efficiency, improve surgical training and decrease costs.
What different navigation and visualization methods are used in XR?
For example, medical images can be used to create virtual 3D models that can be laid over the patient during surgery. Once patient-specific 3D models are created physical and virtual spaces can be combined with tracking technology. You can, for example, have an optical tracking system with cameras in the OR that detect spheres placed on various objects which are to be tracked. This enables real-time tracking of objects and the ability to place virtual elements in the OR. These fused scenes and images can be presented either on screens, in augmented reality, or in mixed reality.
This process can require some setup of the navigation system and a certain amount of input from the users. HoloLens offers eye-gaze control so that some functions can be controlled by looking at them. For example, when the surgeon’s hands are occupied during surgery, he can look at and activate certain buttons for interaction to reduce the physical clicking of buttons.
X-ray vision, context-sensitive guidance, coordinator, training assistant and more: augmented reality (AR) has hit the OR.
How does your ‘perfect’ operating room of the future look like?
Today, we miss a complete solution for integration. Existing and new technologies in the OR are mostly independent; they have different systems and standards.
I see a bright future where all systems talk to each other, where images and information are synergistically flowing between devices; I see decision-support mechanisms using AI that provide automatic patient- and user-specific workflow in which next steps are predicted and shown.
Maybe the patient lies on a table and complete patient data is shown holographically in the surroundings. Then treatment is performed by a robot-assisted surgical system with current and accurate visual controls that support surgeons in their intraoperative decision-making and surgery.
Whatever new gadgets or technologies are brought into the OR, I hope these will seamlessly integrate into the whole imaging, diagnostic and treatment workflow.
In the next-generation operating room interconnected sensors will collect data, analyse it in real-time and make it available to digital assistance functions.
You are part of the session “Will XR become the new standard for surgery?”. What is your opinion on that question?
First, we should remain focused on the patient by improving patient care from a value-based healthcare perspective and not get distracted by all futuristic emerging technologies – which resonate with our Sci-Fi fantasies from childhood.
Secondly, I think XR will stay. Various modalities will find their place in the healthcare systems. Surgeons, and clinicians in general, will use what they have available and what works best. We are always looking for solutions to solve problems, to treat the patient. XR is one potential candidate for solving particular communication, visualization and navigation challenges.
At first, it might have been curiosity about the gadgets and cool holograms. However, research is starting to gather knowledge and examples where XR could improve certain aspects of care. Though we still need more research, e.g. randomized controlled trials that provide evidence that when doing an intervention in three dimensions on a patient, we will have to see and plan those in 3D using XR devices.
In addition, the definition of XR could evolve as new devices and gadgets are developed. Also, there are unexplored territories with projections and beyond wearable devices; I imagine some AR with screenless and glass-free 3D coming to the OR in the near future.
Egidijus Pelanis is a medical doctor and PhD candidate working at The Intervention Centre, Oslo University Hospital in Norway. He is working with technologies in healthcare and is part of both Section of Clinical Research, led by professor Bjørn Edwin, and Section of Medical Cybernetics and Image Processing, led by professor Ole Jakob Elle. Egidijus is researching the use of various navigation and visualization methods for minimally invasive surgery.
Shift Medical 2021
Friday, September 24, 3.30 - 6.00 pm (CEST)
Session 3: Extended Reality in Surgery
Will XR become the new standard for surgery?
Egidijus Pelanis: “Mixed reality in minimally invasive liver surgery.”
Tectales is a media partner of Shift Medical 2021 that will take place September 23-25.
You are a professional from the emtech ecosystem and interested in sharing your tech tale with a guest contribution at Tectales? Just contact us: firstname.lastname@example.org.