Robotic catheter navigates autonomous inside body

Surgeons have used robots operated by joysticks for more than a decade, and teams have shown that tiny robots can be steered through the body by external forces such as magnetism. Now, a paper in Science Robotics describes a robotic catheter that can navigate autonomously — the surgical equivalent of a self-driving car. Bioengineers at Boston Children’s Hospital demonstrated a robotic catheter that found its way along the walls of a beating, blood-filled heart to a leaky valve in an animal model, without a surgeon’s involvement.

Randal McKenzi
A self-driving robotic catheter, inserted at the base of the heart, arrives at a leaky valve.

Senior investigator Pierre Dupont, PhD, chief of Pediatric Cardiac Bioengineering at Boston Children’s, believe this is the first instance of autonomous navigation inside the body. He envisions self-driving robots assisting surgeons in complex operations, reducing fatigue and freeing surgeons to focus on the most difficult maneuvers — ultimately improving outcomes. “The right way to think about this is through the analogy of a fighter pilot and a fighter plane,” he says. “The fighter plane takes on the routine tasks like flying the plane, so the pilot can focus on the higher-level tasks of the mission.” 

The team’s robotic catheter navigated using an optical touch sensor developed in Dupont’s lab, informed by a map of the cardiac anatomy and preoperative scans. The touch sensor uses artificial intelligence and image processing algorithms to enable the catheter to figure out where it is in the heart and where it needs to go.

For the demo, the team performed a highly technically demanding procedure known as paravalvular aortic leak closure, which repairs replacement heart valves that have begun leaking around the edges. (The team constructed its own valves for the experiments.) Once the robotic catheter reached the leak location, an experienced cardiac surgeon took control and inserted a plug to close the leak.

In repeated trials, the robotic catheter successfully navigated to heart valve leaks in roughly the same amount of time as the surgeon (using either a hand tool or a joystick-controlled robot).

Biologically inspired navigation

Photo
Heart valve surgery, reimagined.
Source: Randal McKenzie

Through a navigational technique called “wall following,” the robotic catheter’s optical touch sensor sampled its environment at regular intervals, in much the way insects’ antennae or the whiskers of rodents sample their surroundings to build mental maps of unfamiliar, dark environments. The sensor told the catheter whether it was touching blood, the heart wall or a valve (through images from a tip-mounted camera) and how hard it was pressing (to keep it from damaging the beating heart).

Data from preoperative imaging and machine learning algorithms helped the catheter interpret visual features. In this way, the robotic catheter advanced by itself from the base of the heart, along the wall of the left ventricle and around the leaky valve until it reached the location of the leak. “The algorithms help the catheter figure out what type of tissue it’s touching, where it is in the heart, and how it should choose its next motion to get where we want it to go,” Dupont explains.

Though the autonomous robot took a bit longer than the surgeon to reach the leaky valve, its wall-following technique meant that it took the longest path. “The navigation time was statistically equivalent for all, which we think is pretty impressive given that you’re inside the blood-filled beating heart and trying to reach a millimeter-scale target on a specific valve,” says Dupont. He adds that the robot’s ability to visualize and sense its environment could eliminate the need for fluoroscopic imaging, which is typically used in this operation and exposes patients to ionizing radiation.

Dupont says the project was the most challenging of his career. While the cardiac surgical fellow, who performed the operations on swine, was able to relax while the robot found the valve leaks, the project was taxing for Dupont’s engineering fellows, who sometimes had to reprogram the robot mid-operation as they perfected the technology. “I remember times when the engineers on our team walked out of the OR completely exhausted, but we managed to pull it off,” says Dupont. “Now that we’ve demonstrated autonomous navigation, much more is possible.”

A vision of the future?

Some cardiac interventionalists who are aware of Dupont’s work envision using robots for more than navigation, performing routine heart-mapping tasks, for example. Some envision this technology providing guidance during particularly difficult or unusual cases or assisting in operations in parts of the world that lack highly experienced surgeons.

As the Food and Drug Administration begins to develop a regulatory framework for AI-enabled devices, Dupont envisions the possibility of autonomous surgical robots all over the world pooling their data to continuously improve performance over time — much like self-driving vehicles in the field send their data back to Tesla to refine its algorithms. “This would not only level the playing field, it would raise it,” says Dupont. “Every clinician in the world would be operating at a level of skill and experience equivalent to the best in their field. This has always been the promise of medical robots. Autonomy may be what gets us there.”

Subscribe to our newsletter

Related articles

Augmented intelligence advances robotic surgery

Augmented intelligence advances robotic surgery

TransEnterix, Inc. announced that a hospital in New Jersey successfully completed its first surgical procedures using the Intelligent Surgical UnitTM.

Smart surgery - from technical fascination to clinical reality

Smart surgery - from technical fascination to clinical reality

Artificial intelligence is developing at an enormous speed and intelligent instruments will profoundly change surgery and medical interventions.

AI system for recognition of hand gestures

AI system for recognition of hand gestures

Scientists have developed an AI system that recognises hand gestures by combining skin-like electronics with computer vision.

OR without surgeon: science fiction or realistic scenario?

OR without surgeon: science fiction or realistic scenario?

Dr Jan Stallkamp, Professor for Automation in Healthcare and Biotechnology, has a vision: robots that can treat patients more efficiently and more precisely than any human physician.

VisionBlender generates computer vision datasets for robotic surgery

VisionBlender generates computer vision datasets for robotic surgery

Researchers at the Hamlyn Centre, Imperial College London, have introduced a novel tool for generating accurate endoscopic datasets.

Surgery robots set to learn how to better assist

Surgery robots set to learn how to better assist

The objective of the AIMRobot project is to pave the way for the next generation of robotic surgery systems capable of autonomy.

Surgical data science: data-driven assistance in the OR

Surgical data science: data-driven assistance in the OR

In the next-generation operating room interconnected sensors will collect data, analyse it in real-time and make it available to digital assistance functions.

Origami-inspired miniature manipulator for microsurgery

Origami-inspired miniature manipulator for microsurgery

Researchers have developed a surgical robot that improves precision and control of teleoperated surgical procedures.

Mouth and throat cancer: Robotic surgery may improve outcomes

Mouth and throat cancer: Robotic surgery may improve outcomes

Robotic surgery for patients with early stage, oropharyngeal squamous cell cancer is associated with improved health outcomes, including better long-term survival.

Popular articles