Breaking News

Artificial intelligence gets real in the OR – Modern Healthcare

Since the start of the year, some surgeons and residents at UC San Diego Health have had access to a new surgical resource: reams of video recordings of them performing operations, parsed by artificial intelligence.
Video recordings of procedures are uploaded to the cloud for quick analysis. The five surgeons involved in the project and their residents then receive videos of their minimally invasive procedures, which are divided into critical steps with a dashboard that compares an operation against previous procedures. The system pixelates distinguishing features of patients and staff, such as faces and tattoos, to de-identify them.
All done with the assistance of AI. “It’s giving active feedback on how your operation performed,” said Dr. Santiago Horgan, chief of the minimally invasive surgery division and director of the Center for the Future of Surgery at UC San Diego School of Medicine.
UC San Diego Health, which went live with the AI tool in two of its operating rooms in February, is one of a growing number of health systems introducing AI into the OR.
AI-assisted surgery has been an area of gradual growth, starting with tools that support care teams with preoperative planning and postoperative evaluation—but it’s laying the groundwork for the next phase of surgical innovation, which could include real-time intra-operative clinical decision support and even some automation, experts say.
“Surgery and AI will go hand-in-hand,” said Dr. Vipul Patel, medical director of AdventHealth’s Global Robotics Institute. “I think you’re going to have to have artificial intelligence (in order) for surgery to evolve.”
AI in recent years has been used to identify problems in medical images as part of planning for surgery and to review procedures after the fact, but use in the OR has been more limited.

Meanwhile, the last major technological advancement in surgery was arguably 20 years ago, with the growth of robotic surgery. The practice, popularized by Intuitive Surgical’s da Vinci system, advanced what’s now known as minimally invasive surgery, in which a surgeon performs a procedure using tiny cuts, rather than traditional open surgery that often requires large incisions. 
The so-called robots—tools entirely operated by a human surgeon—are designed to make a user’s movements more precise.
The da Vinci system “is really what defines this era of robotic surgery,” said Marissa Schlueter, healthcare senior intelligence analyst at CB Insights, a firm that analyzes data on venture capital and startups. But recently, there’s been a spate of new entrants, with the number of new groups filing patents related to robotic surgery “skyrocketing” in recent years, she said.
In the past 15 years, the number of applicants for patents related to robotic surgery increased by more than thirtyfold, according to a CB Insights report published last month.
But while robotic surgery gained traction roughly two decades ago, it hasn’t changed much since that early stage of adoption.
“The robotic systems that we currently use don’t use AI yet—there’s no automation of the systems,” Patel said. “AI is really our next frontier.”
Patel said he envisions a future when a surgical robot will operate autonomously. Though a human surgeon would still oversee the process, they wouldn’t manually dictate each movement the robot takes.
That future wouldn’t just be flashy tech, according to Patel. AI and automation could help to standardize procedures so they follow established best practices, leading to outcomes that are more consistent. In the long term, that could help reduce medical errors.
There’s no nationwide system for reporting adverse events causing death or serious harm, making it difficult to pin down the frequency and cost of preventable surgical errors. But some studies estimate medical errors as the third-leading cause of death in the U.S. “Robots don’t get tired, they don’t need coffee in the morning,” Patel said. That said, “I think that’s still many years away.”
Patel is on the advisory board of Activ Surgical, a digital surgery company. Dr. Peter Kim, a pediatric surgeon who is the company’s co-founder and chief science officer, played a role in performing the world’s first autonomous robotic surgery of soft tissue—albeit on a pig.
The software company is working on creating intra-operative decision-support tools, starting with one that connects to laparoscopic and arthroscopic systems and helps surgeons see aspects of blood flow and tissues not usually visible. The company is kicking off its first human trials for the product this year and plans to submit data to the Food and Drug Administration for clearance at the end of 2020.
The company’s next step will involve developing real-time decision support based on the tool’s insights, Kim said.

Dr. Ahmed Ghazi, a urologist and director of the simulation innovation lab at the University of Rochester (N.Y.) Medical Center, once thought autonomous robotic surgery wasn’t possible.  He changed his mind after seeing a research group successfully complete a running suture on one of his lab’s tissue models with an autonomous robot.
It was surprisingly precise—and impressive, Ghazi said. But “what’s missing from the autonomous robot is the judgment,” he said. “Every single patient, when you look inside to do the same surgery, is very different.” Ghazi suggested thinking about autonomous surgical procedures like an airplane on autopilot: the pilot’s still there. “The future of autonomous surgery is there, but it has to be guided by the surgeon,” he said.
It’s also a matter of ensuring AI surgical systems are trained on high-quality and representative data, experts say. Before implementing any AI product, providers need to understand what data the program was trained on and what data it considers to make its decisions, said Dr. Andrew Furman, executive director of clinical excellence at ECRI. What data were input for the software or product to make a particular decision must also be weighed, and “are those inputs comparable to other populations?” he said.
To create a model capable of making surgical decisions, developers need to train it on thousands of previous surgical cases. That could be a long-term outcome of using AI to analyze video recordings of surgical procedures, said Dr. Tamir Wolf, co-founder and CEO of Theator, another company that does just that.
While the company’s current product is designed to help surgeons prepare for a procedure and review their performance, its vision is to use insights from that data to underpin real-time decision support and, eventually, autonomous surgical systems.
UC San Diego Health is using a video-analysis tool developed by Digital Surgery, an AI and analytics company Medtronic acquired earlier this year. The acquisition is part of Medtronic’s strategy to bolster its AI capabilities, said Megan Rosengarten, vice president and general manager of surgical robotics at Medtronic.
“There’s a lot of places where we’re going to build upon that,” Rosengarten said. She described a likely evolution from AI providing recommendations for nonclinical workflows, to offering intra-operative clinical decision support, to automating aspects of nonclinical tasks, and possibly to automating aspects of clinical tasks.
Autonomous surgical robots aren’t a specific end goal Medtronic is aiming for, she said, though the company’s current work could serve as building blocks for automation.
Intuitive Surgical, creator of the da Vinci system, isn’t actively looking to develop autonomous robotic systems, according to Brian Miller, the company’s senior vice president and general manager for systems, imaging and digital. Its AI products so far use the technology to create 3D visualizations from images and extract insights from how surgeons interact with the company’s equipment.
To develop an automated robotic product, “it would have to solve a real problem” identified by customers, Miller said, which he hasn’t seen. “We’re looking to augment what the surgeon or what the users can do,” he said.

In a world with autonomous surgeries, the surgeon wouldn’t become obsolete, experts say. They’d take a different role, overseeing the process. That’s in part because human anatomy is unpredictable. An abdomen can be like Pandora’s box, said UC San Diego Health’s Horgan. Even experienced surgeons aren’t entirely sure what’s inside a human body until the surgery begins.
Horgan thinks it’s likely that smaller and more repetitive steps within surgery—such as suturing—may one day be automated. A robot won’t be able to complete a full operation on its own, he said, but those steps toward automation could improve outcomes by reducing variability.
Using AI in tandem with video recordings could one day provide surgical teams with an “instant replay,” like in sports, said Dr. Carla Pugh, a professor of surgery at Stanford University School of Medicine and director of the Technology Enabled Clinical Improvement Center, which is researching how insights pulled from video, motion tracking and other data could be used to evaluate skills.
That includes using AI to analyze video recordings of surgeries, which her research has found can make reviewing procedures after the fact more efficient for surgeons. The main way surgeons receive feedback today is by others observing and then commenting on their procedures; even if an operation is recorded, without AI the surgeon is left with hours of unannotated video.
With AI, “you can fast-forward to the one critical step,” Pugh said. “At the end of a four-hour case, you could spend 10 minutes reviewing all of the critical decisions.”
The project’s ultimate goal, in her mind, is for AI to be able to review, analyze and predict steps a surgeon is taking and what anatomy they’re seeing in real-time. But she doesn’t expect to see autonomous surgical systems anytime soon.
Imagine trying to program a self-driving car—except the roads can change direction unexpectedly, Pugh gave as an example.
“That’s what we deal with with the human body,” she said, noting tissues can vary depending on whether a patient is a young athlete or a senior who’s out of shape, for example. It’s unlikely a robot would be able to figure that out on its own.
On the question of autonomous surgeries, she said, “let’s answer that question in 20 years.”

Source: https://www.modernhealthcare.com/care-delivery/artificial-intelligence-gets-real-or