Surgeon and researcher innovate with mixed reality and AI for safer surgeries

Surgeon and researcher innovate with mixed reality and AI for safer surgeries

Photo: OU Health

A University of Oklahoma researcher and a surgeon at OU Health, based in Oklahoma City, had a vision of using AI to visualize superimposed and anatomically aligned 3D CT scan data during surgery. The mission was to augment every surgery.

THE PROBLEM

“Compared to a pilot flying a plane or even a regular Google Maps user on his way to work, surgeons today have their instruments clustered behind them hanging on the wall,” said Mohammad Abdul Mukit, an MS student in electrical and computer engineering at the University of Oklahoma, and a graduate fellow and research assistant. His research focuses on applications of computer vision, extended reality and AI in medical surgeries.

“The Google Maps user or the pilot gets constant, real-time updates regarding where they are, what to do next, and other vital data that helps them make split-second decisions,” he explained. “They don’t have to plan the trip for days or memorize every turn and detail of every landmark along the way. They just do it.”

On the other hand, surgeons today have to do rigorous surgical planning, memorize the specifics of each unique case, and know all the necessary steps to ensure the safest possible surgery. Then they engage in complex procedures for several hours, with no targeting or homing devices or head-mounted displays to assist them.

“They have to feel their way to their objective and hope everything goes as they planned,” Mukit said. “Through our research, we aim to change this process forever. We are making the ‘Google Maps for surgery.'”

PROPOSAL

To turn this vision into reality, Mukit and OU Health plastic and reconstructive surgeon Dr. Christian El Amm have been working together since 2019. This journey, however, started in 2018, with El Amm’s collaboration with energy technology company Baker Hughes.

BH specializes in using augmented reality/mixed reality and computed tomography scans to create 3D reconstructions of rock specimens. For geologists and oil and gas companies, this visualization is extremely helpful as it assists them to efficiently plan and execute drilling operations.

“When you change the way you see the world, you change the world you see.”

Mohammad Abdul Mukit, University of Oklahoma

This technology caught the attention of El Amm. He envisioned that this technology combined with AI could allow him to visualize superimposed and anatomically aligned 3D CT scan data during surgery. This could also be used to see reconstruction steps he had planned during surgery while never losing sight of the patient.

However, several key challenges needed to be solved to get a prototype mixed reality system ready for use in surgery.

MEETING THE CHALLENGE

“During the year-long collaboration, the BH team created solutions for those challenges that, until that time, were unsolved,” Mukit recalled. “They implemented a client/server system. The server – a high-end PC – equipped with RGBD cameras would do all the computer vision work to estimate the six DoF pose of the patient’s head.

“It would then stream the stored CT scan data to the client device, a Microsoft Hololens-1, for anatomically aligned visualization,” he continued. “BH developed a proprietary compression algorithm that enabled them to stream a high volume of CT scan data. BG also integrated a proprietary AI engine to do the pose estimation.”

This was a complex engineering project done in a very short time. After this prototype was completed, the team had a better understanding of the limitations of such a setup and the need for a better system.

“The prototype system was somewhat impractical for a surgical setting, but it was essential for better understanding our needs,” Mukit said. “First, the system couldn’t estimate the head pose in surgical settings when most of the patient’s body was covered in clothing except the head. Next, the system needed time-consuming camera calibration steps every time we exited the app.

“This was a problem since according to our experience, surgeons accept only those devices that just work from the get-go,” he continued. “They don’t have the time to fiddle around with technology while they are concentrating on life-altering procedures. We also deeply felt the need for the options to control the system via voice commands. This is an essential element when it comes to surgical settings as the surgeons will always have their hands busy.”

Surgeons will not be contaminating their hands by touching a computer for controlling the system or by taking off the device for recalibration. The team realized that a new, more convenient and seamless system was essential.

“I started working on building a better system from scratch in 2019, once the official collaboration ended with BH,” Mukit said. “Since then, we have moved most of the essential tasks to the edge, the head-mounted display itself. We also leveraged CT scan data to train and deploy machine learning models, which are more robust in head pose estimation than before.

“We developed ‘marker-less tracking,’ which allows the CT scan or other images to be superimposed using artificial intelligence instead of cumbersome markers to guide the way,” he added. “We then eliminated the need for any manual camera calibration.”

Finally, they added voice commands. All these moves made the apps/system plug-and-play for surgeons, Mukit said.

“Due to their convenience and usefulness, the apps were very warmly welcomed by the OU-Medicine surgeons,” he noted. “Suddenly ideas, feature requests, queries were just pouring in from different medical experts. I realized then that we had something really special in our hands and that we had only scratched the surface. We started developing these features for each unique genre of surgery.”

Gradually, this made the system enriched with various useful features and led to unique innovations, he added.

RESULTS

El Amm has begun using the device during surgical cases to enhance the safety and efficiency of complex reconstructions. Many of his patients come to him for craniofacial reconstruction after a traumatic injury; others have congenital deformities.

Thus far, he has used the device for several cases, including reconstructing a patient’s ear. The system took a mirror image of the patient’s other ear, then the device overlaid it on the other side, allowing El Amm to precisely attach a reconstructed ear. In the past, he would cut a template of the ear and aim for precision using the naked eye.

In another surgical case, which required an 18-step reconstruction of the face, the device overlaid the patient’s CT scan on top of his real bones.

“Each one of those bones needed to be cut and moved in a precise direction,” El Amm said. “The device allowed us to see the bones individually, then it displayed each of the cuts and each of the movements, which allowed the surgeon to verify that he had gone through all those steps. It’s basically walking through the steps of surgery in virtual reality.”

ADVICE FOR OTHERS

“When you change the way you see the world, you change the world you see,” Mukit said. “That is what mixed reality was made for. MR is the next general-purpose computer. Powerful technology will no longer be in your pockets or at your desks.

“Through MR, it will be integrated with your human self,” he continued. “It will change how you solve problems, which in turn will lead to new creative ways of solving problems with AI. I think that within the next few years we are going to see another technology revolution. Especially after a mixed reality head-set is unveiled in 2023, which is reported to be lighter than any other visors in the market.”

Currently, almost every industry is integrating mixed reality headsets into their businesses – rightly so, as the gains are evident, he added.

“This technology is now mature enough for countless possible applications in almost every industry and especially in healthcare,” he concluded. “Mixed reality has not made its way fully into this industry yet. We have only scratched the surface, and already in a few months, we have seen such an overwhelming tsunami of ideas from experts. Ideas that now can be implemented with ease.

“These application scenarios range from education and training to making surgeries safer, faster and more economical for both the surgeons and patients. The time to jump into mixed reality is now.”

Twitter: @SiwickiHealthIT
Email the writer: [email protected]
Healthcare IT News is a HIMSS Media publication.

Source: Read Full Article