MediView XR has raised $4.5 million to equip surgeons with augmented reality imagery that effectively gives them 3D X-ray-like vision.

With Microsoft HoloLens or other AR goggles, surgeons can insert an instrument into a patient and see an animation that shows exactly where the instrument is going, under the skin. The AR Surgical Navigation Platform tool will help surgeons remove cancer tumors in a way that is similar to controlling a video game.

Healthcare is proving to be one of the promising areas where new AR and virtual reality technologies — often dubbed mixed reality — can be used to change the way that long-established procedures are done.

“What we’re doing is creating a way to navigate to a cancerous lesion in a way that’s never been done before with augmented reality,” said John Black, CEO of MediView, in an interview with VentureBeat. “It’s like using a kind of GPS with augmented reality to navigate to a specific point in the human body. You’ve probably seen a little bit of noise out in the world about augmented reality. We are really one of the only companies using it for true surgical navigation. So tracking both the patient, the hologram, and the instrument to get there.”

How AR delivers benefits for cancer surgery

What a doctor sees during surgery with MediView AR.

Above: What a doctor sees during surgery with AR goggles and MediView software.

Image Credit: MediView

The new technology came out of the Lerner Research Institute at the Cleveland Clinic. It has moved into its second round of in-human evaluation surgeries.

The system leverages the capabilities of Microsoft’s HoloLens (the company is developing a version for HoloLens 2 as well) to look directly into a patient and see tumors in 3D through the patient’s skin, even allowing the surgeon to walk around and view the tumor, organs and other anatomy from any angle. The system lets that surgeon perform the procedure without taking their hands off the patient or the tool that they are manipulating to cut, or ablate, the tumor away.

The seed funding comes from Inside View, a Northwest Ohio venture group based out of Findlay, Ohio. Other investors include Plug and Play Ventures and the Northwest Ohio Tech Fund. It will be used to further develop the system and get it through Food and Drug Administration (FDA) approval, which the company expects to see happen in 2021. MediView has already used this system on five liver tumor patients in the first set of trials and has started a new nine-patient human trial in August 2019.

Charles Martin III, an interventional radiologist at the Cleveland Clinic, was the first in the world to use the system. He was motivated to do so by seeing first-hand the limitations of traditional medical imaging technology and all the radiation exposure associated with those procedures.

“Since we are able to use a patient’s own anatomy to make these to make these images, the image quality is actually quite good,” Martin said in an interview. “The resolution will continue to improve as the processors continue to get faster and the resolution improves newer version devices. However, the rendered images we do have a color code, which assists in distinguishing everything when we make the holograms. What’s nice is this lets us perform a minimally invasive therapy without having to open the patient up more. We have this additional information, right there, right in front of us.”

Martin said that using a Microsoft HoloLens with its “XR” holographic visualizations has the potential to make his job much easier because he can now see the tumor in three dimensions during surgery while the more traditional fluoroscopy technique only had 2D displays.

Black said, “As the CEO, when I saw this technology, I was visibly shaken. We’ve seen that same response from everybody in healthcare.”

How it works

Above: Surgeons can plan their approach based on images of a patient’s real body ahead of time. Then they can execute the surgery in real-time with the same view in AR.

Image Credit: MediView

The Real-Time, Fused Holographic Visualization system which puts a Microsoft HoloLens onto the face of the lead surgeon, is like a missile-guidance system. It lets the surgeon plan out a digital path to the patient’s tumor, which improves patient experience and understanding of what will be done during their procedure. During surgery, like in a comic book or Star Trek, a surgeon who puts on the HoloLens can look directly at their patient and see all of the patient’s internal anatomy under skin.

READ  Foldaway is Making a Thumbstick for VR That Pushes Back

CT or MRI scans are used to build the initial 3D map of the patient (sort of like Google Maps lets you see where you are) and then allow the surgeon to navigate to the desired location within the patient. With this in place, surgeons can see the correct position of the patient’s organs, blood vessels, bones, and other structures.

Surgeons can see the cancerous lesions within the organs. They can peel back layers of anatomy or make different anatomy appear or disappear. As the surgeon moves around the patient their internal anatomy remains in its correct anatomical location. When the surgeon picks up a surgical instrument, he or she will see a light saber-like light-ray emitting from the end of the tool. As the surgeon brings the instrument toward the patient, the surgeon can see that light-ray extending into the patient and how it will intersect each piece of the anatomy. As the surgeon reaches the targeted location the system alerts him or her about the proper placement.

“We love our surgeons, but the environment they’re forced to work in, with their head and body and hands, is oriented away from the surgical site,” Black said. “They can look at flat-panel monitors to make sense of a 2D image that is really a three-dimensional problem. This is a beautiful way, almost gamification of a surgical procedure that we’ve never seen before.”

A safer system

Above: MediView provides feedback in real-time in the middle of a procedure.

Image Credit: MediView

Surgeons and patients will see another gain by using MediView’s Real-Time, Fused Holographic Visualization, or RTFHV, system in the form of dramatically reduced radiation exposure, because MediView’s extended reality “XR” medical imaging system no longer requires surgeons to use as much X-Ray radiation while operating on just about any type of cancer.

Black said the technology has the capability to tremendously reduce radiation exposure for clinicians and patients but will not completely eliminate the need for some radiation based imaging. With MediView’s system, radiation-based imaging will still be used for preoperative diagnosis and planning but will be greatly reduced during actual procedures. Ultrasound uses sound waves to image the human body and does not use harmful radiation like X-Rays do. MediView leverages ultrasound technology for intraoperative verification of imaging
alignment and tool placement to further reduce radiation.

All of this is done without radiation emitting from an X-Ray machine which is currently used to do the same procedure. Instead the system “fuses” sensor readings from Ultrasound devices with CT or MRI scans, along with positioning data from the tool that the surgeon is using to cut away tumors and presents them the most advanced 3D holographic display used in surgery rooms today.

“One of the things that I kept seeing that we were running into with complications was just a visualization piece of it,” said Karl West, Director of Medical Devices at Cleveland Clinic and Director of the Lerner Research Institute, where MediView’s device was developed, in an interview. “Physicians were very challenged to place a complex three-dimensional device into a complex three-dimensional anatomy using 2D fluoroscopy. Fluoroscopy uses ionizing radiation for imaging. And so everyone in the room is being exposed to ionizing radiation.”

READ  How Trick3D and Holiday Inn owner used virtual reality to design a new hotel brand

A revolutionary improvement in visualization

Above: MediView can speed up surgery and training too.

Image Credit: MediView

At the Cleveland Clinic, four critical pieces of technology were refined and synchronized in 3D to bring X-ray vision and guidance to surgery rooms. These included active anatomic CT/MRI registration, a core technology that gives the surgeon 3D X-ray-like vision into their patient.

Using a patient’s own CT or MRI, the surgeon can see beneath the skin to the patient’s organs, blood vessels, bones, and other unique structures. The surgeon can identify critical anatomy and risk structures to achieve an optimal guidance and definitive location of the cancerous tumor.

“Every time you go into the doctor now we get a CAT scan or MRI. We take that MRI, which has a tremendous amount of information in it, and we put it to use. We extrapolate out of that data set the anatomy of interest. Through a process called segmentation, we then export it into our unique algorithms that we’ve developed here within the lab,” said West. “And we fuse that patient-specific data to the patient through a registration process. And then using EM tracking ultrasound and the HoloLens, we were able to put this holographic guidance platform together. So if you can imagine, we put the HoloLens on and look at the patient like having X-ray vision. You can see it inside the patient. It’s a trick but really, you’re seeing the anatomy just as it is within the patient. And then all our tools that we use for minimally invasive procedures are also tracked and rendered in the 3D format within that hologram. So it allows you to navigate to your target in real-time.”

The tech also uses Preoperative Plan with Intraoperative Display. Other AR companies let you review 3D images to plan surgery. MediView takes this a leap further by letting the physician actually plan their trajectory with the light-ray tool targeting the tumor during the procedure.

And MediView uses intraoperative tool tracking. MediView’s system tracks the surgeon’s tools throughout the procedure. When a surgeon picks up an instrument and brings it into the operative workspace, the tool is transformed into a holographic image allowing the tools to be visualized prior to entering and while inside the patient.

This allows the surgeon to continuously see if that tool is in the right place and whether they need to advance or adjust trajectory based on real-time navigation.

If the tool is properly aligned with the planned path, proper placement is confirmed by changing the color of the display from red to green. Similar to a self-driving car that follows a map, this system will turn back to red to warn a surgeon that they are off their planned path.

“If you’re standing there and you stick something through your skin inside your body, you can no longer see where it’s going,” West said. “What we’re what with the HoloLens technology and EM tracking is that now when you push the place something inside through the skin, now you can still see it, you can still see where it’s going. You can see the tip of it, you can still see your lesion. So what we’re doing is we’re guiding the physician, the caregiver to the spot he needs to be using the 3D holographic technology.”

And the tech uses a Real-Time Holographic Ultrasound Overlay. The final piece of MediView’s system is a first-of-its-kind holographic ultrasound technology. This technology produces a separate 3D hologram using ultrasound. This hologram is created during the procedure as the surgeon scans over the patient with the ultrasound probe. This provides real-time confirmation that all imaging, anatomy and tools are properly aligned.

Why is this a breakthrough? Because for the first time these holographic images are “tracked” to moving soft tissues with what is called “active registration.” Other systems can only track bones since they are rigid structures and easier to track and visualize. The MediView system tracks these images and the guidance system to not just the tumor itself, but other soft tissues that are moving as the patient breathes or moves.

READ  The best Verizon Wireless plans in February 2019

The active registration of the hologram with mobile soft tissue anatomy that is the initial key differentiator for MediView. The synchronized
interplay of the hologram with the guidance technologies differentiates MediView from others developing in this emerging technology space.

Bones are rigid structures that do not individually move. Soft tissues, however, are dynamic and can continuously move throughout a procedure. The simple breathing of the patient can move organs. MediView’s proprietary system follows soft tissue movement and continuously aligns the surgeon’s visualizations with the patient’s moving tissues. It is this dynamic or active registration of the hologram with mobile soft tissue anatomy that is the initial key differentiator for MediView. The synchronized interplay of the hologram with the guidance technologies separates MediView from the remainder of this emerging technology space.

To provide comprehensive and accurate guidance, the surgeon is able to see all of the MediView’s data and system capabilities synchronized together in real-time 3D as they move around the patient in the surgical suite. Using hand gestures and voice commands, the surgeon has complete autonomous control of all data and imaging being presented to them to enhance clinical decision making. The operator can reposition, rotate, scale and select components of the holograms to simplify their job and improve clinical workflow.

If the tool is properly aligned with the planned path, proper placement is confirmed by changing the color of the display from red to green. Similar to a self-driving car that follows a map, this system will turn back to red to warn a surgeon that they are off their planned path.

In the future, Black sees numerous other uses and applications where this technology can transform care. MediView will be using the funding to develop this transformative technology further to help surgeons with spine, neurosurgery, breast, ENT, orthopedic, and even more general surgeries as well. Black is convinced that “this holographic technology will transform a wide variety of procedures and help countless  surgeons deliver better care to their patients.”

Expanding the company

Above: Fluorscopy machines can cost $3 million.

Image Credit: Siemens Healthcare

Black started out in construction engineering and today he applies that learning to building devices that improve healthcare.

Mina Fahim and Greg Miller will be joining MediView as chief technology officer and chief information officer, respectively, in October. Fahim said that “a substantial portion of complications in surgeries could be avoided with better visualization technology.” Fahim is formerly of Medtronic and St. Jude Medical and has been a research development engineer by trade.

Miller co-founded two startups, most recently CentraComm, where he had oversight of day-to-day operations and has worked in vision and manufacturing automation. He said, “MediView’s use of futuristic technology will save many lives.”

MediView has licenses for two key patents from the Clevland Clinic. The final cost is still being evaluated as the system is still in development. A study is pending publication demonstrating potential cost advantages for the hospital system or surgery center.

At the Lerner Research Institute at the Cleveland Clinic, Karl West led the development team, using a HoloLens to create 3D holographic representations of the donor’s skull and other anatomy to assess and refine their surgical plans. Jeffrey Yanof created the software. West has been developing medical devices for more than 25 years.

Black, who has performed more than 2,000 surgeries to date, co-founded MediView with an investor in April of 2017 specifically to license and commercialize the technology. About 10 people work on the project full time.

“What we really look forward to with this technology as well, is that it’s going to open up opportunities for smaller centers to do these more complex procedures that they don’t they can’t do now because they don’t have access to some of the technologies,” Black said. “Some of the smaller hospitals will be able to treat their patients using this new technology. It’s far cheaper than buying a $3 million fluoroscopy system.”

Sign up for Funding Daily: Get the latest news in your inbox every weekday.



READ SOURCE