Print Friendly, PDF & Email

Seeing inside the colon … with virtual reality

Bringing together 3D virtual reality, machine learning and the human body.

Virtual_Reality_story_animation_short_The_Pulse

Imagine that you are inside a colon. A giant colon in which you can navigate. You travel forward from side to side, searching for polyps and other tumors. This is not science-fiction but an innovative and potentially revolutionary anatomical display prototype tool embedded in a virtual reality headset.

The tool in development is one of the upcoming innovations and research projects from the GE Healthcare Global Center of Excellence in Medical Imaging Software, in Buc, France, where 200 specialists who have been exploring the field since the 1990s came together to innovate.

“The research center is launching new applications every year,” said Jerome Knoplioch, Principal Software Architect at GE Healthcare.

Their specialty is software that can help radiologists, doctors and surgeons better read the human body, perform diagnoses and make their decisions. Their main project today focuses on 3D modelling, virtual reality and the use of big data.

The human body has three dimensions

The center of excellence in Buc is a pioneer in anatomical 3D modelling using images taken with General Electric scanners. 3D modelling is used to gain a targeted view of an organ. The doctor can, for example, concentrate on observing coronary artery stenosis accurately to see within the space of the cavity. It can be observed from all angles: cross-sectional view, inside view in 3D and flat mode.

Multiplying the points of view allows doctors to confirm their diagnoses and prevent optical illusions. The view can also be made real using a 3D printer, allowing the surgeon to handle the organ that they are going to treat. The next step involves projection inside the organ using a virtual reality headset.

Virtual_Reality_dynamic_quote_The_Pulse

Cloud and machine learning

Researchers at the center of excellence are also looking at big data techniques based on connection to cloud equipment and “machine learning”. Machine learning is close to a form of artificial intelligence, as the machine “learns” through its own “experience”.

“This is an ongoing development process. Today, we are collecting data from the way that we are using our equipment so that we can improve it. Tomorrow, we can teach the machine to recognize cancer by itself,” explained Jérôme Gonichon, Software Director at GE Healthcare.

Of course, this type of innovation requires an accurate and rigorous validation process. The “intelligent” machine will not substitute the physician in the decision-making process but rather assist them in establishing a quicker and more accurate diagnosis and in turn, in treating more patients in a shorter period of time.

Once connected to the cloud, the procedure takes on a whole other dimension. Some fifty people have been assigned specifically to research in this key area. Rather than store data locally, the GE Healthcare imaging equipment is connected to the network (there are already 30,000 machines on it) and shares information. The scanners can also connect to other data concerning the patients themselves.

Automating numerous tasks means physicians will be able to concentrate on decisionmaking and treatment. The challenge is to achieve greater knowledge of the human body and how it works.