Print Friendly, PDF & Email

Q&A: Software – The Leading Driver in Healthcare Innovation


Mike Harsh, Vice President and Chief Technology Officer (CTO) of GE Healthcare

In many respects, the story of GE Healthcare is a story about software. Of course GE is known as an engineering and electronics powerhouse, and its healthcare division is one of the world’s leading medical technology providers. But the driving force behind so many of the company’s innovations in recent decades has been software. Software has also fundamentally changed the business of healthcare and led to the introduction of innovations that are changing the provision of healthcare.

We spoke to Mike Harsh, GE Vice President and GE Healthcare Chief Technology Officer (CTO), to share his perspective, which stems from more than 30 years’ experience creating new products for the company. As CTO, Harsh oversees product development across all of GE Healthcare’s product lines — diagnostic imaging, instrumentation, healthcare IT, life sciences and medical diagnostics, which includes contrast agents, molecular diagnostics and high-value in vitro diagnostic products.

This interview is the first in a series of GE Healthcare newsroom articles looking at the company’s leadership in software, and its use in products that are delivering higher-quality care that is accessible to more people at a reasonable cost.

You’ve seen a lot of progress during your time at GE.
Having been with GE for nearly 33 years, I have seen quite a bit of history unfolds, particularly with GE Healthcare. And when you consider the role of software in that history, well, it’s really what has enabled our products to do what they do today. Because of software, our medical imaging products are able to analyze acquired data in ways we hadn’t even imagined in the early days, when people were basically looking at shadows of anatomy on pieces of film.

In what ways has software been an engine of healthcare innovation, then?
The revolution in mini- and micro-processing was what gave rise to the technology we already knew, at least mathematically, could be achieved. The hardware foundation had arrived to enable us to develop a computed tomography (CT) system. And that’s what led us to build out our image processing lab here in Wisconsin, which is something we pulled over from the GE Global Research Center, GE’s corporate research & development labs. Until then, there was no way of looking at an image other than throwing pieces of film on a light box. Today I look at a high-quality image one of our systems generates, and I’m aware that it comes from the algorithms we use, the signal processing we use, our knowledge of the physics of our systems — all brought together through software.

We were the first to come up with the image correction algorithms that could actually make a fan beam CT system practical, when there was a lot of talk in the industry about whether it was even possible. After that, there was no looking back. The mathematics and theory had been there for decades, so it was a matter of when the technology would catch up to theory.

How does software fit into GE Healthcare’s business?
People tend to think about software as one thing, but you really can’t look at it that way. Consider a smartphone. On the system level, you have software that enables the phone to function — sending and receiving signals through a mobile network. Then you have the user interface, which makes it easy and efficient for people to operate the phone. And then there’s the ability to use the phone in other ways by downloading apps and exchanging data. The software in our healthcare systems follows similar concepts.

Embedded software enables our equipment to perform everyday tasks. For example, an MR can have various acquisition sequences to generate medical images. That process is governed by embedded software, and it may run on hundreds of small chips within the equipment.

Application software is one level up from that, governing image analysis or image segmentation, for example. So that software enables us to take acquired image data and transform it into something that’s clinically useful. The same type of software is used on devices such as electrocardiographs that do analysis on an ECG wave form.

Lastly, we have enterprise-level software that runs across a broad network. Our picture archiving and communications systems (PACS) systems fall into this category. At this level, I’m able to take images from the diagnostic imaging system, review them on the system, do some image analysis at a workstation near the system or elsewhere in the hospital or care setting, and then archive them on a server where they can be accessed by a patient’s healthcare providers around the world, anytime, anywhere. And I mean ‘anywhere’ literally, as we’ve pushed strongly into mobile devices and cloud solutions, and this innovation continues today. What is very important overall is user-experience (UX) design, which ensures our interfaces work the way users work, and the way they think.

How do software developers fit in, organizationally, among GE Healthcare’s other technical areas?
Software today is actually a discipline of engineering. In the early days, there wasn’t a software discipline in engineering, and that has formalized as tools and techniques and processes have become more advanced.

Our technical community at GE Healthcare breaks down into four areas: mechanical and material technologists, electronic technologists, bioscience/life science technologists and software technologists. All of these disciplines can be represented in any project team and one of those technical groups, depending on the product, will take the lead. GE Healthcare has over 3,000 software engineers, with over 5,000 in GE overall, to develop digital offerings that solve GE customers’ most pressing challenges.

Clearly, software plays different roles across GE Healthcare’s range of products and services. On the one hand, you might have an algorithm that interprets CT image data in a new and improved way, with real medical significance. On the other hand, you might have a software application such as the MUSE Cardiology Information System directing and monitoring the flow of cardiac ECG information, solving a particular clinical problem, improving workflow and ultimately allowing superior patient care. Or, you might simply be talking about an optimized user interface that enables a technician to operate a piece of equipment correctly, easily and efficiently.

What do you recall as some of the milestones in software at GE Healthcare?
Embedded software is certainly one of the biggest, when we could actually put digital signal processing and programmability into our products. That was followed very quickly with our CT scanners and image processing capability. After that came our digital X-ray systems, including digital subtraction angiography, which changed angiography in a big way.

And then MR, which was another big revolution. It was technology we had understood since the mid-1950s, and then we were able to apply it to whole body imaging in the 1980s, as the software became more advanced and as the computational systems were able to handle it. Ultrasound had been around previously also, but real advances in a medical setting were made in the late 1980s and early 1990s with the advent of the digital beamformer and real-time digital signal processing.

Those are some of the highlights, but at the end of the day, the hardware has really enabled the explosion of the software. Speed, memory and miniaturization of electronics — all of these have been key factors in the amount of software we’ve been able to put in our medical products.

How do you bring clinician needs and perspectives into the software development process?
You can only do what we do with strong partnerships with clinicians and academic institutions. We hold medical advisory boards. We put the technology in clinicians’ hands. We make sure we have engineers embedded with scientists and clinicians around the world, working with medical professionals. That’s how these ideas come back in. It’s an area that continues to be exciting because, as computational systems become more powerful and as algorithms continue to advance, there’s no end to the things we can do. There’s never been a shortage of ideas for as long as I’ve been here.

In what way has software been a competitive differentiator for GE Healthcare?
GE Healthcare has demonstrated real strength in diagnostic instrumentation and software. Our image and signal processing capabilities are just outstanding, not to mention our reconstruction techniques and how we are able to model complete physical systems. And this software technology has enabled us to break new ground in many areas, such as lowering the radiation dose required for certain exams and increasing the speed of MR acquisition by dealing with massive amounts of MR data in a new ways.

In other areas of our business, our ability to move mission-critical data around in a care setting wirelessly to a central station in a hospital is something we’re really proud of, and that sets us apart. Our Centricity PACS systems, for example, are used with one of every five radiology exams in the United States and have led to a 70 percent improvement in radiology productivity over the past 15 years. That’s a big number.

Our MUSE system, which is totally software-based and which we’ve had for 15-20 years, brings all our cardiology products together. It’s a way of doing analysis for ECG waveforms for cardiologists, including automated ST segment analysis with software.

In Life Sciences, our division producing technology for drug discovery, biopharmaceutical manufacturing and cellular technologies, we some have some incredible tools. One example is the super-high-resolution microscopy equipment we have now for drug discovery and medical research. To illustrate, users are able to examine the behavior of single insulin molecular in a living organism, allowing researchers to see the effects of drug on cells in a way not previously possible. Our technology is being used to understand how HIV infects a cell. The resolution is so high it’s possible to see how individual HIV particles infect a cell — how they travel down the micro-tubules in a cell to the nucleus. Having that type of understanding and visualization has never been done before, and it will help clinicians design better drugs to treat and even prevent some of the very tough diseases we face. This to me is just amazing.

What do you see as the greatest opportunity for GE Healthcare in the future?
As a company, we have gone from imaging and instrumentation to diagnostics. With GE technologies we can now understand the workings of the human body, not only through hardware and software but our laboratory ‘wetware’ as well. So we’re well positioned to connect the dots — all the way from radiology at an anatomic level through to physiology, metabolics, pathology and all the ‘ologies’ to drive better clinical decisions. Software gives us a way to integrate the data from multiple sources. And if we can bring these data together, we’re going to be able to drive better clinical decisions and prognostics about disease.

At a purely technical level, we need to be increasing computational power through the intelligent design of software. When we first began developing software, we were writing code to deal with array processors to achieve computation speed. But after a while, the hardware got so fast and inexpensive that software advances could ride on the back of that speed — we didn’t need to focus on extremely efficient and parallel programming strategies. But there’s a limit to miniaturization as the technology approaches Nano-scale feature sizes. Now it’s software’s turn to step up and drive the next level of performance. So it occurs to me that we need to draw from those early days of software and relearn how to architect our software into chunks that can be processed simultaneously by multi-core systems.

We’re also getting to the stage where the hardware platform for application software no longer matters. Like so many aspects of modern life, we’re moving toward the Internet of Things. So virtualization and the interconnectivity of our systems is going to drive how we look at our systems in the next five to ten years.