Keep Physicians in the Healthcare Cockpit

cockpit
by Balu Nair | October 18, 2018
Original Article

Boardrooms across industries, from manufacturing to aviation and healthcare, are grappling with similar questions that will shape their future.

How can we best use artificial intelligence (AI)? Where’s the balance between computers and humans? And how much automation is too much?

For some industries, finding these answers has been a bumpy, public journey. Think self-driving cars or your latest miscommunication with Alexa.

We need companies that are willing to push the boundaries of technology. And in some industries, we can afford to let those companies take the lead (if not always get behind the wheel).

But the benefits of technology aren’t without inherent risks, a reality that presents serious consequences.

Take autopilot technology on airplanes, and two events that vividly illustrate how artificial technology can be incredibly effective, but also lure us into a false — and fatal — sense of security.

First, the “Miracle on the Hudson.” The role of pilot Sully Sullenberger has been much celebrated. But as William Langewiesche’s article states, the robotic brain of the Airbus itself deserves more of the credit than it often gets.

Sullenberger made split-second decisions at the controls that saved hundreds of lives. But the airplane also kept a steady pitch, maintained perfect wing levels and made crucial calculations on the speed and angles as it plunged toward the lake.

It was an example of how powerful the synergy between man and machine can be.

But the power of AI in airplanes is so great that it can make some folks forget the value of humans.

After an Asiana flight crashed while trying to land in San Francisco a few years ago, investigators discovered that an over-reliance on automated systems was largely to blame.

The pilots assumed that the plane would maintain the right speed for landing, but they had accidentally turned off that function earlier in the flight. They trusted the technology so much that they stopped paying attention and lost three lives in the accident.

The lesson here is particularly important for healthcare, where even small errors can result in immediate adverse effects.

Technology businesses have made no secret of their healthcare ambitions in the past year. And that’s great for everyone, if their ingenuity, processing power and entrepreneurial spirit are deployed in support of the physicians and medical professionals in the proverbial cockpit.

It’s never easy to draw a clear line where the computer’s job ends and the human’s work begins. Whatever line does exist is forever in flux.

AI can analyze huge amounts of data and perceive relationships or gaps. But physicians are the deciders on what to do with that information.

Technology can assist in reading body scans or testing blood samples. But we still want humans making the diagnosis, doing the surgery and ultimately delivering care.

As with Sullenberger and the Airbus examples, physicians must make the decisions that will define the course of a patient’s care. Cutting-edge tools and information must enable them to do so with the best chance at success.

Even in areas where AI is already pervasive — for example, in tracking a patient’s medication regimen — human oversight is essential to avoid the Asiana effect. Patients are too important to trust machines, or the companies that make them, to get it right all the time.

While disrupting healthcare, the data revolution and remarkable pace of technological change must not blind us to the most valuable asset the industry has: the clinical care team.

As technology takes on an ever-expanding role in the healthcare industry, the most successful companies will be those that realize the value of allowing clinical care providers to lead the change. They ensure the latest devices and systems are enabling better care, rather than forcing medical professionals to struggle to adapt to changes made by the technology team.

There’s a place for the Googles and Amazons of the world to help lift the healthcare industry, but it’s not in the pilot’s seat. That role is for people devoting their full attention to the needs of the patient — and the machines can become their co-pilot.

Balu Nair is CTO of Gray Matter Analytics, a healthcare analytics solutions company.

SHARE WITH FRIENDS