March 8, 2018

Still Waiting for IT to Create Value in Healthcare

by Chas Roades


We will only fully capture the vast promise of information technology in healthcare when we use it to solve the needs of consumers, not just the needs of payers and providers of care


  • This week, the annual Health Information and Management Systems Society (HIMSS) conference is taking place in Las Vegas, and is expected to attract more than 50,000 attendees to its showcase of the latest health information technology (IT) trends and products
  • Early headlines from HIMSS focused on the announcement by CMS Administrator Seema Verma that the Federal government is planning to re-examine its approach to interoperability and meaningful use
  • Other notable events at HIMSS so far have included a keynote speech from Alphabet CEO Eric Schmidt, an unexpected appearance by White House advisor Jared Kushner, and a gigantic pile of paper documents assembled by Athenahealth to demonstrate the problems we still face


1. Amid the annual excitement of HIMSS, it’s worth acknowledging that health IT has mostly been a huge disappointment.

This is not a new observation. Guess when the first HIMSS conference was held. 1962! That’s right, we’ve been trying to figure out how to make IT work in healthcare longer than we’ve had a space program. (In fairness, back then HIMSS was just HMSS, centered on systems and industrial processes in healthcare. IT was added as a focus in 1981, and HMSS became HIMSS in 1986—the Walkman era, not the Apollo era.)

Our industry has notoriously lagged behind the rest of the economy when it comes to harnessing the power of IT and data to drive performance. Think of the impact that IT has had on manufacturing, banking, travel, and countless other industries since 1962. Process efficiency, quality improvement, cost reduction, and labor productivity have all been radically transformed by IT in other parts of the economy—and in the era of the Internet that impact has grown exponentially.

With healthcare IT we follow the money. Early investment was mostly limited to applications that helped us get paid more, or faster. Think of revenue-cycle technologies to support coding, billing and collections. When I first started working with hospitals back in the 1990s, the revolutionary technology was the “electronic bed board”—pioneering organizations figured out that keeping track of patient flow through the hospital electronically made sense in a world of case-rate reimbursement. Otherwise, we were still squarely in the world of the paper chart and the fax machine—at a time when factory robots were already making cars, and we could all buy pet food and diapers on this new thing called the Internet.

For most hospitals, IT has become the “thing that ate the capital budget”

Only in the last 10 years have meaningful incentives driven real investment in information-powered care. Since the HITECH Act and the advent of Federal programs around meaningful use and value-based purchasing, our industry has been on an IT spending spree, racing to make progress on “wiring” the health system. By one estimate, we’ve spent as much as $3T in aggregate on implementing health IT since 2009, including $30B of Federal incentives. For most hospitals, IT has become the “thing that ate the capital budget.”

What have we gotten for our money?

  • There’s little evidence that health IT has improved efficiency or reduced cost at all. Rather, the massive installations of expensive electronic health record (EHR) systems have contributed significantly to hospital overhead, and thus to the ever-rising cost of care—someone has to pay for it, and “meaningful use” was a partially-funded mandate from the Federal government at best.
  • EHRs were largely built for billing and recordkeeping purposes, so an additional layer of costly IT solutions has emerged to “capture the value” of the data these systems generate. Tying those solutions together has created another challenge for providers.
  • Far from enhancing clinician’s work, we’ve turned the doctor’s gaze away from the patient and toward a computer screen. By one estimate, only 27% of physician time is now spent on direct patient care. This is a key driver of the epidemic of physician “burnout” that now vexes American healthcare.
  • Perhaps most ironically, to lighten the added burden that EHRs add to physicians’ lives, many health systems have now begun to hire “scribes” to keep their doctors happy, adding a whole new labor cost to healthcare just to handle the IT work that was supposed to increase productivity.

Meanwhile the EHRs themselves have become a drag on innovation. In pursuit of “meaningful use” incentives from the Federal government, hospital systems found themselves choosing from among a handful of proprietary, mutually-exclusive platforms (EPIC, Cerner, Meditech, and so on). Wiring the system, collecting data, and reporting on metrics was the primary focus early on—true interoperability was left for a later “stage” of meaningful use.

The inability of EHRs to talk to each other has limited our ability to integrate care delivery for the patient. We’ve ended up with “walled gardens” of data, a tower of Babel for providers, and patients still forced to fill out redundant paper forms, with little assurance that their data will be available to the next provider they encounter.

2. What’s wrong with healthcare IT is what ails healthcare generally—the consumer has been an afterthought.

What’s going on here? Was this inevitable? Of course not, but it’s hardly surprising we’ve ended up where we are. These are all symptoms of a larger problem—American healthcare is built around the needs of the wholesale purchaser (third-party payers) and the producer (hospitals and doctors), not the consumer. Until very recently, there has been little impetus—beyond the basic good intentions of those who choose to practice medicine—to invest in anything beyond what the reimbursement system rewards. Little surprise that revenue cycle technology was for so many years the pinnacle of healthcare IT. Once payers became interested in (or provides became embarrassed by) quality and safety, the industry turned its attention to solutions like EHRs and “clinical analytics” platforms.

If our industry had been truly built around making care better and more useable for patients and consumers, interoperability would have been first on the list of must-haves for “wiring the health system”—think how fundamental that issue is for patients. We would already have figured out how to use IT for distributed, easy consumer access, as the banking industry did ages ago with ATMs. And rather than gazing longingly at non-healthcare innovations and straining to imagine their application in our industry (“Uber for doctors;” “OpenTable for clinics;” “the App Store for analytics”), the rest of the economy would be trying to steal great innovations from healthcare. There’s a reason no one is clamoring to find the next “EPIC for grocery stores.”

When consumers are tuning in on their iPads for HIMSS 2020, then we’ll know we’re headed in the right direction

It’s laudable that much of what’s being discussed this week at HIMSS (at least judging from tweets and press releases) is more in tune with a consumer-centric view of healthcare. Technologies to support telemedicine, to streamline patient access, to ease scheduling, and to (finally) give the patient control of their own data are dominating the HIMSS headlines. The fact that the “Blue Button” initiative is again being touted by CMS is welcome news…but a little frustrating. How, in 2018, are we still debating whether and how to let patients “own their data?”

Just for contrast, recognize that the biggest annual technology conferences in the rest of the economy—Apple’s Worldwide Developers’ Conference, the Consumer Electronics Show, and so forth—are so eagerly anticipated by consumers that they’ve taken on the look of rock concerts, streamed live and widely covered by mainstream media. When consumers tune in on their iPads for HIMSS 2020, then we’ll know we’re headed in the right direction.

3. Real innovation may ultimately come from players outside our industry—not from our healthcare IT natives.

The other vision of the future, perhaps closer to where we’re going now, is that healthcare IT—and much of the healthcare industry beyond it—just gets swallowed by players from the outside, who are much more attuned to consumers and much better prepared to deploy technology to address real consumer needs.

It’s obvious why our industry erupts in paroxysms of anticipation when Apple announces a new healthcare initiative or Google makes a nod toward patients. Those firms have the one thing that our homegrown, inward-facing technology community has never had to have, until very recently. Consumers are in their DNA—their approach to innovation, investment, and technology deployment starts with consumer needs.

It’s a telling piece of unintentional irony that Jared Kushner—having presumably been excused from solving Middle East peace due to his lack of a security clearance—was sent to HIMSS this week as a herald of the Administration’s renewed focus on addressing the issue of IT interoperability for healthcare consumers. Meanwhile back in New York City, Jared’s brother Josh sits at the helm of Oscar Health, the upstart health insurance company. While Oscar is still experiencing growing pains, what makes their foray into healthcare different from all their larger, legacy competitors is their relentless focus on the consumer. Talk to executives at Oscar, and you immediately realize that they don’t think of themselves as a healthcare company. They think of themselves as a consumer services company, in the healthcare business. Whether or not they succeed, there’s an important insight in what they’re trying to build.

Maybe HIMSS got the wrong Kushner?

Subscribe to the Weekly Gist for executive-level commentary and insights from the week in healthcare, delivered to your inbox every Friday