Over the last few decades, the United States (U.S.) has established one of the most sophisticated and advanced healthcare industries in the world. This has occurred through substantial investments in research, brainpower and technology across the industry, from electronic medical records to design thinking, robotics, artificial intelligence (AI) and data analytics. Furthermore, researchers in the U.S. have pioneered ground-breaking therapies, novel methods of care delivery, digitization of health records, treatments for chronic (sometimes terminal) diseases and population health management. These advancements have tremendously improved human health and life expectancy both in the U.S. and around the world.
Unfortunately, this progress has been challenged by an inability to control the cost of care. In a system which incentivizes volume and disease treatment over prevention and wellness, it is no surprise that the U.S. has the highest per capita cost for care (two and a half times that of most developed nations), and this continues to grow exponentially every year. Even though the U.S. spends more on healthcare innovation and research than any other industrialized nation in the world, it ranks near the bottom (27th in the world) on key health outcomes like mortality, chronic health conditions and mental health that are tracked by the World Health Organization. Similar developed nations spend less per capita but have better health outcomes.
However, there is a significant disparity in healthcare outcomes between people with insurance coverage and access to quality care compared to people in countries without strong healthcare infrastructure and expertise. Consider the difference in child mortality rate in the U.S. (10-49 per 1000) with that of Ethiopia (200 per 1000); Ethiopia’s lack of resources helps to explain the worse outcomes. Further, while the U.S. spends more than $10K dollars per capita on healthcare, Ethiopia spends $25 dollars per capita.
Underlying the issue of healthcare costs is the unexpected inverse correlation between the cost and quality of healthcare. This finding has stumped patients, policy makers, doctors, and healthcare administrators for decades. Additionally, the inertia and politics surrounding healthcare has propagated the status quo, making it extremely challenging to precipitate change. In contrast to other industrialized nations, the U.S. does not have a uniform healthcare ecosystem, and there is no universal healthcare coverage for its citizens. Rather than operating a national health service, a single-payer national health insurance system or a multi-payer universal health insurance fund, the U.S. healthcare system can best be described as a hybrid system. In 2014, 48% of U.S. healthcare spending came from private funds, with 28% coming from households and 20% coming from private businesses.
The federal government on the other hand, accounts for 45% of healthcare spending. Most healthcare, even if publicly financed, is delivered privately. Another significant cost factor is the administrative overhead tied to managing and delivering healthcare. A large portion of the U.S. healthcare workforce consist of clerks, adjudicators, and adjusters, among others; none of whom have a direct role in care delivery.
This white paper explores the historical events, context and constructs that influenced the formation of the current healthcare system in the U.S. with the goal of highlighting a path forward that involves leveraging the underlying strengths of the healthcare market, such as the people, data, analytics, research, technology and innovation to foster value-based preventive care, streamline costs and elevate the overall quality of care delivered.