This article was first published in the New York State Bar Association’s Health Law Journal, Vol. 17, No. 3 (Summer/Fall 2012).
Due to the sensitive nature of the industry it services, the American hospital must rightfully operate under copious federal and state regulations, in addition to volumes of rules and ordinances established by separate, non-governmental entities. Though policing policies such as accreditation, certification and periodic review come from a variety of both public and private sources, the goal is generally consistent: develop uniform standards to ensure that hospitals in the U.S. operate at an acceptable safety level while delivering quality patient care.
The Many Paths to Accreditation
Though its primary function is without question the delivery of accurate and effective medical treatment, health care is also big business. In an attempt to promote constant vigilance among America’s hospitals, any one institution may be subject to accreditation review at any time from private, non-governmental organizations such as the Joint Commission, the Healthcare Facilities Accreditation Program (HFAP), Accreditation Commission for Health Care (ACHC), Community Health Accreditation Program (CHAP), the Compliance Team, Inc., Healthcare Quality Association on Accreditation (HQAA), or DNV Healthcare, Inc. (DNV), among others.
By and large, each private entity governs through its own set of rules. For example, the Joint Commission surveys hospitals by following more than 276 standards and reviewing 1,612 elements of performance. HFAP does largely the same thing pursuant to its 1,100 or more individual standards. Focusing on home medical equipment as well as durable medical equipment, prosthetics, orthotics and supplies (“DMEPOS”), HQAA has developed a review process consistent with federal standards.
Hospital Accreditation and the Joint Commission
Should a hospital wish to treat Medicare beneficiaries (with the expectation of payment), it must first enter into a provider agreement with Medicare. As a condition precedent to such participation, hospitals must meet certain requirements established by the Social Security Act or imposed by the Secretary of the Department of Health and Human Services (HHS), more commonly referred to as “conditions of participation” (CoPs). Hospitals can satisfy this statutory requirement by certification through a state agency, or alternatively the provider can seek “accreditation by an approved national accreditation organization that all applicable Medicare conditions are met or exceeded.” The federal government recognizes the Joint Commission – as well as certain other organizations that have been confirmed as capable of providing appropriate oversight – as a national accreditation program for hospitals participating in Medicare or Medicaid.
Formed December 15, 1951, as an independent, non-profit entity, the Joint Commission (known until 2007 as the Joint Commission on Accreditation of Hospitals) began as a collaboration between the American College of Physicians, the American Hospital Association, the American Medical Association, the Canadian Medical Association, and the American College of Surgeons. The Joint Commission started its process of administering hospital accreditations in January 1953, evolving over the years from a one-page set of requirements in 1919 (known as “The Minimum Standard”) to a 152-page manual for standards in 1970 (known as the 1970 Accreditation Manual for Hospitals) to the approximately 500-page manual that exists today.
The Joint Commission provides the following mission statement for the organizations with which it partners: “To continuously improve health care for the public, in collaboration with other stakeholders, by evaluating health care organizations and inspiring them to excel in providing safe and effective care of the highest quality and value.” As with all acute care hospital accreditation entities, the Joint Commission must confirm that these providers meet specific and extensive criteria set forth by the federal government.
As part of the rigorous set of standards reviewed in any hospital survey, the Joint Commission integrates performance measures in hospital accreditation oversight through its ORYX® initiative (a term unique to the Joint Commission). First deployed by the Joint Commission in 1997, ORYX core measure data are among the key data elements included in the Joint Commission’s focus on improvement. In its original form, ORYX had no industry standard detailing the type or amount of data hospitals should collect, and in fact more hospitals initially resisted than participated in this approach. Today, however, this institutionalized method for garnering information based on quality measures is a federal requirement, and the Joint Commission now accumulates data from hospitals for approximately 60 different inpatient measures. Moreover, not only does the federal government penalize hospitals for non-compliance, the 2010 Patient Protection and Affordable Care Act (PPACA) may soon emphasize quality and performance as the core foundation of health care’s future reimbursement structure.
In November 2010, the Joint Commission outlined a five-year plan to continue its monitoring of the changing health care climate as the organization addresses areas for improvement:
“• Refinement of the process for electronic receipt of high quality standardized performance measure data that cover all aspects of care delivery within and across the various types of health care organizations (e.g., hospitals, long term care, home care, etc.). Approaches to refining this process will include exploration of the potential to expand the capability of the electronic health record to capture measured data as a byproduct of the health care delivery process.
• Expansion of the scope of measure sets available for selection by health care organizations. This includes increasing the complement of measure sets for hospitals to provide a broader menu for measure selection. . . .
• Creation of sophisticated applications of measurement data use for accreditation, accountability and public reporting purposes.
• Coordination of data demands and prioritization of critical measurement areas by the various public and private sector entities to minimize data collection burden and eliminate redundancies for health care organizations, while maximizing the consistency and usefulness of the data. Coordination activities will focus on the amalgamation of data demands by large national entities including CMS, the QIOs, NQF, AHRQ, IOM and others.
• Continued, proactive support for the leadership role of the National Quality Forum in the identification of national measurement objectives and the establishment of a long-term collaborative relationship.
• Continued proactive support for, and participation in, the work of the Hospital Quality Alliance, the AQA, and their combined efforts to harmonize these activities.”
Contemporary Performance Standards in the Context of Modern Health Care
Today, the Joint Commission requires hospitals to collect and submit certain data that falls under its “core measure sets,” including but not limited to heart attacks and heart failure, pneumonia, and the Joint Commission’s Surgical Care Improvement Project. Last month, the Joint Commission released its Annual Report on Quality and Safety entitled Improving America’s Hospitals (the “Report”) as a means to showcase the commendable achievements of hospitals identified by the Joint Commission as “Top Performers on Key Quality Measures,” as well as to provide a comprehensive analysis on how those hospitals accredited by the Joint Commission fared for all measures.
Joint Commission accountability measures connect evidence based care with positive patient results. The Joint Commission contends that implementation is more effective when it relates to certain programs wherein the public or even an outside regulatory agency holds the provider accountable, similar to the proposed federal regulations for value-based purchasing. The Joint Commission has established four criteria in assessing the success of these evidence based examples, including:
“Research: Strong scientific evidence demonstrates that performing the evidence-based care process improves health outcome (either directly or by reducing risk of adverse outcomes).
Proximity: Performing the care process is closely connected to the patient outcome; there are relatively few clinical processes that occur after the one that is measured and before the improved outcome occurs.
Accuracy: The measure accurately assesses whether or not the care process has actually been provided. That is, the measure should be capable of indicating whether the process has been delivered with sufficient effectiveness to make improved outcomes likely.
No Adverse Effects: Implementing the measure has little or not chance of inducing unintended adverse consequences.”
The tables in Appendix A summarize the Report in three areas: (1) heart attack care accountability composite; (2) pneumonia care accountability composite; and (3) joint replacement, just one example contained within the surgical care accountability composite. These tables show a steady increase in the care measure results (the “Care Composite”) for each medical condition and surgical procedure.
When taken at face value in relation to the examples set forth in Appendix A, it is difficult to find fault with the Report and the ways in which hospitals have improved the delivery of care in these areas. And yet, while viewing these successes in the context of health care in its totality does not in itself undercut the Report and its significance as a means to gauge the effectiveness of the accreditation process, it does portray somewhat of a different image.
The United States spent an estimated $2.6 trillion on national health in 2010 (17.6 percent of the U.S. GDP). Some estimates expect this figure to be as high as $4.64 trillion by 2020 (nearly 20% of the U.S. GDP). Singling out the nation’s biggest spender, trends in California are of special concern as health care expenses continue to grow steadily along with the state’s population, even though California lost approximately 10% of its hospital beds between 2002 and 2009.
While few dispute the statistical information proving that we as a nation spend more on health care every year, the nexus between health care spending and actual revenue trends calls into question the sustainability of a system that finds itself locked into a self-perpetuating spending binge in its bid for survival.
A Comparison Between the Report and Correlating Medicare DRGs
With respect to tables 1 and 2 in Appendix A (heart attack care measure results), the Joint Commission’s Care Composite was compared with the Medicare diagnostic related groups (DRGs) information in 2006 and 2007 for DRG numbers 127 (heart failure and shock) and 140 (angina pectoris), and in 2008 and 2009 for MS-DRG numbers 291 (heart failure and shock with major complication/comorbidity (MCC), 292 (heart failure and shock with complication/comorbidity (CC), 293 (heart failure and shock without CC or MCC), and 311 (angina pectoris).
The Medicare revenue percentage for each respective DRG description was extracted from the DRG data relating to its annual revenue consistent with national data for such acute care, divided by the number of patient days for the same year. This data was taken from the Medicare Provider Analysis and Review (MEDPAR) files, which contain information pertaining to 100% of Medicare beneficiaries using hospital inpatient services national data for short stay, inpatient DRGs. From these figures, Appendix A, Table 1 compares the Medicare revenue percentage for heart failure and shock with the Report’s Care Composite in the area of heart attack care for years 2006 through and including 2009. Appendix A, Table 2 compares the Medicare revenue percentage for angina pectoris with the Report’s Care Composite in the same area, and for the same time frame (2006 to 2009).
A similar approach was employed to create Appendix A, Table 3, comparing the Joint Commission’s Pneumonia Care Composite with the appropriate DRGs. For 2006 and 2007, DRG numbers 89 (simple pneumonia and pleurisy (18 years and older in age)) with CC), 90 (simple pneumonia and pleurisy (18 years and older in age)) without CC, and 91 (simple pneumonia and pleurisy (under 18 years in age)) were used for the study, and for 2008 and 2009 MS-DRG numbers 193 (simple pneumonia and pleurisy with MCC), 194 (simple pneumonia and pleurisy with CC), and 195 (simple pneumonia and pleurisy without CC or MCC). The Medicare revenue percentage for each respective DRG description was extracted from the DRG data relating to its annual revenue consistent with national data for such acute care, divided by the number of patient days for the same year. The source of the data is also the MEDPAR files.
Appendix A, Table 4 (addressing joint replacement, a single example from the Report’s surgical care composite) was created through a compilation of data from within the Report (page 22, Table 6). Using information from three separate line items – (1) Antibiotics within 1 hour of first cut – For hip joint replacement surgery,” (2) “Appropriate Prophylactic Antibiotics – For hip joint replacement surgery,” and (3) “Stopping Antibiotics within 24 hours – For hip joint replacement surgery,” Appendix A, Table 4 represents the average. The Care Composite for joint replacement was then compared with the appropriate MS-DRGs numbers from 2008 and 2009, including 469 (major joint replacement or reattachment of lower extremity with MCC) and 470 (major joint replacement or reattachment of lower extremity without MCC).  The Medicare revenue percentage for each respective MS-DRG description was extracted from the MS-DRG data relating to its annual revenue consistent with national data for such acute care, divided by the number of patient days for the same year. The source of the data is also the MEDPAR files.
If our nation’s track record on health care funding since the inception of Medicare is any indication, it should come as no surprise that hospital reimbursements do not share the same trajectory as Joint Commission quality standards. Indeed, factoring into the equation additional variables such as annual inflation and a struggling economy only serves to further distinguish the historical paths of performance and payment. As Medicare prepares for a massive shifting from cost to performance-based reimbursement, a move likely followed in quick succession by other payer groups, the contradictory manner in which health care regulations reward annual improvement by reducing reimbursements speaks volumes about a system not just in transition, but in a state of confusion.
To be certain, the evolution of the reimbursement system has been shaped as much by innovation and advancements as it has by politics and a constantly changing definition of public interest. But in this age of technology, it may be prudent to take stock in the collections of data we have amassed as a means to understand and refine the delicate infrastructure of health care in the U.S. Ultimately, future congressional focus should be directed toward creating a self-sustaining system that improves the delivery of health care throughout the nation and is fair to both the individuals and institutions that participate therein. This hardly seems like an unreasonable place to start.
Appendix A, Table 1
Joint Commission Heart Attack Care Measure Results Compared with Medicare Revenue for Heart Failure and Shock, 2006 to 2009
Appendix A, Table 2
Joint Commission Heart Attack Care Measure Results Compared with Medicare Revenue for Angina Pectoris, 2006-2009
Appendix A, Table 3
Joint Commission Pneumonia Care Measure Results Compared with Medicare Revenue for Pneumonia and Pleurisy, 2006-2009
Appendix A, Table 4
Joint Commission Surgical Care Measure Results (Joint Replacement) Compared with Medicare Revenue for Joint Replacement, 2008-2009
Hospitals in California (2002 to 2009)
*Number of community hospitals only, which represent 85% of all hospitals according to American Hospital Association data for each year. Federal hospitals, long-term care hospitals, psychiatric hospitals, and other similar institutions are not included.
**Numbers based on 2010 U.S. Census, 2000 U.S. Census, and estimates based on a comparison data from the years 2001 through 2009.
***U.S. Census Bureau’s annual survey of state and local government finances.
 According to Congressional Budget Office estimates, major health programs accounted for 2.9 percent of the nation’s GDP between 1971 and 2010 (averaged). Under the 2010 Patient Protection and Affordable Care Act, this figure may increase to as much as 7.1 percent by 2021. See, e.g., Presentation by Douglas W. Elmendorf, Director, Congressional Budget Office, Federal Budget Math: We Can’t Repeat the Past (June 16, 2011).
 The Joint Commission is an independent, not-for-profit organization that accredits and certifies more than 19,000 health care organizations and programs in the United States. See www.jointcommission.org.
 Established in 1945 to conduct objective reviews of osteopathic hospitals and the services they provide, HFAP surveys hospitals for compliance with the Medicare Conditions of Participation and Coverage. See www.hfap.org.
 ACHC is a national health care accrediting organization designed to create a system catering to small providers. See www.achc.org.
 CHAP is an independent, not-for-profit accrediting body for community based health care organizations. See www.chapinc.org.
 Since 2006, the Compliance Team, Inc. has been a nationally recognized, CMS-approved accrediting body for providers of durable medical equipment, prosthetics, orthotics, and supplies. See www.exemplaryprovider.com.
 HQAA provides home medical or durable medical equipment accreditation programs. See www.hqaa.org.
 The newest accreditation organization for hospitals, DNV received deemed authority from the Centers for Medicare & Medicaid Services in 2008. See www.dnvaccreditation.com.
 This regulatory infrastructure exists in addition to the labyrinth of federal and state laws. See, e.g., 42 U.S.C. Section 1395x(e)(ii).
 See 42 CFR § 424.58. The 2003 Medicare Prescription Drug, Improvement, and Modernization Act (Pub. L. 198-173) required the federal government to implement quality standards for DMEPOS.
 Originally P.L. 74-271, approved August 14, 1935, 49 Stat. 620, and all subsequent amendments thereto.
 See, e.g., 42 U.S.C. §§ 1302, 1395hh, 1395rr; 42 C.F.R. part 482.
 See 74 Federal Register (227) 62333 (Nov. 27, 2009).
 Id. (approving the Joint Commission’s status through July 15, 2014); see also Medicare Improvements for Patients and Providers Act of 2008 (“MIPPA”), § 125 (Pub. L. 110-275) (changing the process of accreditation in 2008 by revoking the Joint Commission’s statutorily-guaranteed “deeming authority” for hospitals and requiring that the Joint Commission apply to, and obtain approval from, the Centers for Medicare & Medicaid Services (CMS)).
 See Roberts, James S., MD, Coale, Jack G., MA, Redman, Robert R., MA, A History of the Joint Commission on Accreditation of Hospitals, 258 (7) Jama 936, 938 (Aug. 21, 1987). The article notes that in 1958, the Canadian Medical Association withdrew from the Joint Commission. Id.
 Comprehensive Accreditation Manual for Hospitals: The Official Handbook, (Joint Commission Resources, Inc., March 2011).
 See id. at FW-1 (the Joint Commission revised its mission statement in 2009).
 The author neither addresses nor opines upon the scope of the Joint Commission’s influence in the hospital accreditation process, and does not attempt to compare the Joint Commission with other entities providing similar and/or comparable oversight. Between 2002 and 2011, the author was the chief executive officer of an acute care hospital in Norwalk, California, accredited at all times by both the Joint Commission and HFAP.
 See, e.g., 42 C.F.R. §§ 482.1, 482.2, 482.11, 482.12, 482.13, 482.21, 482.22, 482.23, 482.24, 482.25, 482.26, 482.27, 482.28, 482.30, 482.41, 482.42, 482.43, 482.45, 482.51, 482.52,482.53, 482.54, 482.55.
 Id. At PM-1 (“ORYX measurement requirements are intended to support Joint Commission – accredited hospitals in their quality improvement efforts. Performance measures are essential to the credibility of any modern evaluation activity for hospitals.”)
 See Chassin, Mark R., M.D., Loeb, Jerod M., Ph.D., et al., Accountability Measures – Using Measurement to Promote Quality Improvement, 363 (7) Nejm 683 (Aug. 12, 2010).
 Pub. L. 111-148.
 PPACA, § 3022; 42 C.F.R. § 425 (proposed rules as of April 7, 2011).
 Evolution of Performance Measurement at the Joint Commission 1986-2010: A Visioning Document (available at www.jointcommission.org/assets/1/18/SIWG_Prologue_web_version.pdf)
 Improving America’s Hospitals: The Joint Commission’s Annual Report on Quality and Safety, p. 4 (2011)
 See supra, note 24.
 The Report, p. 29.
 The Report, p. 20, Table 3.
 The Report, p. 21, Table 5.
 The Report, p. 22, Table 6.
 But cf., McCannon, Joseph, AB, Berwick, Donald M., MD, MPP, A New Frontier in Patient Safety, 305 (21) Jama 2221 (June 1, 2011)(concluding that despite the investment into the nation’s healthcare system since the 1999 report To Err Is Human, medical errors continue to harm hospital patients to such an extend that further change is necessary); Wachter, Robert M., Patient Safety at Ten: Unmistakable Progress, Troubling Gaps, 29 (1) Health Affairs 165, 172 (January 2010) (summarizing the success in efforts to enforce safety standards over the past five years as slightly above average).
 See Centers for Medicare & Medicaid Services, Office of the Actuary, National Health Statistics Group; Department of Commerce, Bureau of Economic Analysis and Bureau of the Census; Keehan, Sean P., Sisko, Andrea M., et al, National Health Spending Projections Through 2020, 30 (8) Health Affairs 1594 (August 2011).
 Keehan, supra, National Health Spending Projections Through 2020, p. 1595
 See Appendix B: Hospitals in California – 2002 to 2009. Between 2002 and 2009, health care spending increased by 34%, and there were 40 fewer hospitals available to treat approximately 2.7 million additional residents.
 As of fiscal year 2008, CMS changed the Medicare inpatient prospective payment system by introducing Medicare Severity Diagnosis Related Group (“MS-DRGs”), thereby creating an entirely new numbering system for DRGs in 2008 and 2009. See 42 U.S.C. § 1395ww; TMA, Abstinence, Education, and QI Programs Extension Act of 2007, P.L. 110-90 (approved Sept. 29, 2007, 121 Stat. 984), § 7(a). Information for DRG and MS-DRG descriptions obtained from the CMS website for fiscal years 2008 and 2009 (https://www.cms.gov/MedicareFeeforSvcPartsAB/).
 See, e.g., id.
 See, e.g., id.
 See, e.g., id.
 See, e.g., id.
 A recent study of the growth in family income in the U.S. over the past decade concluded that the estimated increase from $76,000 in 1999 to $99,000 in 2009 was practically erased by the increase in household spending on monthly health insurance premiums, out-of-pocket health care costs, and tax-related expenses directed toward health care. See Auerbach, David I., Kellerman, Arthur L., A Decade of Health Care Cost Growth Has Wiped Out Real Income Gains for an Average U.S. Family, 30 (9) Health Affairs 1630 (Sept. 2011).
 See supra, note 24.