-
Start Preamble
Start Printed Page 2454
AGENCY:
Centers for Medicare & Medicaid Services (CMS), HHS.
ACTION:
Proposed rule.
SUMMARY:
In this proposed rule, we are proposing to implement a Hospital Value-Based Purchasing program (“Hospital VBP program” or “the program”) under section 1886(o) of the Social Security Act (“Act”), under which value-based incentive payments will be made in a fiscal year to hospitals that meet performance standards with respect to a performance period for the fiscal year involved. The program will apply to payments for discharges occurring on or after October 1, 2012, in accordance with section 1886(o) of the Social Security Act (as added by section 3001(a) of the Patient Protection and Affordable Care Act (Pub. L. 111-148), enacted on March 23, 2010, as amended by the Health Care and Education Reconciliation Act of 2010 (Pub. L. 111-152), enacted on March 30, 2010 (collectively known as the Affordable Care Act)). The measures we are proposing to initially adopt for the program are a subset of the measures that we have already adopted for the existing Medicare Hospital Inpatient Quality Reporting Program (Hospital IQR program), formerly known as the Reporting Hospital Quality Data for the Annual Payment Update Program (RHQDAPU), and we are proposing, based on whether a hospital meets or exceeds the performance standards that we are proposing to establish with respect to the measures, to reward the hospital based on its actual performance, rather than simply its reporting of data for those measures.
DATES:
To be assured consideration, comments must be received at one of the addresses provided below, no later than 5 p.m. on March 8, 2011.
ADDRESSES:
In commenting, please refer to file code CMS-3239-P. Because of staff and resource limitations, we cannot accept comments by facsimile (FAX) transmission.
You may submit comments in one of four ways (please choose only one of the ways listed):
1. Electronically. You may submit electronic comments on this regulation to http://www.regulations.gov. Follow the “Submit a comment” instructions.
2. By regular mail. You may mail written comments to the following address only:
Centers for Medicare & Medicaid Services, Department of Health and Human Services, Attention: CMS-3239-P, P.O. Box 8010, Baltimore, MD 21244-8010.
Please allow sufficient time for mailed comments to be received before the close of the comment period.
3. By express or overnight mail. You may send written comments to the following address only: Centers for Medicare & Medicaid Services, Department of Health and Human Services, Attention: CMS-3239-P, Mail Stop C4-26-05, 7500 Security Boulevard, Baltimore, MD 21244-1850.
4. By hand or courier. If you prefer, you may deliver (by hand or courier) your written comments before the close of the comment period to either of the following addresses:
a. For delivery in Washington, DC— Centers for Medicare & Medicaid Services, Department of Health and Human Services, Room 445-G, Hubert H. Humphrey Building, 200 Independence Avenue, SW., Washington, DC 20201.
(Because access to the interior of the Hubert H. Humphrey Building is not readily available to persons without Federal government identification, commenters are encouraged to leave their comments in the CMS drop slots located in the main lobby of the building. A stamp-in clock is available for persons wishing to retain a proof of filing by stamping in and retaining an extra copy of the comments being filed.)
b. For delivery in Baltimore, MD—Centers for Medicare & Medicaid Services, Department of Health and Human Services, 7500 Security Boulevard, Baltimore, MD 21244-1850.
If you intend to deliver your comments to the Baltimore address, please call telephone number (410) 786-8691 in advance to schedule your arrival with one of our staff members.
Comments mailed to the addresses indicated as appropriate for hand or courier delivery may be delayed and received after the comment period.
For information on viewing public comments, see the beginning of the SUPPLEMENTARY INFORMATION section.
Start Further InfoFOR FURTHER INFORMATION CONTACT:
Allison Lee, (410) 786-8691.
End Further Info End Preamble Start Supplemental InformationSUPPLEMENTARY INFORMATION:
Inspection of Public Comments: All comments received before the close of the comment period are available for viewing by the public, including any personally identifiable or confidential business information that is included in a comment. We post all comments received before the close of the comment period on the following Web site as soon as possible after they have been received: http://www.regulations.gov. Follow the search instructions on that Web site to view public comments.
Comments received timely will also be available for public inspection as they are received, generally beginning approximately 3 weeks after publication of a document, at the headquarters of the Centers for Medicare & Medicaid Services, 7500 Security Boulevard, Baltimore, Maryland 21244, Monday through Friday of each week from 8:30 a.m. to 4 p.m. To schedule an appointment to view public comments, phone 1-800-743-3951.
Table of Contents
I. Background
A. Overview
B. Hospital Inpatient Quality Data Reporting Under Section 501(b) of Public Law 108-173
C. Hospital Inpatient Quality Reporting Under Section 5001(a) of Public Law 109-171
D. 2007 Report to Congress: Plan To Implement a Medicare Hospital Value-Based Purchasing Program
E. Provisions of the Affordable Care Act
II. Provisions of the Proposed Regulations
A. Overview of the Proposed Hospital Value-Based Purchasing Program
B. Proposed Performance Period
C. Proposed Measures
D. Proposed Performance Standards
E. Proposed Methodology for Calculating the Total Performance Score
F. Applicability of the Value-Based Purchasing Program to Hospitals
G. The Exchange Function
H. Proposed Hospital Notification and Review Procedures
I. Proposed Reconsideration and Appeal Procedures
J. Proposed FY 2013 Validation Requirements for Hospital Value-Based Purchasing
K. Additional Information
L. QIO Quality Data Access
III. Collection of Information Requirements
IV. Response to Comments
V. Regulatory Impact Statement
A. Overall Impact
B. Anticipated Effects
C. Alternatives Considered
D. Accounting Statement
Acronyms
Because of the many terms to which we refer by acronym in this proposed rule, we are listing the acronyms used and their corresponding meanings in alphabetical order below:
AHRQ Agency for Healthcare Research and QualityStart Printed Page 2455
AMI Acute Myocardial Infarction
CCN CMS Certification number
CMS Centers for Medicare & Medicaid Services
DRG Diagnosis-Related Group
FISMA Federal Information Security and Management Act
HCAHPS Hospital Consumer Assessment of Healthcare Providers and Systems
HF Heart Failure
HIPAA Health Insurance Portability and Accountability Act
HOP QDRP Hospital Outpatient Quality Data Reporting Program
IPPS Inpatient prospective payment systems
IQR Inpatient Quality Reporting
NQF National Quality Forum
PN Pneumonia
PQRI Physician Quality Reporting Initiative
PRRB Provider Reimbursement Review Board
PSI Patient Safety Indicator
QIO Quality Improvement Organization
QRS Quality Review Study
RFA Regulatory Flexibility Act
RHQDAPU Reporting Hospital Quality Data for the Annual Payment Update Program
RIA Regulatory Impact Analysis
SCIP Surgical Care Improvement
VBP Value-Based Purchasing
I. Background
A. Overview
The Centers for Medicare & Medicaid Services (CMS) promotes higher quality and more efficient health care for Medicare beneficiaries. In recent years, we have undertaken a number of initiatives to lay the foundation for rewarding health care providers and suppliers for the quality of care they provide by tying a portion of their Medicare payments to their performance on quality measures. These initiatives, which include demonstration projects and quality reporting programs, have been applied to various health care settings, including physicians' offices, ambulatory care facilities, hospitals, nursing homes, home health agencies, and dialysis facilities. The overarching goal of these initiatives is to transform Medicare from a passive payer of claims to an active purchaser of quality health care for its beneficiaries.
This effort is supported by our adoption of an increasing number of widely-agreed upon quality measures for purposes of our existing quality reporting programs. We have worked with stakeholders to define measures of quality in almost every setting. These measures assess structural aspects of care, clinical processes, patient experiences with care, and, increasingly, outcomes.
We have implemented quality measure reporting programs that apply to various settings of care. With regard to hospital inpatient services, we implemented the Hospital IQR program. In addition, we have implemented quality reporting programs for hospital outpatient services through the Hospital Outpatient Quality Data Reporting Program (HOP QDRP), and for physicians and other eligible professionals through the Physician Quality Reporting Initiative (PQRI). We have also implemented quality reporting programs for home health agencies and skilled nursing facilities based on conditions of participation, and an end-stage renal disease quality reporting program based on conditions for coverage.
This new program will necessarily be a fluid model, subject to change as knowledge, measures and tools evolve. We view the Hospital VBP program under section 1886(o) of the Social Security Act (the Act) as the next step in promoting higher quality care for Medicare beneficiaries and transforming Medicare into an active purchaser of quality health care for its beneficiaries.
In developing this rule as well as other value-based payment initiatives, CMS applied the following principles for the development and use of measures and scoring methodologies.
Purpose:
CMS views value-based purchasing as an important step to revamping how care and services are paid for, moving increasingly toward rewarding better value, outcomes, and innovations instead of merely volume.
Use of Measures:
- Public reporting and value-based payment systems should rely on a mix of standards, process, outcomes, and patient experience measures, including measures of care transitions and changes in patient functional status. Across all programs, CMS seeks to move as quickly as possible to the use of primarily outcome and patient experience measures. To the extent practicable and appropriate, outcomes and patient experience measures should be adjusted for risk or other appropriate patient population or provider characteristics.
- To the extent possible and recognizing differences in payment system maturity and statutory authorities, measures should be aligned across Medicare's and Medicaid's public reporting and payment systems. CMS seeks to evolve to a focused core-set of measures appropriate to the specific provider category that reflects the level of care and the most important areas of service and measures for that provider.
- The collection of information should minimize the burden on providers to the extent possible. As part of that effort, CMS will continuously seek to align its measures with the adoption of meaningful use standards for health information technology (HIT), so the collection of performance information is part of care delivery.
- To the extent practicable, measures used by CMS should be nationally endorsed by a multi-stakeholder organization. Measures should be aligned with best practices among other payers and the needs of the end users of the measures.
Scoring Methodology:
- Providers should be scored on their overall achievement relative to national or other appropriate benchmarks. In addition, scoring methodologies should consider improvement as an independent goal.
- Measures or measurement domains need not be given equal weight, but over time, scoring methodologies should be more weighted towards outcome, patient experience and functional status measures.
- Scoring methodologies should be reliable, as straightforward as possible, and stable over time and enable consumers, providers, and payers to make meaningful distinctions among providers' performance.
CMS welcomes comments on these principles.
B. Hospital Inpatient Quality Data Reporting Under Section 501(b) of Public Law 108-173
Section 501(b) of the Medicare Prescription Drug, Improvement, and Modernization Act of 2003 (MMA), Public Law 108-173, added section 1886(b)(3)(B)(vii) to the Act. This section established the original authority for the Hospital IQR program and revised the mechanism used to update the standardized payment amount for inpatient hospital operating costs. Specifically, section 1886(b)(3)(B)(vii)(I) of the Act provided for a reduction of 0.4 percentage points to the annual percentage increase (sometimes referred to at that time as the market basket update) for FY 2005 through FY 2007 for a subsection (d) hospital if the hospital did not submit data on a set of 10 quality indicators established by the Secretary as of November 1, 2003. It also provided that any reduction applied only to the fiscal year involved, and would not be taken into account in computing the applicable percentage increase for a subsequent fiscal year. The statute thereby established an incentive for many subsection (d) hospitals to submit data on the quality measures established by the Secretary.
We implemented section 1886(b)(3)(B)(vii) of the Act in the FY Start Printed Page 24562005 IPPS final rule (69 FR 49078) and codified the applicable percentage change in § 412.64(d) of our regulations. We adopted additional requirements under the Hospital IQR program in the FY 2006 IPPS final rule (70 FR 47420).
C. Hospital Inpatient Quality Reporting Under Section 5001(a) of Public Law 109-171
1. Change in the Reduction to the Annual Percentage Increase
Section 5001(a) of the Deficit Reduction Act of 2005 (DRA), Public Law 109-171, further amended section 1886(b)(3)(B) of the Act to, among other things, revise the mechanism used to update the standardized payment amount for hospital inpatient operating costs by adding new section 1886(b)(3)(B)(viii) to the Act. Specifically, sections 1886(b)(3)(B)(viii)(I) and (II) of the Act as added by the DRA originally provided that the annual percentage increase for FY 2007 and each subsequent fiscal year shall be reduced by 2.0 percentage points for a subsection (d) hospital that does not submit quality data in a form and manner, and at a time, specified by the Secretary. Section 1886(b)(3)(B)(viii)(I) of the Act also provided that any reduction in a hospital's annual percentage increase will apply only with respect to the fiscal year involved, and will not be taken into account for computing the applicable percentage increase for a subsequent fiscal year.
In the FY 2007 IPPS final rule (71 FR 48045), we amended our regulations at § 412.64(d)(2) to reflect the 2.0 percentage point reduction required under the DRA.
2. Selection of Quality Measures
Section 1886(b)(3)(B)(viii)(V) of the Act, before it was amended by section 3001(a)(2)(B) of the Affordable Care Act, required that, effective for payments beginning with FY 2008, the Secretary add other measures that reflect consensus among affected parties, and to the extent feasible and practicable, have been set forth by one or more national consensus building entities. The National Quality Forum (NQF) is a voluntary consensus standard-setting organization with a diverse representation of consumer, purchaser, provider, academic, clinical, and other health care stakeholder organizations. The NQF was established to standardize health care quality measurement and reporting through its consensus development process. We have generally adopted NQF-endorsed measures for purposes of the Hospital IQR program. However, we believe that consensus among affected parties also can be reflected by other means, including consensus achieved during the measure development process, consensus shown through broad acceptance and use of measures, and consensus achieved through public comment.
Section 1886(b)(3)(B)(viii)(VI) of the Act authorizes the Secretary to replace any quality measures or indicators in appropriate cases, such as when all hospitals are effectively in compliance with a measure, or the measures or indicators have been subsequently shown to not represent the best clinical practice. We interpreted this provision to give us broad discretion to replace measures that are no longer appropriate for the Hospital IQR program.
We have adopted 45 measures under the Hospital IQR program for the FY 2011 payment determination. Of these measures, 27 are chart-abstracted process of care measures, which assess the quality of care furnished by hospitals in connection with four topics: Acute Myocardial Infarction (AMI); Heart Failure (HF); Pneumonia (PN); and Surgical Care Improvement (SCIP) (75 FR 50182). Fifteen of the measures are claims-based measures, which assess the quality of care furnished by hospitals on the following topics: 30-day mortality and 30-day readmission rates for Medicare patients diagnosed with either AMI, HF, or PN; Patient Safety Indicators/Inpatient Quality Indicators/Composite Measures; and Patient Safety Indicators/Nursing Sensitive Care. Three of the measures are structural measures that assess hospital participation in cardiac surgery, stroke care, and nursing sensitive care systemic databases. Finally, the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) patient experience of care survey is included as a measure for the FY 2011 payment determination.
The technical specifications for the Hospital IQR program measures, or links to Web sites hosting technical specifications, are contained in the CMS/The Joint Commission Specifications Manual for National Hospital Inpatient Quality Measures (Specifications Manual). This Specifications Manual is posted on the CMS QualityNet Web site at https://www.QualityNet.org/. We maintain the technical specifications by updating this Specifications Manual semiannually, or more frequently in unusual cases, and include detailed instructions and calculation algorithms for hospitals to use when collecting and submitting data on required measures. These semiannual updates are accompanied by notifications to users, providing sufficient time between the change and the effective date in order to allow users to incorporate changes and updates to the specifications into data collection systems.
3. Public Display of Quality Measures
Section 1886(b)(3)(B)(viii)(VII) of the Act, before it was amended by section 3001(a)(2)(C) of the Affordable Care Act, required that the Secretary establish procedures for making data submitted under the Hospital IQR program available to the public after ensuring that a hospital has the opportunity to review the data before it is made public. To meet this requirement, we have displayed most Hospital IQR program data on the Hospital Compare website, http://www.hospitalcompare.hhs.gov,, after a 30-day preview period. An interactive Web tool, this Web site assists beneficiaries by providing information on hospital quality of care to those who need to select a hospital. It further serves to encourage beneficiaries to work with their doctors and hospitals to discuss the quality of care hospitals provide to patients, thereby providing an additional incentive to hospitals to improve the quality of care that they furnish. The Hospital Compare website currently makes public data on clinical process of care measures, risk adjusted outcome measures, the HCAHPS patient experience of care survey, and structural measures. However, data that we believe is not suitable for inclusion on Hospital Compare because it is not salient or will not be fully understood by beneficiaries, as well as data for which there are unresolved display or design issues may be made available on other CMS Web sites that are not intended to be used as an interactive Web tool, such as http://www.cms.hhs.gov/HospitalQualityInits/. In such circumstances, affected parties are notified via CMS listservs, CMS e-mail blasts, national provider calls, and QualityNet announcements regarding the release of preview reports followed by the posting of data on a Web site other than Hospital Compare.
D. 2007 Report to Congress: Plan To Implement a Medicare Hospital Value-Based Purchasing Program
Section 5001(b) of the DRA required the Secretary to develop a plan to implement a value-based purchasing program for payments made under the Medicare program for subsection (d) hospitals. In developing the plan, we were required to consider the on-going development, selection, and modification process for measures of Start Printed Page 2457quality and efficiency in hospital inpatient settings; the reporting, collection, and validation of quality data; the structure, size, and sources of funding of value-based payment adjustments; and the disclosure of information on hospital performance.
In 2007, we submitted to Congress a report that discusses options for a plan to implement a Medicare hospital VBP program that builds on the Hospital IQR program. We recommended replacing the Hospital IQR program with a new program that would include both a public reporting requirement and financial incentives for better performance. We also recommended that a hospital VBP program be implemented in a manner that would not increase Medicare spending.
To calculate a hospital's total performance score under the plan, we analyzed a potential performance scoring model that incorporated measures from different quality “domains,” including clinical process of care and patient experience of care. We examined ways to translate that score into an incentive payment by making a portion of the base diagnosis-related group (DRG) payment contingent on performance. We analyzed criteria for selecting performance measures and considered a potential phased approach to transition from Hospital IQR to value-based purchasing. In addition, we examined redesigning the current data transmission process and validation infrastructure, including making enhancements to the Hospital Compare Web site, as well as an approach to monitor the impact of value-based purchasing.
E. Provisions of the Affordable Care Act
Section 3001(a) of the Patient Protection and Affordable Care Act (Pub. L. 111-148), enacted on March 23, 2010, as amended by the Health Care and Education Reconciliation Act of 2010 (Pub. L. 111-152), enacted on March 30, 2010 (collectively known as the Affordable Care Act), added a new section 1886(o) to the Social Security Act (the Act) which requires the Secretary to establish a hospital value-based purchasing program under which value-based incentive payments are made in a fiscal year to hospitals meeting performance standards established for a performance period for such fiscal year. Both the performance standards and the performance period for a fiscal year are to be established by the Secretary. Section 1886(o)(1)(B) of the Act directs the Secretary to begin making value-based incentive payments under the Hospital VBP program to hospitals for discharges occurring on or after October 1, 2012. These incentive payments will be funded for FY 2013 through a reduction to FY 2013 base operating DRG payments for each discharge of 1%, as required by section 1886(o)(7). Section 1886(o)(1)(C) provides that the Hospital VBP program applies to subsection (d) hospitals (as defined in section 1886(d)(1)(B)), but excludes from the definition of the term “hospital,” with respect to a fiscal year: 1) a hospital that is subject to the payment reduction under section 1886(b)(3)(B)(viii)(I) for such fiscal year; 2) a hospital for which, during the performance period for the fiscal year, the Secretary cited deficiencies that pose immediate jeopardy to the health and safety of patients; and 3) a hospital for which there is not a minimum number (as determined by the Secretary) of applicable measures for the performance period for the fiscal year involved, or for which there is not a minimum number (as determined by the Secretary) of cases for the applicable measures for the performance period for such fiscal year.
II. Provisions of the Proposed Regulations
A. Overview of the Proposed Hospital VBP Program
This proposed rule proposes to implement a Hospital Value-Based Purchasing program (“Hospital VBP program” or “the program”) under section 1886(o) of the Social Security Act (“Act”), under which value-based incentive payments will be made in a fiscal year (beginning FY 2013) to hospitals that meet performance standards established with respect to a performance period ending prior to the beginning of such fiscal year. This proposed rule was developed based on extensive research we conducted on hospital value-based purchasing, including research that formed the basis of a 2007 report we submitted to Congress, entitled “Report to Congress: Plan to Implement a Medicare Hospital Value-Based Purchasing Program” (November 21, 2007), a copy of which is available on the CMS Web site, and takes into account input from both stakeholders and other interested parties. As described more fully below, we are proposing to initially adopt for the FY 2013 Hospital VBP program 18 measures that we have already adopted for the Hospital IQR Program, categorized into two domains, as follows: 17 of the proposed measures will be clinical process of care measures, which we will group into a clinical process of care domain, and 1 measure will be the HCAHPS survey, which will fall under a patient experience of care domain. With respect to the clinical process of care and HCAHPS measures, we are proposing to use a three-quarter performance period from July 1, 2011 through March 31, 2012 for the FY 2013 payment determination and to determine whether hospitals meet the proposed performance standards for these measures by comparing their performance during the proposed performance period to their performance during a proposed three-quarter baseline period from July 1, 2009 through March 31, 2010. We are also proposing to initially adopt for the FY 2014 Hospital VBP program three outcome measures. With respect to the proposed outcome measures, we are proposing to use an 18-month performance period from July 1, 2011 to December 31, 2012. Furthermore, for the proposed outcome measures, we are proposing to establish performance standards and to determine whether hospitals meet those standards by comparing their performance during the proposed performance period to their performance during a proposed baseline period of July 1, 2008 to December 31, 2009.
In general, we are proposing to implement a methodology for assessing the total performance of each hospital based on performance standards, under which we will score each hospital based on achievement and improvement ranges for each applicable measure. Additionally, we are proposing to calculate a total performance score for each hospital by combining the greater of the hospital's achievement or improvement points for each measure to determine a score for each domain, multiplying each domain score by a proposed weight (clinical process of care: 70 percent, patient experience of care: 30 percent), and adding together the weighted domain scores. We are proposing to convert each hospital's total performance score into a value-based incentive payment utilizing a linear exchange function. All of these proposals are addressed in greater detail below.
B. Proposed Performance Period
Section 1886(o)(4) of the Act requires the Secretary to establish a performance period for a fiscal year that begins and ends prior to the beginning of such fiscal year. In considering various performance periods that could apply for purposes of the fiscal year 2013 payment adjustments, we recognized that hospitals submit data on the chart-abstracted measures adopted for the Hospital IQR Program on a quarterly Start Printed Page 2458basis, and for that reason, we would propose that the performance period commence at the beginning of a quarter. We also recognize that we must balance the length of the period for collecting measure data with the need to undertake the rulemaking process in order to establish the performance period and provide the public with an opportunity to meaningfully comment on that proposal. With these considerations in mind, we concluded that July 1, 2011 is the earliest date that the performance period could begin.
We then considered how long the performance period should be. Our preference would have been to propose to use a full year as the performance period for the clinical process of care and HCAHPS measures we are proposing to initially adopt for the FY 2013 Hospital VBP program, consistent with our analysis that using a full year performance period provides high levels of data accuracy and reliability for scoring hospitals on these measures. We concluded, however, that this would not give us sufficient time to calculate the total performance scores, calculate the value-based incentive payments, notify hospitals regarding their payment adjustments, and implement the payment adjustments. We subsequently analyzed how a shorter performance period might affect a hospital's performance score. Using the most recent clinical process of care and HCAHPS measure data available, we examined the feasibility of proposing to adopt a one quarter, two quarter, or three quarter performance period by comparing each of these periods to a four quarter baseline period. We did this to determine how closely a hospital's total performance score calculated using one, two, or three quarters of data would approximate what the hospital's total performance score would be if we proposed to use four quarters of data. Under our analysis, the total performance scores approximated using three quarters of data closely correlated with total performance scores approximated using four quarters of data. Specifically, our analysis showed that the three quarter performance period would have a correlation coefficient of 0.96815 (p-value .0001), while a two quarter performance period would have a correlation coefficient of 0.90358 (p-value .0001).
We also recognize that under the Hospital IQR program, hospitals have 135 days to submit chart abstracted data following the close of each quarter. Because we are proposing to implement a Hospital VBP program that builds on the Hospital IQR program, we would like, to the extent possible, to maintain our existing Hospital IQR program requirements. We believe that the 135 day time lag supports the adoption of a three quarter performance period based on the analysis discussed above, and that a one or two quarter performance period would provide lower data accuracy for scoring hospitals and adjusting their payments.
Therefore, we propose to use the fourth quarter of FY 2011 (July 1, 2011-September 30, 2011) and the first and second quarters of FY 2012 (October 1, 2011-March 31, 2012) as the performance period for proposed clinical process of care and HCAHPS measures we are proposing to initially adopt for the FY 2013 Hospital VBP program. Hospitals will be scored based on how well they perform on the proposed clinical process of care and HCAHPS measures during this proposed performance period. We note that we anticipate proposing to use a full year as the performance period for the clinical process of care and HCAHPS measures in the future. For the three mortality outcome measures currently specified for the Hospital IQR program for the FY 2011 payment determination (MORT-30-AMI, MORT-30-HF, MORT-30-PN) that we propose below to adopt for the FY 2014 Hospital VBP program payment determination, we are proposing to establish a performance period of July 1, 2011 to December 31, 2012. An eighteen-month performance period for mortality measures is intended to ensure the measures' reliability by capturing more cases than could be observed over one year of measurement. We plan to add additional measures to the Hospital VBP program, including but not limited to AHRQ and HAC measures that have been specified for the Hospital IQR program and propose that the performance period for those measures will begin one year after these measures have been displayed on the Hospital Compare Web site for the reasons discussed below.
C. Proposed Measures
Section 1886(o)(2)(A) of the Act requires the Secretary to select for the Hospital VBP program measures, other than readmission measures, from the measures specified for the Hospital IQR program. Section 1886(o)(2)(B)(i) requires the Secretary to ensure that the selected measures include measures on six specified conditions or topics: Acute Myocardial Infarction (AMI); Heart Failure (HF); Pneumonia (PN); Surgeries, as measured by the Surgical Care Improvement Project (SCIP); Healthcare-Associated Infections (HAI); and, the Hospital Consumer Assessment of Healthcare Providers and Systems survey (HCAHPS). Section 1886(o)(2)(C)(i) provides that the Secretary may not select a measure with respect to a performance period for a fiscal year unless the measure has been specified under section 1886(b)(3)(B)(viii) of the Act and included on the Hospital Compare website for at least one year prior to the beginning of the performance period. Section 1886(o)(2)(C)(ii) provides that a measure selected under section 1886(o)(2)(A) shall not apply to a hospital if the hospital does not furnish services appropriate to the measure.
Our measure development and selection activities for the Hospital IQR Program take into account national priorities, such as those established by the National Priorities Partnership,[1] and the Department of Health and Human Services,[2] as well as other widely accepted criteria established in medical literature.[3] Because we must select measures for the Hospital VBP program from the pool of measures that have been adopted for the Hospital IQR program, the measures to be selected for inclusion in Hospital VBP would also reflect these priorities.
In the FY 2011 IPPS/RY 2011 LTCH PPS final rule, we stated that in future expansions and updates to the Hospital IQR program measure set, we would be taking into consideration several important goals. These goals include: (a) Expanding the types of measures beyond process of care measures to include an increased number of outcome measures, efficiency measures, and patients' experience of care measures; (b) expanding the scope of hospital services to which the measures apply; (c) considering the burden on hospitals in collecting chart-abstracted data; (d) harmonizing the measures used in the Hospital IQR program with other CMS quality programs to align incentives and promote coordinated efforts to improve quality; (e) seeking to use measures based on alternative sources of data that do not require chart abstraction or that utilize data already being reported by many hospitals, such as data that hospitals report to clinical data registries, or all payer claims databases; and (f) weighing the relevance and utility of the measures compared to the burden on hospitals in submitting data under the Hospital IQR program. In addition, we believe that we Start Printed Page 2459must act with all speed and deliberateness to expand the pool of measures used in the Hospital VBP program. This goal is supported by at least two Federal reports documenting that tens of thousands of patients do not receive safe care in the nation's hospitals.[4] For this reason, we believe that we need to adopt measures for the Hospital VBP program relevant to improving care, particularly as these measures are directed toward improving patient safety, as quickly as possible. We believe that speed of implementation is a critical factor in the success and effectiveness of this program.
The Hospital VBP program that we are proposing to implement has been developed with the focused intention to motivate all subsection (d) hospitals to which the program applies to take immediate action to improve the quality of care they furnish to their patients. Because we view as urgent the necessity to improve the quality of care furnished by these hospitals, and because we believe that hospitalized patients in the United States currently face patient safety risks on a daily basis, we are proposing in this proposed rule to adopt an initial measure set for the Hospital VBP program. However, we are also proposing to add additional measures to the Hospital VBP program in the future in such a way that their performance period will begin immediately after they are displayed on Hospital Compare for a period of time of at least one year, but without the necessity of notice and comment rulemaking. We propose this because of the urgency to improve the quality of hospital care, and in order to minimize any delay to take substantive action in favor of patient safety. The details of this proposal are discussed below.
We have stated that for the Hospital IQR Program, we give priority to quality measures that assess performance on: (a) Conditions that result in the greatest mortality and morbidity in the Medicare population; (b) conditions that are high volume and high cost for the Medicare program; and (c) conditions for which wide cost and treatment variations have been reported, despite established clinical guidelines. In addition, we stated that we seek to select measures that address the six quality aims of effective, safe, timely, efficient, patient-centered, and equitable healthcare. Current and long term priority topics include: Prevention and population health; safety; chronic conditions; high cost and high volume conditions; elimination of health disparities; healthcare-associated infections and other adverse healthcare outcomes; improved care coordination; improved efficiency; improved patient and family experience of care; effective management of acute and chronic episodes of care; reduced unwarranted geographic variation in quality and efficiency; and adoption and use of interoperable health information technology.
We have also stated that these criteria, priorities, and goals are consistent with section 1886(b)(3)(B)(viii)(X) of the Act, as added by section 3001(a)(2)(D) of the Affordable Care Act, which requires the Secretary, to the extent practicable and with input from consensus organizations and other stakeholders, to take steps to ensure that the Hospital IQR program measures are coordinated and aligned with quality measures applicable to physicians and other providers of services and suppliers under Medicare.
Currently, there are 45 measures specified under the Hospital IQR program for the FY 2011 payment determination. We view all of these measures (with the exception of the measures of readmission) as “candidate measures” for the Hospital VBP program. We recognize that we cannot add any measure to the program unless it meets the requirements of section 1886(o). In determining what measures to initially propose for the FY 2013 Hospital VBP program we considered several factors. First, a measure must be included on the Hospital Compare Web site for at least one year prior to the beginning of the performance period and specified under the Hospital IQR program. The SCIP-Inf-9 and 10 measures do not meet this requirement nor do any of the nine (previously ten given the Nursing Sensitive Care—Failure to Rescue measure was harmonized with the Death Among Surgical Patients with Serious, treatable Complications) Agency for Healthcare Research and Quality (AHRQ) measures. Therefore, these measures were not considered candidate measures. It is our intention to add measures to the Hospital VBP program as soon as this requirement is met in order to help improve patient care as quickly as possible.
As noted above, we recognize that we cannot include in the measure set any readmission measures in accordance with section 1886(o)(2)(A) of the Act. We also are not proposing at this time to adopt the current Hospital IQR structural measures because we believe that these measures require further development if they are to be used for the Hospital VBP program. We seek public comment at this time on the possible utility of adopting structural measures for the Hospital VBP program measure set and how these measures might contribute to the improvement of patient safety and quality of care. Table 1 contains a list of the remaining initial eligible measures.
Table 1—Initial Eligible Measures for the FY 2013 Hospital VBP Program
Measure ID Measure description Process Measures AMI-1 Aspirin at Arrival. AMI-2 Aspirin Prescribed at Discharge. AMI-3 ACE/ARB Inhibitor. AMI-4 Adult Smoking Cessation Advice/Counseling. AMI-5 Beta Blocker Prescribed at Discharge. AMI-7a Fibrinolytic Therapy Received Within 30 Minutes of Hospital Arrival. AMI-8a Primary PCI Received Within 90 Minutes of Hospital Arrival. HF-1 Discharge Instructions. HF-2 Evaluation of LVS Function. HF-3 ACEI or ARB for LVSD. HF-4 Adult Smoking Cessation Advice/Counseling. Start Printed Page 2460 PN-2 Pneumococcal Vaccination. PN-3b Blood Cultures Performed in the Emergency Department Prior to Initial Antibiotic Received in Hospital. PN-4 Adult Smoking Cessation Advice/Counseling. PN-5c Timing of Receipt of Initial Antibiotic Following Hospital Arrival. PN-6 Initial Antibiotic Selection for CAP in Immunocompetent Patient. PN-7 Influenza Vaccination. SCIP-Inf-1 Prophylactic Antibiotic Received Within One Hour Prior to Surgical Incision. SCIP-Inf-2 Prophylactic Antibiotic Selection for Surgical Patients. SCIP-Inf-3 Prophylactic Antibiotics Discontinued Within 24 Hours After Surgery End Time. SCIP-Inf-4 Cardiac Surgery Patients with Controlled 6AM Postoperative Serum Glucose. SCIP-Inf-6 Surgery Patients with Appropriate Hair Removal. SCIP-Card-2 Surgery Patients on a Beta Blocker Prior to Arrival That Received a Beta Blocker During the Perioperative Period. SCIP-VTE-1 Surgery Patients with Recommended Venous Thromboembolism Prophylaxis Ordered. SCIP-VTE-2 Surgery Patients Who Received Appropriate Venous Thromboembolism Prophylaxis Within 24 Hours Prior to Surgery to 24 Hours After Surgery. Outcome Measures MORT-30-AMI Acute Myocardial Infarction (AMI) 30-Day Mortality Rate. MORT-30-HF Heart Failure (HF) 30-Day Mortality Rate. MORT-30-PN Pneumonia (PN) 30-Day Mortality Rate. Survey Measures HCAHPS Hospital Consumer Assessment of Healthcare Providers and Systems Survey. To determine which measures we would propose to initially adopt for the FY 2013 Hospital VBP program, we then examined whether any of the eligible Hospital IQR measures (table above) should be excluded from the Hospital VBP program measure set because hospital performance on them is “topped out,” meaning that all but a few hospitals have achieved a similarly high level of performance on them. We believe that measuring hospital performance on topped-out measures will have no meaningful effect on a hospital's total performance score. Scoring a topped-out measure for purposes of the Hospital VBP program would also present a number of challenges. First, as we discuss below, we are proposing that the benchmark performance standard for all measures will be the performance at the mean of the top decile (defined in section II. E. of this proposed rule). Applied to a topped-out measure, the benchmark would be statistically indistinguishable from the highest attainable score for the measure and, in our view, could lead to unintended consequences as hospitals strive to meet the benchmark. Examples of unintended consequences could include, but are not limited to, inappropriate delivery of a service to some patients (such as delivery of antibiotics to patients without a confirmed diagnosis of pneumonia), unduly conservative decisions on whether to exclude some patients from the measure denominator, and a focus on meeting the benchmark at the expense of actual improvements in quality or patient outcomes. Second, we have found that for topped-out measures, it is significantly more difficult to differentiate among hospitals performing above the median. Third, because a measure cannot be applied to a hospital unless the hospital furnishes services appropriate to the measure, data reporting under the Hospital VBP program will not be the same for all hospitals. To the extent that a hospital can report a higher proportion of topped-out measures, for which its scores would likely be high, we believe that such a hospital would be unfairly advantaged in the determination of its total performance score.
To determine whether an eligible Hospital IQR measure is topped out, we initially focused on the top distribution of hospital performance on each measure and noted if their 75th and 90th percentiles were statistically indistinguishable. Based on our analysis, we identified 7 topped-out measures: AMI-1 Aspirin at Arrival; AMI-5 Beta Blocker at Discharge; AMI-3 ACEI or ARB at Discharge; AMI-4 Smoking Cessation; HF-4 Smoking Cessation; PN-4 Smoking Cessation; and SCIP-Inf-6 Surgery Patients with Appropriate Hair Removal. We then observed that two of these measures identified as topped out (AMI-3 ACEI or ARB at Discharge and HF-4 Smoking Cessation) had significantly lower mean scores than the others, which led us to question whether our analysis was too focused on the top ends of distributions and whether additional criteria that could account for the entire distribution might be more appropriate. To address this, we analyzed the truncated coefficient of variation for each of the measures. The coefficient of variation (CV) is a common statistic that expresses the standard deviation as a percentage of the sample mean in a way that is independent of the units of observation. Applied to this analysis, a large CV would indicate a broad distribution of individual hospital scores, with large and presumably meaningful differences between hospitals in relative performance. A small CV would indicate that the distribution of individual hospital scores is clustered tightly around the mean value, suggesting that it is not useful to draw distinctions between individual hospital performance scores. We used a modified version of the CV, namely a truncated CV, for each measure, in which the five percent of hospitals with the lowest scores, and the five percent of hospitals with highest scores were first truncated (set aside) before calculating the CV. This was done to avoid undue effects of the highest and lowest outlier hospitals, which if included, would tend to greatly widen the dispersion of the distribution and make the measure appear to be more reliable or discerning. For example, a measure for which most hospital scores are tightly clustered Start Printed Page 2461around the mean value (a small CV) might actually reflect a more robust dispersion if there were also a number of hospitals with extreme outlier values, which would greatly increase the perceived variance in the measure. Accordingly, the truncated CV was added as an additional criterion requiring that a topped-out measure also exhibit a truncated CV < 0.10. Using both the truncated CV and data showing whether hospital performance at the 75th and 90th percentiles was statistically indistinguishable, we reexamined the available measures and determined that the same seven measures continue to meet our proposed definition for being topped-out.
Our priorities for the Hospital VBP program are to transform how Medicare pays for care and to encourage hospitals to continually improve the quality of care they furnish. Our analysis of the impact of including the topped-out measures discussed above shows that their use would mask true performance differences among hospitals and, as a result, would fail to advance these priorities. Therefore, we are proposing to not include these 7 topped-out measures (AMI-1 Aspirin at Arrival; AMI-5 Beta Blocker at Discharge; AMI-3 ACEI or ARB at Discharge; AMI-4 Smoking Cessation; HF-4 Smoking Cessation; PN-4 Smoking Cessation; and SCIP-Inf-6 Surgery Patients with Appropriate Hair Removal) in the list of measures we are proposing to initially adopt for the FY 2013 Hospital VBP program.
We examined whether the following outcome measures adopted for the Hospital IQR program are appropriate for inclusion in the FY 2013 Hospital VBP program. These measures are as follows: (1) AHRQ patient safety indicators (PSIs), inpatient quality indicators (IQIs) and composite measures; (2) AHRQ PSI and nursing sensitive care measure; and (3) AMI, HF, and PN mortality measures (Medicare patients). We believe that these outcome measures provide important information relating to treatment outcomes and patient safety. We also believe that adding these outcome measures would significantly improve the correlation between patient outcomes and Hospital VBP performance. However, because under section 1886(o)(2)(C)(i) of the Act, we may only select measures if they have been included on the Hospital Compare Internet website for a least one year prior to the beginning of the performance period, we believe that the AHRQ Patient Safety Indicators (PSI) and Inpatient Quality Indicators (IQI) and composite measures, and the AHRQ Nursing Sensitive Care measure are not yet eligible for inclusion in the FY 2013 Hospital VBP program. These measures are currently specified for the Hospital IQR program but have not yet been included on Hospital Compare. Because of the urgency to act quickly to improve patient safety, we plan to adopt them for use in the Hospital VBP Program as rapidly as possible and will continue working to develop additional robust outcome measures for the Hospital VBP program. We invite comments on the addition of the AHRQ PSI, IQI, and Nursing Sensitive Care measures for Hospital VBP program inclusion in FY 2014 and future years.
We considered whether the current publicly-reported 30-day mortality claims-based measures (Mort-30-AMI, Mort-30-HF, Mort-30-PN) should be included in the FY 2013 Hospital VBP program. The mortality measures assess hospital-specific, risk-standardized, all-cause 30-day mortality rates for patients hospitalized with a principal diagnosis of heart attack, heart failure, and pneumonia. All-cause mortality is defined for purposes of these measures as death from any cause within 30 days after the index admission date, regardless of whether the patient died while still in the hospital or after discharge. On July 1, 2009, the specifications for these measures were changed from a one-year reporting period to a three-year rolling average. This was done to address concerns regarding the reliability of the measures, and the three-year rolling average allows us to include a larger number of cases in the measure calculations, although our analysis shows that eighteen months of these data is also reliable. We do not believe that the three-quarter performance period we are proposing to use for the initial clinical process of care and HCAHPS measures for the FY 2013 Hospital VBP program would be appropriate to use for these mortality outcome measures because we do not believe that the data collected for these mortality measures during those three quarters will provide us with sufficiently accurate information about a hospital's outcomes on which to score hospitals on these measures and base payment. The detailed methodology for the 30-day risk standardized mortality measures is available on http://www.qualitynet.org.
However, we propose to adopt these currently reported 30-day mortality claims-based measures (MORT-30-AMI, MORT-30-HF, and MORT-30-PN) as measures for the FY 2014 Hospital VBP program and, as proposed above, to establish a performance period with respect to these measures of July 1, 2011 to December 31, 2012.
The eligible clinical process of care measures that have not been excluded for reasons previously discussed cover acute myocardial infarction, heart failure, pneumonia, and surgeries (as measured by the Surgical Care Improvement Project (SCIP)). Therefore, we believe that they meet the requirements in section 1886(o)(2)(B)(i)(I)(aa)-(dd) of the Act. Section 1886(o)(2)(B)(i)(ee) of the Act requires the Secretary to also select for purposes of the FY 2013 Hospital VBP program measures that cover healthcare-associated infections (HAI) “as measured by the prevention metrics and targets established in the HHS Action Plan to Prevent Healthcare-Associated Infections (or any successor plan) of the Department of Health and Human Services.” The SCIP measures that we discuss above were developed to support practices that have demonstrated an ability to significantly reduce surgical complications such as HAIs. Compliance with these SCIP infection measures is also included as a targeted metric in the HHS Action Plan to Prevent Healthcare-Associated Infections issued in 2009, available on the HHS website. As a result, we believe that the SCIP-Inf-1; SCIP-Inf-2; SCIP-Inf-3; and SCIP-Inf-4 measures we have adopted for the Hospital IQR program meet the requirement in section 1886(o)(2)(B)(i)(I)(ee) and we propose to categorize them under a HAI condition topic instead of under the SCIP condition topic.
Under section 1886(o)(2)(B)(i)(II), the Secretary must select measures for the FY 2013 Hospital VBP program related to the Hospital Consumer Assessment of Healthcare Providers and Systems survey (HCAHPS). CMS partnered with the Agency for Healthcare Research and Quality (AHRQ) to develop HCAHPS. The HCAHPS survey is the first national, standardized, publicly reported survey of patients' experiences of hospital care, and we propose to adopt it for the FY 2013 Hospital VBP program. HCAHPS, also known as the CAHPS® Hospital Survey, is a survey instrument and data collection methodology for measuring patients' perceptions of their hospital experience.
The HCAHPS survey asks discharged patients 27 questions about their recent hospital stay that are used to measure the experience of patients across 10 dimensions in the Hospital IQR program. The survey contains 18 core questions about critical aspects of patients' hospital experiences (communication with nurses and doctors, the responsiveness of hospital staff, the cleanliness and quietness of the hospital environment, pain Start Printed Page 2462management, communication about medicines, discharge information, overall rating of the hospital, and whether they would recommend the hospital). The survey also includes four items to direct patients to relevant questions if a patient did not have a particular experience covered by the survey, such as taking new medications or needing medicine for pain. Three items in the survey are used to adjust for the mix of patients across hospitals, and two items related to race and ethnicity support congressionally-mandated reports on disparities in health care.
The HCAHPS survey is administered to a random sample of adult patients across medical conditions between 48 hours and six weeks after discharge; the survey is not restricted to Medicare beneficiaries. Hospitals must survey patients throughout each month of the year. The survey is available in official English, Spanish, Chinese, Russian and Vietnamese versions. The survey and its protocols for sampling, data collection and coding, and file submission can be found in the HCAHPS Quality Assurance Guidelines, Version 5.0, which is available on the official HCAHPS website, http://www.hcahpsonline.org.
AHRQ carried out a rigorous, scientific process to develop and test the HCAHPS instrument. This process entailed multiple steps, including: A public call for measures; literature review; cognitive interviews; consumer focus groups; stakeholder input; a three-state pilot test; small-scale field tests; and soliciting public comments via several Federal Register notices. In May 2005, the HCAHPS survey was endorsed by the National Quality Forum (NQF). CMS adopted the entire HCAHPS survey as a measure in the Hospital IQR program in October 2006, and the first public reporting of HCAHPS results occurred in March 2008. The survey, its methodology and the results it produces are available on the HCAPHS website at http://www.hcahpsonline.org/home.aspx. With respect to our display of the HCAHPS measure on Hospital Compare for purposes of the Hospital IQR program, we publicly report the measure as 10 separate items. The “cleanliness of hospital environment,” “quietness of hospital environment,” “overall rating of the hospital,” and “recommend the hospital” survey items are displayed as stand-alone items. The remaining 6 items (communication with nurses, communication with doctors, responsiveness of hospital staff, pain management, communication about medicines, discharge information) are composites of the remaining survey items.
Finally, we propose to not include the PN-5c measure in the Hospital VBP program. We do not believe that this measure is appropriate for inclusion because it could lead to inappropriate antibiotic use. We intend to propose to retire this measure, as well as several other measures that we are not proposing to adopt for the Hospital VBP program, from the Hospital IQR program in the near future.
Accordingly, we propose to initially select the following 17 clinical process of care measures, and the HCAHPS measure, for inclusion in the FY 2013 Hospital VBP program. The proposed list of initial measures is provided in Table 2.
Table 2—Proposed Initial Measures for FY 2013 Hospital VBP Program
Measure ID Measure description Clinical Process of Care Measures Acute myocardial infarction: AMI-2 Aspirin Prescribed at Discharge. AMI-7a Fibrinolytic Therapy Received Within 30 Minutes of Hospital Arrival. AMI-8a Primary PCI Received Within 90 Minutes of Hospital Arrival. Heart Failure: HF-1 Discharge Instructions. HF-2 Evaluation of LVS Function. HF-3 ACEI or ARB for LVSD. Pneumonia: PN-2 Pneumococcal Vaccination. PN-3b Blood Cultures Performed in the Emergency Department Prior to Initial Antibiotic Received in Hospital. PN-6 Initial Antibiotic Selection for CAP in Immunocompetent Patient. PN-7 Influenza Vaccination. Healthcare-associated infections: SCIP-Inf-1 Prophylactic Antibiotic Received Within One Hour Prior to Surgical Incision. SCIP-Inf-2 Prophylactic Antibiotic Selection for Surgical Patients. SCIP-Inf-3 Prophylactic Antibiotics Discontinued Within 24 Hours After Surgery End Time. SCIP-Inf-4 Cardiac Surgery Patients with Controlled 6AM Postoperative Serum Glucose. Surgeries: SCIP-Card-2 Surgery Patients on a Beta Blocker Prior to Arrival That Received a Beta Blocker During the Perioperative Period. SCIP-VTE-1 Surgery Patients with Recommended Venous Thromboembolism Prophylaxis Ordered. SCIP-VTE-2 Surgery Patients Who Received Appropriate Venous Thromboembolism Prophylaxis Within 24 Hours Prior to Surgery to 24 Hours After Surgery. Survey Measures HCAHPS Hospital Consumer Assessment of Healthcare Providers and Systems Survey.5 5 Proposed dimensions of the HCAHPS survey for use in the FY 2013 Hospital VBP program include: Communication with Nurses, Communication with Doctors, Responsiveness of Hospital Staff, Pain Management, Communication about Medicines, Cleanliness and Quietness of Hospital Environment, Discharge Start Printed Page 2463Information and Overall Rating of Hospital.
We solicit public comments on these proposed measures and also on our intention to add additional measures to the Hospital VBP Program as rapidly as possible going forward. To that end, we are proposing to implement a subregulatory process to expedite the timeline for adding measures to the Hospital VBP program beginning with the FY 2013 program. Under this process we could add any measure to the Hospital VBP program if that measure is adopted under the Hospital IQR program and has been included on the Hospital Compare Web site for at least one year. We are proposing that the performance period for all of these measures would start exactly one year after the date these measures are publicly posted on Hospital Compare, consistent with section 1886(o)(2)(C)(i). Under this proposed subregulatory process, we would solicit comments from the public on the appropriateness of adopting one or more Hospital IQR measures for the Hospital VBP program. We would also assess the Hospital IQR measure rates using the criteria we used to select the proposed measures for the initial FY 2013 Hospital VBP measure set and notify the public regarding our findings. We would propose performance period end dates for any measure we selected for Hospital VBP program in rulemaking. We are also proposing to implement a subregulatory process to retire Hospital VBP measures. Under this process, we would post our intention to retire measures on the CMS Web site at least 60 days prior to the date that we will retire the measure. We would also, as we do with respect to Hospital IQR measures that we believe pose immediate patient safety concerns if reporting on them is continued, notify hospitals and the public of the retirement of the measure and the reasons for its retirement through the usual hospital and QIO communication channels used for the Hospital IQR program, which include e-mail blasts to hospitals and the dissemination of Standard Data Processing System (SDPS) memoranda to QIOs, as well as posting the information on the QualityNet Web site. We would then confirm the retirement of the measure from the Hospital VBP program measure set in a rulemaking vehicle. We make this proposal because it will allow us to ensure that the Hospital VBP program measure set focuses on the most current quality improvement and patient safety priorities. We are seeking public comment on our proposals and other methods that allow for the addition of measures to the Hospital VBP program as rapidly as possible in order to improve quality and safety for patients.
For value-based incentive payments made with respect to discharges occurring during FY 2014 or a subsequent fiscal year, CMS is required by statute to ensure that the measures selected for the Hospital VBP program include efficiency measures, including measures of “Medicare Spending per beneficiary.” CMS solicits public comment as to what services should be included and what should be excluded in a “Medicare spending per beneficiary” calculation. For example, the calculation could include outlier payments and/or Part B payments for services furnished during an inpatient hospital stay, or could include Part A and Part B payments for services received by a beneficiary during some window of time prior to the admission and/or after the discharge. We also solicit public comment on what, if any, type(s) of hospital segmentation or adjustment should be considered.
In addition, we are considering different approaches for measuring internal hospital efficiency. Internal hospital efficiency measures could assess hospital spending per admission, as determined using cost reports or other sources. CMS seeks comment on this and other approaches for measuring internal hospital efficiency.
D. Proposed Performance Standards
Section 1886(o)(3)(A) requires the Secretary to establish performance standards with respect to the measures selected under the Hospital VBP program for a performance period for a fiscal year. The performance standards must include levels of achievement and improvement (section 1886(o)(3)(B)), and must be established and announced not later than 60 days prior to the beginning of the performance period for the fiscal year involved (section 1886(o)(3)(C)). Achievement and improvement levels are discussed more fully in section II. E. of this proposed rule. In addition, as part of the process for establishing the performance standards, the Secretary must take into account appropriate factors, such as: (1) Practical experience with the measures, including whether a significant proportion of hospitals failed to meet the performance standard during previous performance periods; (2) historical performance standards; (3) improvement rates; and (4) the opportunity for continued improvement (section 1886(o)(3)(D)).
To determine what the proposed performance standard for each proposed clinical process of care measure and the proposed HCAHPS measure should be for purposes of the FY 2013 Hospital VBP program, we analyzed the most reliable and current hospital data that we have on each of these measures by virtue of the Hospital IQR program. Because we are proposing to adopt a performance period that is less than a full year for FY 2013, we were also sensitive to the fact that hospital performance on the proposed measures may be affected by seasonal variations in patient mix, case severity, and other factors.
To address this potential variation and ensure that the hospital scores reflect their actual performance on the measures, we believe that the performance standard for each clinical process of care measure and HCAHPS should be based on how well hospitals performed on the measure during the same three quarters in a baseline period. In determining what three-quarter baseline period would be the most appropriate to propose to use for the FY 2013 Hospital VBP program, we wanted to ensure that the baseline would be as close in time to the proposed performance period as possible. We believe that selecting a three-quarter baseline period from July 1, 2009 to March 31, 2010 will enable us to achieve this goal. Although the proposed baseline period has ended, we are still in the process of validating this data and expect the validation process to be complete by the end of January 2011.
We also believe that an essential goal of the Hospital VBP program is to provide incentives to all hospitals to improve the quality of care that they furnish to their patients. In determining what level of hospital performance would be appropriate to select as the performance standards for each measure, we focused on selecting levels that would challenge hospitals to continuously improve or maintain high levels of performance. As required by Section 1886(o)(3)(D), we specifically considered hospitals' practical experience with the measures, particularly through the Hospital IQR program, examining how different achievement and improvement thresholds would have historically impacted hospitals, how hospital performance may have changed over time, and how hospitals could continue to improve. For these reasons, we propose to set the achievement performance standard (achievement threshold) for each proposed measure at the median of hospital performance (50th percentile) during the baseline period of July 1, 2009 through March 31, 2010. As proposed in section II. E. of Start Printed Page 2464this proposed rule, hospitals would receive achievement points only if they exceed the achievement performance standard and could increase their achievement score based on higher levels of performance. We believe these achievement performance standards represent achievable standards of excellence. We also propose to set the improvement performance standard (improvement threshold) for each proposed measure at each specific hospital's performance on the measure during the proposed baseline period of July 1, 2009 through March 31, 2010. We believe that these improvement performance standards ensure that hospitals will be adequately incentivized to improve.
Because our process for validating the proposed baseline period of data is not yet complete, we are unable to provide the precise achievement threshold values for what these performance standards will be at this time. These values will be specified in the final rule. We specify example achievement performance standards, using July 1, 2008 through March 31, 2009 data, in Table 3 below.
Table 3—Example Achievement Performance Standards for FY 2013 Hospital VBP Proposed Measures
Measure ID Measure description Example performance standard Process Measures AMI-2 Aspirin Prescribed at Discharge 0.987 AMI-7a Fibrinolytic Therapy Received Within 30 Minutes of Hospital Arrival 0.673 AMI-8a Primary PCI Received Within 90 Minutes of Hospital Arrival 0.856 HF-1 Discharge Instructions 0.872 HF-2 Evaluation of LVS Function 0.983 HF-3 ACEI or ARB for LVSD 0.944 PN-2 Pneumococcal Vaccination 0.929 PN-3b Blood Cultures Performed in the Emergency Department Prior to Initial Antibiotic Received in Hospital 0.951 PN-6 Initial Antibiotic Selection for CAP in Immunocompetent Patient 0.909 PN-7 Influenza Vaccination 0.909 SCIP-Inf-1 Prophylactic Antibiotic Received Within One Hour Prior to Surgical Incision 0.955 SCIP-Inf-2 Prophylactic Antibiotic Selection for Surgical Patients 0.978 SCIP-Inf-3 Prophylactic Antibiotics Discontinued Within 24 Hours After Surgery End Time 0.927 SCIP-Inf-4 Cardiac Surgery Patients with Controlled 6AM Postoperative Serum Glucose 0.912 SCIP-VTE-1 Surgery Patients with Recommended Venous Thromboembolism Prophylaxis Ordered 0.938 SCIP-VTE-2 Surgery Patients Who Received Appropriate Venous Thromboembolism Prophylaxis Within 24 Hours Prior to Surgery to 24 Hours After Surgery 0.913 Survey Measures HCAHPS Hospital Consumer Assessment of Healthcare Providers and Systems Survey • Communication with Nurses • Communication with Doctors • Responsiveness of Hospital Staff • Pain Management • Communication About Medicines • Cleanliness and Quietness of Hospital Environment • Discharge Information • Overall Rating of Hospital .500 We also propose to use an 18-month performance period of July 1, 2011 to December 31, 2012, with a baseline period of July 1, 2008 to December 31, 2009, for the mortality measures (MORT-30-AMI, MORT-30-HF, MORT-30-PN) we are proposing to initially include in the FY 2014 Hospital VBP program. Like the proposed clinical process of care and HCAHPS measures, we propose to set the achievement performance standard (achievement threshold) for each proposed outcome measure at the median of hospital performance (50th percentile) during the proposed baseline period. Similarly, we propose to set the improvement performance standard (improvement threshold) for each proposed outcome measure at each specific hospital's performance on each measure during the proposed baseline period of July 1, 2008 to December 31, 2009. We provide the following sample achievement thresholds, (displayed as survival rates) derived from July 2006-July 2009 as examples of the achievement performance standards for that period:
- MORT-30-AMI: 83.7%
- MORT-30-HF: 88.8%
- MORT-30-PN: 88.5%.
We solicit public comments on the proposed performance standards as described above.
E. Proposed Methodology for Calculating the Total Performance Score
1. Statutory Provisions—Proposed Methodology for Calculating the Total Performance Score
Section 1886(o)(5)(A) of the Act requires the Secretary to develop a methodology for assessing each hospital's total performance based on performance standards with respect to the measures selected for a performance period. Using such methodology, the Secretary must provide for an assessment for each hospital for each performance period. Section 1886(o)(5)(B) of the Act sets forth four additional requirements related to the scoring methodology developed by the Start Printed Page 2465Secretary under section 1886(o)(5)(A). Specifically, section 1886(o)(5)(B)(i) requires the Secretary to ensure that the application of the scoring methodology results in an appropriate distribution of value-based incentive payments among hospitals receiving different levels of hospital performance scores, with hospitals achieving the highest hospital performance scores receiving the largest value-based incentive payments. Section 1886(o)(5)(B)(ii) provides that under the methodology, the hospital performance score must be determined using the higher of its achievement or improvement score for each measure. Section 1886(o)(5)(B)(iii) requires that the hospital scoring methodology provide for the assignment of weights for categories of measures as the Secretary deems appropriate. Section 1886(o)(5)(B)(iv) prohibits the Secretary from setting a minimum performance standard in determining the hospital performance score for any hospital. Finally, section 1886(o)(5)(B)(v) requires that the hospital performance score for a hospital reflect the measures that apply to the hospital.
2. Additional Factors for Consideration—Proposed Methodology for Calculating the Total Performance Score
In addition to statutory requirements, we also considered several additional factors when developing the proposed performance scoring methodology for the Hospital Value-Based Purchasing program. First, we believe it is important that the performance scoring methodology is straight forward and transparent to hospitals, patients, and other stakeholders. Hospitals must be able to clearly understand performance scoring methods and performance expectations to maximize quality improvement efforts. The public must understand performance score methods to utilize publicly reported information when choosing hospitals. Second, we believe the scoring methodologies for all Medicare Value-Based Purchasing programs, including (but not limited to) the End Stage Renal Disease Quality Incentive Program (42 CFR Part 413) should be aligned as appropriate given their specific statutory requirements. This alignment will facilitate the public's understanding of quality information disseminated in these programs and foster more informed consumer decision making about health care. Third, we believe differences in performance scores must reflect true differences in performance. In order to ensure this in the proposed Hospital Value-Based Purchasing Program, we assessed the quantitative characteristics of the measures we are proposing to use to calculate a performance score, including the current state of measure development, distribution of current hospital performance in the proposed measure set, number of measures, and the number and grouping of measure domains. Fourth, we must appropriately measure both quality achievement and improvement in our Hospital Value-Based Purchasing program. Section 1886(o)(5)(B)(ii) of the Act specifies that performance scores under the Hospital Value-Based Purchasing program be calculated utilizing the higher of achievement and improvement scores for each measure, and that explicit direction has implications for the design of the performance scoring methodology. We must also consider the impact of performance scores utilizing achievement and improvement on hospital behavior due to payment implications. Fifth, we wish to eliminate unintended consequences for rewarding inappropriate hospital behavior and outcomes to patients in our performance scoring methodology. Sixth, we wish to utilize the most currently available data to assess hospital improvement in a performance score methodology. We believe that more current data would result in a more accurate performance score, but recognize that hospitals require time to abstract and collect quality information. We also require time to process this information accurately.
This proposed rule's method for calculating the improvement score relies on a comparison of the current payment year's performance period with a “baseline” period of July 1, 2008 through December 31, 2009 for the three 30-day mortality measures, rather than a comparison of the current year with the previous year (as outlined in the 2007 report to Congress). We propose this baseline period because these data are the most currently available data at this time for public comment. We plan to propose future annual updates to the baseline period through future rulemaking. We recognize that comparing a payment year's performance period with the previous year's performance period may be a better estimate of incremental improvement. As noted above, we solicit comment on the merits and impact of all of the factors related to our performance score methodology alternatives, including the choice of how to define the baseline year.
We solicit comment on the merits and impact of all of these factors related to our performance score methodology alternatives described in the next section of this proposed rule. Specifically, we welcome suggestions on improving the simplicity of the Hospital Value-Based Purchasing program performance score methodology and its alignment with other CMS Value-Based Purchasing programs. We recognize that statutorily mandated differences may require differences in performance score methodologies among the CMS Value-Based Purchasing programs.
3. Background—Proposed FY 2013 Hospital VBP Program Scoring Methodology
In November 2007, CMS published a report entitled, “Report to Congress: Plan to Implement a Medicare Hospital Value-Based Purchasing Program” (referred to in this proposed rule as the “2007 Report to Congress”).[6] In addition to laying the groundwork for hospital value-based purchasing, the 2007 Report to Congress analyzed and presented a potential performance scoring methodology (called the Performance Assessment Model) for the Hospital VBP program. The Performance Assessment Model combines scores on individual measures across different quality categories or “domains” (for example, clinical process of care, patient experience of care) to calculate a hospital's total performance score. The Performance Assessment Model provides a methodology for evaluating a hospital's performance on each quality measure based on the higher of an attainment score in the measurement period or an improvement score, which is determined by comparing the hospital's current measure score with a baseline period of performance. The use of an improvement score is intended to provide an incentive for a broad range of hospitals that participate in a hospital VBP program by awarding points for showing improvement on quality measures, not solely for outperforming other hospitals.
Under the Performance Assessment Model, measures are grouped into domains, for example, clinical process of care (which could include AMI, HF, PN, and SCIP) and patient experience of care (for example, HCAHPS). A score is calculated for each domain by combining the measure scores within that domain, weighting each measure equally. The domain score reflects the percentage of points earned out of the total possible points for which a hospital is eligible. A hospital's total performance score is determined by aggregating the scores across all Start Printed Page 2466domains. In aggregating the scores across domains, the domains could be weighted equally or unequally, depending on the policy goals. The total performance score is then translated into the percentage of Hospital VBP incentive payment earned using an exchange function, which aligns payments with desired policy goals.
4. Proposed FY 2013 Hospital VBP Program Scoring Methodology
We believe that the Performance Assessment Model presented and analyzed in the 2007 Report to Congress provides a useful foundation for developing a FY 2013 Hospital VBP program performance scoring methodology that comports with the requirements in section 1886(o) of the Act. The Performance Assessment Model outlines an approach that we believe is well-understood by patient advocates, hospitals and other stakeholders, was developed during a year-long process that involved extensive stakeholder input, and was presented by us to Congress. Since issuing the report, we have conducted further, extensive research on a number of important methodology issues for the Hospital VBP program, including the impact of topped-out measures on scoring, appropriate case minimum thresholds for measures, appropriate measure minimum thresholds per domain, and other issues required to ensure a high level of confidence in the scoring methodology (all of which we discuss in this proposed rule).
After carefully reviewing and evaluating a number of potential performance scoring methodologies for the Hospital VBP program, we propose to use a Three-Domain Performance Scoring Model, although only two domains will receive weight in FY 2013. This methodology is very similar to the Performance Assessment Model; however it incorporates an outcome measures domain in addition to the clinical process of care and patient experience of care domains. While we do not propose to adopt any outcome measures for the FY 2013 Hospital VBP program, we propose to adopt these measures as part of an outcome measures domain for FY 2014. Therefore, we refer to the proposed methodology as the Three-Domain Performance Scoring Model and describe how the outcomes measures would apply when the domain is eventually given weight.
We present below the proposed Three-Domain Performance Scoring Model, which includes setting benchmarks and thresholds, scoring hospitals on achievement and improvement for three domains (clinical process of care, patient experience of care, and outcomes), weighting the domains, and calculating the hospital total performance score. In the discussion, we highlight any differences between the Three-Domain Performance Scoring Model and the Performance Assessment Model, along with our reasons for the departure.
a. Clinical Process of Care and Outcome Measures Scoring Under the Three-Domain Performance Scoring Model: Setting Performance Benchmarks and Thresholds
As stated above, section 1886(o)(5)(B)(ii) of the Act requires that under the Hospital VBP performance scoring methodology, hospital performance scores be determined using the higher of achievement or improvement scores for each measure. With respect to scoring hospital performance on the proposed clinical process of care and outcome measures, we propose to use a methodology based on the scoring methodology set forth in the 2007 Report to Congress Performance Assessment Model. Under this methodology, a hospital's performance on each quality measure is evaluated based on the higher of an attainment score (herein, “achievement score”) in the performance period or an improvement score, which is determined by comparing the hospital's score in the performance period with its score during a baseline period of performance. In determining the achievement score, we propose that hospitals would receive points along an achievement range, which is a scale between the achievement threshold (the minimum level of hospital performance required to receive achievement points) and the benchmark (the mean of the top decile of hospital performance during the baseline period). In determining the improvement score, we propose that hospitals would receive points along an improvement range, which is a scale between the hospital's prior score on the measure during the baseline period and the benchmark.
Under this methodology, we propose to establish the benchmarks and achievement thresholds using national data from a three-quarter baseline period of July 1, 2009 through March 31, 2010. We discuss our rationale for proposing to use this baseline period in section D. of this proposed rule.
To define a high level of hospital performance on a given measure, we propose to set the benchmark at the mean of the top decile of hospital scores on the measure during the baseline period. We believe this will ensure that the benchmark represents demonstrably high but achievable standards of excellence; in other words, the benchmark will reflect observed scores for the group of highest-performing hospitals on a given measure.
We considered several options for setting the achievement threshold, including the 25th, 50% (median), and 75th percentile scores. The higher and lower options were rejected for being too stringent and too lenient, respectively. Setting the achievement threshold at the 50th percentile, however, balances the agency's goal to reward only those hospitals that can demonstrate a certain level of quality with the desire to set the bar at an attainable level. We decided that the median score (that is, the point at which the performance of the hospital is better than the performance of half of all hospitals during the baseline period) would be an appropriate threshold for earning some merit, that is, to earn one or more points for achievement. The higher the hospital's achievement falls over the achievement performance standard, the higher the score, until the hospital reaches what we believe to be an empirical standard of excellence (that is, the benchmark). Therefore, we propose to set the achievement threshold at the 50th percentile of hospital performance on the measure during the baseline period. Hospitals will have to score at or above this threshold to earn achievement points.
We believe that these proposed definitions are in keeping with the statutory requirements and reflect the evidence-based approach for determining thresholds and benchmarks set forth in the 2007 Report to Congress.
b. Clinical Process of Care and Outcome Measures Scoring Under the Three-Domain Performance Scoring Model: Scoring Hospital Performance Based on Achievement
Like the Performance Assessment Model set forth in the 2007 Report to Congress, for each of the proposed clinical process and outcome measures that apply to the hospital, we propose that a hospital would earn 0-10 points for achievement based on where its performance for the measure fell relative to the achievement threshold (which we propose above to define as performance during the baseline period at the 50th percentile) and the benchmark (which we propose above to define as performance during the baseline period at the mean of the top decile), according to the following formula:
[9 * ((Hospital's performance period score−achievement threshold)/(benchmark−achievement threshold))] + .5, where the hospital Start Printed Page 2467performance period score falls in the range from the achievement threshold to the benchmark
All achievement points would be rounded to the nearest whole number (for example, an achievement score of 4.5 would be rounded to 5). If a hospital's score was:
- Equal to or greater than the benchmark, the hospital would receive 10 points for achievement
- Equal to or greater than the achievement threshold (but below the benchmark), the hospital would receive a score of 1-9 based on a linear scale established for the achievement range (which distributes all points proportionately between the achievement threshold and the benchmark so that the interval in performance between the score needed to receive a given number of achievement points and one additional achievement point is the same throughout the range of performance from the achievement threshold to the benchmark).
- Less than the achievement threshold (that is, the lower bound of the achievement range), the hospital would receive 0 points for achievement.
c. Clinical Process of Care and Outcome Measures Scoring Under the Three-Domain Performance Scoring Model: Scoring Hospital Performance Based on Improvement
In keeping with the approach analyzed for the 2007 Report to Congress, for the proposed clinical process of care and outcome measures, we propose that a hospital would earn 0-9 points based on how much its performance on the measure during the performance period improved from its performance on the measure during the baseline period. A unique improvement range for each measure would be established for each hospital that defines the distance between the hospital's baseline period score and the national benchmark for the measure (the mean of the top decile), according to the following formula:
[10 * ((Hospital performance period score−Hospital baseline period score)/(Benchmark−Hospital baseline period score))]−.5, where the hospital performance score falls in the range from the hospital's baseline period score to the benchmark
All improvement points would be rounded to the nearest whole number. If a hospital's score on the measure during the performance period was:
- Greater than its baseline period score but below the benchmark (within the improvement range), the hospital would receive a score of 0-9 based on the linear scale that defines the improvement range
- Equal to or lower than its baseline period score on the measure, the hospital would receive 0 points for improvement.
d. Examples To Illustrate Clinical Process of Care and Outcome Measures Scoring Under the Three-Domain Performance Scoring Model
Three examples are presented to illustrate how the proposed Three-Domain Performance Scoring Model would be applied in the context of the proposed clinical process of care and outcome measures. The hospitals were selected from an empirical database created from 2004-2005 data to support the development of the Performance Assessment Model, and all performance scores are calculated for the pneumonia measure, “patients assessed and given pneumococcal vaccine.” Figure 1 shows the scoring for Hospital B. The benchmark calculated for the pneumonia measure in this case was 0.87 (the mean value of the top decile in 2004), and the achievement threshold was 0.47 (the performance of the median or the 50th percentile hospital in 2004). Hospital B's 2005 performance rate of 0.91 during the performance period for this measure exceeds the benchmark, so Hospital B would earn 10 (the maximum) points for achievement. The hospital's performance rate on a measure is expressed as a decimal. In the illustration, Hospital B's performance rate of 0.91 means that 91 percent of applicable patients admitted for pneumonia were assessed and given the pneumococcal vaccine. (Because Hospital B has earned the maximum number of points possible for this measure, its improvement score would be irrelevant.)
Start Printed Page 2468Figure 2 shows the scoring for another hospital, Hospital I. As can be seen below, the hospital's performance on this measure went from 0.21 (below the achievement threshold) in the baseline period to 0.70 (above the achievement threshold) in the performance period. Applying the achievement scale, Hospital I would earn 6 points for this measure, calculated as follows:
[9 * ((0.70 − 0.47)/(0.87 − 0.47))] + 0.5 = 5.175 + 0.5 = 5.675, rounded to 6 points.
However, because Hospital I's performance during the performance period is also greater than its performance during the baseline period, it would be scored based on improvement as well. According to the improvement scale, based on Hospital I's period-to-period improvement, from 0.21 to 0.70, Hospital I would earn 7 points, calculated as follows:
[10 * ((0.70 − 0.21)/(0.87 − 0.21))] − 0.5 = 6.92, rounded to 7 points.
Because the higher of the two scores is used for determining the measure score, Hospital I would receive 7 points for this measure (rounded to the nearest whole number).
Start Printed Page 2469In Figure 3 shown below, Hospital L's performance on the pneumonia measure drops from 0.57 to 0.46 (a decline of 0.11 points). Because this hospital's performance during the performance period is lower than the achievement threshold of 0.47, it receives 0 points based on achievement. It would also receive 0 points for improvement, because its performance during the performance period is lower than its performance during the baseline period. In this example, Hospital L would receive 0 points for the measure.
Start Printed Page 2470e. Calculation of the Overall Clinical Process of Care and Outcome Measure Domain Scores Under the Three-Domain Performance Scoring Model
We propose that both a hospital's overall clinical performance score and outcome performance score would be based on all measures that apply to the hospital. We propose that a measure applies to a hospital if, during the performance period, the hospital treats a minimum number of cases (which we propose to define as 10 cases in section F of this proposed rule) that meet the technical specifications for reporting the measure. We also propose that at least 4 measures within a domain must apply to the hospital in order for the hospital to receive a performance score on that domain (this proposal is also discussed more fully in section F of this proposed rule). Thus, the number and type of measures that apply to each hospital will vary, depending on the services the hospital provides (for example, some hospitals may not perform percutaneous coronary intervention; therefore, this measure would not apply to them). As proposed above, for each applicable measure, a hospital would receive a score based on the higher of its achievement and improvement scores. Because the clinical process of care and outcome measure performance scores will be based only on the measures that apply to the hospital, we propose to normalize the domain scores across hospitals by converting the points earned for each domain to a percentage of total points.
With respect to the clinical process of care and outcome domains, we propose that the points earned for each measure that applies to the hospital would be summed (weighted equally) to determine the total earned points for the domain:
Total earned points for domain = Sum of points earned for all applicable domain measures
Under the proposed approach, each hospital would also have a corresponding universe of total possible points for each of the clinical process and outcome domains calculated as follows:
Total possible points for domain = Total number of domain measures that apply to the hospital multiplied by 10 points
We also propose that the hospital's clinical process of care and outcome domain scores would each be a percentage, calculated as follows:
Domain score = Total earned points divided by Total possible points multiplied by 100%
As an example, four clinical process of care measures apply to Hospital E, and Hospital E reports data on at least 10 cases for each of these measures. Under the proposed scoring methodology discussed above, Hospital E is awarded 9, 5, 3, and 10 points, respectively, for these measures. Hospital E's total earned points for the clinical process of care measure domain would be calculated by adding together all the points Hospital E was awarded, resulting in a total of 27 points. Hospital E's total possible points would be the total number of measures that apply to the hospital (four measures) and for which the hospital had the minimum number of cases multiplied by 10 points, for a total of 40 points. Hospital E's clinical process of care domain score would be the total earned points (that is, 27 points) divided by the total possible Start Printed Page 2471points (that is, 40 points) multiplied by 100, which yields a result of 67.5.
5. Scoring Patient Experience of Care Measures (HCAHPS) Under the Three-Domain Performance Scoring Model
Since the 2007 Report to Congress was published, we have performed additional analyses on methods of scoring HCAHPS measures for purposes of the Hospital VBP program using data collected from a greater number of hospitals and over a longer period of time. We have found that the model laid out in the 2007 Report to Congress has good measurement properties and functions as intended with respect to achievement, consistency, and improvement. We believe that the scoring approach proposed here, which is based on the HCAHPS model set forth in the 2007 Report to Congress, reflects both the interrelated nature of HCAHPS dimensions and the importance of providing incentives to hospitals to improve on each of eight dimensions of patient experience.
The scoring approach we propose for HCAHPS performance for the FY 2013 Hospital VBP program captures eight HCAHPS dimensions (seven composites and one global rating of care) and would seek to incentivize hospitals to improve on each of the eight dimensions of patient experience (See Table 4). We propose that the 8 dimensions will be structured similar to the 10 HCAHPS items that we currently report on Hospital Compare, except that we are proposing to combine the cleanliness of hospital environment and quietness of hospital environment items into a single dimension and to not include the recommend the hospital item. We are proposing these changes because we did not want to give more weight to the two items capturing environmental issues by treating them as separate dimensions and the “Recommend the hospital” item is very similar to the included “Overall rating” item.
We are proposing to score each of the eight HCAHPS dimensions using an approach that parallels the one we are proposing to use to score the clinical process measures, using an achievement point range from 0-10 and an improvement point range from 0-9, with the total score on each HCAHPS dimension being the higher of the achievement or improvement score. In order to ensure statistical reliability, we are also proposing that, for inclusion in the Hospital VBP program for FY 2013, hospitals report a minimum of 100 HCAHPS surveys during the performance period (we discuss this proposal further in section F of this proposed rule).
In order to be consistent with what we do under the Hospital IQR program, we are also proposing to give hospitals that have 5 or fewer HCAHPS-eligible discharges in a month the option to not submit HCAHPS surveys for that month as part of their quarterly data submission. However, in contrast to the proposed clinical process of care measure scoring methodology, under which different numbers of measures might apply to different hospitals, all hospitals that report HCAHPS data would be expected to report the complete survey.
As we are proposing to do with respect to scoring the proposed clinical process of care measures, we are proposing that achievement thresholds and benchmarks would be used to score hospital performance during the performance period, and these achievement thresholds and benchmarks would be established using data from the proposed baseline period. Thus, a hospital's achievement score would be based on a fixed standard rather than on its current standing relative to its peers. The achievement threshold for each HCAHPS dimension would correspond to median performance in the baseline period (50th percentile performance). Therefore, hospitals would earn points for achievement if they performed at least as well in the performance period as the mid-performing hospital performed during the baseline period. The benchmark corresponds to excellent performance observed in the baseline period and we are proposing to set it such that the maximum achievement points (10 points) would be awarded if the hospital performed at least at the 95th percentile of performance during the baseline period. We are proposing to set the actual benchmarks and achievement thresholds for the FY 2013 Hospital VBP program using data from the proposed baseline period (July 1, 2009 through March 31, 2010).
Similar to the proposed clinical process measures, we are proposing that each of the eight HCAHPS dimensions would be given equal weight in calculating the overall HCAHPS score. However, unlike the proposed scoring approach for the proposed clinical process of care measures, we are proposing to construct the patient experience of care measures score for the FY 2013 Hospital VBP using three elements: Achievement points, improvement points, and consistency points.
As shown in Table 4, for each of the eight HCAHPS dimensions we propose for the FY 2013 Hospital VBP program, scores would be based on the publicly reported adjusted proportions of best category (“top-box”) responses. (Top-box responses, as publicly reported on the Hospital Compare website, are the most positive responses to HCAHPS survey questions.) Please note that the “Cleanliness and Quietness” dimension is the average of the publicly reported stand-alone “Cleanliness” and “Quietness” ratings.
Table 4—Eight Proposed HCAHPS Dimensions for the FY 2013 Hospital VBP Program
Dimension (Composite or stand-alone item) Constituent HCAHPS survey items 1. Nurse communication Nurse-Courtesy/Respect. (% “Always”) Nurse-Listen. Nurse-Explain. 2. Doctor communication Doctor-Courtesy/Respect. (% “Always”) Doctor-Listen. Doctor-Explain. 3. Cleanliness and quietness Cleanliness. (% “Always”) Quietness. 4. Responsiveness of hospital staff Bathroom Help. (% “Always”) Call Button. 5. Pain management Pain Control. (% Always”) Help with Pain. 6. Communication about medications New Medicine-Reason. (% “Always”) New Medicine-Side Effects. 7. Discharge information Discharge-Help. (% “Yes”) Discharge-Systems. 8. Overall rating Overall Rating. a. Patient Experience of Care Measure (HCAHPS) Scoring Under the Three-Domain Performance Scoring Model: Scoring Hospitals on Achievement
Section 1886(o)(3)(A) requires the Secretary to establish performance standards with respect to the measures selected under the Hospital VBP program for a performance period for a fiscal year. The performance standards must include levels of achievement and improvement (section 1886(o)(3)(B)). The scoring methodology we are proposing to implement for HCAHPS includes achievement, improvement and consistency points. The achievement and improvement points are very similar to what is proposed for clinical measures. The consistency points measure whether hospitals are meeting the achievement thresholds across the eight proposed HCAHPS dimensions, which we believe will encourage hospitals to meet those thresholds for all of them. Consistency points are an additional form of achievement measurement that Start Printed Page 2472complements achievement points earned through hospital performance on individual HCAHPS dimensions.
The first proposed component of the patient experience of care/HCAHPS Hospital VBP program scoring algorithm is achievement points, which rewards hospital performance at or above the proposed baseline median on each of the eight HCAHPS dimensions. A minimum score of 0 corresponds to all eight dimensions being below the baseline median (that is, the dimension-specific achievement threshold), while a maximum score of 80 corresponds to all eight dimensions being at or greater than the 95th percentile from the baseline period (that is, the dimension-specific benchmark). We propose to assign 0 to 10 points for each of the eight HCAHPS dimensions as follows:
- If the hospital's score on a dimension is equal to or greater than the benchmark (that is, the baseline 95th percentile performance), the hospital would receive 10 points for achievement on that dimension
- If the hospital's score on a dimension is within the achievement range (that is, equal to or greater than the achievement threshold of 50th percentile performance but below the benchmark of 95th percentile performance), the hospital would receive a score of 1-9, based on a linear scale established for the achievement range and rounding to the nearest whole point according to the following formula:
((Hospital HCAHPS performance period dimension score − 50)/5) + 0.5 For example, if performance on a given dimension is at the 60th percentile, the hospital would receive 3 achievement points, calculated as follows: ((60 − 50)/5) + 0.5 = 2 + 0.5 = 2.5, which would be rounded to 3.
- If the hospital's score on a dimension is less than the achievement threshold for the dimension (that is, less than the 50th percentile of performance), the hospital would receive 0 points for achievement.
b. HCAHPS Performance Scoring Under the Three-Domain Performance Scoring Model: Scoring Hospitals on Improvement
The second proposed component of the HCAHPS Hospital VBP scoring algorithm is improvement points. For each HCAHPS dimension, a hospital could earn from 0-9 improvement points for each dimension depending on how much its performance on the dimension improved from its performance on the dimension during the baseline period. This proposed approach would recognize and encourage improvement for each of the eight HCAHPS dimensions. A unique improvement range for each hospital on each HCAHPS dimension would be established. Improvement points would be awarded proportionately and would be rounded to the nearest whole number. The score is based on the proportion of possible improvement in the performance period from the baseline period score on a given dimension to the benchmark on the same dimension, We propose to calculate improvement points for each of the eight dimensions according to the following formula:
[10*((Hospital performance period score − Hospital baseline period score)/(Benchmark − Hospital baseline period score))] − 0.5, where the hospital performance score falls in the range from the hospital's baseline period score to the benchmark
All improvement points would be rounded to the nearest whole number. If a hospital's score on the measure during the performance period was:
- Greater than its baseline period score but below the benchmark (within the improvement range), the hospital would receive a score of 0-9 based on the linear scale that defines the improvement range
- Equal to or lower than its baseline period score on the measure, the hospital would receive 0 points for improvement.
- If there is no improvement or if the score from the baseline period was already at the benchmark, the improvement score is 0.
For example, if a hospital's baseline score on a given dimension was at the 45th percentile and the hospital's score on the dimension during the performance period was at the 70th percentile, the hospital's improvement points on that dimension would be 5, calculated as follows:
[10 * ((70 − 45)/(95 − 45))] − 0.5 = 4.5, which would be rounded to 5.
c. HCAHPS Performance Scoring Model: Calculation of Consistency Points
The third proposed component of the HCAHPS Hospital VBP scoring algorithm is the consistency score. The consistency score recognizes consistent achievement across dimensions. To ensure at least adequate performance across all HCAHPS dimensions, we are proposing that for the FY 2013 Hospital VBP program hospitals earn consistency points ranging from 0-20 based on how many of their dimension scores meet or exceed the achievement threshold. The purpose of the consistency score (referred to as the “minimum performance score” in the 2007 Report to Congress), is to incentivize hospitals to continually improve on all HCAHPS dimensions to the point where their score on each dimension is at or above the achievement threshold. We believe that providing this type of incentive that applies to an entire domain is consistent with promoting wider systems changes within hospitals to improve quality.
We are proposing that a hospital would receive 0 consistency points if its performance on one or more HCAHPS dimensions during the performance period was at least as poor as the worst-performing hospital's performance on that dimension during the baseline period. A hospital would receive a maximum score of 20 consistency points if its performance on all eight HCAHPS dimensions was at or above the achievement threshold (50% of hospital performance during the baseline period).
We propose for the FY 2013 Hospital VBP program that a maximum of 20 consistency points would be awarded proportionately based on the single lowest of a hospital's 8 HCAHPS dimension scores during the performance period compared to the median baseline performance score for that specific HCAHPS dimension. If all 8 of a hospital's dimension scores during the performance period were at or above the 50th percentile achievement threshold in the baseline period, then that hospital would earn all 20 points. (That is, if the lowest of a hospital's eight HCAHPS dimension scores was at or above the 50th percentile of hospital performance on that dimension during the baseline period, then that hospital would earn the maximum of 20 consistency points). Consistency points would be awarded proportionately according to the number of percentiles the lowest dimension score is between the 0th and 50th percentile of hospital performance during the baseline period. Consistency points would be rounded to the nearest whole number (for example, 9.5 consistency points would be rounded to 10 points). We propose to define the lowest percentile as the lowest dimension score among the eight HCAHPS dimensions that would be scored under the FY 2013 Hospital VBP program. The formula for the HCAHPS consistency score is as follows:
(2 * (lowest percentile/5))− 0.5, rounded to the nearest whole number, with a minimum of zero and a maximum of 20 consistency pointsStart Printed Page 2473
For example:
- If the lowest score a hospital receives on an HCAHPS dimension is at or below the 0th percentile of hospital performance on that dimension during the baseline period, then 0 consistency points would be awarded to that hospital.
- If the lowest score a hospital receives on an HCAHPS dimension is equal to the 10th percentile of hospital performance on that dimension during the baseline period, then 4 (that is, (2 * (10/5)) − 0.5 = 3.5, rounded to 4) consistency points would be awarded to that hospital.
- If the lowest score a hospital receives on a HCAHPS dimension is equal to the 25th percentile of hospital performance on that dimension during the baseline period, then 10 (that is, (2 * (25/5)) − 0.5 = 9.5, rounded to 10) consistency points would be awarded to that hospital.
- If a hospital's score on all eight HCAHPS dimensions were at or above the achievement threshold (50th percentile of hospital performance during the baseline period), then 20 consistency points would be awarded to that hospital.
d. Examples To Illustrate HCAHPS Measure Scoring Model
Examples are presented here to illustrate how the proposed Three-Domain Performance Scoring Model would apply in the context of scoring the proposed HCAHPS dimensions. The dimension used for this illustration is doctor communication. Figure 4 shows Hospital B's scoring on the doctor communication dimension. It was placed at the 96th percentile, which exceeded the benchmark. Thus, Hospital B would earn the maximum of 10 points for achievement. Because this is the highest number of achievement points the hospital could attain for this dimension, its improvement from its baseline period score on this measure would not be relevant.
Figure 5 shows that Hospital I's performance on the doctor communication dimension rose from the 42nd percentile during the baseline period to the 64th percentile during the performance period. Because Hospital I's performance during the performance period exceeds the achievement threshold of the 50th percentile, Hospital I's score would be in the achievement range. According to the achievement scale, Hospital I would earn 3 achievement points. However, in this case, the hospital's performance in the performance period has improved from its performance during the baseline period, so Hospital I would be scored based on improvement as well as achievement. Applying the improvement scale, Hospital I's period-to-period improvement from the 42nd to the 64th percentile would earn it 3.65 improvement points which would be rounded to 4 points. Using the greater of the two scores, Hospital I would receive 4 points for this dimension (rounded to the nearest whole number).
Start Printed Page 2474In Figure 6, Hospital L's performance in the baseline period was at the 11th percentile, and its performance declined in the performance period to the 6th percentile. Because Hospital L's performance during the performance period is lower than the achievement threshold of the 50th percentile, it would receive 0 points based on achievement. Hospital L would also receive 0 points for improvement because its performance during the performance period is lower than its performance during the baseline period.
Start Printed Page 2475e. Calculating the Overall Patient Experience of Care Domain (HCAHPS) Performance Score
The proposed final step under the proposed HCAHPS scoring methodology for the FY 2013 Hospital VBP program is to combine the three proposed component scores into the overall patient experience of care domain (HCAHPS) performance score. We propose to calculate the overall HCAHPS performance score as follows:
1. For each of the eight dimensions, determine the larger of the 0-10 achievement score and the 0-9 improvement score.
2. Sum these eight values to arrive at a 0-80 HCAHPS base score.
3. Calculate the 0-20 HCAHPS consistency score.
4. To arrive at the HCAHPS total earned points, or HCAHPS overall score, sum the HCAHPS base score and the consistency score.
In summary, the overall HCAHPS performance score is calculated as follows:
HCAHPS total earned points = HCAHPS base score + consistency score.
6. Weighting of Hospital Performance Domains and Calculation of the Hospital VBP Total Performance Score
Section 1886(o)(5)(B)(iii) requires that the methodology developed for assessing the total performance of each hospital must provide for the assignment of weights for categories of measures as the Secretary determines appropriate. As discussed above in section C. of this proposed rule, we have proposed to group the measures for the Hospital VBP program into domains, which we would define as categories of measures by measure type. For purposes of the Hospital VBP program in FY 2013, we propose that two domains will be scored, the clinical process of care and patient experience of care. We believe that hospital quality is multifaceted, requiring adherence to evidence-based practices, achieving good clinical outcomes, and having positive and effectual patient experiences. In determining how to appropriately weight quality measure domains, we considered a number of criteria. Specifically, we considered the number of measures that we have proposed to include in each domain and the reliability of individual measure data. We also considered the systematic effects of alternative weighting schemes on hospitals according to their location and characteristics (for example, by region, size, and teaching status). We also considered Departmental quality improvement priorities. We strongly believe that outcome measures are important in assessing the overall quality of care provided by hospitals. While we believe that the addition of an outcome domain will make public valuable and important quality information regarding hospital performance, and bring needed attention to patient outcomes, for reasons previously discussed in section II. C. of this proposed rule, we are not proposing to include outcome measures in the FY 2013 Hospital VBP program. Taking all of these considerations into account, we propose the use of a 70 percent clinical process of care and 30 percent patient experience of care (HCAHPS) weighting scheme for the FY 2013 Hospital VBP program. We are proposing this weighting scheme because the 17 proposed clinical process of care measures comprise all but one of the measures we are proposing to include in the FY 2013 Hospital VBP program. We believe assigning a 30 percent weight to the Start Printed Page 2476patient experience of care domain is appropriate because the HCAHPS measure is comprised of eight dimensions that address different aspects of patient satisfaction. For the FY 2014 Hospital VBP program, in addition to proposing to use the 30-day mortality claims-based measures currently displayed on Hospital Compare, we propose to adopt the following 8 Hospital Acquired Condition measures and 9 AHRQ Patient Safety Indicator and Inpatient Quality Indicator outcome measures:
Hospital Acquired Condition measures:
- Foreign Object Retained After Surgery
- Air Embolism
- Blood Incompatibility
- Pressure Ulcer Stages III & IV
- Falls and Trauma: (Includes: Fracture, Dislocation, Intracranial Injury, Crushing Injury, Burn, Electric Shock)
- Vascular Catheter-Associated Infections
- Catheter-Associated Urinary Tract Infection (UTI)
- Manifestations of Poor Glycemic Control
AHRQ Patient Safety Indicators (PSIs), Inpatient Quality Indicators (IQIs), and Composite Measures:
- PSI 06—Iatrogenic pneumothorax, adult
- PSI 11—Post Operative Respiratory Failure
- PSI 12—Post Operative PE or DVT
- PSI 14—Postoperative wound dehiscence
- PSI 15—Accidental puncture or laceration
- IQI 11—Abdominal aortic aneurysm (AAA) repair mortality rate (with or without volume)
- IQI 19—Hip fracture mortality rate
- Complication/patient safety for selected indicators (composite)
- Mortality for selected medical conditions (composite)
We believe that these outcome measures provide important information relating to treatment outcomes and patient safety. All of these measures are currently included in the Hospital IQR program for the FY 2013 payment determination (75 FR 50209). We also believe that adding these outcome measures would significantly improve the correlation between patient outcomes and Hospital VBP performance. We will propose the FY 2014 Hospital VBP performance period end date and performance standards for these outcome measures in future rulemaking. We solicit public comment on what weight would be appropriate to assign to the outcome domain in future rulemaking.
We propose to calculate a hospital's total performance score by multiplying its performance on each domain by the proposed weight for that domain (70 percent clinical process of care, 30 percent patient experience of care), and adding those weighted scores together.
We solicit public comment on the proposed domain weighting approach and calculation of the total performance score, and are particularly interested in receiving comments regarding the utility and appropriateness of alternative methods.
Earlier in this proposed rule, we articulated our principles for value-based purchasing programs. In order to address these principles in our proposed hospital value-based purchasing program, we considered several additional factors when developing our proposed performance scoring methodology for the Hospital Value-Based Purchasing Program. CMS is actively seeking all the comments and proposals about alternative scoring methodologies that may achieve all these principles in better, more efficient, or more straightforward ways. New, innovative ideas are particularly useful to the Agency as we seek to create a payment system fully aligned with the overall health system aims of better health, better health care, and more efficient care through improvement.
Section 1886(o)(5)(B)(iv) states that the Secretary may not set a minimum performance standard in determining the hospital performance score for any hospital. We note that under the proposed Three-Domain Performance Scoring Model, the Secretary does not set the minimum performance standard for any hospital. Rather, the hospital in effect sets its own minimum performance standard based on how well it performed during the baseline period, and any improvement from that performance is sufficient for the hospital to earn improvement points.
7. Alternative Hospital Performance Scoring Models Considered
Since the 2007 Report to Congress, CMS has performed additional research and analyses regarding alternative scoring approaches for hospital value-based purchasing. We primarily focused on the Three-Domain Performance Scoring Model, the Six-Domain Performance Scoring Model, and the Appropriate Care Model (ACM). We are proposing to adopt the Three-Domain Performance Scoring Model as previously described.
The Appropriate Care Model (ACM), also referred to as the “all-or-none” model, is intended to be a more patient-centric method of assessing hospital performance on the clinical process of care measures. The ACM creates sub-domains by topic for the clinical process measures and is distinguished from the other two models in that it requires complete mastery for each topic area (“all-or-none”) in the clinical process of care domain at the patient level.
Under the ACM, the patient encounter, rather than the clinical process of care measure itself, becomes the scored “event,” with a hospital receiving 1 point if it successfully provides to a patient the applicable processes under all of the measures within an applicable topic area, or 0 points if it fails to furnish one or more of the applicable processes. The hospital's condition-specific ACM score is the proportion of patients with the condition who receive the appropriate care as captured by the process measures that fall within the topic area.
Within a condition, different sets of clinical processes may apply to a patient. For example, some AMI patients should receive aspirin at arrival but other AMI patients should not; some AMI patients smoke and should receive smoking cessation counseling, while others do not smoke and do not need to receive such counseling. Regardless of the number of clinical process of care measures within a topic that apply to a patient, each patient encounter to which a specific topic area applies weights equally with respect to the hospital's score for the topic area. Patients requiring many clinical processes within a topic are not weighted more heavily than patients requiring only a few clinical processes. There is no “partial credit” given to the hospital for a patient who is provided some, but not all, applicable clinical processes within a topic.
Under the ACM, CMS would determine what percentage of a hospital's patients within each condition or topic area (for example, AMI, HF, PN, and SCIP) received all of the applicable processes covered by all of the measures that fall under that topic. A hospital's performance on each topic area (that is, the percentage of patients that received all the appropriate processes) would then be scored along achievement and improvement ranges similar to those we have proposed for the Three-Domain Performance Scoring Model. These scores across the topic areas would then be equally weighted and combined to create a score for all of the clinical process measures. The hospitals would then be measured on the outcome and patient experience of care domains, just as in the Three-Domain Performance Scoring Model. The total performance score would be Start Printed Page 2477computed as a weighted average across the three domains, calculated by weighting the scores for each of the domains.
With each performance scoring model considered, we commissioned independent researchers at Brandeis University to examine the variation and stability of the clinical process of care domain under different combinations for the number of cases (patients) and number of measures and develop minimum numbers of cases and measures that provide a high level of confidence in the meaningfulness of performance scores across hospitals while at the same time providing scores for the largest possible number of hospitals. Based on this research, we concluded that in order to ensure the statistical reliability of a hospital's score under the ACM model, the hospital would need to have at least 25 patients within a condition (or topic area) to be measured on that condition and have cases corresponding to at least two conditions to receive an overall ACM score.
Under the ACM, for each condition measured in the clinical process of care domain, a hospital may earn points for achievement or for improvement. The method for determining earned points per condition in the ACM is analogous to the way points are determined per measure in the proposed Three-Domain Performance Scoring Model. Accordingly, the points a hospital earns for each condition is the higher of its points for achievement (that is, performance above the achievement threshold) or improvement (that is, performance better than the hospital's own performance during the baseline period). The hospital's overall ACM score for the clinical process of care domain is the sum of its condition-specific points equally weighted across all conditions measured for the hospital.
Applied to the following five conditions (AMI, HF, PN, SCIP, and HAI), a hospital reporting on all five conditions could earn a maximum of 50 points under the ACM, while a hospital reporting only three conditions could earn at most 30 points. The final overall clinical process of care domain score for a hospital under the ACM would be the fraction of its actual sum of points divided by its maximum possible points (for example, 50 in most cases, but possibly 30, 20, or 10 corresponding to the number of conditions reported).
The Six-Domain Performance Scoring Model, like the ACM, would create and separately score individual sub-domains at the topic level for the clinical process measures. In other words, the clinical process of care domain would be further broken down into sub-domains characterized by condition (our earlier analysis of the Six-Domain Performance Scoring Model included the HAI measures under the SCIP topic area, using only the four following topic areas, AMI, HF, PN, and SCIP). We would assign intermediate scores to each hospital for each of the clinical process sub-domains (such as, AMI, HF, PN, and SCIP). Like the Three-Domain Performance Scoring Model, hospitals would be scored on each measure in the sub-domain and individual measures (such as, SCIP-Card-2 and AMI-3) would still be weighted equally within a sub-domain. Scores across the topic area sub-domains would then be equally weighted and combined to create an overall clinical process score. The total performance score would be computed as an average across domains, calculated by weighting the scores for each of the three domains. At least two clinical process domains would be needed to calculate a total performance score. Based on the research conducted at Brandeis University discussed above, we concluded that a hospital would need to report at least 1 measure included within a domain (with a minimum of 2 domains) and have 10 opportunities (that is, patients) included in the measure. If an outcome domain was included, a hospital would also need to report on at least one of the available outcome measures.
8. Hospital Performance Scoring Model Comparisons
We assessed each of the models discussed above for purposes of structuring the performance scoring methodology for the Hospital VBP program. Specifically, we considered the following conceptual and empirical criteria:
- Impact on patients: The primary purpose of the Hospital VBP program is to drive improvements in clinical quality, patient-centered care, and efficiency. Thus, consideration of the impact of the various models on quality improvement in patient care is paramount.
- Accuracy of comparisons made between hospitals: The Hospital VBP program should make fair comparisons between hospitals based on total performance scores that are affected predominantly or exclusively by the hospital's performance on the individual measures. However, differences in the TPS between hospitals may also be affected by differences in the scope of services offered, which would determine the mix of measures that comprise the TPS for each hospital. Thus, a critical aspect of developing and implementing the TPS is facilitating equivalent and accurate comparisons between hospitals.
- Rank Correlation Impact: In light of the fact that the value-based incentive payment amount will vary by hospital, based on the hospital's TPS, we must consider how each model will affect how hospitals rank in terms of their performance.
- Extent of variance across hospitals: In addition to accuracy, the second important property of a TPS is that it has sufficient variance to clearly differentiate between hospitals. The logic and purpose of the scoring is to discriminate among hospitals according to relative performance; hence, the TPS should capture meaningful variation and financial incentives should reflect that variation.
- Number of hospitals that receive a score from the Hospital VBP program: The models for calculating the total performance score use different criteria for hospitals' minimum cases per measure and measures per domain. Consequently, the number of hospitals scored will differ depending on the model used. Other things being equal, a greater number of hospitals receiving scores is preferable in our view.
We analyzed how each of the scoring models discussed above best meet these criteria by modeling hospital performance on each model using data from 2007-2008 for the baseline period and 2008-2009 as the performance period. As discussed above, the primary difference between the Three-Domain Performance Scoring Model and the Six-Domain Performance Scoring Model is that the Six-Domain Performance Scoring model creates intermediate scores at the topic level for the clinical process measures, so that six domains are scored (AMI, HF, PN, SCIP, outcomes, and patient experience) rather than three domains (clinical process of care, outcomes, and patient experience). The Six-Domain model provides an intermediate, condition-specific score for prevalent and/or high-cost conditions in the Medicare population that could provide a useful summary when a more complete set of measures becomes available for those conditions. However, in light of the current set of measures available for use in the Hospital VBP program, we believe that the intermediate scores by condition would convey a false sense of precision about the quality of care for that condition. For this reason, and because hospital total performance scores that we modeled under the Six-Domain Performance Scoring Model were not substantively different from those we modeled under the Three-Start Printed Page 2478Domain Performance Scoring Model, we chose to focus our continued analysis on the Three-Domain Performance Scoring Model and the ACM. We discuss the results of our analysis of the Three-Domain Performance Scoring Model and the ACM below.
The scoring of the clinical process of care and outcome domains in the Three-Domain Performance Scoring Model is based on the Performance Assessment Model presented in the 2007 Report to Congress, but includes and scores the outcome domain as a separate domain. We believe that because each measure is scored independently under the Three-Domain Performance Scoring Model, the model will provide useful information to hospitals on aspects of care that may require improvement. The Three-Domain Performance Scoring Model scores hospitals based on how they performed with respect to each opportunity to provide appropriate care as defined by the measures, in effect weighting hospital scores by service and patient mix. In contrast with the ACM, independent scoring provides opportunities for hospitals to receive credit for each measure for which they meet the performance standard. In addition, hospitals are scored on a curve at the measure level such that they only earn points when their performance on a measure is better than their peers' average performance during the baseline period, or better than their own previous performance, increasing the accuracy of comparisons made between hospitals. This aspect of the Three-Domain Performance Scoring Model differs from the ACM, because ACM scoring results in higher scores for hospitals that only report on “easier” measures (that is, measures for which performance is high for most hospitals), not every clinical process of care measure for each condition will apply to every hospital, and the ACM does not award points for hospitals that furnish most (but not all) recommended care with respect to a clinical process of care topic.
Furthermore, in the Three-Domain Performance Scoring Model, the scoring of the clinical process of care measures in a single clinical process of care domain is consistent with the current level of precision on the measures. We believe that given the current set of measures available for adoption into the Hospital VBP program at this time, the intermediate scores created at the condition or topic level under the ACM would convey a false sense of precision about the quality of care provided for that condition. There are efforts in the industry to derive sets of measures that capture many aspects of quality for a certain condition. The measures currently in the Hospital IQR program were not developed with that aim; rather, they were developed and implemented as the best single quality measures for various conditions treated in the hospital and, as such, serve better as a proxy for overall quality than as a precise accounting of quality for individual topics. In other words, the measures now available for the Hospital VBP program do not represent all of the processes that constitute best practices for treating the condition in the inpatient setting, but collectively capture an array of clinical processes that are valid indicators representative of the overall quality of care provided in the hospital inpatient setting.
We believe that the Three-Domain Performance Scoring Model and the ACM are similar in several ways. Rank correlations of hospitals' total performance scores based on the two models were extremely high (between 89 percent and 94 percent). With respect to total performance score rank, most hospitals remain in the same quintile regardless of which model is used; only 8 to 18 percent of hospitals changed in rank quintile due to model choice. In addition, the number of hospitals with a sufficient number of cases and measures for inclusion under the ACM criteria (that is, at least 25 patients in 2 conditions) is similar to the number of hospitals qualifying under the criteria that we are proposing below to use for the Three-Domain Performance Scoring Model (that is, at least 10 patients for 4 measures).
The ACM is considered to be “patient focused” rather than “opportunity focused.” Since the unit of scoring is the patient encounter, and the hospital earns a clinical process of care domain score of zero for a patient if the hospital fails to provide any of the applicable processes covered by the measures in the applicable topic area, we believe that the hospital is likely to become aware of all of the processes the patient requires in order to treat the condition, rather than thinking in terms of individual opportunities. The ACM sets a high bar for quality improvement and sends a strong signal about complete mastery for each individual topic area (“all-or-none”) at the patient level. On the other hand, we believe that for complex patients or patients for whom one or more processes are not needed, the ACM model may provide a disincentive to providing quality care. Due to its all-or-nothing scoring approach, the ACM loses patient information that would have some effect on the total performance score under the Three-Domain Performance Scoring Model, under which hospitals would receive credit for all of the measures for which it met the performance standard. Furthermore, as a result of all-or-nothing scoring, the ACM approach will capture whether a patient received appropriate care, but it does not describe the extent of lacking care.
With regard to the extent of variation between hospitals, in our analysis, hospital performance scores modeled under the ACM in general tended to be lower than scores modeled under the Three-Domain Performance Scoring Model. These lower scores would, in theory, allow more room for hospitals to improve in future years.
We will continue analyzing alternative performance scoring models, including the ACM, and may consider proposing to implement scoring models other than the Three-Domain Performance Scoring Model in the future. We solicit public comments on the proposed Three Domain Performance Scoring Model as well as other potential performance scoring models.
9. Example of Applying the Three-Domain Performance Scoring Model to a Hospital and Calculating the Total Performance Score
To illustrate the application of the proposed Three-Domain Performance Scoring Model, we offer the following example:
For the performance period, Hospital E reports and receives raw scores on the measures as set forth in Table 5. (This example uses data from 2007 as the baseline period and 2009 as the performance period.)
Table 5—Examples of Hospital Raw Scores on Hospital VBP Performance Measures
Domain Condition Measure name Achievement threshold Benchmark Hospital baseline score Hospital performance period score Clinical Process of Care HF-1 Discharge Instructions 0.778 0.989 0.4 0.952 Start Printed Page 2479 HF-2 Evaluation of LVS Function 0.957 1.0 0.353 0.727 PN-2 Pneumococcal Vaccination 0.844 0.985 0.357 0.583 PN-7 Initial Antibiotic Received Within 6 Hours of Hospital Arrival 0.949 1.0 0.846 1.0 Patient Experience of Care HCAHPS Base Score† 60 HCAHPS Consistency Score 9 † The HCAHPS base score is calculated by summing the higher of the achievement or improvement score for each of the 8 HCAHPS dimensions. Table 6 below depicts the individual measure scores and total performance score Hospital E would receive after applying the proposed scoring methodology described above.
Table 6—Example of Hospital VBP Score Calculation
Domain Condition Achievement points Improvement points Earned points (higher of achievement of improvement) Domain score Clinical Process of Care HF-1 8 9 9 67.5 HF-2 0 5 5 PN-2 0 3 3 PN-7 10 10 10 Patient Experience of Care (HCAHPS) HCAHPS Base Score 60 40 † 60 69 HCAHPS Consistency Score 9 Total Performance Score 0.6795 † HCAHPS earned points are calculated by summing the higher of achievement or improvement points across the 8 HCAHPS dimensions. 10. Request for Comments—Proposed FY 2013 Hospital Value-Based Purchasing Performance Score Methodology and Alternatives
As stated in Sections E(1) and E(2) of this proposed rule, we considered both statutorily mandated and additional factors when assessing the proposed FY 2013 Hospital Value-Based Purchasing program performance score methodology and the alternatives outlined in the previous sections. These additional factors include (1) simplicity and transparency of performance score methods to hospitals; (2) alignment of Hospital VBP performance score methodology with other CMS Value-Based Purchasing programs; (3) quantitative characteristics of the measures and hospital-level data; (4) the relative emphasis placed on achievement and improvement in a performance score methodology; (5) elimination of unintended consequences for rewarding inappropriate hospital behaviors and patient outcomes, and (6) use of most currently available measure data to assess improvement in a performance score methodology.
We solicit comment on the merits and drawbacks about all of these factors on our proposed performance score methodology, and our performance score methodology alternatives described in this proposed rule. We are particularly interested in all suggested new, improved scoring methodology alternatives that may achieve our objectives in better, straightforward, or more effective ways.
F. Applicability of the Value-Based Purchasing Program to Hospitals
Section 1886(o)(1)(C) of the Act specifies the applicability of the value-based purchasing program to hospitals. For purposes of the Hospital VBP program, the term “hospital” is defined under section 1886(o)(1)(C)(i) as a “subsection (d) hospital,” (as defined in section 1886(d)(1)(B) of the Act). Section 1886(d)(1)(B) of the Act defines a “subsection (d) hospital” as a “hospital located in one of the fifty States or the District of Columbia.” The term therefore does not include hospitals located in the territories or hospitals located in Puerto Rico. Section 1886(d)(9)(A) of the Act separately defines a “subsection (d) Puerto Rico hospital” as a hospital that is located in Puerto Rico and that “would be a subsection (d) hospital if it were located in one of the 50 states.” Therefore, because 1886(o)(1)(C) does not refer to “subsection (d) Puerto Rico hospitals,” the Hospital VBP program would not apply to hospitals located in Puerto Rico. The statutory definition of a subsection (d) hospital under section 1886(d)(1)(B), however, does include inpatient, acute care hospitals located in the State of Maryland. These hospitals are not currently paid under the IPPS in accordance with a special waiver provided by section 1814(b)(3) of the Act. Despite this waiver, the Maryland hospitals continue to meet the definition of a “subsection (d) hospital” Start Printed Page 2480because they are hospitals located in one of the 50 states. Therefore we propose that the Hospital VBP program will apply to acute care hospitals located in the State of Maryland unless the Secretary exercises discretion pursuant to 1886(o)(1)(C)(iv), which states that “the Secretary may exempt such hospitals from the application of this subsection if the State which is paid under such section submits an annual report to the Secretary describing how a similar program in the State for a participating hospital or hospitals achieves or surpasses the measured results in terms of patient health outcomes and cost savings established under this subsection.”
The statutory definition of a subsection (d) hospital also does not apply to hospitals and hospital units excluded from the IPPS under section 1886(d)(1)(B) of the Act, such as psychiatric, rehabilitation, long term care, children's, and cancer hospitals. In order to identify hospitals, we propose that, for purposes of this provision, we would adjust payments to hospitals as they are distinguished by provider number in hospital cost reports. We propose that payment adjustments for hospitals be calculated based on the provider number used for cost reporting purposes, which is the CMS Certification Number (CCN) of the main provider (also referred to as OSCAR number). Payments to hospitals are made to each provider of record.
Section 1886(o)(1)(C)(ii) sets forth a number of exclusions to the definition of the term “hospital.” First, under section 1886(o)(1)(C)(ii)(I) a hospital is excluded if it is subject to the payment reduction under section 1886(b)(3)(B)(viii)(I) (the Hospital IQR program) for the fiscal year. Therefore, any hospital that is subject to the Hospital IQR payment reduction because it does not meet the requirements for the Hospital IQR program will be excluded from the Hospital VBP program for the fiscal year. We are concerned about the possibility of hospitals deciding to “opt out” of the Hospital VBP program by choosing to not submit data under the Hospital IQR program, thereby avoiding both the base operating DRG payment reduction and the possibility to receive a value-based incentive payment, although we recognize that these hospitals would still be subject to the Hospital IQR program reduction to their annual payment increase for the fiscal year. We intend to track hospital participation in the Hospital IQR program and welcome public comment on this issue.
With respect to hospitals for which we have measure data from the performance period but no measure data from the baseline period (perhaps because these hospitals were either not open during the baseline period or otherwise did not participate in the Hospital IQR program during that period), we are proposing that these hospitals will still be included in the Hospital VBP program, but that they will be scored based only on achievement. We invite public comments on this approach and welcome input on scoring hospitals without baseline performance data using this and other approaches.
Under section 1886(o)(1)(C)(ii)(II), a hospital is excluded if it has been cited by the Secretary for deficiencies during the performance period that pose immediate jeopardy to the health or safety of patients. We are proposing to interpret this to mean that any hospital that is cited by the Centers for Medicare and Medicaid through the Medicare State Survey and Certification process for deficiencies during the proposed performance period (for purposes of the FY 2013 Hospital VBP program, July 1, 2011-March 31, 2012) that pose immediate jeopardy to patients will be excluded from the Hospital VBP program for the fiscal year. We are also proposing to use the definition of the term “immediate jeopardy” that appears in 42 CFR 489.3.
Section 1886(o)(1)(C)(ii)(III) requires the Secretary to exclude for the fiscal year hospitals that do not report a minimum number (as determined by the Secretary) of measures that apply to the hospital for the performance period for the fiscal year.
Section 1886(o)(1)(C)(ii)(IV) requires the Secretary to exclude for the fiscal year hospitals that do not report a minimum number (as determined by the Secretary) of cases for the measures that apply to the hospital for the performance period for the fiscal year.
In determining the minimum number of reported measures and cases under sections 1886(o)(1)(C)(ii)(III) and (IV), the Secretary must conduct an independent analysis of what minimum numbers would be appropriate. To fulfill this requirement, we commissioned Brandeis University to perform an independent analysis that examined technical issues concerning the minimum number of cases per measure and the minimum number of measures per hospital needed to derive reliable performance scores. This analysis examined hospital performance scores using data from 2007-2008 and 2008-2009. The researchers tested different minimum numbers of cases and measures and concluded that the most important factor in setting minimum thresholds for the Hospital VBP program is to determine a combination of thresholds that allows the maximum number of hospitals to be scored reliably. We note that such reliability depends on the combination of the two thresholds. For example, if we allowed the number of cases per measure to be small (for example, 5 cases), we might still have reliable overall scores if there were a sufficiently large number of measures.
The independent analysis indicated that a smaller number of cases would yield less reliable results for any given measure, ultimately affecting results, when the measures were combined to create the domain scores. Because the proposed Hospital VBP scoring methodology aggregates information across all of the proposed measures, the analysis considered various thresholds for the minimum number of cases to include in a measure. We recognized that lowering the minimum number of cases required for each measure would allow a greater number of hospitals to participate in the Hospital VBP program. The analysis explored whether a lower threshold for each individual measure might be sufficient to make composite measures (that is, measures based on aggregations of individual measures), more statistically reliable.
Brandeis researchers checked the reliability of the total performance score for hospitals with only 4 measures. One approach was to randomly select 4, 6, 10, or 14 measures and to compare the reliabilities that are determined using these different sets of measures per hospitals. The research found that using 4 randomly selected measures per hospital did not greatly reduce between-hospital reliability (particularly in terms of rank ordering) from what would have been determined using 10 or 14 measures. Examining hospitals with at least 10 cases for each measure, the analysis compared the reliability of clinical process measure scores for hospitals according to the number of such measures reported. Whisker plots and reliability scores revealed comparable levels of variation in the process scores for hospitals reporting even a small number of measures as long as the minimum of 10 cases per measure was met. Based on this analysis, we propose to establish the minimum number of cases required for each measure under the proposed Three Domain Performance Scoring Model at 10, which we believe will allow us to include more hospitals in the Hospital VBP program.Start Printed Page 2481
When examining the minimum number of measures necessary to derive reliable performance scores, the independent analysis revealed that the distribution of performance scores varied depending on the number of measures reported per hospital. The whisker plots and reliability scores demonstrated a clear difference in the distribution of scores for hospitals reporting 4 or more measures compared with those reporting fewer than 4 measures.
We believe that setting the minimum number of measures and cases as low as is reasonable is an essential component of implementing the Hospital VBP program and will help to minimize the number of hospitals unable to participate due to not having the minimum number of cases for a measure, or the minimum number of measures. Therefore, as we stated above, we propose to exclude from hospitals' total performance score calculation any measures on which they report fewer than 10 cases. We also propose to exclude from the Hospital VBP program any hospitals to which less than 4 of the proposed measures apply.
We are also proposing that, for inclusion in the Hospital VBP program for FY 2013, hospitals must report a minimum of 100 HCAHPS surveys during the performance period. The reliability of HCAHPS scores was determined through statistical analyses conducted by RAND, the statistical consultant for HCAHPS. Based on these analyses, we believe that a reliability rate of 85 percent or higher is desired for HCAHPS to ensure that true hospital performance, rather than random “noise,” is measured. RAND's analysis indicates that HCAHPS data do not achieve an 85 percent reliability level across all eight HCAHPS dimensions with a sample of less than 100 completed surveys.
As proposed in this section and in section II. E. of this proposed rule, hospitals reporting insufficient data to receive a score on either the clinical process of care or HCAHPS domains will not receive a total performance score for the FY 2013 Hospital VPB program.
We solicit public comments on our proposals regarding the minimum numbers of cases and measures necessary for hospitals' inclusion in the Hospital VBP program. We note that hospitals excluded from the Hospital VBP program will be exempt from the base operating DRG payment reduction required under section 1886(o)(7) as well as the possibility for value-based incentive payments.
G. The Exchange Function
Section 1886(o)(6) of the Act governs the calculation of value-based incentive payments under the Hospital VBP program. Specifically, section 1886(o)(6)(A) requires that in the case of a hospital that meets or exceeds the performance standards for the performance period for a fiscal year, the Secretary shall increase the base operating DRG payment amount (as defined in section 1886(o)(7)(D)), as determined after application of a payment adjustment described in section 1886(o)(7)(B)(i), for a hospital for each discharge occurring in the fiscal year by the value-based incentive payment amount. Section 1886(o)(6)(B) defines the value-based incentive payment amount for each discharge in a fiscal year as the product of (1) the base operating DRG payment amount for the discharge for the hospital for such fiscal year, and (2) the value-based incentive payment percentage for the hospital for such fiscal year. Section 1886(o)(6)(C)(i) provides that the Secretary must specify a value-based incentive payment percentage for each hospital for a fiscal year, and section 1886(o)(6)(C)(ii) provides that in specifying the value-based incentive payment percentage, the Secretary must ensure (1) that the percentage is based on the hospital's performance score, and (2) that the total amount of value-based incentive payments to all hospitals in a fiscal year is equal to the total amount available for value-based incentive payments for such fiscal year under section 1886(o)(7)(A), as specified by the Secretary.
Section 1886(o)(7) of the Act describes how the value-based incentive payments are to be funded. Under section 1886(o)(7)(A), the total amount available for value-based incentive payments for all hospitals for a fiscal year must be equal to the total amount of reduced payments for all hospitals under section 1886(o)(7)(B), as estimated by the Secretary. Section 1886(o)(7)(B)(i) requires the Secretary to adjust the base operating DRG payment amount for each hospital for each discharge in a fiscal year by an amount equal to the applicable percent of the base operating DRG payment amount for the discharge for the hospital for such fiscal year, and further requires that the Secretary make these reductions for all hospitals in the fiscal year involved, regardless of whether or not the hospital has been determined to have earned a value-based incentive payment for the fiscal year. With respect to fiscal year 2013, the term “applicable percent” is defined as 1.0 percent, but the amount gradually rises to 2 percent by FY 2017 (section 1886(o)(7)(C)).
The 2007 Report to Congress introduced the exchange function as the means to translate a hospital's total performance score into the percentage of the value-based incentive payment earned by the hospital. We believe that the selection of the exact form and slope of the exchange function is of critical importance to how the incentive payments reward performance and encourage hospitals to improve the quality of care they provide.
As illustrated in Figure 7, we considered four mathematical exchange function options: Straight line (linear); concave curve (cube root function); convex curve (cube function); and S-shape (logistic function).
Start Printed Page 2482In determining which of these exchange functions would be most appropriate for translating a hospitals TPS into a value-based incentive payment percentage, we carefully considered four aspects of each option.
First, we considered how each option would distribute the value-based incentive payments among hospitals. Under section 1886(o)(7)(A) of the Act, the total amount available for value-based incentive payments for all hospitals for a fiscal year must be equal to the total amount of reduced payments for all hospitals for such fiscal year, as estimated by the Secretary. We interpret this section to mean that the redistribution of a portion of the IPPS payment to all hospitals under the Hospital VBP program must be accomplished in a way that is estimated to be budget neutral, without increasing or decreasing the aggregate overall IPPS payments made to the hospitals. As a result, if we award higher value-based incentive payments to higher performing hospitals, less money is available to make value-based incentive payments to lower performing hospitals. The reverse is also true. If we give higher value-based incentive payments to lower performing hospitals, less money is available to reward higher performing hospitals. The form and slope of each exchange function also affects the level of value-based incentive payments available to hospitals at various performance levels. Under both the cube and logistic functions, lower incentive payments are available to lower performing hospitals and aggressively higher payments are available for higher performing hospitals. These functions therefore distribute more incentive payments to higher performing hospitals. Under the cube root function, payments stay at relatively lower levels for higher performing hospitals; this function distributes more incentive payments to lower performing hospitals. The linear function moves more aggressively to higher levels for higher performing hospitals than the cube root function, but not as aggressively as the logistic and cube functions. It therefore distributes more incentive payments to higher performing hospitals than the cube root function, but not as aggressively as the logistic and cube functions.
Second, we considered the potential differences between the value-based incentive payment amounts for hospitals that do poorly and hospitals that do very well. Due to the fact that the cube root function distributes lower payment amounts to higher performing hospitals, the cube root function creates the narrowest distribution of incentive payments across hospitals. The linear is next, followed by the logistic. The cube Start Printed Page 2483function, which most aggressively moves to higher payment levels for higher performing hospitals, creates the widest distribution.
Third, we considered the different marginal incentives created by the different exchange function shapes. In the case of the linear shape, the marginal incentive does not vary for higher or lower performing hospitals. The slope of the linear function is constant, so any hospital with a TPS that is 0.1 higher than another hospital would receive the same increase in its value-based incentive payment across the entire TPS range. For the other shapes, the slope of the exchange function creates a higher or lower marginal incentive for higher or lower performing hospitals. Steeper slopes at any given point on the function indicate greater marginal incentives for hospitals to improve scores and obtain higher payments at that point, while flatter slopes indicate smaller marginal incentives. If the slope is steeper at the low end of performance scores than at the high end, as with the cube root function, hospitals at the low end have a higher marginal incentive to improve than hospitals at the high end. If the slope is steeper at the high end, as with the cube function, hospitals have a higher marginal incentive to improve at the high end than they do at the low end.
Fourth, we weighed the relative importance of having the exchange function be as simple and straightforward as possible.
Taking all of these factors into account, we propose to adopt a linear exchange function for the purpose of calculating the percentage of the value-based incentive payment earned by each hospital under the Hospital VBP program. The linear function is the simplest and most straightforward of the mathematical exchange functions discussed above. The linear function provides all hospitals the same marginal incentive to continually improve. The linear function more aggressively rewards higher performing hospitals than the cube root function, but not as aggressively as the logistic and cube functions. We propose the function's intercept at zero, meaning that hospitals with scores of zero will not receive any incentive payment. Payment for each hospital with a score above zero will be determined by the slope of the linear exchange function, which will be set to meet the budget neutrality requirement of section 1886(o)(6)(C)(ii)(II) that the total amount of value-based incentive payments equal the estimated amount available under section 1886(o)(7)(A). In other words, we will set the slope of the linear exchange function for FY 2013 so that the estimated aggregate value-based incentive payments for FY 2013 are equal to 1 percent of the estimated aggregate base operating DRG payment amounts for FY 2013. Analogous estimates will be done for subsequent fiscal years.
We believe that our proposed linear exchange function ensures that all hospitals have strong incentives to continually improve the quality of care they provide to their patients. We may revisit the issue of the most appropriate exchange function in future rulemaking as we gain more experience under the Hospital VBP program. We solicit public comments on our proposed exchange function and the resulting distribution of value-based incentive payments.
We note that, in order to evaluate the different exchange functions, we needed to estimate the value-based incentive payment amount. As noted previously, section 1886(o)(6)(B) of the Act defines the value-based incentive payment amount as equal to the product of the base operating DRG payment amount for each discharge for the hospital for the fiscal year and the value-based incentive payment percentage specified by the Secretary for the hospital for the fiscal year. Section 1886(o)(7)(D)(i) defines the base operating DRG payment with respect to a hospital for a fiscal year as, unless certain special rules apply, “the payment amount that would otherwise be made under subsection (d) (determined without regard to subsection (q)) for a discharge if [subsection (o)] did not apply; reduced by any portion of such payment amount that is attributable to payments under paragraphs (5)(A), (5)(B), (5)(F) and (12) of subsection (d); and such other payments under subsection (d) determined appropriate by the Secretary.” Therefore, for estimation purposes, to calculate base operating DRG payments, we estimated the total payments using Medicare Part A claims data and subtracted from this number the estimates of payments made as outlier payments (authorized under section 1886(d)(5)(A)), indirect medical education payments (authorized under section 1886(d)(5)(B)), disproportionate share hospital payments (authorized under section 1886(d)(5)(F)), and low-volume hospital adjustment payments (authorized under section 1886(d)(12)). We note that this approximation of base operating DRG payments made for the purpose of estimating the value-based payment amount to evaluate the different exchange functions is not a policy proposal. We will propose a definition of the term “base operating DRG payment amount” under section 1886(o)(7)(D), as well as how we would implement the special rules for certain hospitals described in section 1886(o)(7)(D)(ii), in future rulemaking. We invite public comment to inform our intended future policymaking on this issue.
Furthermore, section 1886(o)(7)(A) states that the total amount available for value-based incentive payments for all hospitals for a fiscal year shall be equal to the total amount of reduced payments for all hospitals for such fiscal year. To calculate the total amount of reduced payments, section 1886(o)(7)(B) states that the base operating DRG payment amount shall be reduced by an applicable percent as defined under section 1886(o)(7)(C). This applicable percent is 1.0 percent for FY 2013, 1.25 percent for FY 2014, 1.5 percent for FY 2015, 1.75 percent for FY 2016, and 2 percent for FY 2017 and subsequent years. To develop an estimation of the value-based incentive payment amount for the purposes of evaluating the different exchange functions, we used the FY 2013 1.0 as the applicable percent. We multiplied an estimate (described above) of the total aggregate base operating DRG payments for hospitals as defined under 1886(o)(1)(C) by 1.0 percent in order to derive the total amount available for value-based incentive payments that was used in the evaluation of the four exchange functions.
H. Proposed Hospital Notification and Review Procedures
Section 1886(o)(8) requires the Secretary to inform each hospital of the adjustments to payments to the hospital for discharges occurring in a fiscal year as a result of the calculation of the value-based incentive payment amount (section 1886(o)(6)) and the reduction of the base operating diagnosis-related group (DRG) payment amount (section 1886(o)(7)(B)(i)), not later than 60 days prior to the fiscal year involved. We propose to notify hospitals of the 1 percent reduction to their FY 2013 base operating DRG payments for each discharge in the FY 2013 IPPS rule, which will be finalized at least 60 days prior to the beginning of the 2013 fiscal year. We expect to propose to incorporate this reduction into our claims processing system in January, 2013, which will allow the 1 percent reduction to be applied to the FY 2013 discharges, including those that have occurred beginning on October 1, 2012. We will address the operational aspects of the reduction as part of the FY 2013 IPPS rule.
Because the proposed performance period would end only six months prior Start Printed Page 2484to the beginning of FY 2013, CMS will not know each hospital's exact total performance score or final value-based incentive payment adjustment 60 days prior to the start of the 2013 fiscal year on October 1, 2012. Therefore, we propose to inform each hospital through its QualityNet account at least 60 days prior to October 1, 2012 of the estimated amount of its value-based incentive payment for FY 2013 discharges based on estimated performance scoring and value-based incentive payment amounts, which will be derived from the most recently available data. We also propose that each hospital participating in the Hospital VBP program establish a QualityNet account if it does not already have one for purposes of the Hospital IQR program. We further propose to notify each hospital of the exact amount of its value-based incentive payment adjustment for FY 2013 discharges on November 1, 2012. The value-based incentive payment adjustment would be incorporated into our claims processing system in January 2013, which will allow the value-based incentive payment adjustment to be applied to the FY 2013 discharges, including those that have occurred beginning on October 1, 2012.
Section 1886(o)(10)(A)(i) of the Act requires the Secretary to make information available to the public regarding individual hospital performance in the Hospital VBP program, including: (1) Hospital performance on each measure that applies to the hospital; (2) the performance of the hospital with respect to each condition or procedure; and (3) the total hospital performance score. To meet this requirement, we propose to publish hospital scores with respect to each measure, each hospital's condition-specific score (that is, the performance score with respect to each condition or procedure, for example, AMI, HF, PN, SCIP, HAI), each hospital's domain-specific score, and each hospital's total performance score on the Hospital Compare website. We note that we are not proposing to use a hospital's condition-specific score for purposes of calculating its total performance score under the proposed Three-Domain Performance Scoring Model.
Section 1886(o)(10)(A)(ii) requires the Secretary to ensure that each hospital has the opportunity to review and submit corrections related to the information to be made public with respect to the hospital under section 1886(o)(10)(A)(i) prior to such information being made public. As stated above, we propose to derive the Hospital VBP measures data directly from measures data submitted by each hospital under the Hospital IQR program. We propose that the procedures we adopt for the Hospital IQR program will also be the procedures that hospitals must follow in terms of reviewing and submitting corrections related to the information to be made public under section 1886(o)(10).
With respect to the FY 2013 Hospital VBP program, we propose to make each hospital's Hospital VBP performance measure score, condition-specific score, domain-specific score, and total performance score available on the hospital's QualityNet account on November 1, 2012. We propose to remind each hospital via the hospital's secure QualityNet account of the availability of its performance information under the Hospital VBP program on this date. Pursuant to section 1886(o)(10)(A)(ii), we propose to provide hospitals with 30 calendar days to review and submit corrections related to their performance measure scores, condition-specific scores, domain-specific scores and total performance score.
Section 1886(o)(10)(B) requires the Secretary to periodically post on the Hospital Compare website aggregate information on the Hospital VBP program, including: (1) The number of hospitals receiving value-based incentive payments under the program as well as the range and total amount of such value-based incentive payments; and (2) the number of hospitals receiving less than the maximum value-based incentive payment available for the fiscal year involved and the range and amount of such payments. We propose to post aggregate Hospital VBP information on the Hospital Compare website in accordance with Section 1886(o)(10)(B). We will provide further details on reporting aggregated information in the future.
I. Proposed Reconsideration and Appeal Procedures
Section 1886(o)(11)(A) of the Act requires the Secretary to establish a process by which hospitals may appeal the calculation of a hospital's performance assessment with respect to the performance standards (section 1886(o)(3)(A)) and the hospital performance score (section 1886(o)(5)). Under section 1886(o)(11)(B), there is no administrative or judicial review under section 1869, section 1878, or otherwise of the following: (1) The methodology used to determine the amount of the value-based incentive payment under section 1886(o)(6) and the determination of such amount; (2) the determination of the amount of funding available for the value-based incentive payments under section 1886(o)(7)(A) and payment reduction under section 1886(o)(7)(B)(i); (3) the establishment of the performance standards under section 1886(o)(3) and the performance period under section 1886(o)(4); (4) the measures specified under section 1886(b)(3)(B)(viii) and the measures selected under section 1886(o)(2); (5) the methodology developed under section 1886(o)(5) that is used to calculate hospital performance scores and the calculation of such scores; or (6) the validation methodology specified in section 1886(b)(3)(B)(viii)(XI).
We will propose an appeals process under section 1886(o)(11) in future rulemaking. We invite public comment, in general, on the structure and procedure of an appropriate appeals process. Specifically, CMS seeks comment on the appropriateness of a process that would establish an agency-level appeals process under which CMS personnel having appropriate expertise in the Hospital VBP program would decide the appeal. We seek insight on what qualifications such personnel should hold. Further, we invite comment on how the appeals process should be structured. Finally, we seek public input on the timeframe in which these appeals should be resolved.
J. Proposed FY 2013 Validation Requirements for Hospital Value-Based Purchasing
In the FY 2011 IPPS final rule (75 FR 50227 through 50229), we adopted a validation process for the FY 2013 Hospital IQR program. We propose that this validation process will also apply to the FY 2013 Hospital VBP program. We believe that using this process for both the Hospital IQR program and the Hospital VBP program is beneficial for both hospitals and CMS because no additional burden will be placed on hospitals to separately return requested medical records for the Hospital VBP program. Because the measure data we are using for the Hospital VBP program is the same as the data we collect for the Hospital IQR program, we believe that we can ensure that the Hospital VBP program measure data are accurate through the Hospital IQR program validation process.
In future rulemaking related to the Hospital IQR program, we will consider proposing refinements to our annual Hospital IQR validation sample selection, targeting, and annual validation period for enhanced alignment and use in the Hospital VBP program. We seek to reduce hospital burden and ensure that the information we collect for both the Hospital IQR Start Printed Page 2485program and the Hospital VBP program is accurate.
K. Additional Information
1. Monitoring and Evaluation
As part of our ongoing effort to ensure that Medicare beneficiaries receive high-quality inpatient care, CMS plans to monitor and evaluate the new Hospital VBP program. Monitoring will focus on whether, following implementation of the Hospital VBP program, we observe changes in access to and the quality of care furnished to beneficiaries, especially within vulnerable populations. We will also evaluate the effects of the new Hospital VBP program in areas such as:
- Access to care for beneficiaries, including categories or subgroups of beneficiaries.
- Changes in care practices that might adversely impact the quality of care furnished to beneficiaries.
- Patterns of care suggesting particular effects of the Hospital VBP program (such as whether there are changes in the percentage of patients receiving appropriate care for conditions covered by the measures); or a change in the rate of hospital acquired conditions.
- Best practices of high-performing hospitals that might be adopted by other hospitals.
We currently collect data on readmission rates for beneficiaries diagnosed with myocardial infarction, heart failure, and pneumonia. We also collect chart abstracted data on a variety of quality of care indicators related to myocardial infarction, heart failure, pneumonia, and surgical care improvement. These sources and other available data will provide the basis for early examination of trends in care delivery, access, and quality. Assessment of the early experience with the Hospital VBP program will allow us to create an active learning system, building the evidence base essential for guiding the design of future Hospital VBP programs and enabling CMS to address any disruptions in access or quality that may arise. These ongoing monitoring and evaluation efforts will be part of CMS's larger efforts to promote improvements in quality and efficiency, both within CMS and between CMS and hospitals in the Hospital VBP program. We welcome public comments regarding approaches to monitoring and evaluating the Hospital VBP program.
2. Electronic Health Records (EHRs)
a. Background
Starting with the FY 2006 IPPS final rule, we have encouraged hospitals to take steps toward the adoption of EHRs (also referred to in previous rulemaking documents as electronic medical records) that will allow for reporting of clinical quality data from the EHRs directly to a CMS data repository (70 FR 47420 through 47421). We encouraged hospitals that are implementing, upgrading, or developing EHR systems to ensure that the technology obtained, upgraded, or developed conforms to standards adopted by HHS. We suggested that hospitals also take due care and diligence to ensure that the EHR systems accurately capture quality data and that, ideally, such systems provide point of care decision support that promotes optimal levels of clinical performance.
We also continue to work with standard setting organizations and other entities to explore processes through which EHRs could speed the collection of data and minimize the resources necessary for quality reporting as we have done in the past.
We note that we have initiated work directed toward enabling EHR submission of quality measures through EHR standards development and adoption. We have sponsored the creation of electronic specifications for quality measures for the hospital inpatient setting, and will also work toward electronically specifying measures selected for the Hospital IQR program and the Hospital VBP program.
b. HITECH Act EHR Provisions
The HITECH Act (Title IV of Division B of the ARRA, together with Title XIII of Division A of the ARRA) authorizes payment incentives under Medicare for the adoption and use of certified EHR technology beginning in FY 2011. Hospitals are eligible for these payment incentives if they meet requirements for meaningful use of certified EHR technology, which include reporting on quality measures using certified EHR technology. With respect to the selection of quality measures for this purpose, under section 1886(n)(3)(A)(ii) of the Act, as added by section 4102 of the HITECH Act, the Secretary shall select measures, including clinical quality measures, that hospitals must provide to CMS in order to be eligible for the EHR incentive payments. With respect to the clinical quality measures, section 1886(n)(3)(B)(i) of the Act requires the Secretary to give preference to those clinical quality measures that have been selected for the Hospital IQR program under section 1886(b)(3)(B)(viii) of the Act or that have been endorsed by the entity with a contract with the Secretary under section 1890(a) of the Act. Any clinical quality measures selected for the HITECH incentive program for eligible hospitals must be proposed for public comment prior to their selection, except in the case of measures previously selected for the Hospital IQR program under section 1886(b)(3)(B)(viii) of the Act.
Thus, the Hospital IQR program and Hospital VBP Program have important areas of overlap and synergy with respect to the reporting of quality measures under the HITECH Act using EHRs. We believe the financial incentives under the HITECH Act for the adoption and meaningful use of certified HER technology by hospitals will encourage the adoption and use of certified EHRs for the reporting of clinical quality measures under the Hospital IQR program which are subsequently used for the Hospital VBP Program.
We note that the provisions in this proposed rule do not implicate or implement any HITECH statutory provisions. Those provisions are the subject of separate rulemaking and public comment.
L. QIO Quality Data Access
The mission of the Quality Improvement Organization (QIO) Program, as authorized under section 1862(g) and Part B of title XI of the Act, is to promote the effectiveness, efficiency, economy, and quality of services delivered to Medicare beneficiaries. We contract with one organization in each state, as well as the District of Columbia, Puerto Rico, and the U.S. Virgin Islands, to serve as that state/jurisdiction's QIO. QIOs are private, usually not-for-profit organizations, which are staffed mostly by doctors and other health care professionals. These professionals are trained to review medical care and help beneficiaries with complaints about the quality of care and to implement improvements in the quality of care available throughout the spectrum of care. Over time, QIOs have been instrumental in advancing national efforts that motivate providers to improve the quality of Medicare services, and in measuring and improving outcomes of quality.
Data collected by QIOs to accomplish their mission represent an important tool for CMS in our efforts to improve quality. QIOs collect survey, administrative, and medical records data in order to monitor and assess Start Printed Page 2486provider performance. The confidentiality and disclosure requirements associated with QIO information are set forth in Section 1160 of the Act. In particular, this section stipulates that QIOs are not Federal agencies for purposes of the Freedom of Information Act and specifies that “any data or information acquired by [a QIO] in the exercise of its duties and functions shall be held in confidence and shall not be disclosed to any person.” The section then authorizes certain exceptions that allow disclosures, including the authority of the Secretary to prescribe additional exceptions “in such cases and under such circumstances as the Secretary shall by regulations provide * * * .” Implementing regulations governing the QIO confidentiality and disclosure requirements were issued in 1985 (see 50 FR 15347, April 17, 1985). In accordance with section 1881(c)(8), section 1160 and the confidentiality and disclosure requirements also apply to End Stage Renal Disease Networks.
A key aspect of these regulations is the significant restriction placed on a QIO's ability to disclose QIO information, in particular information related to a Quality Review Study (QRS). A QRS is defined in § 480.101(b) as “an assessment, conducted by or for a QIO, of a patient care problem for the purpose of improving patient care through peer analysis, intervention, resolution of the problem and follow-up.” QIOs are instrumental in collecting, maintaining, and processing certain data associated with the Hospital Inpatient Quality Reporting Program. Such data is considered to be QRS data. As such, these data are subject to the increased restrictions placed on disclosures of QRS information set forth in § 480.140 of the QIO regulations. Section 480.140 even places stringent restrictions on a QIO's ability to disclose to CMS. While the QIO regulations have gone largely unchanged since 1985, the regulations were recently updated to account for CMS' expanded role in quality reporting. Specifically, § 480.140 was amended to add a new subparagraph (g), which ensures that CMS has access to QRS information collected as part of the Hospital Inpatient Quality Reporting Program, following hospital review of the data. However, CMS's access is restricted to the sole purpose of conducting certain activities related to MA organizations, as described in § 422.153. See 75 FR 19678, 19759 (April 15, 2010). CMS continues to be limited in other areas of quality reporting based on the current regulatory restrictions.
In fact, many of the same regulatory restrictions that impact CMS' ability to properly coordinate quality reporting have also impacted CMS' ability to oversee and plan other QIO program activities and Departmental initiatives. As previously noted, the QIO regulations were originally issued in 1985. Although these regulations have not undergone significant change, there have been significant changes both within and outside the QIO program directly impacting the way the QIOs and CMS conduct business. In 1985, computers were still in their infancy, and QIO review activities were primarily conducted onsite at the provider's and/or practitioner's place of business. Similarly, CMS' oversight responsibilities were conducted onsite at the QIOs' offices. The QIO program regulations were written based on this reality. Additionally, the original restrictions were designed to enhance provider and practitioner participation in the QRS process, and in fact, were considered necessary to obtain the frank and open communication needed to improve the quality of health care.
Since 1985 however, we have seen enormous technological advances, including improvements in the ability to electronically exchange large amounts of data safely and securely through the internet. Moreover, several laws, most notably the Health Insurance Portability and Accountability Act (HIPAA) and the Federal Information Security and Management Act (FISMA), have been established to protect sensitive information. In addition, despite the QIOs continued focus on information obtained directly from providers and practitioners, QIOs also obtain a large amount of CMS claims data electronically to complete their review activities. During this same time period, the QIO program has expanded and now includes more emphasis on quality reporting and additional responsibilities, for example, a broader range of beneficiary appeals of provider discharges. In turn, CMS' responsibilities have also been broadened both in terms of programmatic responsibilities, for example, quality reporting, and its contractor oversight responsibilities. Moreover, there are various initiatives designed to ensure transparency of our programs, as well as the operations of individual providers and practitioners. We have also identified several unintended consequences resulting from these regulatory restrictions, which need to be addressed to ensure better management of the QIOs. This includes improvements related to CMS' oversight of QIO physician reviewers.
In light of the above, we are proposing several changes to the QIO regulations. We are amending the definition of the QIO review system in § 480.101(b) to include CMS. The QIO review system currently consists of the QIO and the organizations and individuals who either assist the QIO or are directly responsible for providing care or for making review determinations with respect to that care. Particularly in the area of quality reporting, there is a need for increased coordination between CMS and the QIOs, which includes exchanges of data so that CMS can better manage and respond to new information.
We are also modifying § 480.130 to clarify the Department's general right to access non-QRS confidential information. We have made it clear that this provision includes Departmental components, including CMS as well as the Center for Disease Control and Prevention including those related to data exchanges associated with the National Health Care Safety Network. Additionally, we are modifying § 480.139(a) to remove limitations on CMS' access to information regarding the QIO's internal deliberations (as defined in § 480.101(b). The current regulation authorizes CMS' access to information in “deliberations,” but limits that access to onsite “at the QIO office or at a subcontracted organization.” This limitation is unrealistic in light of today's technologically advanced business environment.
For the same reasons, we have modified § 480.140 to eliminate the onsite restriction to CMS' access to QRS data. In addition to the reasoning we have presented above, we considered this change necessary in order to create a more consistent approach to how and when we could gain access to QRS information. In our recent addition of subparagraph (g) to § 480.140, the “onsite” limitation was removed only in the context of MA organizations. We now see no reason to confine this change to such a narrow purpose. As a general matter, CMS must have access to QRS information not only for quality reporting purposes but also to ensure proper oversight and management of the QIOs. This includes access for the evaluation of specific contractor performance issues and for the long-term planning of the QIO program. In addition, the current state of technology, the use of electronic exchanges of data and information, and the speed at which data must be exchanged to ensure accomplishment of our work, warrants Start Printed Page 2487the elimination of the restriction that data can only be accessed “onsite” at the QIO. We also considered the fact that the current “onsite” limitation does not establish realistic limits on the use of data CMS views onsite. While actual copies of materials cannot be removed from an onsite location, it is unlikely that the “onsite” restriction adequately prevents CMS from “taking away” information it has learned while viewing that information. Thus, the change presents a more realistic approach to access in light of today's environment. It will enable CMS to operate more efficiently, and account for the current information exchange methodologies used throughout the world. In fact, we are asking for comments regarding whether the “onsite” restriction should be eliminated entirely from subparagraph (a) of section 480.140. In order to reflect the specific changes we are now proposing in section 480.140, we are making corresponding changes in § 422.153 to ensure consistency between the two provisions.
In general, the changes will not only enable CMS to better monitor its programs and contractors, but will also help to ensure that CMS has access to information in a timely manner to account for any unintended consequences to patient care resulting from its programs. This increased access to QIO information is vital to achieving CMS' goal of developing a performance-based incentive payment program that rewards providers for high-quality care. Access to this data will enhance CMS' efforts to create a Hospital VBP program based on quality of care. The changes will also facilitate CMS' effort to improve coordination with its contractors. Moreover, CMS will be positioned to better leverage opportunities to improve the quality of health care and to oversee its contractor activities with less cost, including costs associated with travel.
In addition to the proposed changes, we are also asking for comments regarding the disclosure of QIO information to researchers. Historically, QIOs have not disclosed confidential QIO information to researchers. However, we recognize the value that research can offer in improving the quality of health care, and researchers frequently contact QIO program representatives to gain access to QIO information. Thus, we are requesting comments on whether researchers should be allowed access to QIO information. This includes access to confidential information associated with quality review studies. Moreover, we are requesting comments on the process that should be used to evaluate these requests, for example, enabling QIOs to independently assess such requests or using the current CMS Privacy Board structure. Insight regarding criteria to be used in evaluating these requests should also be provided.
III. Collection of Information Requirements
This document does not impose information collection and recordkeeping requirements. Consequently, it need not be reviewed by the Office of Management and Budget under the authority of the Paperwork Reduction Act of 1995.
IV. Response to Comments
Because of the large number of public comments we normally receive on Federal Register documents, we are not able to acknowledge or respond to them individually. We will consider all comments we receive by the date and time specified in the DATES section of this preamble, and, when we proceed with a subsequent document, we will respond to the comments in the preamble to that document.
V. Regulatory Impact Statement
A. Statement of Need
The objectives of the Hospital VBP program include to transform how Medicare pays for care and to encourage hospitals to continually improve the quality of care they provide. In accordance with section 1886(o) of the Act, we have proposed to accomplish these goals by providing incentive payments based on hospital performance on quality measures. This proposed rule was developed based on extensive research we conducted on hospital value-based purchasing, some of which formed the basis of the 2007 Report to Congress, as well as extensive stakeholder and public input. The proposed approach reflects the statutory requirements and the intent of Congress to promote increased quality of hospital care for Medicare beneficiaries by aligning a portion of hospital payments with performance.
B. Overall Impact
We have examined the impact of this rule as required by Executive Order 12866 on Regulatory Planning and Review (September 30, 1993), the Regulatory Flexibility Act (RFA) (September 19, 1980, Pub. L. 96-354), section 1102(b) of the Social Security Act, section 202 of the Unfunded Mandates Reform Act of 1995 (March 22, 1995; Pub. L. 104-4), Executive Order 13132 on Federalism (August 4, 1999) and the Congressional Review Act (5 U.S.C. 804(2)).
Executive Order 12866 directs agencies to assess all costs and benefits of available regulatory alternatives and, if regulation is necessary, to select regulatory approaches that maximize net benefits (including potential economic, environmental, public health and safety effects, distributive impacts, and equity). A regulatory impact analysis (RIA) must be prepared for major rules with economically significant effects ($100 million or more in any 1 year). To provide funding for value-based incentive payments, beginning in fiscal year 2013 and in each succeeding fiscal year, section 1886(o)(7) of the Act governs the funding for the value-based incentive payments and requires the Secretary to reduce the base operating DRG payment amount for a hospital for each discharge in a fiscal year by an amount equal to the applicable percent of the base operating DRG payment amount for the discharge for the hospital for such fiscal year. We anticipate defining the term “base operating DRG amount” in future rulemaking. For purposes of this proposed rule, we have limited our analysis of the economic impacts to the value-based incentive payments. As required by section 1886(o)(7)(A), total reductions for hospitals under section 1886(o)(7)(B) must be equal to the amount available for value-based incentive payments under section 1886(o)(6), resulting in a net budget-neutral impact. Overall, the distributive impact of this proposed rule is estimated at $850 million for FY 2013. Therefore, this proposed rule is economically significant and thus a major rule under the Congressional Review Act.
The objectives of the Hospital VBP program include to transform how Medicare pays for care and to encourage hospitals to continually improve the quality of care they provide. In accordance with section 1886(o) of the Act, we have proposed to accomplish these goals by providing incentive payments based on hospital performance on quality measures. This proposed rule was developed based on extensive research we conducted on hospital value-based purchasing, some of which formed the basis of the 2007 Report to Congress, as well as extensive stakeholder and public input. The proposed approach reflects the statutory requirements and the intent of Congress to promote increased quality of hospital care for Medicare beneficiaries by aligning a portion of hospital payments with performance.Start Printed Page 2488
The RFA requires agencies to analyze options for regulatory relief of small businesses. For purposes of the RFA, small entities include small businesses, nonprofit organizations, and small governmental jurisdictions. Most hospitals and most other providers and suppliers are considered to be small entities, either by nonprofit status or by having revenues $34.5 million or less in any 1 year. Individuals and States are not included in the definition of a small entity.
Guidance issued by the Department of Health and Human Services interpreting the RFA considers effects to be economically significant if they reach a threshold of 3 to 5 percent or more of total revenues or costs. Among the 3,092 hospitals that would be participating in the Hospital VBP program, we estimate that percent increases in payments resulting from this proposed rule will range from 0.0236 percent for the lowest-scoring hospital to 1.817 percent for the highest-scoring hospital. When the reduction in base DRG operating payments to hospitals required under section 1886(o)(7) is taken into account, roughly half of participating hospitals will receive a net increase in payments and half will receive a net decrease in payments. However, we estimate that no participating hospital will receive more than a net 1 percent increase or decrease in payments. This falls well below the threshold for economic significance established by HHS for requiring a more detailed impact assessment under the RFA. Thus, we are not preparing an analysis under the RFA because the Secretary has determined that this proposed rule would not have a significant economic impact on a substantial number of small entities.
In addition, section 1102(b) of the Act requires us to prepare a regulatory impact analysis if a rule may have a significant impact on the operations of a substantial number of small rural hospitals. This analysis must conform to the provisions of section 603 of the RFA. For purposes of section 1102(b) of the Act, we define a small rural hospital as a hospital that is located outside of an urban area and has fewer than 100 beds. We are not preparing an analysis under section 1102(b) of the Act because the Secretary has determined that this proposed rule would not have a significant impact on the operations of a substantial number of small rural hospitals.
Section 202 of the Unfunded Mandates Reform Act of 1995 also requires that agencies assess anticipated costs and benefits before issuing any rule whose mandates require spending in any 1 year of $100 million in 1995 dollars, updated annually for inflation. In 2010, that threshold is approximately $135 million. This rule would not mandate any requirements for State, local, or tribal governments, nor would it affect private sector costs.
Executive Order 13132 establishes certain requirements that an agency must meet when it promulgates a proposed rule (and subsequent final rule) that imposes substantial direct requirement costs on State and local governments, preempts State law, or otherwise has Federalism implications. As stated above, this final rule would not have a substantial effect on State and local governments.
C. Anticipated Effects
Table 7 displays our analysis of the distribution of possible total performance scores based on 2009 data, providing information on the estimated impact of this proposed rule. Value-based incentive payments for the estimated 3,092 hospitals participating in Hospital VBP are stratified by hospital characteristic, including geographic region, urban/rural designation, capacity (number of beds), and percentage of Medicare utilization. For example, line 4 of Table 7 shows the estimated value-based incentive payments for the East South Central region, which includes the states of Alabama, Kentucky, Mississippi, and Tennessee. Column 2 relates that, of the 3,092 participating hospitals, 301 are located in the East South Central region. Column 3 provides the estimated mean value-based incentive payment to those hospitals, which is 1.021 percent. The next columns provide the distribution of scores by percentile; we see that the value-based incentive percentage payments for hospitals in the East South Central region range from 0.550 at the 5th percentile to 1.482 at the 95th percentile, while the value-based incentive payment at the 50th percentile is 1.023 percent.Start Printed Page 2489
Start Printed Page 2490Table 7—Two-Domain Impact (Clinical Process and HCAHPS): Estimated Incentive Rates by Hospital Characteristic†
Hospital characteristic N = 3,092 Mean Percentile 5th 10th 25th 50th 75th 90th 95th Region: New England 138 1.083 0.660 0.751 0.935 1.088 1.276 1.391 1.434 Middle Atlantic 370 0.955 0.542 0.619 0.766 0.963 1.152 1.288 1.352 South Atlantic 518 1.041 0.551 0.661 0.822 1.039 1.255 1.420 1.499 East North Central 475 1.022 0.555 0.652 0.840 1.025 1.214 1.380 1.472 East South Central 301 1.021 0.550 0.634 0.810 1.023 1.235 1.413 1.482 West North Central 248 1.083 0.638 0.721 0.866 1.075 1.283 1.470 1.567 West South Central 457 1.014 0.477 0.597 0.784 0.997 1.248 1.432 1.563 Mountain 201 0.980 0.584 0.650 0.822 0.986 1.159 1.336 1.396 Pacific 384 0.935 0.434 0.551 0.755 0.951 1.126 1.290 1.383 Urban/Rural: Large Urban 1,199 1.008 0.552 0.646 0.815 1.014 1.206 1.370 1.449 Other Urban 1,010 1.016 0.551 0.646 0.817 1.015 1.209 1.379 1.484 Rural 883 1.007 0.487 0.607 0.788 1.009 1.239 1.398 1.499 Capacity (by # beds): 1 to 99 beds 1,045 1.044 0.491 0.617 0.814 1.047 1.284 1.456 1.575 100 to 199 beds 939 1.002 0.500 0.598 0.815 1.015 1.201 1.360 1.452 200 to 299 beds 481 0.989 0.586 0.662 0.803 0.996 1.175 1.323 1.392 300 to 399 beds 279 0.995 0.577 0.668 0.821 1.022 1.167 1.293 1.379 400 to 499 beds 151 0.985 0.575 0.700 0.837 0.982 1.135 1.307 1.414 500+ beds 197 0.960 0.562 0.652 0.766 0.960 1.146 1.265 1.314 Medicare Utilization: 0 to 25% 237 0.990 0.542 0.639 0.798 1.012 1.164 1.352 1.451 >25% to 50% 1,508 1.016 0.528 0.642 0.818 1.020 1.224 1.381 1.459 >50% to 65% 1,148 1.005 0.524 0.637 0.804 1.008 1.206 1.381 1.482 > 65% 196 1.02 0.52 0.60 0.80 1.02 1.28 1.42 1.53 † Note: Because sufficient 2009 data was not available at the time of publication of this proposed rule, the measures SCIP-Card-2 and SCIP-Inf-4 were not included in the calculation of estimated incentive rates. However, we believe that no significant change in estimated incentive rates results from the omission of these measures. Table 8 below shows the estimated percent distribution by hospital characteristic of the 1% reduction ($850 million) in the base operating DRG payment for fiscal year 2013.
Table 8—Average Estimated Percentage Withhold Amount (as required by section 1886(o)(7) of the Social Security Act) by Hospital Characteristic
Hospital characteristic N=3,092 Estimated percent withhold amount Region: New England 138 5.9 Middle Atlantic 370 15.9 South Atlantic 518 19.5 East North Central 475 17.5 East South Central 301 7.8 West North Central 248 7.2 West South Central 457 10.3 Mountain 201 4.8 Pacific 384 11.2 Urban/Rural: Large Urban 1,199 49.8 Other Urban 1,010 38.2 Rural 883 11.1 Capacity (by # beds): 1 to 99 beds 1,045 8.1 100 to 199 beds 939 21.2 200 to 299 beds 481 20.5 300 to 399 beds 279 16.9 400 to 499 beds 151 11.0 500+ beds 197 23.4 Medicare Utilization: 0 to 25% 237 3.9 >25 to 50% 1,508 60.0 >50% to 65% 1,148 32.8 >65% 196 3.2 We also analyzed the characteristics of hospitals not receiving a Hospital VBP score based on the program requirements, which is shown below in Table 9. We estimate that 353 hospitals will not receive a Hospital VBP score in fiscal year 2013. We note that these hospitals will not be impacted by the reductions in base DRG operating payments under section 1886(o)(7). IPPS hospitals not included in this analysis were excluded due to the complete absence of cases applicable to the measures included, or due to the absence of a sufficient number of cases to reliably assess the measure.
As might be expected, a significant portion of hospitals not receiving a Hospital VBP score are small providers because such entities are more likely to lack the minimum number of cases or measures required to participate in the Hospital VBP program. We anticipate conducting future research on methods to include small hospitals in the Hospital VBP program.
Table 9—Projected Number of Hospitals Not Receiving a Hospital VBP Score in FY 2013, by Hospital Characteristic
Hospital characteristic Number of hospitals not receiving hospital VBP Score (N=353) Region: New England 6 Middle Atlantic 18 South Atlantic 14 East North Central 31 East South Central 26 West North Central 17 West South Central 85 Mountain 25 Pacific 26 Puerto Rico 34 Missing Region 71 Urban/Rural: Large Urban 116 Other Urban 83 Rural 83 Missing Urban/Rural 71 Capacity (by # beds): 1 to 99 beds 213 100 to 199 beds 47 200 to 299 beds 11 300 to 399 beds 8 400 to 499 beds 2 500+ beds 0 Missing Capacity 72 Medicare Utilization: 0 to 25% 78 >25% to 50% 75 >50% to 65% 43 >65% 28 Missing Medicare Utilization 129 We note that a number of hospitals were missing hospital characteristic data, including region, urban/rural classification, size, and Medicare utilization. All 353 hospitals included in Table 9, including those with missing hospital characteristic data, lacked sufficient clinical process of care data or HCAHPS data needed to calculate a total performance score.
D. Alternatives considered
The major alternative performance scoring models considered for this proposed rule were the Six-Domain Performance Scoring Model and the Appropriate Care Model, and both of these models were discussed in Section II. E. of this proposed rule. Examining these alternative performance scoring models, our analyses showed only modest differences in financial reimbursements across the separate models considered by the various characteristics listed above. We believe that these observed transfers are within the limits of expected variation and do not reflect significant differences in financial reimbursements between the performance scoring models considered.
E. Accounting Statement
As required by OMB Circular A-4 (available at http://www.whitehouse.gov/omb/circulars/a004/a-4.pdf), we have prepared an accounting statement showing the classification of the impacts associated with the provisions of this proposed rule.
As required by section 1886(o)(7)(A), total reductions for hospitals under section 1886(o)(7)(B) must be equal to the amount available for value-based incentive payments under section 1886(o)(6), resulting in a net budget-neutral impact. Overall, the distributive impacts of this proposed rule, resulting from the incentive payments and the 1% reduction (withhold) in the base operating DRG payment for fiscal year 2013, are estimated at $850 million for fiscal year 2013 (reflected in 2010 dollars).
Table 10—Accounting Statement: Classification of Estimated Expenditures for FY 2013
Category Transfers Annualized Monetized Transfers $0 (distributive impacts resulting from the incentive payments and the 1% reduction (withhold) in the base operating DRG payment are estimated at $850 million). From Whom To Whom? Federal Government to Hospitals. The analysis above, together with the remainder of this preamble, provides a Regulatory Impact Analysis. In accordance with the provisions of Executive Order 12866, this regulation was reviewed by the Office of Management and Budget.
Start List of SubjectsList of Subjects
42 CFR Part 422
- Administrative practice and procedure
- Health facilities
- Health
42 CFR Part 480
- Health care
- Health professions
- Health records
- Peer Review Organizations (PRO)
- Penalties
- Privacy
- Reporting and recordkeeping requirements
For the reasons set forth in the preamble, the Centers for Medicare & Medicaid Services proposes to amend 42 CFR chapter IV as follows:
Start PartPART 422—MEDICARE ADVANTAGE PROGRAM
1. The authority citation for part 422 continues to read as follows:
Subpart D—Quality Improvement
2. Section 422.153 is revised to read as follows:
Use of quality improvement organization review information.CMS will acquire from quality improvement organizations (QIOs) as defined in part 475 of this chapter data collected under section 1886(b)(3)(B)(viii) of the Act and subject to the requirements in § 480.140(g). CMS will acquire this information, as needed, and may use it for the following functions:
(a) Enable beneficiaries to compare health coverage options and select among them.
(b) Evaluate plan performance.
(c) Ensure compliance with plan requirements under this part.
(d) Develop payment models.
(e) Other purposes related to MA plans as specified by CMS.
PART 480—ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION REVIEW INFORMATION
3. The authority citation for part 480 continues to read as follows:
Subpart B—Utilization and Quality Control Quality Improvement Organizations (QIOs)
4. Section 480.101(b) is amended by revising the definition of “QIO review system” to read as follows:
Scope and definitions.* * * * *QIO review system means the QIO and those organizations and individuals who either assist the QIO or are directly responsible for providing medical care or for making determinations with respect to the medical necessity, appropriate level and quality of health care services that may be reimbursed under the Act. The system includes—
(1) The QIO and its officers, members and employees;
(2) QIO subcontractors;
(3) Health care institutions and practitioners whose services are reviewed;
(4) QIO reviewers and supporting staff;
(5) Data support organizations; and
(6) CMS.
* * * * *5. Section 483.130 is revised to read as follows:
Disclosure to the Department.Except as limited by § 480.139(a) and § 480.140 of this subpart, QIOs must disclose to the Department all information requested by the Department in the manner and form requested. The Information can include confidential and non-confidential information and requests can include those made by any component of the Department, such as CMS.
6. Section 480.139 is amended by revising paragraph (a)(1) to read as follows:
Disclosure of QIO deliberations and decisions.(a) QIO deliberations. (1) A QIO must not disclose its deliberations except to—
(i) CMS; or
(ii) The Office of the Inspector General, and the General Accounting Office as necessary to carry out statutory responsibilities.
* * * * *7. Section 480.140 is amended by revising paragraph (a)(1) and paragraph (g) to read as follows:
Disclosure of quality review study information.(a) * * *
(1) Representatives of authorized licensure, accreditation or certification agencies as is required by the agencies in carrying out functions which are within the jurisdiction of such agencies under state law; to federal and state agencies responsible for identifying risks to the public health when there is substantial risk to the public health; or to Federal and State fraud and abuse enforcement agencies;
* * * * *(g) A QIO must disclose quality review study information to CMS with identifiers of patients, practitioners or institutions—
(1) For purposes of quality improvement. Activities include, but are not limited to, data validation, measurement, reporting, and evaluation.
(2) As requested by CMS when CMS deems it necessary for purposes of overseeing and planning QIO program activities.
Dated: December 10, 2010.
Donald M. Berwick,
Administrator, Centers for Medicare & Medicaid Services.
Approved: December 16, 2010.
Kathleen Sebelius,
Secretary.
Footnotes
3. Chassin, M.R.; Loeb, J.M.; Schmaltz, S.P. and Wachter, R.M. (2010) “Accountability Measures—Using Measurement to Promote Quality Improvement.” New England Journal of Medicine. Vol 363: 683-688.
Back to Citation4. See OEI-06-09-00090 “Adverse Events in Hospitals: National Incidence Among Medicare Beneficiaries.” Department of Health and Human Services, Office of Inspector General, November 2010. See also, 2009 National Healthcare Quality Report, pp. 107-122. “Patient Safety,” Agency for Healthcare Research and Quality.
Back to Citation6. The report may be found at http://www.cms.gov/AcuteInpatientPPS/downloads/HospitalVBPPlanRTCFINALSUBMITTED2007.pdf.
Back to Citation[FR Doc. 2011-454 Filed 1-7-11; 4:15 pm]
BILLING CODE 4120-01-P
Document Information
- Published:
- 01/13/2011
- Department:
- Centers for Medicare & Medicaid Services
- Entry Type:
- Proposed Rule
- Action:
- Proposed rule.
- Document Number:
- 2011-454
- Dates:
- To be assured consideration, comments must be received at one of
- Pages:
- 2453-2491 (39 pages)
- Docket Numbers:
- CMS-3239-P
- RINs:
- 0938-AQ55: Hospital Value-Based Purchasing Program (CMS-3239-F)
- RIN Links:
- https://www.federalregister.gov/regulations/0938-AQ55/hospital-value-based-purchasing-program-cms-3239-f-
- Topics:
- Administrative practice and procedure, Health care, Health facilities, Health maintenance organizations (HMO), Health professions, Health records, Medicare, Penalties, Privacy, Reporting and recordkeeping requirements
- PDF File:
- 2011-454.pdf
- CFR: (5)
- 42 CFR 422.153
- 42 CFR 480.101
- 42 CFR 480.130
- 42 CFR 480.139
- 42 CFR 480.140