[Federal Register Volume 62, Number 246 (Tuesday, December 23, 1997)]
[Rules and Regulations]
[Pages 67174-67213]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 97-32828]
[[Page 67173]]
_______________________________________________________________________
Part III
Department of Health and Human Services
_______________________________________________________________________
Health Care Financing Administration
_______________________________________________________________________
42 CFR Part 483
Medicare and Medicaid; Resident Assessment in Long Term Care
Facilities; Final Rule
Federal Register / Vol. 62, No. 246 / Tuesday, December 23, 1997 /
Rules and Regulations
[[Page 67174]]
DEPARTMENT OF HEALTH AND HUMAN SERVICES
Health Care Financing Administration
42 CFR Part 483
[HCFA-2180-F]
RIN 0938-AE61
Medicare and Medicaid; Resident Assessment in Long Term Care
Facilities
AGENCY: Health Care Financing Administration (HCFA), HHS.
ACTION: Final rule.
-----------------------------------------------------------------------
SUMMARY: This final rule establishes a resident assessment instrument
for use by long term care facilities participating in the Medicare and
Medicaid programs when conducting a periodic assessment of a resident's
functional capacity. The resident assessment instrument (RAI) consists
of a minimum data set (MDS) of elements, common definitions, and coding
categories needed to perform a comprehensive assessment of a long term
care facility resident. A State may choose to use the Federally
established resident assessment instrument or an alternate instrument
that is designed by the State and approved by us. These regulations
establish guidelines for use of the data set and designation of the
assessment instrument.
The provisions contained in these regulations implement statutory
requirements. The resident assessment instrument is intended to produce
a comprehensive, accurate, standardized, reproducible assessment of
each long term care facility resident's functional capacity.
EFFECTIVE DATE: Except for Secs. 483.20(f) and 483.315(h), these
regulations are effective March 23, 1998. Sections 483.20(f) Facility
computerization requirements and 483.315(h) State computerization
requirements are effective June 22, 1998.
FOR FURTHER INFORMATION CONTACT: Cindy Hake, (410) 786-3404.
SUPPLEMENTARY INFORMATION:
I. Background
On December 28, 1992, we published in the Federal Register, at 57
FR 61614, a proposed rule with an opportunity for public comment,
``Resident Assessment in Long Term Care Facilities,'' which established
a resident assessment instrument that all long term care facilities
participating in the Medicare and Medicaid programs must use when
conducting an assessment of a resident's functional capacity. We
proposed that a State may choose to use the Federally established
resident assessment instrument or an alternate instrument that is
designed by the State and approved by us. We proposed that a facility
must enter information from the resident assessment into a computer, in
accordance with HCFA-specified formats. At least monthly, the facility
must transmit electronically the information contained in each resident
assessment to the State.
The resident assessment instrument would consist of a minimum data
set (MDS) of screening and assessment elements, including common
definitions and coding categories for use by a facility in performing a
comprehensive assessment of a long term care facility resident. In
addition to containing identifying information such as name, birthdate,
and occupation, the MDS consists of standardized items that assess, for
example, a resident's communication patterns, cognitive patterns,
physical functioning and structural problems, health conditions, and
medications. The proposed rule established guidelines for use of the
data set, and designated one or more assessment instruments that a
State may require a facility to use.
We proposed to add a new Sec. 483.315, which would require a State
to specify for use in long term care facilities within the State either
the HCFA-designated resident assessment instrument or an alternate
instrument. The State would request and receive approval from us before
implementing or modifying an alternate instrument. The uniform MDS was
included in Sec. 483.315(b). We also provided as attachments to the
regulations the utilization guidelines for the resident assessment
instrument, MDS common definitions, and resident assessment protocols
(RAPs).
II. Analysis of and Responses to Public Comments
We received 146 timely letters in response to our December 28,
1992, proposed regulation. Most were from provider organizations and
nursing home staff. We also heard from consumer organizations,
professional organizations, nursing home residents and their families,
and State and Federal agencies.
Prior to addressing comments on specific regulatory sections, we
will provide a summary of public comments on major topics, and discuss
some of the general issues raised by these regulations (in the order in
which those issues appeared in the preamble to the proposed rule).
Summary of Public Comments
Summary of Public Comments on MDS
During the public comment period, respondents suggested over 70
different additions to the MDS. Many commenters suggested modifying
items to increase clarity. For example, the item ``wheeled self'' was
divided into two items, ``wheeled self on unit'' and ``wheeled self off
unit'' to further differentiate a resident's capabilities. Commenters
also suggested the addition of items that provided information needed
by clinical staff caring for residents. Data suggest that nursing home
residents experience pain on a regular basis, but the MDS items
associated with pain did not differentiate the intensity and location
of pain (chest, joint, other). We expanded MDS items associated with
pain to assist clinicians in determining the nature and scope of pain
for care planning purposes.
There was a concern expressed by commenters that the MDS, as
originally designed, could not be used for determining nursing home
payment or monitoring quality of care, either at the resident and or
the facility level. To address this concern, we added items to the MDS
that are needed to support a case-mix classification system for long
term care facility payment known as, Resource Utilization Groups III,
which is a mechanism for determining the level of resources necessary
to care for an individual based upon his clinical characteristics as
measured by the MDS. This classification system was developed under the
auspices of the HCFA-funded Multistate Nursing Home Case-mix and
Quality demonstration, whose purpose is to develop, implement and
evaluate a case-mix payment system for SNF services under Medicare. The
original four States participating in the demonstration began using the
MDS+ (an alternate RAI that consists of the original MDS, plus
additional assessment items specified by the State for use in all
Medicare and Medicaid-certified nursing homes in the State), based on
the Resource Utilization Groups III classification system in their
Medicaid programs in 1994, as have several other States subsequently.
Section 4432 of the Balanced Budget Act of 1997 (Public Law 105-
33), amends section 1888 of the Social Security Act (the Act), by
adding a new subsection (e). The Balanced Budget Act and the
Prospective Payment System (PPS) will require national implementation
in Fiscal Year 1998 of a casemix payment system for Medicare that is
based on MDS data. The Secretary determines the manner and
[[Page 67175]]
time frames within which resident assessment data are collected at the
State and national levels to develop and implement casemix payment
rates. The resident assessment data submitted to the State is a
resource upon which the Secretary can draw for development and
implementation of the PPS system.
We added other items to the original MDS to ensure that key
indicators of quality of care, (known as quality measures) could be
derived from the MDS and monitored longitudinally at the resident and
facility level. The addition of items needed to support payment and
quality monitoring programs will also strengthen the clinical relevancy
of the MDS by providing important information to facility staff about
the resident's potential for achieving the highest level of
functioning. One example of such items are nursing care interventions
related to rehabilitation and restorative care for the resident, such
as range of motion, training and skill practice in walking,
transferring, eating, dressing/grooming, and communication.
Commenters were particularly concerned with the ability of the MDS
to assist in assessing the quality of life for nursing home residents.
Revisions we made within the section on mood and behavior, in
particular, have the potential for providing important information
regarding the resident's risk for depression, as well as the presence
of depression. Nursing home residents have a high risk of developing
depression, with clinical experts estimating that at least 60 percent
of current nursing home residents have some level of depression.
However, analysis of MDS records for a large group of residents showed
that the mood and behavior items were checked for only 16 percent of
the residents. We found that nursing homes that have clinical staff
with expertise in this area identify more residents with mood and
behavior problems. Concerned that residents with, or at risk of,
depression may not be identified, we have modified the mood and
behavior items to help facility staff identify objective behaviors
frequently associated with depression. We also added a scale to measure
the frequency with which these symptoms occur. An item indicating the
use of a behavior management program was modified to allow the assessor
to identify specific strategies that were being used with the resident
to deal with mood and behavior symptoms.
Finally, commenters expressed concern that the MDS was not
appropriate to use with some groups of nursing home residents, such as
the non-elderly or short term stay populations. To better understand
the changing nursing home population, we have added an item in Section
P that identifies different populations often served by nursing homes
(for example, pediatric resident, hospice care resident). To address
commenters' concerns, we also added items focusing more on short-term
nursing and therapy needs, and issues important to terminal residents,
such as pain. We also expanded the item on discharge planning to assess
the resident's potential for discharge, including the resident's desire
to return to the community and the presence of a support person who is
positive towards discharge. This item will also be useful in developing
a RAP on discharge planning that was suggested by a number of
commenters.
Summary of Public Comments on Triggers
Commenters believed that the trigger legend was too complex and
needed to be simplified or eliminated. It is substantially revised, and
we have reduced the number of triggers for particular RAPs. We have
also eliminated the categories of automatic and potential triggers as
this had not been well understood and sometimes led to unnecessary work
by nursing home staff.
Summary of Public Comments on the RAP Summary Form
We revised the RAP Summary Form and accompanying instructions to
reduce confusion regarding their use that was noted by commenters.
Specifically, the revised form provides a column for indicating if the
RAP was triggered. It provides more specific instruction and direction
on the type of information that we would expect a facility to document
for each triggered RAP, including rationale to support decision-making
regarding whether to proceed with a care plan for a triggered RAP.
Additionally, because we consider the RAPs part of the utilization
guidelines for the MDS, we designated the RAP Summary form as Section V
of the MDS. This will provide nursing home staff and surveyors with
more complete information on resident care problems and outcomes. This
will also permit surveyors to monitor the completion of the RAPs.
Summary of Public Comments on RAPs
Most of the commenters valued the RAPs as part of the RAI for
improving the quality of care. A number of commenters indicated the
need for the addition of new RAPs. Specifically, we received comments
suggesting the creation of RAPs on discharge planning, pain, terminal
care/imminent death, resident rights, bowel incontinence/constipation,
abnormal lab values, and foot care. A new RAP on discharge planning is
already developed and we expect to develop other RAPs during 1997.
There was also concern that many of the current RAPs do not address
the needs of short-stay residents. Work is currently in progress and we
expect to publish revised RAP Guidelines that address the needs of this
population in 1997.
Comments on MDS and RAPS
Comment: Most commenters asserted that the original MDS did not
provide enough information in some areas. These commenters noted that
the areas of nursing diagnosis and medical needs, and certain
information needed for care planning, were lacking. Some commenters
stated that professional nurses are knowledgeable regarding areas that
are not addressed on the MDS and automatically incorporate them into
the assessment and care plan. Another commenter pointed out that the
MDS+ includes additional information that is helpful in care planning.
Response: As discussed elsewhere, we have added a number of items
that nursing home staff have identified as useful in assessing a
resident's functional capability and medical problems. We have also
clarified items that had been confusing for facility staff in the past.
Some of the items added to the MDS were previously on the MDS+. We
believe that the MDS captures information on most of the areas of
concern in assessing nursing home residents. While we agree that there
are additional items that would provide necessary information for
nursing home staffs' use in care planning, it is not possible for us to
design an instrument that covers every potential item that a nursing
home needs to know to provide care to residents. The RAI is not
intended to replace or substitute for a resident's full clinical
record. The facility should document in those clinical records
pertinent information whether or not required by the RAI. A facility is
responsible for providing care that is necessary to assist a resident
in attaining or maintaining his or her highest practicable well-being,
regardless of whether the care areas are captured on the MDS. A
facility may document additional information regarding the resident's
status wherever it chooses in the resident's clinical records.
Comment: One commenter urged that we move cautiously in adding any
other
[[Page 67176]]
data elements to the MDS, explaining that some States with a non-MDS
based case-mix system are having difficulty merging the MDS and their
reimbursement system. Other commenters disagreed regarding the need to
add items to the MDS at this time. They thought that we should maintain
the status quo until the industry and surveyors have more fully
understood and integrated the current instrument into their way of
doing business. Commenters mentioned that the MDS is a screening tool
that already contains most of the relevant items. One commenter stated
that the original MDS underwent extensive scrutiny and testing during
its development and should be kept as is for at least 10 years in order
to maintain consistency for providers, computer companies, research,
and case-mix reimbursement.
Response: We disagree regarding the need to maintain the MDS for
the next several years in the form it was originally issued in 1990
(not as revised in 1995 in version 2.0). Many of the changes in version
2.0 of the MDS were made to address areas that had been particularly
troublesome or poorly understood by clinicians responsible for
completing the RAI. Moreover, changes in the MDS have not been frequent
enough to cause significant disruption for facilities. Nearly all
States began to require use of the original RAI in late 1990 or early
1991, and most did not require facilities to use the new RAI until
January 1996 (with some States deferring that requirement to 1997).
This means that the original RAI was in place for nearly 5 years before
facilities were expected to change to the new instrument. Additionally,
it is less burdensome and confusing to incorporate necessary
improvements in the RAI at this time than it will be after
implementation of requirements in this regulation for facility
computerization of MDS information. Overall, the advantages of
implementing version 2.0 of the RAI in 1996 far outweigh maintenance of
the original assessment system.
If clinically warranted and supported by affected parties, we
anticipate reviewing the MDS every 3 to 5 years to determine whether it
needs to be revised, and sponsoring the development of a new version of
the RAI approximately every 5 years. For all RAI refinement activities,
we will seek the input of interested and affected parties.
Comment: Several other commenters expressed the belief that we
should conduct more RAI training on a national level and institute a
facility support effort, rather than making major changes to the
instrument.
Response: We support the need for more RAI training at all levels
and have numerous activities underway to strengthen the knowledge of
facility staff and surveyors about comprehensive assessment and its
linkage to resident care planning and quality of care. The need for
additional RAI training has been consistently supported by the States,
provider, consumer and professional associations with which we have
worked to develop version 2.0 of the RAI. In 1995, we published a new
edition of the Resident Assessment Instrument User's Manual for version
2.0 of the RAI that contains new information on the use of the RAPs and
linking the RAI to care plans. We have developed ``train the trainer''
materials for use in both provider and surveyor training, and have
begun a multi-year effort to develop educational materials for both
providers and surveyors at both basic and advanced levels. We train all
long term care facility surveyors on the RAI as part of our basic
health surveyor course and have offered specialty courses on advanced
resident assessment issues for surveyors as well as other State staff
on a routine basis. We also offered a full-day program on resident
assessment for all long term care facility surveyors during each of the
HCFA regional conferences held during 1994. We are committed to working
in partnership with providers and States to identify training needs and
develop methods to facilitate the dissemination of consistent
information and improve providers' use of the RAI in order to improve
care outcomes for nursing home residents.
We believe that the industry also shares a responsibility to
promote understanding of the RAI within facilities. Provider and
professional organizations should offer sessions on resident assessment
during their annual meetings or as special continuing education
programs held throughout the course of the year. Our staff have
participated in a number of national meetings and will continue to do
so, as warranted. However, we believe that providers can best learn how
to integrate RAI requirements into their daily practice from other
providers who have implemented successful programs. We encourage the
use of ``peer teaching'' programs in a variety of forms.
Beneficiary organizations have also played an important role in
getting information on the RAI out to their members. The organizations
have educated residents, families and ombudsmen regarding the role of
resident assessment in quality care and how to use the RAI in care
planning and conflict resolution. They also provided invaluable input
in modifying the RAI.
As part of a contract with us, the Research Triangle Institute
evaluated the extent to which facilities had implemented the RAI as
well as the accuracy of the assessments being conducted. The Research
Triangle Institute compared available assessment information for 23
specific assessment items in facilities both before and after the
implementation of the RAI. Their sample consisted of over 260
facilities in 10 States. The Research Triangle Institute's results
showed that:
The percent of residents with no assessment information
available for particular health status issues decreased on average by
81 percent;
The percent of residents with accurate information
documented on assessment items increased on average by 24 percent;
The percent of residents with available information on all
23 items increased by 53 percent.
The Research Triangle Institute's study asserts that facilities are
using the RAI, and that the RAI has resulted in the presence of more
accurate information on which a facility can base its individualized
care plans.
Comment: Commenters addressed the usefulness of the RAPs. Of those
who responded to this request for comment, some said that the RAPs are
useful and provide a structured framework for making sense of the MDS
data through analysis, interpretation, and synthesis, believing that
the RAPs tie the assessment process together. A consumer advocacy
organization believed that the RAPs assist facility staff in learning
causes of problems and identifying potential risks of decline that
require further staff attention. A few said that the RAPs have improved
the quality of care in nursing homes, or could with the appropriate
training and administrative support.
Response: The RAPs are structured decision frameworks which contain
guidelines for additional assessment of relevant resident attributes,
risk factors, clinical history and other factors. They assist with
clinical decision-making and help nursing home staff gather and analyze
necessary information to develop an appropriate and individualized care
plan.
The Guidelines section of each RAP assists staff to determine
whether a problem exists and to identify relevant causal factors that
affect the resident's condition. The RAPs also offer suggestions
regarding how a facility can eliminate or minimize factors
[[Page 67177]]
contributing to the resident's problem, or how a facility can maximize
a resident's strengths to achieve the highest practicable well-being.
In this way, the RAPs help facility staff to develop an individualized
care plan that meets the needs of the resident.
According to the report of the Research Triangle Institute's study,
directors of nursing indicated the RAP triggers and guidelines were
used routinely in over 90 percent of the facilities participating in
the survey. Three-quarters of the directors of nursing stated that they
believed that use of the RAP triggers had increased their facility's
ability to identify residents' clinical problems, and two-thirds
believed that using the RAPs had increased their facility's ability to
identify residents' potential for rehabilitation improvement.
Among the 180 directors of nursing who thought the RAP triggers had
increased identification of clinical problems, 45 percent were able to
identify, without prompting, specific RAPs for which this increase was
most pronounced. They most frequently cited cognitive loss/dementia (21
percent), ADL/functional rehabilitation potential (17 percent),
delirium (16 percent), and communication (15 percent). Seventy-two
percent of the directors of nursing interviewed stated that they did
not believe it had been at all difficult for staff to provide necessary
care in response to the newly identified clinical problems.
Comment: Some commenters believed that the RAPs are too
prescriptive, and that we are ``legislating a cookbook approach.''
Response: RAPs function as resident-care related assessment tools
rather than as clinical standards. RAPs do not contain prescriptive
mandates to perform particular diagnostic tests or specialized
assessments. Rather, RAPs lead facility staff through a process that
enables them to gain a better understanding of the resident's status in
a particular area.
For each resident, facility staff are required to make decisions
regarding whether each RAP that triggered for that resident identifies
a problem that requires care planning and intervention. Staff are
required to proceed with a care plan only if clinically appropriate. As
part of the RAP review process, facilities are required to document key
information regarding a particular area or condition that includes
objective findings and subjective complaints of the resident.
Irrespective of RAI requirements, this type of information should be
routinely assessed and documented by a facility as a part of good
clinical practice. We do not require that a facility provide
documentation that addresses each issue or question raised in a
particular RAP guideline. We disagree that the RAPs represent a
cookbook approach. The RAPs are tested assessment protocols that lead
facility staff through a focused, logically progressive, clinical
evaluation of the resident, relative to the particular area addressed
by the RAP. The RAPs are not intended to prescribe courses of action
for a facility. Rather, they provide a structured, problem-oriented
framework for organizing MDS information and additional clinically
relevant information that identifies medical problems. Upon completion
of the RAPs, the facility staff will have:
Identified clinical issues unique to the resident that may
adversely affect his or her highest practicable level of well-being;
Identified factors that place the resident's highest
practicable functioning at risk;
Considered whether the identified potential problems could
be prevented or reversed, or risk factors minimized, and evaluated the
extent to which the resident is able to attain a higher level of well-
being and functional independence; and
Evaluated ongoing care practices for the individual
resident.
Comment: One commenter asked that we not mandate standards for care
planning until there is better understanding of how the assessment
process works. The commenter stated that a great deal of work needs to
be done in setting up appropriate standards for care planning.
Response: Neither the RAPs, nor any other component of the RAI
contains required standards of care or standards regarding the specific
interventions and time frames for evaluation that must be present in
care plans. As noted in the responses above, the RAPs are a structured
framework that lead the facility through more in-depth assessment; they
do not mandate a course of action for care planning. A facility has a
great deal of flexibility in developing a care plan to meet a
resident's individual needs.
Comment: Some who commented thought that the RAPs are too complex
and difficult to use. One expressed the belief that the RAPs are not
the only correct criteria for providing good care. Another pointed out
that it has been a difficult learning process for facilities to
understand that the MDS provides only raw data about a resident.
Commenters recommended that some of the RAP items be included in the
MDS as core assessment items.
Response: We agree that there has been a steep learning curve in
terms of facilities' understanding of the RAPs and their ability to
integrate them into day-to-day clinical process. Anecdotally, and more
recently supported in the Research Triangle Institute study, facilities
report that understanding and use of the RAPs has lagged well behind
that of the MDS. Recognizing that the system required a major learning
process, we have tried to address the RAPs in newer versions of our
train-the-trainer courses offered annually for State RAI coordinators.
Initially, our courses and materials focused on use of the MDS, then
use of the RAPs, then integration of the RAI in care planning. Many
States are still in the process of conducting training sessions for
providers on use of the RAPs and care planning.
We also have made revisions to the RAP Summary form and our
instructions regarding use of the RAPs in order to make them easier to
understand and use. We will continue to refine our training products as
well as evaluate facility staffs' ability to use the RAPs. If problems
are identified, we are open to exploring ways to revise the RAP format
or content in order to make the comprehensive assessment process more
meaningful and productive for both facility staff and residents. We
have incorporated some additional RAP triggers into the MDS and
integrated assessment procedures contained in the RAP Guidelines
throughout the instructions contained in the October 1995 edition of
the RAI User's Manual.
Comment: A few commenters suggested that we make the RAPs available
to facilities on request. Commenters asserted that often there is not a
copy at the nurse's station.
Response: We agree that it is important for the RAPs to be
available for staff use. In 1990, we sent information to each nursing
home administrator regarding the RAI, and this information included a
copy of the RAPs. Additionally, in 1990, we provided each State with a
camera-ready copy of the original version of the RAI, and in 1995, we
provided each State with a camera-ready copy of the new RAI, version
2.0. States were then responsible for providing facilities with a copy
of the revised RAI including the RAPs.
We do not believe it is our responsibility to ensure that each
nursing home currently has a copy of the RAPs. Facilities could request
a copy from States, provider organizations or from other sources.
However, we are exploring strategies to improve consistent distribution
of RAI
[[Page 67178]]
information to nursing homes and ensure that clinical staff have access
to the RAI User's Manual. We believe that for the RAPs to be used as
intended, a copy of the RAPs should be available at each nursing
station. States are responsible for communicating with facilities
regarding the State-specified instrument and should, therefore, ensure
that the facilities have the most current RAPs.
Comment: Some commenters wanted more flexibility in using the RAPs.
They thought the RAPs should be adaptable, and, as professionals,
facility staff should be able to pick and choose appropriate
interventions from those suggested in the RAPs. Commenters also
suggested that we make the RAPs optional. One commenter believed that
the final product and process forces health care professionals into a
format that stifles flexibility and interferes with the assessment and
care planning process. Another suggestion was to allow a facility to
use the RAPs as a flexible assistive device in care planning.
Response: We agree that facility staff are capable professionals
and, as such, should be able to use the RAPs as is appropriate for each
individual resident. This has always been our intent regarding their
use. A facility may supplement the RAP assessment.
We believe that negative feelings regarding the utility of the RAPs
are associated with lack of understanding of their use. As
aforementioned, our training in the past did not focus on the RAPs. It
has been our experience that facility staff who have been properly
trained on the RAPs and integrated them into their clinical practice
are convinced of their utility and positive effects on resident
outcomes.
We do not believe that use of the RAPs should be optional, as they
reflect necessary components of a comprehensive assessment. The RAPs
represent a standard methodology for assessing and analyzing certain
aspects of resident status. As part of the utilization guidelines for
the RAI, the RAPs ensure consistent identification of medical problems
and description of functional capabilities. They supplement the MDS to
provide a standardized comprehensive assessment as is required by the
Omnibus Budget Reconciliation Act of 1987 (OBRA '87).
Comment: A few commenters suggested that we collaborate with the
Department's Agency for Health Care Policy and Research and the
industry to make the RAPs more germane to current industry practice,
knowledge, and standards. One commenter wanted us to provide actual
assessment tools and decision trees. A State provider association
recommended that the RAI contain fewer RAPs, and furthermore, that we
encourage facilities to develop their own triggers consistent with
their care planning system.
Response: We collaborated extensively with the industry in
developing the original 18 RAPs. The Department's Agency for Health
Care Policy and Research was not yet in existence when we developed the
original RAPs. In revising the RAPs, we will seek the input of
interested and affected parties. Regarding the comment to develop
assessment tools and decision trees, it would be difficult for us to
develop decision trees that cover all possible scenarios. We do not
wish to require such a methodology for completing the RAPs, as it would
limit the flexibility of facilities. Most providers have tended to
request that we develop more RAPs, rather than fewer. We have an
ongoing process for developing new RAPs by clinical experts and
validating the RAPs through testing. Also, we will review the content
of the current RAPs to ensure that they contain information pertinent
to the changing nursing home population. We do not anticipate issuing
changes to the RAPs more frequently than once a year. States may, with
our approval, revise their instruments as frequently as they deem
necessary.
Triggers are risk factors or strengths that are indicative of a
need for additional assessment. They do not automatically flag all
problems worthy of care planning. The original triggers were developed
using an expert consensus process and have been empirically validated.
As such, it is inappropriate to suggest that a facility identify its
own triggers based on their care planning systems. A facility may
choose to add additional triggers, but must use at least the triggers
identified in the State RAI. Facility staff may choose to assess
residents using the RAPs even if the RAPs are not triggered.
Comment: Commenters suggested that we emphasize that the RAP
process is not limited to the completion of the RAP Summary form. It
includes the need to understand why the resident's condition triggered
the RAP. The commenter also recommended that the RAI Training Manual
contain a set of examples concerning how to use the information in the
RAPs as part of the assessment process.
Response: We agree that the RAP process is not merely filling out
the RAP Summary form, but is an important link between gathering
assessment information and developing the appropriate care plan. In
April 1992, we issued guidance to our regional offices and the States
regarding the RAP process and other policy issues. We also shared this
information with provider and consumer organizations. We have revised
the RAI User's Manual to include this guidance and more specific
instructions and examples, including RAP documentation and linkages to
care planning. In October 1995, we distributed to States and
associations ``train the trainer'' materials that included special
course content for RAI surveyors and trainers. This included
instructions on using the RAPs.
Comment: A commenter urged that we structure the RAPs so that they
identify resident problems, complicating conditions and risk factors.
The individual stated that some RAPs are currently in this structure
and that this would make the RAPs easier to use.
Response: We believe that all RAPs presently contain this
information. However, we are open to reviewing the RAPs to ensure that
their format is consistent as a part of our ongoing RAP review and
refinement process that we began in 1995.
Comments on the Development of a Computerized National Data Base of
Assessment Information
Comment: Generally, commenters that supported the proposed
requirement to computerize the MDS included State governments and
national and State provider organizations. One State expressed the
belief that computerization should be optional; they thought that
States should determine when and whether participation is feasible
given the States' prevailing conditions.
Response: We intend to implement a Federal process for assuring and
improving quality in this country's nursing homes which relies on
resident-level MDS assessment data reported by nursing homes
participating in Medicaid and Medicare. Furthermore, our intention is
to improve the Federal long term care survey process by using
information derived from MDS data to identify potential quality
problems within nursing facilities. The goals of this approach are
twofold: to improve care received by beneficiaries by enhancing the
timeliness and effectiveness of facility monitoring; and to better
utilize survey agency resources by targeting potential problem
facilities and by focusing onsite survey activities on specific problem
areas within a facility.
We view the collection of MDS data and its use within a
standardized survey process, as defined under our State
[[Page 67179]]
Operations Manual as being consistent with our current practices. Under
the present survey process, the facility must submit specific
information to the State survey agency, including data on resident
census, facility staffing and ownership status. These facility-specific
data, along with other information gathered by the survey team (for
example, facility deficiency information) are currently maintained both
at State agencies and within a national data base maintained by HCFA.
In addition, survey teams review residents' clinical records and other
resident-specific information. The submission and use of MDS data
within the context of facility regulation is entirely consistent with
existing practices and our obligation to collect the information
necessary to ensure the quality of care provided to residents of
Medicare and Medicaid certified long term care facilities.
Automated data collection is essential to meaningful analysis of
the quantity of data collected. The MDS data system would allow us to
expand our existing system for gathering data related to quality, and
provide us with objective and detailed measures of the health status
and care outcomes for residents of a facility. Coupled with facility
characteristic and deficiency history data, we expect the MDS system
will be more reliable and effective in supporting early identification
of potential care problems and directing the survey process towards
these identified problem areas.
In their roles as our agents for conducting regulatory survey and
quality assurance activities, States will be required to process and
analyze MDS data reported by facilities to meet the objectives stated
above. MDS information collected by States will also be used to
construct a national repository of MDS assessments. The national data
base will be used to serve numerous functions: to study and refine the
quality measures used to direct survey activities of State agencies
(for example, to enhance the ability of these indicators to support
survey targeting); to understand the characteristics of the nation's
nursing home residents and the services they receive; to measure the
impact of regulation and assist in the formulation of national health
care policy; and to provide researchers with information needed to
evaluate the outcomes of various types of care and to improve standards
of clinical practice.
Our authority to require computerization of MDS information is
based on our general authority to set health and safety standards for
providers under sections 1819(f)(1) and 1919(f)(1) of the Act. We will
use the computerized data to establish standards, evaluate a facility's
compliance with these standards, and review the standards'
effectiveness and their continued appropriateness. For example,
analysis of MDS assessments within a national repository might indicate
an increase in the number of residents suffering from depression. We
may then develop standards to assist facility staff in detecting and
treating the disease. Such a standard could then be evaluated and its
effectiveness assessed by a process of continually re-analyzing the MDS
data base for changes in the prevalence of this characteristic over
time.
Computerization of RAI data is also consistent with our authority
under sections 1819(h)(3) and 1919(h)(3) of the Act to perform, review
and validate facility surveys. As is discussed above, we intend to
revise the survey process to utilize computerized assessment data. The
new process will be an information-based approach, oriented around
quality measures derived from computerized MDS data, as well as other
sources of information. Furthermore, sections 1819(g)(2)(A)(I) and
1919(g)(2)(A)(I) of the Act mandate that we subject facilities to a
``standard'' survey. The availability of computerized assessment data
will improve our ability to make the survey process more standard and
consistently implemented within the across the States.
Currently, part of the standard survey includes an assessment of
the status of a sample of residents over time to determine whether the
facility has assisted the residents to attain or maintain their highest
practicable level of well-being. Computerized assessment data will be
instrumental in that it will allow a complete monitoring of
characteristics of ``all'' residents, including changes in their
functional status over time. Furthermore, under the current survey
process, we can only determine changes in resident status and a
facility's relative success in maintaining resident well-being cross-
sectionally during an annual onsite survey. MDS computerization, on the
other hand, provides the ability to monitor resident functional status
and other characteristics through a longitudinal process of continuous
measurement.
These uses of computerized RAI data also provide justification for
requiring computerization under our overall program supervision
responsibilities and general rulemaking authority under section 1102 of
the Act, to the extent that the information will be used for general
monitoring of care and beneficiary needs. This computerized information
will ensure that program standards set forth in sections 1819(b) and
1919(b) of the Act are met, that the program is being properly
administered, and that beneficiaries are being served, as contemplated
generally by the Act. We address elsewhere the further uses of the data
for monitoring the Medicaid and Medicare programs.
In addition to the authority cited above, to the extent that the
RAI data are collected solely for Medicaid purposes, section 1906(a)(6)
of the Act requires State agencies to make reports as required by the
Secretary. As discussed above, the RAI data are essential for the
Secretary's evaluation and monitoring responsibilities under the Act.
We disagree with suggestions to offer States a choice to
participate in the proposed national MDS data base for a number of
reasons. First, the processes being regulated by Federal authority via
State agencies (healthcare delivery and associated standards of care)
do not have varying criteria from one State to the next. In other
words, standards of medical care and health service delivery do not
vary across States; health standards in New York are the same as those
in Alaska. This commonality has reasonably led to the formulation of
one national set of regulations to evaluate provider performance with
respect to these common health standards. It is our belief that this
standard regulatory approach is in the best interest of the nation's
healthcare consumers with respect to both ensuring consistent delivery
of services across States and with respect to the healthcare industry's
reasonable expectation to operate under a single set of rules and
requirements. Thus, as this standard approach to facility regulation
evolves over time, with its specific objectives for continuous
improvement and refinement, it is appropriate for us to require our
agents (in other words, States) to adopt the standard processes and
mechanisms required to consistently implement these new approaches.
Specifically, allowing States to choose not to adopt a standard
system for MDS information will adversely affect our ability to meet
the objectives for these data. The following goals cannot be met
without consistent implementation of the MDS system and process
standards across all States:
The ability to construct a modern regulatory model
provides a reliable and objective means of measuring facility
performance. MDS information gathered and maintained by a standard
system in each State provides an information structure capable of
providing this
[[Page 67180]]
alternate approach to measuring quality and creates the foundation for
an information-based regulatory model. The ability to successfully
implement such an approach is directly tied to process standardization
across States.
If States are allowed to choose not to operate this
standard system, then we would not be capable of developing and
implementing a facility targeting system or, information-based survey
process consistently across States; thus, at best, an environment would
exist in which facilities in one State would be subject to different
quality monitoring and survey approaches than facilities in a
neighboring State.
Our ability to build a centralized national repository to
support our various objectives with respect to quality monitoring,
policy, program and regulatory development and evaluation, and to
facilitate healthcare research, is dependent on our ability to receive
reliable and timely MDS information from each State. Without a
standardized MDS collection system in each State, the development of
this MDS repository will be severely limited if not entirely impossible
due to the prohibitive costs associated with interacting with varying
system implementations in each State. Furthermore, without full
participation of each State in this program, the general
representativeness and usefulness of the information in the data base
will likely be skewed or biased, depending on which States choose to
participate. This would affect the validity of the information and
could seriously limit its application for health resource planning and
research of value to State and Federal governments, providers and
consumers.
Finally, States will play a critical role in informing
consumers in that States will make aggregate MDS information available.
This information will allow potential residents or their family members
to select a facility that may best suit their needs. Without a standard
approach and system for developing these public information resources,
consumers and advocacy groups will not have reliable, consistent and
comparable information healthcare providers across States.
Comment: In commenting on what specific uses States would have for
the computerized data, commenters discussed using the data in the
nursing home survey process. One State believed that the data would
assist State survey agencies to focus the survey process and set norms.
A consumer advocacy organization pointed out that, based on the
strengths and weaknesses of a facility, a State could individualize the
composition of the survey team sent to evaluate the facility's
regulatory compliance. In other words, the number and type of surveyors
sent onsite would be based on the types of potential care problems
identified at the facility. For example, if a facility had a high
prevalence of antipsychotic drug use, the survey team would include a
pharmacist. This approach has the dual benefits of maximizing limited
survey agency resources by better targeting them against the most
likely problem areas, and for minimizing the general invasiveness of
the survey process within the facility by focusing the process on key
problem areas.
The MDS data set provides objective and consistent measures of a
number of facility care and outcome parameters. By comparing individual
providers to ``gold standards'' and other peer group-based norms on
each of these parameters, States can identify high and low facility
performance outliers on measures associated with the quality of care
and the quality of life for residents of these facilities. Commenters
also suggested that the data could be used to replace some resident-
level information currently collected during the survey process on a
form called the Resident Census and Conditions of Residents (HCFA-672),
as well as other reporting forms for State and Federal needs, which
would reduce facility burden. The MDS assessment contains detailed
resident characteristics that can be used to eliminate all other forms
and resident-level data collected by facilities to meet State and
Federal requirements. One commenter, however, believed that the
information should not be collected by the survey agency and used for
investigations or enforcement.
Response: As described in prior responses, MDS data will assist
State survey agencies in a plethora of ways to achieve greater
efficiencies in monitoring quality of care and ensuring the highest
levels of quality of care and quality of life for residents of nursing
facilities. These examples include:
Problem identification: the capability to reliably target
areas for investigation of potential resident care problems prior to
and in support of the onsite survey process;
Survey targeting and scheduling: the ability to determine
survey frequency and scope based on specific indicators of potential
care problems;
Tailoring survey team composition to specific problem
potentials in facilities to most efficiently use limited staff and
resources. It is important to note that States currently are not
prohibited from considering nursing home characteristics when
determining survey team composition, provided that the team includes a
registered nurse. The State Operations Manual notes in section 2801
that to the extent practical, the team's composition should reflect the
type of facility surveyed; and
Conducting cost/benefit analysis of care approaches, based
on resident outcome data adjusted for case-mix classification
categories. This is consistent with the Department's medical treatment
effectiveness initiative.
Many software programs currently used by nursing homes to enter MDS
information already have the capacity to generate timely resident
census information such as that found on the HCFA form 672. Several
States are also developing systems to facilitate this activity for
providers. The availability of the MDS standard system will provide
significantly more detail on resident characteristics than general
census information, and will make use of definitions that have been
clinically developed and refined to maximize both reliability and
validity. As such, the MDS will create a whole new model for
understanding and communicating about resident characteristics. This
new model will far outperform the limited view of residents that can be
derived from information from current sources, such as the Form 672.
We further believe that automated resident status information has
much more potential to further decrease the amount of paper work
associated with the survey process. We have recently completed an
evaluation of the survey process and intend to make ongoing refinements
to incorporate new technologies and increase the efficiency of the
survey process.
Comment: One commenter stated that using computerized data to
target surveys would resemble a ``big brother'' environment and not one
conducive to accurate assessments for fear of investigations based upon
minimal data.
Response: We disagree that using assessment data to target the
survey process would resemble a ``big brother'' environment or that
fear of investigations would affect the accuracy of assessments.
Facilities already submit significant resident-level information to
support both survey agency functions and for claims processing under
Medicaid and Medicare. These data have been collected for years without
the adverse effects suggested by a ``big brother'' analogy; instead, to
the extent that facility information has been made public (for example,
release of survey and complaint results and findings), this release has
served to provide valuable information to those interested in promoting
the quality of life in nursing
[[Page 67181]]
facilities. There is no reason to believe that collection and analysis
of MDS information will not similarly be used in the interests of the
general public with respect to their right to know the quality of
healthcare services delivered by Medicaid and Medicare providers.
The MDS simply provides a better, more powerful mechanism than is
currently available to observe and report resident condition or to
monitor facility quality and safeguard the rights of residents of these
facilities. The MDS is a tool for measuring healthcare facility
performance, which also creates a foundation for improving the
effectiveness of regulatory agencies as well as their operational
efficiency.
Having a standard MDS repository available within State and Federal
agencies provides a rich information resource to serve many objectives:
it will provide access to reliable information and standard measures of
resident characteristics for the many groups interested in improving
care and quality of life in nursing homes, including consumer advocates
and researchers; and, the MDS will support many other programs within
States including providing the basis for Medicaid payments as is
currently in effect in a number of States.
Clearly, the availability of MDS information within standardized
Federal systems maintained by States directly benefits the general
public as consumers of healthcare services and generally enhances the
public knowledge of the quality of these services.
Furthermore, we expect that the MDS repository will enable HCFA or
its State agent, or both, to provide facilities with analytic reports
based on aggregated resident characteristics. This is consistent with a
quality improvement model, as it allows facilities to compare
themselves to other homes that are similar in terms of size and
resident demographics. This directly promotes facilities as they seek
to develop their own in-house quality assurance programs. Ultimately,
facilities may use the data in ways that would analyze allocation of
resources, and demonstrate efficiencies in caring for certain types of
residents, and in turn, negotiate with managed care organizations for
admission of certain types of residents.
We recognize that information contained within the MDS assessment
is sensitive and must be safeguarded, and that protecting the privacy
of residents is essential. In establishing a system of records for
storage of MDS data, both HCFA and the States (as HCFA's contractors in
performing survey functions) must comply with the Privacy Act, which
applies to Federal systems of records containing individually-
identifiable information. While we can make public aggregate summaries
of the data, there are strict Federal guidelines for the release of
individually-identifiable information by Federal agencies to any
individual or organization. We can only release individually-
identifiable information if a disclosure provision exists in the
Privacy Act System of Records that is published in the Federal
Register. We review requests on an individual basis, according to the
provisions of the Privacy Act. Refer to the more detailed discussion
later in this preamble concerning protection of privacy.
In summary, it is clear that the availability of structured
analyses derived from MDS information will empower those working with a
variety of approaches to improve the lives of residents of nursing
homes. Whereas the big brother term suggests a scenario in which the
interests of the individual are sacrificed to promote the interests of
the State, this is clearly not the case with respect to the objectives
for MDS information. Instead, MDS information will be used to directly
support the interests of individual nursing home residents by
substantially enhancing our understanding of healthcare delivery in
nursing homes and by creating a standard framework for monitoring the
quality of this care.
Comment: A few commenters noted that computerized assessment data
would support a case-mix reimbursement system, and that it would be
helpful to be able to compare facilities with similar case-mix levels.
Response: Our Office of Research and Demonstrations began the
Nursing Home Case-mix and Quality Demonstration in 1989. One goal of
the demonstration is to design, implement and evaluate a nursing home
payment and quality monitoring system for Medicare skilled nursing
facilities based on resident-level information contained in an expanded
set of MDS data. States participating in the demonstration are also
using MDS data to calculate reimbursement under Medicaid. Computerized
information from the demonstration's data base will provide information
on outcomes and processes of care, stratified by case-mix and other
characteristics in the six participating States. This will also provide
a mechanism by which to evaluate the effect of reimbursement on quality
issues.
Several other Medicaid agencies in States not participating in the
demonstration have chosen to independently implement an MDS-based case-
mix system for setting payment rates for facilities and for determining
coverage. Numerous other States are currently studying moving toward a
case-mix payment system based on the MDS. Furthermore, States have
identified a plethora of other functions to be supported by information
contained on the MDS assessment form, these functions include:
utilization review, service placement, and improvement in the States'
ability to monitor and evaluate the cost-effectiveness and quality of
care and services provided under the Medicaid program.
At least two States have already incorporated, or plan to
incorporate, MDS information into their Medicaid management information
system. West Virginia notes that to do so will allow the State to fine-
tune its long term care rate setting and payment methodology. West
Virginia integrated its stand-alone long term care payment process into
the Medicaid management information system. The system captures monthly
data to calculate the resident-specific case-mix index. An electronic
billing system was implemented through the Medicaid management
information system, which calculates the base rate reimbursement for
all Medicaid beneficiaries, as well as the additional payment due based
on the case-mix acuity determined from an expanded set of MDS data. The
MDS reporting system not only enables the Medicaid agency to conduct
utilization review, but also allows the survey agency to use the
reports for quality of care issues.
Another State has a legislative mandate to integrate to the fullest
extent possible, its MDS system, preadmission screening and annual
resident review system, and treatment authorization request system. The
State points out that, because it uses a composite per diem rate, the
State agency has little ability to comprehensively review and adjust
approval or reimbursement systems in order to improve the quality of
care, increase efficiency, or control costs in long term care. We
believe that integration of the MDS and the Medicaid management
information system will support the objectives of its Medicaid program,
including provision of the highest practical level of care and
management of available funds in a fiscally prudent manner to maximize
purchasing power. The State maintains that its system will provide
information to facilities, State and Federal agencies, and to the
public that will improve the quality and cost effectiveness of care
delivered in the State.
Comment: Another use of computerized resident data that
[[Page 67182]]
commenters addressed was to support policy analysis and monitoring of
trends. One State noted that the data could be used to inform and
improve general Medicare and Medicaid policies. Another State gave the
example of using the data as a tracking system for prevalence of
pressure sores, restraints, and drug therapy. A commenter stated that
data could be used by the appropriate quality monitoring personnel in
the State to increase the probability of detecting and analyzing State-
wide health care problems. Another State commenter suggested using the
data at the resident-specific level to determine an individual's needs
for assistance with activities of daily living and other required
services. The commenter also discussed analyzing aggregate information
for residents by facility.
Response: We agree that these data will benefit both the policy and
operational components of States and the Federal Government as well as
provide valuable information to the consumers of long term care
services.
Potential benefits in policy development and evaluation expected
from this information include the following:
Foremost is the added operational efficiency derived from
the MDS' ability to support a multitude of applications and
programmatic objectives. As a single form designed to capture a
comprehensive view of residents and related facility care practices,
when submitted within the context of a standardized data management
system, it greatly reduces the operational costs of data gathering as
compared to current program requirements involving multiple forms and
submissions from the facility. For example, many States receive three
different categories of resident information from facilities, each
requiring separate forms and submission rules: placement determination
forms (for example, preadmission screening and resident review),
payment-oriented clinical information to support case-mix adjustment
(for example, Minnesota's or West Virginia's case-mix assessments), and
survey-oriented forms describing resident characteristics. With the
breadth of data collected on the MDS, the requirements in each of these
examples can easily be met via a single submission of MDS data; thus,
the operational overhead and associated costs for both facility and
State are reduced.
At the national level, policy decision-making,
development, and evaluation are supported through the creation of a
standard means to analyze State differences in the quality of services
and resident care outcomes in the nation's 17,000 certified long term
care facilities.
By deriving both payment and quality functions from a
single instrument, a framework is developed to closely monitor the
relationship between payment and corresponding service delivery, and to
provide an objective basis upon which incentives to promote and reward
outstanding care patterns and outcomes can be built.
With respect to support for survey agency operations, creation of a
standardized MDS repository in State agencies provides the framework
for the development of an information-driven survey process by which
the frequency and scope of facility review are based on objective
measures of a facility's performance in comparison to established
standards. This information-based survey concept and its benefits are
discussed in prior sections of this regulation.
Comment: We also received other suggestions and examples of ways
that States are currently using computerized MDS data. A few States
indicated that they are using or could use the data for resident review
requirements under the preadmission screening and resident review
program (PASRR). Other ideas included:
Relating to research support, MDS information will support
both basic clinical research activities as well as practical
applications such as identifying issues for ``best practice''
conferences.
Using the resident data to identify strengths of each
facility, staffing patterns, common diagnoses, and resident
characteristics (suggested by a professional organization).
Using the data for health planning related to long term
care services, certificate of need decision-support, projecting nursing
home bed need, and determining characteristics and care needs of
current residents.
Identifying industry and surveyor training needs with
respect to changing demographics and industry structural delivery
mechanisms (for example, as service delivery blends across multiple
traditional care settings).
One State commenter expressed the belief that the paperwork burden
in that State would be reduced by having MDS data available for a
variety of purposes.
Response: We agree that potential benefits exist for all of the
above listed uses of automated assessment information. A standardized
system for MDS data collection and analysis that we will be providing
to States will facilitate States' and facilities' ability to make use
of these data by creating an infrastructure for managing, analyzing and
distributing information to meet these varying program objectives.
Comment: A commenter did not think that a facility could determine
staffing patterns from the MDS data set, which would negate its ability
to be used in determining differential rates of payment.
Response: The commenter is partially correct, in that the RAI does
not explicitly collect information on staffing. However, staffing
standards, staffing mix, and minimum staffing requirements are already
well understood with respect to the intensity of care required for a
given resident and his or her clinical characteristics. There are, in
fact, several commercially available systems that currently use MDS
data and derived resident characteristics information to assist
facility administrators in setting appropriate staffing levels
according to the mix of resident care requirements in their facility.
Furthermore, with respect to State payment and rate setting, States
that currently use an MDS-based case-mix payment approach have adopted
the resource utilization group methodology for the payment
determination. This methodology is based on resource groupings that are
created through time studies of facility staff as they carry out their
daily care tasks. These time study data are then linked to
corresponding resident characteristics data to determine levels of care
resource utilization (staff time, supplies, etc.) for given sets of
care needs.
Thus, in this approach, staff requirements are implicit in the
determination of each distinct care grouping, each of which is then
associated with a specific reimbursement rate. Residents with complex
care characteristics fall into a higher reimbursement group which
directly reflects the additional staff resources required to care for
that resident. In more sophisticated States, these models have been
extended to allow for staffing pay rate differences across various
regions within the State (for example, urban vs. rural staff pay
differentials).
The current MDS 2.0 assessment form includes calculations for
several of the most common variations of the resource utilization
group's scoring in the standard specification for MDS data. Therefore,
States that do not currently use case-mix-based reimbursement will
still have an implicit and proven method of measuring the relative care
and staffing requirements of residents
[[Page 67183]]
according to widely accepted norms for such comparisons.
Again, the ability to support this functionality is created by the
deployment of the standardized system for managing State MDS data
repositories, upon which such resource utilization groups-oriented
analyses will be derived.
Comment: We requested public comment on whether to collect a sample
or 100 percent of MDS data. Of those who commented, most believed it
would be preferable to collect 100 percent of facility data. One State
thought that collecting only a sample of data would not produce the
necessary level of detail required for a multipurpose data base system.
The commenter further stated that operational activities generally
focus on specific individuals, which would usually require information
on all residents from all facilities. Another noted that 100 percent
would be advantageous for rate setting and quality assurance,
recognizing that the intended use of the data influences the collection
requirements. The commenter said that an aggregate of 100 percent of
facility data would serve well for the Federal level data set. A third
State believed that having facilities submit data for all residents
would make the State survey agency's sampling procedure in the long
term care survey process more effective, as well as result in a
comprehensive national data base. One State thought that sampled data
would be disadvantageous in that it would provide incomplete or
inaccurate representation and would be influenced by factors such as
population density.
Those opposing collection of 100 percent of the data listed the
associated cost, the size of the data base, and the man hours involved
in collecting and maintaining the data. Proponents of collecting a
sample of facility data noted that current survey protocols determine
compliance with State and Federal requirements based on a sample, and
that MDS data set required for submission should be no different. A
national provider organization said that collecting 100 percent of the
data would not meet the underlying intent of the law pertaining to the
implementation of comprehensive assessments, the resultant care plans,
and improved quality of care. A national provider organization believed
that if 100 percent of the data is collected at the facility level, the
State should send us a stratified sample on a quarterly basis, while if
a sample is gathered at the facility level, the State should send us
the entire sample on a quarterly basis.
Response: There are many drawbacks associated with sampling. An
incomplete representation or smaller number of records would make
estimates of trends more difficult. Problems with resampling would
prevent the development of longitudinal measures. Such problems
include:
The retention of any bias in the initial sample that would
increase over time and would affect the reliability of the data.
The unequal burden on facilities in the sample to correct
errors, respond to inquiries and provide data.
The need to develop complex instructions that would direct
facilities how to replenish the sample when subjects drop out. We would
require other instructions to handle changes of ownership in
facilities, facilities that leave the Medicare and Medicaid programs,
and facilities that go out of business.
In short, it would be difficult and expensive to construct and
maintain a statistically significant sample of residents for whom we
would require a facility to transmit its MDS records to the State.
Furthermore, since the facility must obtain the information
required by the MDS on each resident for clinical care planning, and
given that most facilities today have already automated this process,
the added requirement of submission of data adds comparatively little
overhead and associated costs to this process. Certainly, there is some
fixed cost associated with developing and supporting transmission of a
single resident's record to the State, but the marginal cost of
transmitting all residents' records is negligible. Therefore, there is
no cost saving to the facility to transmit MDS assessments for a sample
versus the entire population of residents.
We agree with the comments that support requiring facilities to
transmit 100 percent of all required MDS assessments. We are requiring
that a facility submit all initial, annual, and quarterly reviews, as
well as partial assessments completed upon discharge, transfer, death,
or reentry to the facility, for all residents, and that a State submit
those assessments to us.
Generally, selection of a statistically representative sample of
MDS assessments adds another complicated, costly and unnecessary layer
to producing useful, valid data that can be used to inform States,
nursing homes, and us about the quality of care and the status of
residents in nursing homes.
One hundred percent of the data is necessary for the following
reasons:
It is necessary for longitudinal tracking of residents
across time and facility admissions. This will allow us to track
special subpopulations of residents such as those with pressure sores
or Alzheimer's disease. It will allow the detection of certain trends,
such as characteristics of new admissions to nursing facilities, and it
will allow the detection of rare but significant events, such as
hospitalizations for pneumonia, fractures or other conditions.
The universe of data is also necessary to link to facility
level data bases, such as ASPEN deficiency data in State agencies and
the Online Survey, Certification and Reporting System, and to link to
Medicare and Medicaid claims files at the national or State level to
determine patterns of utilization and resource use pre- and post-
admission to nursing homes, and to determine resource utilization in
nursing homes.
It allows for targeting individual and aggregate resident
outcomes for use in an information-driven survey process that would be
impossible without a universal data base.
The universe of MDS assessments makes possible the analysis of data
at any level (for example, resident, unit within a facility, facility,
State, regional, national, or for specific resident populations). An
incomplete representation or smaller number of MDS assessments, as well
as issues associated with resampling that were mentioned above, would
limit trend analyses.
Working with the universal population of resident assessments will
eliminate the technical difficulty and expense of selecting and
maintaining a representative sample such as will be necessary to
support longitudinal analyses. Creating and implementing a complex
sampling process would be burdensome to facilities and States, and the
burden could fall unequally on selected States or selected facilities.
If facilities were required to perform sampling, there would be
additional cost to upgrade their software and training for this
capability. Additionally, some sampling methodologies would require
complicated survey analyses to adjust sampling design. This would also
be expensive.
In conclusion, the marginal additional cost of obtaining the full
universe of assessments will, in fact, be exceeded by the cost and
difficulty of maintaining a representative sample of assessments large
enough to provide the necessary information for all the uses proposed
for the data base.
[[Page 67184]]
Comment: A State suggested submitting 100 percent of data to the
State, which would then submit only a sample to us. The State contended
that it needed the most complete data set possible. The State also
noted that its data base would be manageable and would not warrant
sampling.
Response: We disagree with the concept of sending a sample of data
to us. Our regional offices have many of the same needs as State survey
agencies for 100 percent of resident-level data for certified long term
care facilities within their States as one method to target and conduct
Federal monitoring surveys in nursing homes. Furthermore, we need 100
percent of the data to develop and refine quality measures, which will
be an integral part of the data-driven survey process.
All the factors enumerated in the above comment regarding the
negative aspects of sampled submissions between facilities and States
apply equally to the submission between States and the national data
base: there is no advantage in terms of cost saving by using a sampling
approach as it is no more costly or complex to transmit assessments for
the full population. In fact, managing sampled data sets is actually
more costly; and, the ability to meet the objectives for these data at
the national level in terms of support for policy decision-making,
development and evaluation, as well as for support for research
initiatives, requires access to a complete population-based repository
of assessments.
Comment: Commenters discussed whether a national data base would
provide useful information to States for making comparisons for
management, performance, measurement, and research purposes. Of those
who addressed this, all agreed that such information would be valuable.
One State said that it would be helpful for them to be able to compare
their State with others regarding length of stay for residents with
certain diagnoses and for utilization rates of special treatments and
procedures.
Response: As discussed previously, we agree that there are many
useful purposes for information from this proposed national MDS data
base. One example of this submitted by a commenter is that the data
base could provide information for interstate comparisons of resident
lengths of stay according to diagnoses or outcomes.
Fundamentally, the MDS data, represented within the context of a
standardized information system, provides the foundation for organizing
complex clinical and facility information in ways that can be easily
generalized to support numerous current and future objectives at the
facility, State, and Federal levels. It provides a common framework for
communicating about resident clinical characteristics, care outcomes,
and quality, as well as facilities' service delivery and quality. Many
of these specific objectives have been identified throughout this
regulation.
Finally, the RAI has been translated into at least seven languages
and is being used in several European and Asian countries for care
planning to improve clinical care and for research purposes. The
international development of comparable data sets would facilitate
performance of cross-national research studies to examine the effects
of differences in care patterns on long term care resident outcomes.
These studies may provide a great deal of information on the geriatric
long term care population across all countries.
Comment: Of those who addressed how data should flow, the majority
of commenters, including a national provider organization, stated that
data should flow through the States to us. Some expressed the belief
that States should also maintain their own data base. One commenter
recommended that data be transmitted to us by the States on an annual
basis. A few commenters believed that the States should send summary
information to us. One commenter said that initially, facilities will
need a great deal of technical assistance, and it would be easiest for
that to come from the States. A national provider organization wanted
States mandated to devise methods for disseminating computerization
information to facilities and for providing technical assistance. One
State noted, however, that States should not be required to collect and
store information, if there are no expectations about how the data will
be used.
Response: We agree that States should have the responsibility to
provide some level of general and technical assistance to facilities as
relates to our and States' requirements for encoding and transmission
of MDS data. We understand that States have varying levels of
experience with the use of computerized information systems and data
bases. However, several States have already established an MDS data
base for case-mix, quality assurance or survey and certification
purposes, or both, and have provided necessary training and assistance
to facilities which enabled them to successfully implement automated
systems.
We have established technical and user groups as part of the
systems design process. These groups consist of States, provider and
consumer representatives and experts in systems design. Their expertise
and knowledge will be used to facilitate provider and State automation.
We will also work with States to ensure that personnel have the
necessary technical expertise and training to fulfill State automation
responsibilities. Also, system specifications and other relevant
materials are already available via an internet web-site, initially
established to support MDS software vendors, and otherwise available
from HCFA.
The pilot testing of the MDS standard system and associated
procedures is another step currently undertaken by us to ensure that
all aspects of this standard MDS system are fully understood with
respect to technical operational requirements, State and facility user
support needs, and general issues associated with deployment and system
acceptance. Information from this test phase will directly support our
ability to assist States in successfully installing and operating this
system and ensure that facilities can easily accomplish their
assessment submission requirements.
We fully appreciate the magnitude of support and effort that will
be necessary to ensure that appropriate training is developed and
disseminated to all who will be involved in implementing this data
base, and are in the process of developing additional procedures and
communication strategies to address this need.
Finally, a central requirement for the MDS standard system design
is to ensure that maximum attention is given to understanding and
assessing current technologies employed by facilities and States so
that the MDS system will best integrate and accommodate these existing
systems. We intend that this will both facilitate system acceptance
across all user levels, and minimize support and other implementation
costs. Also, we will emphasize technologies that lend themselves to
ease of use and user-friendliness in the selection process for each
level of the standard systems, but especially as this relates to
systems used by facilities to submit MDS assessments to their State
agency. Also, one of the implicit benefits of the decision to develop a
standard system for MDS data management is that this provides the
greatest ability to centralize support efforts, and also reduces costs
for multi-state facility chains and software vendors by reducing the
variation of systems with which they will interface, in that they need
only support access to a single standard system across States.
[[Page 67185]]
Comment: A few commenters thought that the data should be sent
directly to us without being sent to the State first. One said that it
would be costly and duplicative if States maintained their own data
base. One State agreed that State data bases would be duplicative and
suggested that States have access to a HCFA data base through the
Online Survey, Certification and Reporting System. The State commenter
noted that this could be difficult for States that have adopted an
alternate Resident Assessment Instrument, since it would be necessary
to remove extraneous data collected by the alternate instrument. A
State put forth the idea of creating a single national entity for the
centralized collection of MDS data. The commenter suggested that States
could then arrange for periodic digital communications with the entity,
believing that this method would be more efficient than each State
having to develop the capacity to receive facility data.
Response: We support having each facility initially submit 100
percent of the MDS data to the State. This would enable States to
maintain a data base for use in Medicare and Medicaid activities that
are primarily State responsibilities: quality assurance, longitudinal
tracking of care outcomes for survey, certification and licensing, and
in some States, case-mix reimbursement classification systems. Several
States are already using computerized MDS information for this purpose,
having decided that the derived benefits outweigh the costs of
establishing and maintaining such a system. Our experience has been
that States realize even more programmatic uses for the data once it is
available to them.
While we could develop a central mechanism for collecting
information from providers, there are significant disadvantages
associated with this approach: (1) It would impose an additional layer
between facilities and States with associated impact on timeliness and
accuracy of information; (2) With so many of the objectives for MDS
data being at the State level, direct submission of information to us
creates an unnatural information flow which will have an impact on the
ability of States to meet these objectives, especially as many of the
objectives, such as the information-based survey process, are so
dependent on timely access to MDS assessment information; (3) With the
many State-specific uses for MDS information, such as case-mix payment,
many of which require specialized elements recorded in the State-unique
S Section of the MDS, we could not possibly centralize support for
these functions or even accommodate all these variations in a central
repository; thus, direct submission to us would defeat the goal of
supporting unique State objectives; and, (4) States are in a much more
appropriate position to support their individual facilities with
respect to the MDS assessment, submission and data validation
processes.
The information provided by a State-maintained MDS data base is not
duplicative of a national data base. States vary with regard to their
demographics, licensing policies, quality assurance and reimbursement
systems. States are a logical level for maintenance of MDS information
since each State performs and must manage its own survey and regulation
processes. Information provided by MDS assessments cannot be obtained
from our Online Survey, Certification and Reporting System. The Online
Survey, Certification and Reporting System itself is not designed to
provide the quantity and specificity of the information in the proposed
MDS data base. Furthermore, a central MDS repository is necessary to
support objectives such as policy and regulation development, but would
not be as readily available for State functions as State-specific data.
Since specific functions (for example, information-based survey
process) are performed from this data base at the State level, it would
be inefficient to require States to support these functions via access
to a central repository.
We disagree that it will be significantly more expensive for States
with alternate instruments to collect MDS data. The design of every
aspect of the standard MDS system, from the record transmission format
to the State data base repository, is intended to support the
customizations required by individual States. Thus, although there will
be some additional costs during the initial system implementation in
States requiring custom formats, the system design makes these costs
insignificant. At this time, there is no State variant of the MDS that
cannot be accommodated within the context of the standard system
architecture.
With respect to transmissions between the State and national
repositories, we are requiring that a facility transmit only the core
MDS items on the HCFA-designated RAI, the State will only maintain the
State-specific elements at the State level.
Comment: A State noted that it currently collects computerized data
from only Medicaid-certified nursing facilities because the State can
reimburse them. The State asked if computer requirements apply to
Medicare-certified facilities, and whether Medicare facilities would
submit directly to us.
Response: The requirement to place the MDS in machine readable
format applies to all Medicare and Medicaid certified nursing homes.
There are no plans to have Medicare-only facilities submit MDS
information directly to us. In the impact statement, we address how
certified facilities will be reimbursed for information systems
equipment and supplies, as well as data encoding and transmission. Long
term care facilities certified to participate in Medicare are required
under section 1819(b)(3) of the Act to use the State-specified RAI. The
State's authority to collect computerized data from Medicare facilities
springs from its role as an agent for us in performing Medicare surveys
under section 1864 of the Act.
Comment: Commenters discussed auditing procedures that would ensure
the accuracy of the data entered into the national data base. Some,
including State commenters, believed that the accuracy of the data
should be verified through the survey and certification process. A
State commenter believed that it would take surveyors approximately 5
minutes to compare a resident's actual records with a computer
printout. One commenter pointed out that if the accuracy is checked
during the survey, a facility will take the assessment seriously and
the assessment would not be viewed as ``paperwork.'' Another supported
using surveyors to audit the match between a resident, his or her MDS
and a computer editing software system.
Response: We agree that auditing the accuracy of MDS data on an on-
going basis is very important in validating the ability of the data to
support key operational and policy decisions. Indeed, the establishment
of mechanisms to ensure acceptable reliability levels is critical to
our ability to move forward with using MDS data for quality assessment
and improvement activities, as well as other programmatic purposes.
Currently, the survey process includes evaluation of the accuracy of
assessments, as required by sections 1819(g)(2)(A)(II) and 1919
(g)(2)(A)(II) of the Act. Surveyors compare information from the most
recently completed RAI with the current status of a sample of residents
found onsite at the time of survey. We may modify and enhance the
methods for accomplishing this task to reflect access to more
longitudinal resident status information.
Several States, particularly those with case-mix reimbursement
systems, have a separate auditing system in which nurse reviewers
conduct an onsite assessment
[[Page 67186]]
using the MDS and compare it to that completed by the facility in order
to verify the accuracy of the facility's assessment. We have recently
completed a study of such methods and will be considering how to most
efficiently assure the quality of MDS data. The methods under
consideration could involve onsite review by surveyors or others, as
designated by us, or offsite data analysis and evaluation, or both. We
are also considering whether auditing would be carried out in
conjunction with the survey process, as well as the timing and
frequency of audits.
Comment: Commenters discussed methods for data verification. A few
commenters stated that we should not require auditing and we should
accept data as submitted. One State noted that any auditing process
will result in cost increases. Another commenter pointed out that the
data should be error free before the facility submits it. A commenter
suggested that we not require auditing unless the MDS data is used for
reimbursement purposes. A few commenters, including a national provider
organization, disagreed with the idea of double entering data as a
means of ensuring data integrity. They stated that it would be too
costly, resulting in an unnecessary expenditure of time, cost, and
effort.
Response: We strongly disagree with the comments that verifying
accuracy of the data is not necessary. Foremost, it is imperative that
the data be accurate and reliable for it to be used in any policy
making, planning or resource utilization capacity. Accurate resident
status information is necessary not only for reimbursement systems but
for the health planning at the State and Federal level. Secondly,
accurate assessments are necessary for quality care at the facility
level, given that care planning should be based on the resident's
assessment.
While data verification may be costly in the short run, we believe
that it is cost efficient in the long run, in that accurate data will
help prevent unnecessary expenditures or poor policy or reimbursement
decisions that might result from erroneous information.
Several States that have computerized MDS data bases have
encountered significant inaccuracies in the data originally received
from facilities. This problem was rectified by establishing a process
for ongoing validation of the accuracy of the data through on-line
electronic systems feedback to facilities, or other systems for
frequent cross checks and communication.
On-line data editing systems can facilitate timely detection and
correction of inaccuracies. Virtually all the States that have
computerized MDS data bases have developed built-in edit checks for
obvious inaccuracies which would disallow entry of conflicting or
invalid data, for example, for a resident coded simultaneously as
comatose yet, inconsistently, enjoying playing cards.
We agree that double entering data to ensure validity would be
expensive and we are not requiring it. We emphasize, however, the
important role that validation plays in the establishment of a data
base. To this end, we published standardized range and relational edits
in May 1995 that MDS data will have to pass in order to be accepted at
the State level.
Comment: Some commenters placed responsibility for the accuracy of
the data on the facility. According to a commenter, having the edit
checking process occur at the facility is critical, otherwise the State
system would quickly become overburdened with rejecting records back to
the facility for correction. One recommendation was for a facility to
have a system for visually checking MDS information prior to submitting
the data. Commenters noted that computer software can validate that the
MDS is complete and that responses are within an acceptable range, and
can also generate a condensed MDS with the responses, and staff can
compare this to the MDS to verify accuracy. A State commenter proposed
that we require a facility to maintain an accuracy rate of 95 percent
for its data to be accepted by the State. Commenters suggested that a
facility only transmit updates and changes to the data base once the
original assessment is on file. Another proposal was for us to require
a facility to incorporate surveillance and correction procedures as
part of its quality assurance program.
Response: We concur that a facility has a responsibility to submit
assessment data that is accurate, and there are many ways to accomplish
this. A facility is required by section 1919(b)(3) of the Act to
conduct a comprehensive, accurate, standardized and reproducible
assessment of each resident's functional capacity. We are adding to
Sec. 483.20(g) the facility's responsibility to accurately assess
residents, as well as Sec. 483.20(f)(3), which notes the facility's
responsibility to transmit accurate data to the State. We believe,
however, that the State also has a role in verifying the accuracy of
the data and systematically monitoring and evaluating the quality and
accuracy of the assessment data which will be submitted from
facilities.
States will monitor completeness and accuracy of MDS data
submissions from the facility. A facility will be in compliance unless
an unacceptable percentage of the records completed by the facility
during a target period are either not submitted to the State or not
accepted by the State because of data errors. We will determine
compliance based on a review of missing records for the target period,
allowing sufficient time after the close of the specified period for
relevant records to be submitted from the facility to the State.
Our initial plan is to have States accept required records
submitted by the facility, except when specific data errors occur.
Currently, plans are for States to reject records only if:
A submission file has a missing or misplaced header record
or trailer record;
Any record in the file does not have the correct record
length with the last data character being the ``end of record''
delimiter required by the standard data specifications;
The submission file contains an invalid facility ID code
(Fac__Id in the data specifications) in the header record or data
record; or
The total number of records in the submission file does
not correspond to the record count given in the trailer record.
We will evaluate this process and make necessary changes based on
experience.
A facility is in compliance unless there is an unacceptable error
rate for the set of records completed by the facility during a
specified period. Determination of compliance is based on a review of
records accepted by the State, allowing sufficient time after the close
of the specified period for relevant records to be submitted from the
facility to the State. The error rate in question is the total number
of fields in error, due to either range or consistency errors as
identified in the MDS 2.0 data specifications in effect, divided by the
total number of required fields across all records for the specified
period. The fields that we require for each type of record (for
example, admission assessment, quarterly assessment, discharge tracking
form, etc.) are detailed in the MDS 2.0 data specifications.
Further, States have a role in training facility personnel in
methods of preventing and correcting data errors. The suggestion by a
commenter that we require a facility to incorporate surveillance and
correction procedures as part of its quality assurance program may be a
viable option.
[[Page 67187]]
Comment: Other commenters believed that States should bear primary
responsibility for the accuracy of the RAI data. One State suggested
that the States should provide facilities with report formats that
cross check interrelated data. Another commenter proposed that a State
keep verification requirements for transmitting data separate from
verification of clinical consistency of the data. A commenter pointed
out that it was unclear whether we intended that States notify a
facility of errant data before transmitting the data to us. One
suggestion was for the State to check data for completeness, accuracy
and compliance with processing instructions. Another was that the State
specify a standardized format for transmitting data that would require
compliance with edits. A few commenters thought that the States should
be responsible for the quality of the data transmitted to us.
Response: The responsibility for data accuracy must reside with the
facility, the source of the data, and a facility should ensure that MDS
data pass all standard accuracy edits before transmission to the State.
The State does have a responsibility to monitor accuracy of data
submitted by a facility and aid the facility in achieving accuracy. The
State will perform standard accuracy edits on data files as they are
received and report any errors found to the facility. A State will also
be able to monitor the error rate for a facility over time and produce
an error summary report to share with the facility. The State will also
have the ability to monitor the error rates for any MDS software
vendor. When systematic problems are found for a vendor, the State will
have the opportunity to work with that vendor to correct the problems.
We may also develop procedures for onsite data accuracy visits to the
facility when error rates are high. We will determine the frequency of
such visits during our formal systems design process. MDS data
submitted to the State will be transmitted to us at least monthly. We
will again edit the data for accuracy. Accuracy edits will be performed
at the facility, State, and HCFA levels.
Comment: We received a number of other suggestions to ensure the
accuracy of data. One was to allow the registered nurse assessment
coordinator to validate the data. A few suggested a computer system
that has a basic set of edit checks, like high-low checks, completeness
checks, clinical inconsistencies, and incorrect data checks. A consumer
advocacy organization pointed out that some States currently have
special nurse auditors who validate the match between a resident and
his or her MDS. A State suggested that the reliability of MDS data be
verified by periodic, random, onsite review of individual records
performed by either State program agency staff or by a contracting
organization. Another noted that if validity becomes an issue, we could
consider a regulatory mechanism for appointing independent assessors.
Response: We agree that the computer systems should have basic edit
checks, which ought to be in place both at the facility level and at
the State level. The standard data specifications we have developed
include valid ranges and required formatting for MDS items and
consistency between MDS items. Detailed information concerning these
data specifications is available on our MDS World Wide Web site (at
http://www.hcfa.gov/Medicare/hsqb/mds20/) and is otherwise available
from us and the State survey agency. We anticipate that facilities will
be able to select commercially available software packages that use
these data specifications. We note that the current regulation grants
States the authority to take over the assessment process if a facility
knowingly and willfully certifies false assessment statements. Section
483.20(c)(4) allows the State to require that assessments be conducted
and certified by individuals who are independent of the facility and
who are approved by the State. New York, for example, contracts with
their peer review organization to conduct onsite audits of the Patient
Review Instrument, used to calculate Medicaid reimbursement. Nurses
sample a certain number of resident records. If the records do not pass
standards based on resource utilization group, the facility loses its
``delegated status'' to conduct assessments and must hire an
independent assessor for 1 year.
Comment: Many of the comments we received regarding privacy and
confidentiality issues demonstrated concern regarding privacy issues
and indicated that residents' identities need to be protected. Some of
the commenters believed that MDS information should be available or
reported in the aggregate format. A few commenters wanted identifying
data available at the State level but not in any public data sets
created. One commenter questioned why we should have access to
assessment information of private pay residents. A national provider
organization stated that the need for information in planning and
quality assurance should not be met at the expense of the resident's
and facility's right to confidentiality. Commenters suggested that we
develop ways to block resident identifiers or develop an alternate
system of identification like numerical coding.
Response: We agree that protecting the privacy of the resident is
essential. In establishing this system of records, both we and the
State (as HCFA's contractor in performing survey functions) must comply
with the Privacy Act (5 U.S.C. 552a), which applies to Federal systems
of records containing individually identifiable information. While
aggregate summaries of the data can be made public, there are strict
Federal guidelines for the release of individually identifiable
information by Federal agencies to any individual or organization. A
release of personal identifiable information can only be made in
limited circumstances described in the Privacy Act. Disclosure may be
made under the Privacy Act for ``routine uses,'' which are compatible
with the purpose for which the information was collected. These routine
uses are described in the Privacy Act System of Records, which is
published in the Federal Register. Requirements associated with routine
uses are also set forth in the System of Records. In most cases, a
``data use agreement'' is required with the recipient being bound, in
turn, by the Privacy Act. Some States have additional laws
strengthening the protection of privacy of the resident.
We would have difficulty assuring the quality of care in facilities
if we only had access to periodic aggregate data. While allowing
evaluation of prevalence rates (percent of residents who have a
particular condition at a given point in time) over time, such data
would largely preclude any quality of care indicators based on
incidence rates (percent of residents who acquire a given condition in
a facility between two points in time). For example, periodic aggregate
data might show the prevalence of decubitus ulcers in the resident
population, but we could not review it to determine the incidence of
such ulcers while residents are in the care of the facility. A high
prevalence of ulcers may indicate that the facility accepts residents
with existing ulcers from the hospital, but a high incidence may
indicate substandard care. If access were limited to aggregate data, it
would also be impossible to evaluate other important outcome measures
potentially indicative of quality of care.
Our quality assurance activities in Medicare and Medicaid certified
facilities are not limited to selected residents (for example, Medicare
or Medicaid residents, or both). Our long term care survey process
directs State
[[Page 67188]]
survey agencies to review the care provided to all residents of
certified facilities, regardless of payor source. For example, quality
assurance survey teams review a random sample of residents without
respect to payor. We would often have difficulties evaluating the
quality of care in Medicare and Medicaid certified facilities if access
to data is limited to residents who are Medicare or Medicaid funded.
This is especially true in a facility in which Medicare or Medicaid
residents, or both, are a minority. This requirement is, therefore, in
keeping with the quality protections that are afforded to all residents
in certified long term care facilities. We will not give out
identifying information unless there is a demonstrated need for it; the
routine use permits disclosure only if we determine that the research
cannot be reasonably accomplished unless the record is provided in
individually identifiable form.
Comment: One commenter was unclear why confidentiality is an issue,
since we already have systems in place to guard confidentiality, and
these systems could carry over into the MDS system. A professional
organization recommended developing a software program that could block
identifying information except when needed by designated persons.
Some commenters addressed the question of who should have access to
the data base. Several suggestions were submitted, including:
The State survey and certification agency;
The reimbursement agency (without resident identifiers);
The ombudsman (one commenter suggested without resident
identifiers while another said consistent with current access rights
for resident records);
The submitting facility (with no access to other
facilities); and
Aggregate data should be available to the public.
Commenters proposed that a State have access to facility data in its
own State with resident identifiers and to other States and the
national data without identifiers.
Response: As aforementioned, under the Privacy Act, when personal
information in the possession of the Federal Government on an
individual is accessed by name, Social Security number or any other
identifying symbol, we must publish a system of records notice. This
notifies the public that we are collecting the information and will be
accessing it in an individually identifiable way.
The notice lists routine uses for the information, including a list
of entities to which we may release information upon request, the uses
for which we may release information, and conditions under which we may
release individually identifiable information. The Privacy Act requires
that the routine uses be consistent with the purpose for which the
information is collected. The Privacy Act does not mandate us to
release the information. The system of records notice will support
research as a routine use, but will require safeguards to ensure the
maximum protection of individually identified information. It requires
that persons or entities requesting the information sign an agreement
to not re-release the data. The system of records notice also permits
release to government agencies for purposes of monitoring nursing home
care. We already have a routine use disclosure provision in place for
handling data requests by those conducting health services or other
appropriate research for most of our systems. We evaluate each request
on an individual basis, including whether it is appropriate to release
any data with identifying information.
Comment: A few commenters recommended that we not release resident-
specific information unless the resident has directly consented. One
State suggested that we and States issue ``designator'' numbers that
would allow resident-specific information to be released. A commenter
suggested that we build fines and penalties into the system for breach
of confidentiality.
Response: As aforementioned, we will follow all provisions of the
Privacy Act, as well as the Freedom of Information Act in managing the
information from this proposed data base. Our Freedom of Information
Act officer decides whether to release the records if a request is made
at the Federal level. Under the Freedom of Information Act,
individually identified RAI data generally would be exempt from
disclosure as medical (and similar) files, the disclosure of which
would constitute a clearly unwarranted invasion of privacy (5 USC
522(b)(6)). Under the Privacy Act, individually identified records may
not be disclosed, except for good cause, including routine uses
consistent with the purposes for which the information was collected (5
USC 522a(b)). (Aggregate data, not individually identifiable, could be
released under either law.)
For records collected under the authority of our RAI requirements,
States are bound by the Privacy act as our agent. In addition, most
States have their own rules governing protection of privacy for records
maintained at the State level. We expect each State to take the
appropriate steps to ensure that resident-identifiable information is
protected.
We are adding language to Sec. 483.20(f)(5) that prohibits a State
from releasing resident-identifiable information to the public, and
provides that a facility may release resident-identifiable information
to an agent only in accordance with a contract under which the agent
agrees not to use or disclose the information except to the extent the
facility itself is permitted to do so. We note that the Health
Insurance Portability and Accountability Act of 1996 (Public Law 104-
191) provides stiff penalties for persons who wrongfully disclose
individually identifiable health information. Such penalties can
include fines or imprisonment, or both.
RAI data would be part of a resident's clinical record, and as
such, would be protected from improper disclosure by facilities under
current law. Facilities are required by sections 1819(c)(1)(A)(iv) and
1919(c)(1)(A)(iv) of the Act and Sec. 483.75 (l)(3) and (l)(4), to keep
confidential all information contained in the resident's record and to
maintain safeguards against the unauthorized use of a resident's
clinical record information, regardless of the form or storage method
of the records. We recognize that there are circumstances that may
necessitate the release of information from the resident's clinical
record. However, these instances are limited by regulation to
circumstances required by (1) transfer to another health care
institution, (2) law, (3) third party payment contract, or (4) the
resident (Sec. 483.75(l)(4)).
The transmission is limited to (1) using a private dial-up network
based on a direct telephone connection from the facility or (2) mailing
a diskette from the facility. In the case of either telephone
communications or the mail, the information transmitted is secure, with
interception of information being prohibited by Federal and State law,
and strong penalties apply. We and the States both receive large
volumes of unencrypted voice phone calls, unencrypted data
telecommunications (for example, claims data), and unencrypted
mailings, all including resident-specific information.
Section 1902(a)(7) of the Act requires Medicaid agencies to provide
safeguards that restrict the use or disclosure of information
concerning applicants and recipients to purposes directly connected
with the administration of the plan. Moreover, under the agreement
between the Secretary and the State survey agency pertaining to
[[Page 67189]]
section 1864 of the Act, the State is required to adopt policies and
procedures to ensure that information contained in its records from the
Secretary or from any provider will be disclosed only as provided in
the Act or regulations.
States may allow other agencies within the State to have access to
MDS data to the extent that it is related to the operation of the
Medicaid programs. All agencies must adhere to the confidentiality
requirements of the Medicare and Medicaid programs relative to this
information. We are also providing in Sec. 483.315(j) that a State may
not release resident-identifiable information to the public. Further,
the State may not release resident-identifiable RAI data to a
contractor without a written agreement under which the contractor
agrees to restrict access to the data.
We believe that adherence to Sec. 483.10(b)(1), Notice of rights
and services, adequately addresses the commenter's suggestion that
residents be notified. We believe that using designator numbers as a
vehicle to permit release of resident-specific information is not
feasible. Because such a system would require that all providers use
the same number for the same resident, (in other words, to enable
tracking of residents across different providers), implementation would
be extremely burdensome.
Comment: Commenters addressed other issues pertaining to privacy
and confidentiality. We received a recommendation to contact specific
individuals who could assist in developing the security of a data base.
Another recommendation was to carefully control computer access (in
other words who can get to the data via computer at the facility and at
the State).
Response: Both at the facility and the State, access to the MDS
records for residents, whether those records are hard copy or
electronic, must be secured and controlled in compliance with our
requirements for safeguarding the confidentiality of clinical records.
The facility must take precautions to ensure that only authorized staff
have access to confidential information. Electronic MDS data should
reside on stand-alone computers in secured physical locations, or
access to those data should incorporate standard user ID and password
techniques.
Comment: A few commenters thought that only the facility in which a
resident resides should be able to make changes in the data entered.
One commenter proposed that the data system require the facility to
``close-out'' a resident's information upon discharge or transfer,
which would prevent the facility from changing that information. It
would also prevent ``a receiving facility'' from entering new data on a
transferred resident until the information base is closed by the
transferring facility.
Response: We are in agreement that no other facility may make
changes in the MDS data. A facility may only change MDS hard copy and
electronic data as allowed by our policy. This policy requires that
after the facility performs the assessment, it is ``sealed,'' and
electronic records are ``locked.'' HCFA policy also require facilities
to complete tracking forms indicating resident discharge from and
reentry to the facility. The facility must complete and submit these
forms and corresponding electronic records to the State within
specified time frames. It would not be appropriate to require one
facility to ``wait for'' a discharge record from another facility
before entering and submitting data for Medicaid payment. This could
result in payment delays for the one facility when another facility is
delinquent on submitting MDS records.
MDS data bases at the State will be used for a variety of purposes,
including quality monitoring, Medicaid and Medicare payment, and policy
analysis. It would prove quite cumbersome and, at times unworkable, for
all data changes to always be made by the facility of residence and
then updated in the State data bases after resubmission from the
facility.
Comment: Commenters discussed the schedule for submission of data
by facilities, for example on an annual or quarterly basis. A national
advocacy organization supported continuous data flow, pointing out that
States need up-to-date information due to the survey cycle and their
need to be able to respond to complaints when they are received. One
commenter said that most facilities in their State routinely transmit
on a weekly basis. Other commenters questioned how current would be the
data that are to be maintained. The few who responded agreed that the
data do need to be up-to-date and agreed with others who said that
transmission should coincide with the RAI requirements. Others
addressed the need for a transaction log to document transmission.
Some commenters noted that reimbursement agencies may require data
on a more frequent basis for the purpose of rate setting. It was also
mentioned that requiring transmission on a more frequent basis would be
an administrative burden. A few commenters wanted the quarterly reviews
to be transmitted also. A national provider group suggested quarterly
submission, staggered in order to facilitate managing the large volume
of data. For example, at the end of each month, \1/12\ of the
facilities would submit data for the preceding 3 months.
One commenter recommended that States transmit data to us on an
annual basis. There was some support for collecting data on a quarterly
basis. A few commenters believed that the States should send us summary
information. A few States suggested submitting data on the same
schedule as the MDS is completed--that is, upon admission, within 14
days of a significant change, and annually. Others agreed that annual
submission would be adequate. A few proposed that data be transmitted
twice a year. One commenter believed that all States should submit data
on the same date.
Response: In order for MDS information to be timely enough for use
in ongoing quality assurance programs, a facility must submit MDS data
at least monthly to States. This would entail submitting all full MDS
assessments (initial, annual and significant change), and any partial
assessments (quarterly, discharge, and reentry) completed since the
facility last transmitted data to the States.
States will also submit data to us at least monthly. The regional
offices also need timely information in order to perform Federal
monitoring surveys. To a certain extent, the role of the regional
office mirrors that of State survey agencies. Hence, the regional
offices need timely, complete information. Furthermore, this is
necessary to enable us to timely evaluate State trends or regional
problems. For example, linking resident status information with SNF
cost report data could identify potential Medicare utilization problems
in relation to certain outcomes or resident status changes.
Analysis regarding the timeliness of MDS data and frequency of
transmission requirements has shown that the MDS data base must contain
quarterly review information if it is to be used for quality monitoring
purposes by State survey agencies. Much of the work being done to
develop quality measures relies on quarterly assessment data for each
resident. Leading researchers and survey experts agree that the
quarterly review data are needed for the timely and reliable
identification of resident outcomes for this purpose. There is under
development, discussion and testing, a case-mix demonstration payment
system using MDS data in calculating appropriate payment rates.
Comment: Commenters made suggestions regarding what edits should
[[Page 67190]]
be allowed without requiring the facility to produce a new electronic
record or hard copy. One State wanted any change in MDS information to
result in a new hard copy. Another State proposed that we allow a
typographical error to be corrected at any time. A consumer advocacy
organization proposed that if a facility makes changes to a
computerized copy, it should be held to the same standard as written
records. Another commenter believed that MDS software should create an
audit trail of changes made to an assessment that would include the
name of the person making the change, the date, the old value, and the
new value. The commenter suggested that we permit a facility to keep
the most current copy in a hard copy format. A State commenter believed
that the computer program should have the ability to update the
assessment information without changing the original version. Another
State did not want to make changes if the data had been transmitted
after the 21st day after admission. Another State proposed using those
things that meet the criteria for significant change with regard to
edits.
Response: According to current policy, a facility may correct
typographical or factual errors within required time frames. To make
revisions on paper records, a facility enters the correct response,
draws a line through the previous response without obliterating it, and
initials and dates the corrected entry. Computer-based systems must
have a way to indicate and differentiate between the original and
corrected entries on the printout of the corrected form, and to ensure
that the correct information is transmitted to the State. Again, we
note that the assessment must be accurate. A significant correction of
prior assessment is completed at the facility's prerogative, because
the previous assessment was inaccurate or completed incorrectly.
Version 2.0 of the MDS contains an item response that, when checked,
indicates that the assessment is a significant correction of a prior
comprehensive assessment. A number of providers have called to our
attention that the wording of this item precludes its use when the
prior assessment that is being corrected was a Quarterly Review
Assessment. We will add code to the MDS version 2.0 that will provide a
mechanism for this.
A significant correction of prior assessment differs from a
significant change in status assessment, in which there has been an
actual change in resident's health status. If there has been a
significant change in health status, the facility cannot merely correct
the affected items on the MDS. The facility must complete a full new
assessment. Any subsequent changes should be noted elsewhere in the
resident's record (for example, in the progress notes). As stated
previously, however, the procedures and policy governing issues of data
storage, retrieval, validation and maintenance in facilities will also
be addressed more fully in a forthcoming HCFA publication, such as a
State Operations Manual.
Comment: Some commenters requested that the requirements we issue
allow electronic signatures. This would avoid duplication by not
requiring that the facility keep on file a hard copy with signatures.
Response: In the development of the system, we will consider
requirements for electronic signatures.
Comment: Commenters addressed how and to what extent we should
standardize electronic formats and how to revise the format to be
consistent with technological changes. One State did not think that a
standardized electronic format is necessary, and proposed that we
request summary reports, findings and group data instead of
individualized data, which would obviate the need for a standard
format. Several others expressed the belief that we should specify a
format. A national provider organization pointed out that a
standardized format would facilitate collecting, merging, and analyzing
national data. Another commenter noted that it would also decrease
software development costs. A State provider organization pointed out
that nothing would be more frustrating and costly than software that is
not well thought out and requires several revisions. The commenter
suggested that we already have experience in formatting because of the
case-mix demonstration project.
A State expressed the belief that it would be easier to maintain a
single format than have to deal with different software languages and
media types. The commenter further said that we or the States should be
responsible for making formatting changes and sharing them with those
affected. Another recommendation was to use Online Survey,
Certification and Reporting System and create a subsystem for MDS data.
By accessing an ``enhancement log,'' the system would be under constant
review and revision.
Response: We concur that many of these suggestions have merit. In
the spring of 1995, we developed and issued a standard record layout
and data dictionary. These were made available to facilities and
software vendors as well as the States. When these regulations go into
effect, the assessment records that facilities transmit to States must
conform to the standard layout. Hence, software vendors have been
strongly encouraged to use the layout and data dictionary when
developing software products for MDS version 2.0. We believe that this
will ensure uniformity in format but still allow facility flexibility
and choice in terms of the software products they use to encode MDS
records.
Comment: A national provider group proposed that we require States
to develop and make available a software package that would transmit
data in the appropriate format. A few commenters expressed the belief
that as long as they meet Federal standards, States and facilities
should be able to develop additional standards.
Response: We have developed, and are in the process of testing, a
national system for MDS data transmission that will be made available
to all States that includes commercially available standard
transmission software. We are mandating that the facility transmit MDS
data to the State according to minimum data validity specifications and
using standard communication and transmission protocols. The State may
choose to impose additional data validity specifications, exceeding our
mandated minimum specifications.
Comment: We received a few suggestions regarding specific
organizations with whom we could consult in developing a standardized
format. One suggestion was to form a technical advisory board that
would consist of Federal and State personnel, providers, hardware and
software vendors, and resident advocacy groups. Another was to contact
a specific standards committee to obtain their input on developing a
format.
Response: We sought technical assistance from those parties as part
of a technical advisory group that we organized as part of the systems
design process. We met with several of the groups mentioned above early
in the design process to get input on a number of systems development
issues. We will continue to seek input throughout the development. We
are committed to working closely with interested and affected parties
in the design process.
Comment: Commenters suggested that the standardized format should
be in either ASCII or EBCDIC, and should include data item description,
data item beginning and ending column, data item length, and whether
the data is right or left justified. One State noted that some States
have already begun to computerize and that the format should be
receptive to those programs,
[[Page 67191]]
particularly for States utilizing our RAI. A State commenter believed
that a data dictionary should be provided for each data submission,
which would provide a vehicle for documenting problems with the data
submission.
Response: As previously stated, we have been working closely with
States that are computerized. Several States were instrumental in
developing the data dictionary and record layout. With their expertise,
we constructed a standard layout that still allows flexibility for
States which have added MDS items. Facilities and States must conform
to the standard record layout, which is currently constructed in ASCII.
Comment: Several commenters wondered how facility noncompliance
with the requirement to transmit the MDS data would be enforced.
Response: As stated earlier in this preamble, facility
noncompliance with the reporting requirement established by this final
rule will be subject to the full range of enforcement remedies set
forth in part 488, subpart F, ``Enforcement of compliance for long-term
care facilities with deficiencies.'' We will treat a facility's failure
to comply with MDS reporting requirements as noncompliance under the
definition in Sec. 488.301. At a minimum, we will require a plan of
correction, and will impose the mandatory denial of payment for new
admissions sanction if the facility has not achieved substantial
compliance within 3 months from the date of the finding of
noncompliance. In such a case, if the facility is still not in
compliance with requirements within 6 months from the date of the
finding, we will terminate its provider agreement. Also, we may impose
one or more other remedies, as determined by us or the State, in
accordance with part 488, subpart F.
Facility failure to meet acceptable standards of performance,
including failure to transmit the MDS data, or failure to otherwise
improve upon its past poor performance, or failure to transmit or to
maintain compliance relative to this reporting requirement could be
considered by us to be indicative of the facility's inability or
unwillingness to perform the resident assessment itself. We believe
that this is a reasonable conclusion because if the requirement to
conduct a resident assessment has been satisfied and completed, then
the administrative reporting requirement would simply and logically
follow. Noncompliance that is repeated or which recurs intermittently
becomes part of the facility's noncompliance history which is a factor
when we or the State selects the appropriate enforcement response. We
will sanction, accordingly, a facility that demonstrates little or no
commitment to continual, rather than cyclical, compliance. A State will
be easily able to ascertain whether a facility is transmitting the
required information timely and in the manner that we prescribe, those
facilities that fail to meet the standard may be subject to the full
range of available remedies, including denial of payment for new
admissions and civil money penalties. We do not expect perfection
relative to compliance with this reporting requirement; we will
incorporate limited tolerance into the compliance assessment process,
whereby good faith efforts made by facilities will be considered. An
additional level of tolerance will exist during early phases of
implementation of the requirement.
Comment: A number of commenters addressed a wide variety of issues
relating to the computerization of MDS information. A State commenter
stressed that we should emphasize the benefits to facility staff and
residents. A consumer advocacy group expressed the belief that we
should address how computerization will affect utilization of the RAPs
and the individualization of the care planning process.
Response: As mentioned in the previous discussion of data uses, we
believe that the automation of this information will be extremely
helpful to facilities. We note that computerization of resident
assessment information does not relieve facilities of their
responsibility to develop, by an interdisciplinary team, a
comprehensive, individualized care plan. While software packages exist
that will automatically print a plan of care based on responses to MDS
items that trigger a RAP, an individual must still exercise
professional clinical judgment in customizing the care plan to suit
each resident's individual needs.
Comment: Commenters proposed that we develop regulations and manual
instructions relating to transmitting data. A State wanted a
telecommunications program to be mandated. Another State expressed the
belief that we should penalize facilities which do not comply with
submission requirements.
Response: Once we develop key specifications for data transmission,
we will issue clarifying policy and give instructions to States and
providers in a State Operations Manual transmittal. We will require
that a facility comply with the policy and regulations covering this
data base in order to participate in the Medicare and Medicaid
programs. As mentioned above, we are requiring that a facility
electronically transmit its data via telecommunications infrastructure
to the State. Penalties for not complying with submission requirements
are addressed with the comments on proposed Sec. 483.20(b)(6),
Automated data processing requirement.
Comment: Some commenters discussed software vendors who have
developed RAI packages. Commenters suggested that we develop a program
to test vendor software for minimal acceptability.
Response: We are developing several aids to promote the accuracy of
RAI software packages developed by commercial vendors. These efforts
include the following documents and data files, being published on our
World Wide Web site (at http://www.hcfa.gov/Medicare/hsqb/mds20/) and
otherwise available from us.
Detailed specifications for data validity (valid ranges
and consistency requirements for MDS items).
Detailed logic and a test data base for RAP determination.
Detailed specifications for the file structure, record
layout, and field formatting for MDS files submitted by facilities.
Detailed logic, a test data base, and a test program for
Resource Utilization Group calculation.
We are also developing a standard State-level MDS processing system
to be distributed to each State. One feature of this system allows RAI
software developers to transmit test files of MDS data to the State and
receive a detailed log of all data validity errors encountered in the
test file.
We will continue to promote processes for assuring the accuracy of
software packages developed by vendors even though the approaches to
this effort will change over time.
Comment: In the preamble to the proposed rule, we encouraged
comment on developing a mechanism for advising us on the need and
method to update the MDS and RAPs. Commenters agreed that we do need a
method to update the RAI. Several suggested that we establish a
clinical advisory panel or commission similar to the project team,
clinical panel and advisory committee that developed the RAI. Other
ideas included an annual update schedule, including any changes in the
MDS as an addendum; sending periodic questionnaires to providers, State
agencies and organizations; and a yearly comment period.
Response: We have always recognized that the RAI will need to
reflect advances in clinical practice and assessment technology. We
will be making periodic revisions to the RAI. In 1994, we awarded a
contract to the
[[Page 67192]]
Hebrew Rehabilitation Center for Aged, under which we will revise the
RAI over a few years. The contractor will convene representatives of
States, provider organizations, professional associations, and consumer
organizations. These groups will advise us regarding the need to add or
refine items or definitions, and regarding areas that are less well
understood, and require clarification. As in the past, the revision
process will be one in which we seek input from the many interested and
affected groups.
Comment: We solicited comment on how to coordinate the assessment
process with other assessment protocols such as home health assessments
and the uniform needs assessment instrument (comments on coordinating
with PASRR are discussed with the comments on proposed
Sec. 483.20(b)(5), Coordination). Some who commented merely agreed that
it is necessary to coordinate assessments. Others gave suggestions, for
example, that we issue a stronger directive that a facility provide a
copy of the MDS as part of their post-discharge care, use the RAI in
all long-term care settings, and coordinate with the home and community
based waiver MDS of OBRA '90.
Response: We recognize the need to coordinate an individual's
health care across various health care settings and the importance of
assessments in this process. Currently, we have no statutory authority
to require this coordination except in the case of coordination of the
RAI with preadmission screening programs for individuals who are
mentally ill and mentally retarded. However, there is great interest in
the development of clinical data sets like the MDS for several provider
types, including end-stage renal disease facilities and home health
agencies. Work is well underway to develop screening tools in some of
these areas.
Sec. 483.20(b)(1) Resident Assessment Instrument
Comment: Commenters addressed the proposed requirement that the
assessment process must include direct observation and communication
with the resident, as well as communication with licensed and
nonlicensed direct care staff on all shifts. Most supported the
requirement. A few commenters were concerned with enforcement of the
requirement. Some wanted us to require that the facility communicate
with the resident only when clinically feasible, since the resident may
not have the cognitive skills to verbally communicate.
Response: The resident is a primary source of information when
completing the assessment and may be the only source of information for
many items. In the RAI User's Manual and in the State Operations
Manual, Transmittal No. 272, we have instructed facility staff to talk
with and observe the resident. It is still possible to interact with a
resident, even if he or she is unable to communicate verbally. Staff
can closely observe the resident and respond to many MDS items based on
observation. We acknowledge that evaluating facility compliance may be
difficult but we believe that this requirement is too important to
delete. However, we do not want to require a specific process for
documenting collection of data across shifts. This would burden
facilities and limit their flexibility to implement a process that is
most appropriate for each facility's specific situation and practices.
Comment: A few commenters believed that only the direct care staff
responsible for providing care to the resident on all shifts should be
included in the assessment process. Others wanted us to require that
the facility talk to other people, such as a resident's family members/
guardians, the attending physician, and other licensed personnel.
Response: We did not limit the assessment process to only those
staff members responsible for actually providing hands-on care because
we believe that facility staff who are not the primary care-givers
often have valuable, first-hand information about a resident. For
example, housekeeping staff who routinely talk with residents may be
aware that a resident prefers extra pillows on her bed because it
alleviates her back pain. In the State Operations Manual Transmittal
No. 272 and in the RAI User's Manual we suggest that information
sources for the assessment should include, but are not limited to,
discussion with the resident's attending physician, appropriate
licensed health professionals and family members. Family members are a
valuable source of information regarding the resident, particularly for
cognitively impaired residents, for whom family is often the only
source of information regarding the resident. For example; a resident's
spouse may be the only person who knows what the resident was like
prior to admission to the nursing home, and is able to provide
background information that is necessary for staff to complete the
Customary Routine section of the MDS.
We require that a physician be a part of the interdisciplinary team
that prepares the care plan. We acknowledge that a doctor's schedule
may not allow consistent participation in the assessment process. While
we encourage facilities to discuss the resident's status with the
attending physician to gain and validate information, we are not
requiring it. The statute is silent regarding the participation of
individuals other than health professionals.
Comment: Some commenters wanted us to clarify that communication
with all shifts can be both verbal and written. For example,
information could be exchanged at pre-shift meetings, through progress
notes or other documentation in the clinical record, or by other means.
Response: We agree that information can be exchanged in a number of
ways, and discuss possible mechanisms in the RAI User's Manual. At this
time we do not wish to mandate a communication process; rather, each
facility should determine how to best exchange information about the
resident.
While we did not receive comments regarding the facility assessing
the resident using the RAI specified by the State, we are adding to
Sec. 483.315(c) that the State must obtain our approval of a State-
specified instrument. This is more consistent with sections 1819(e)(5)
and 1919(e)(5) of the Act. Furthermore, we are specifying those domains
or areas that the facility must assess. We listed these domains in the
assessment requirement previously, and inadvertently omitted them at
former paragraph (b)(2); a State suggested that removing the domains
weakened the requirement. Additionally, surveyors use the regulatory
tags for particular domains to cite deficiencies when a facility has
problems only in certain assessment areas. The State is responsible for
obtaining approval from us for its instrument and to specify its
approved instrument to facilities. Facilities must therefore rely upon
the State's assertion that the instrument is approved by the Secretary.
Proposed Sec. 483.20(b)(2) When Required
Comment: Several commenters addressed our proposed requirement that
a facility complete the comprehensive assessment within 14 days after a
resident's admission. Some commenters agreed with the 14-day time
period, and wanted us to emphasize that the RAI and quarterly review
are a minimum, stressing that all the resident's needs must be
identified and care planned as necessary. A commenter requested
clarification regarding completion ``within 14 days after admission,''
stating that it could be interpreted differently. For example, the
facility could construe the requirement
[[Page 67193]]
to mean ``14 days after admission'' or ``the fourteenth day of
admission.''
Response: Completion of the RAI specified by the State does not
necessarily fulfill a facility's obligation to perform a comprehensive
assessment. As previously stated, Sec. 483.25 requires that a facility
ensure that each resident attains or maintains his or her highest
practicable well-being. A facility is responsible for assessing areas
that are relevant for individual residents, regardless of whether they
are included in the RAI. For example, in completing the MDS, the
assessor simply indicates whether or not a factor is present. If the
MDS indicates the presence of a potential resident problem, need, or
strength, the assessor should then investigate the resident's condition
in more detail. The RAPs may assist in this investigation.
Other problems that are relevant for an individual resident may not
be addressed by the RAI at all. For example, the MDS includes a listing
of those diagnoses that affect the resident's functioning or needs in
the past 7 days. While the MDS may indicate the presence of medical
problems such as unstable diabetes or orthostatic hypotension, there
should be evidence of additional assessment of these factors if
relevant to the development of a care plan for an individual resident.
Another example of resident concerns not addressed by the MDS is sexual
patterns. Some facilities have responded by creating additional
assessment tools which they complete for all residents in addition to
the State RAI. This is not a Federal requirement. Additional assessment
is necessary only for factors that are relevant for an individual
resident. Facility staff have stated that many of the items added to
version 2.0 of the MDS may eliminate the need for supplementation of
items in facility specific assessments and will hopefully contribute to
a more comprehensive assessment for each resident.
A facility is also responsible for assessing and intervening in
response to acute or emergent problems such as respiratory distress or
fever. While this may seem obvious, surveyors have reported numerous
instances in which this has not occurred.
A facility must complete the initial assessment no later than 14
days after a resident's admission. For example, if a resident is
admitted on July 1, the assessment must be completed by July 15.
Although Federal requirements dictate the completion of RAI assessments
according to certain time frames, standards of good clinical practice
dictate that the assessment process is more fluid, and should be on-
going.
Comment: A few commenters recommended that we require the
assessment within the month that the annual is due and not a narrowly
defined ``every 365 days.''
Response: The statute requires that a facility conduct an
assessment no less often than once every 12 months. The facility should
use the completion date of the last assessment (in other words, the
date the registered nurse coordinator has certified the completion of
the assessment on the RAP Summary form in section V) to calculate when
the annual assessment is due. Current policy is that the next
assessment is due within 365 days. As we are not aware of any problems
regarding this policy, it will remain unchanged.
Comment: Commenters proposed alternatives to the requirement that a
facility complete an initial assessment within 14 days of a resident's
admission. A commenter suggested that a facility complete the MDS
within 14 days of admission, and the RAPs and care plan within 7 days
of completing the MDS (instead of completing the MDS and the RAP
process in 14 days). This would allow adequate time to complete and
document the in-depth assessment. Others believed that 21 days to
complete the assessment and 30 days to complete the care plan is
necessary.
Response: As mentioned above, the statute currently specifies the
time frame for the initial assessment, which does not allow us any
latitude. We have defined the RAI to include the MDS, triggers and
utilization guidelines, including the RAPs. Since the RAPs are part of
the comprehensive assessment, they too must be completed within 14 days
after admission or detection of a significant change. Current care
planning requirements allow 7 days after completion of the RAI for
completion of the plan of care.
Comment: A consumer advocacy group suggested that we require an
assessment similar to the quarterly review upon the resident's return
from a hospitalization, since some change in the resident's condition
had necessitated the hospital visit. Another commenter recommended that
we require an assessment when the use of restraints for an individual
increased over a prescribed threshold.
Response: We agree that it may be beneficial for the facility to
complete another assessment upon return from hospitalization or upon an
increase in restraint usage. An increase in restraint use is an example
of a situation in which a significant change reassessment is probably
necessary. If it becomes necessary to restrain a resident or increase
restraint usage, it is likely that the resident's condition has
deteriorated and there are behaviors of new onset or increased
frequency. In this case, the facility must revise the care plan. If a
resident's condition has significantly changed prior to or after
hospitalization, the facility must complete a comprehensive,
significant change assessment on the resident's return to the facility.
Some facilities have instituted a policy requiring a comprehensive RAI
assessment each time a resident is readmitted after hospitalization. We
prefer, however, to leave our requirement so that it is based on what
is clinically warranted (in other words, whether the resident's
condition meets the definition of a significant change).
Comment: Several commenters, including some State and national
provider organizations, were concerned with the impact that the 14-day
requirement would have on facilities whose residents are typically
short-stay, such as residents in hospital-based SNFs. A few wanted us
to exempt facilities which have an average length of stay less than 30
days from having to complete the assessment. Others wanted all
facilities to have 30 days in which to complete the assessment. Some
commenters suggested that we develop an alternate instrument that
pertains to the specific care needs of short-stay residents. For
example, they maintained that the MDS does not contain enough detail on
the rehabilitative aspects of care, nor does it capture important
information about a post-acute resident's health conditions. Others
proposed that we allow a facility to complete only those MDS items that
are appropriate for short-stay residents and skip the rest. A few
commenters wanted us to convene a clinical advisory panel that would
assist in identifying the clinical characteristics of short-stay
populations and determining which MDS elements are critical for them.
Response: From the comments received, it is evident that there are
a variety of strategies that people believe would be useful in dealing
with the assessment of short-stay residents. We cannot, under the law,
extend the time frame for completion of the RAI. Nor can we currently
exempt any facility certified under the long term care facility
requirements, even though it may provide care exclusively or primarily
for individuals needing a short period of rehabilitation prior to
return to the community. While we are aware that this has long been a
concern voiced by some providers, various
[[Page 67194]]
clinical experts have long believed that the majority of the RAI
gathers useful information for short-stay individuals as well as long-
term residents. In 1992 and 1993, we consulted with several panels of
expert clinicians and health professional, provider, and consumer
groups to identify MDS items that were not pertinent for short-stay
individuals. Of the few items that the panels proposed as not being
relevant for short-stay individuals, there was no consensus on
eliminating items, with all groups in agreement that all individuals in
certified facilities would benefit from the RAI assessment process.
We agree that the original MDS did not contain enough relevant
information pertinent to short-stay populations. We have added some
items to version 2.0 related to special therapies and care needs
(previously included in the MDS+, an alternate RAI used by some States)
that are very relevant for short-stay populations. A national
association representing hospital-based skilled nursing facilities
reported finding these MDS+ items useful in identifying nursing and
therapy needs for short term stay residents and for determining
Medicare coverage and subsequent reimbursement.
We have also added an item to collect information on pain that will
assist facilities in providing more focused care for short-stay
residents. Furthermore, we will clarify and add material to several of
the RAPs specific to short-stay populations as part of our contract
with the Hebrew Rehabilitation Center for Aged to refine the RAI, in an
effort to facilitate a more effective and efficient assessment for
these residents.
Moreover, as this concern has continued to be voiced by providers
and as the number of individuals undergoing a short-term, generally
rehabilitative stay in certified skilled nursing facilities has
continued to increase, we have begun to revisit this issue. We are
currently consulting with providers, consumer groups and professional
associations for the purpose of informing them about our work on
developing a module of assessment items that would be completed as an
alternative to many of the core MDS items. In this way, probably
through the use of the ``skip pattern'' logic in the MDS, facilities
providing care for ``short term stay'' individuals could perform a
standardized, reproducible assessment that is more relevant to the
resident population, while still adhering to the statutory requirement
to perform a comprehensive assessment based on the MDS.
Comment: Several commenters expressed the belief that the MDS is
not appropriate or does not collect enough information for special care
populations, like pediatrics, individuals with AIDS, individuals with
head injuries, individuals who are terminal and are receiving hospice
care, and properly placed residents who have mental illness or mental
retardation. The concerns were similar to those who addressed short
stay residents. A State provider organization asserted that the MDS is
designed for a homogeneous, chronic long term care resident and
suggested that we develop a variety of assessment parameters. Another
State organization stated that 70 percent of the MDS+ elements do not
apply to children. The commenter went on to say that about one-fifth of
those that do apply are demographic in nature. Commenters noted that
facility staff need to know what kinds of behavior usually heralded the
onset of a psychiatric crisis for a resident with mental illness, and
that the MDS does not sufficiently capture behavioral disorders, mood
disturbances, activity potential, and cognitive functioning for
individuals with mental illness or mental retardation. To address these
concerns, commenters recommended that we:
Waive special care populations from the RAI requirement;
Develop additional RAPs to address specific needs;
Develop additional MDS elements in modules for ``special
care'' residents;
Have skip patterns; or
Develop a new instrument.
Response: We acknowledge that the MDS may not be completely
responsive to the needs of special populations in nursing homes today.
We expect to use MDS data to gain a better sense of the clinical
characteristics and care needs of the diverse population of long term
care facilities, and to refine the RAI as it appears warranted over
time. In the meantime, some of the items that were added to the MDS are
more responsive to the needs of these residents. For example, items
that assess the presence, type, intensity, and treatment of pain were
added to version 2.0; this is particularly important for residents in a
hospice program. We have expanded significantly the MDS items
associated with mood and behavior, and also included the use of
programs for treatment of mood and behavior problems. Again, we note
that the statute does not allow us to exempt certain populations.
Comment: A State commenter requested that we exempt terminal/
hospice residents from RAI requirements since the philosophy of hospice
care is vastly different from the rehabilitative approach of the
typical nursing facility. Another State commenter noted that SNF/NF
residents who are residents of a certified hospice will have two
assessments and two care plans because of two sets of requirements; it
is possible that the care plans may be conflicting.
Response: When a resident of a Medicare participating SNF/NF elects
the Medicare hospice benefit, the hospice and the SNF/NF must
coordinate, establish, and agree upon a plan of care for both providers
which reflects the hospice philosophy and is based on an assessment of
the individual's needs and unique living situation in the SNF/NF. This
coordinated plan of care must identify the care and services that the
SNF/NF and hospice will provide in order to be responsive to the unique
needs of the individual and his or her expressed desire for hospice
care. The plan of care must include directives for managing pain and
other distressing symptoms and be revised and updated by the SNF/NF and
hospice, as necessary, to reflect the individual's current status.
Our policy is that when a resident of a SNF/NF elects to receive
Medicare coverage of services under the hospice benefit, both the
Medicare hospice conditions of participation and the SNF/NF
requirements apply. This means that the facility must assess a resident
using RAI. Some confusion arose among the SNF/NF providers concerning
the completion of RAPs that were not clinically appropriate. We have
issued a clarification memorandum reminding providers that the RAPs are
guidelines for assessment. They are not meant as prescriptive courses
of actions. Rather, they are intended as frameworks for assessment that
are clinically indicated depending on the needs of each individual
resident. For example, some of the RAP guidelines may include content
suggestive of an aggressive work-up to determine causal factors that
may not be appropriate for individuals who are terminally ill (for
example, an aggressive work-up to determine the cause of weight loss
would generally not be appropriate or expected for a resident receiving
hospice care.) Many of the RAPs, however, such as ``Activities'' or
maintenance of the resident's ``Activities of Daily Living'' should
lead to more aggressive assessment if they are useful in helping
facility staff increase the resident's comfort level and ability to
attain or maintain his or her highest practicable well-being and create
an atmosphere in which the resident will be able to die with dignity.
[[Page 67195]]
It is important to remember that RAP documentation and the plan of care
may also reflect a resident's right to refuse treatment or services.
In summary, we developed the RAPs to assist facilities in planning
appropriate and individualized care for residents. As we revise the RAP
guidelines over the next few years, we intend to incorporate material
specifically related to terminal care to better address the needs of
the hospice residents residing in SNF and NFs.
Comment: Some commenters wanted changes in the proposed definition
of ``readmission.'' One asked for clarification of what a ``temporary''
absence meant, asserting that a 5-month absence could be temporary for
someone who has lived in the facility for 10 years. A State provider
organization thought that ``temporary absence'' should not be defined
only as a hospitalization, but should allow for other absences like
doctor's visits.
Response: We do not consider it to be a temporary absence when a
resident leaves a facility for a doctor's visit; we do not require that
a facility conduct a new assessment merely because of such a visit.
Readmission is defined as a resident returning to a facility from the
hospital or therapeutic leave. We consider an absence to be temporary
when the facility fully expects the resident to return. For example, if
a resident leaves the facility for a few days during a holiday season,
the nursing home would not need to complete a new assessment (unless
there has been a significant change). If the resident is absent for an
extended period of time, however, it may be difficult for the facility
to determine if a significant change has occurred, and the facility may
wish to conduct an assessment. Furthermore, if the resident is absent
for a year or more, the facility must conduct its annual reassessment
upon the resident's return. However, we are not attaching a time frame
to temporary absence. This holds regardless of where the resident went
and how long he or she was absent from the facility. This policy
recognizes that there is variation in bed hold and discharge policies
in the States.
Comment: A few commenters expressed concern with the proposed
provision in Sec. 483.20(b)(2)(i) that would allow a facility to amend
assessment information up to 21 days after admission in some
situations. One commenter thought the entire MDS was amendable. A
national provider organization recommended that we permit a facility to
correct ``technical'' items on the MDS beyond the 21st day because
these items would not alter the triggers or RAP process.
Response: In the past, we had not allowed a facility to correct
non-factual errors once the assessment was completed. Rather, these
non-factual errors were to be noted elsewhere in the resident's
clinical record (for example progress notes). A facility corrected non-
factual errors on the next assessment (in other words quarterly,
annual, significant change). A facility needs to complete a new MDS
when the non-factual error would have an impact on the resident's care
plan. In this case, a facility should perform another comprehensive
assessment (in other words the MDS and RAPs) within 14 days of noting
the error. We would note that non-factual errors associated with a
resident's assessment and significant change associated with the
resident are two different concepts; however, both can result in
completing a new comprehensive assessment. As discussed below, we are
deleting the 21-day provision.
Comment: A State provider group disagreed with our proposed
delineation in Sec. 483.20(b)(2)(i)(B) of categories within the MDS
that can be amended, because the commenter did not believe that
facilities and surveyors would be able to consistently differentiate
which items on the MDS could be changed. The commenter proposed
changing the requirement to read ``Further resident observation and
interaction indicates a need to alter the initial assessment.''
Response: The provision to amend certain sections within 21 days
has been confusing for facilities. We are deleting the 21-day
provision. We require that a facility complete the MDS and RAPs within
14 days of a resident's admission, within 14 days of a significant
change in a resident's status, and at least annually. By the fourteenth
day, the registered nurse must sign and date the RAP Summary form to
signify that the assessment is complete, within regulatory time frames.
Within 7 days of completing the assessment, the facility must:
Encode the MDS and RAP summary in a machine readable
format;
Run the encoded MDS through edits specified by us. The
facility must correct any information on the encoded MDS that does not
pass HCFA-specified edits.
Within 7 days of completing the assessment, the facility must be
able to transmit the edited MDS and RAP Summary form to the State
according to State or Federal time frames. Therefore, the facility
must:
``Lock'' the edited MDS record;
Certify that the MDS meets HCFA-specified edits; and
Print the edited MDS and RAP Summary form and place them
in the resident's record. The hard copy of the assessment must match
the assessment that the facility transmits to the State. A facility
must, therefore, correct the hard copy to reflect changes associated
with the edit correction process.
We believe that this change eliminates the confusion for facilities
as to what sections could be changed. It will also decrease the number
of corrections the facility will have to make and subsequently transmit
to the State due to changed assessment information.
In Sec. 483.20 (b)(2)(ii) and (b)(2)(iii), we proposed that a
facility must assess current residents of a nursing facility by October
1, 1991 and residents of a skilled nursing facility by January 1, 1991.
We are deleting paragraphs (b)(2)(ii) and (b)(2)(iii) because these
requirements are no longer necessary. They were necessary when the
proposed regulation was written to make sure that individuals already
residing in long term care facilities were comprehensively assessed
according to the new requirements.
Comment: There were many comments related to the definition of
significant change at proposed Sec. 483.20(b)(2)(iv). Commenters
proposed amendments to the definition, deletions to the definition, and
additions to the definition. These comments follow.
Several commenters were concerned that the definition leaves too
much room for interpretation and were particularly concerned about how
this would be evaluated during the survey process. One commenter
pointed out that the definition for significant change leaves much to
the professional judgment of the surveyor to decide what constitutes a
significant change. A few suggested that we delete ``or should have
determined'' from the criterion for significant change because it
invites surveyor second-guessing of facility multi-disciplinary staff
judgment long after the fact.
Other comments related to the notion of permanency in the
definition. Commenters asserted that the distinction between acute and
chronic changes is often difficult to determine, and that the emphasis
on permanency of the change is too exclusive. Some commenters preferred
the language in the State Operations Manual at Appendix R or in the
original RAI Training Manual. They believed that there is inconsistency
between the proposed regulation and the State Operations Manual
training manual, for example, the definition of ``permanent.''
[[Page 67196]]
Commenters wanted us to clarify what permanent means. Another requested
that we delete ``permanent'' and ``apparently permanent'' from the
criterion, and that we add ``is significant (major) or likely to be
permanent.'' The commenter believes that this will be more consistent
with the State Operations Manual Transmittal No. 250, which contains
surveyor guidelines and protocols.
A commenter was concerned about whether the examples of significant
change in the proposed regulation were intended to be all-inclusive,
and believed they should be expanded and clarified. For example, the
commenter believed that the regulation should clarify what a ``sudden
improvement in resident status'' means.
A few commenters, including a national and a State provider
organization, recommended that we change proposed paragraph (b)(2)(iv)
to read ``within 14 calendar days after the facility determines * * *
that there has been a significant decline or improvement in the
resident's physical or mental condition such that in the clinical
judgment of the assessor the change in condition appears to be major or
permanent.'' They believed that this wording would be more consistent
with the original training manual.
A few commenters believed that proposed paragraphs (b)(2)(iv) (A)
and (G) are redundant. One commenter was confused as to which elements
of the MDS the facility reviews in determining if a significant change
has occurred according to the criterion at paragraph (b)(2)(iv)(A). A
consumer advocacy organization wanted paragraph (b)(2)(iv)(A) to read
``Apparent permanent deterioration or improvement in two or more
activities of daily living or apparent deterioration or improvement in
any combination of two or more activities of daily living,
communication or cognitive abilities.''
A few provider organizations wanted the criterion revised to read
``Deterioration in behavior or mood to the point where daily problems
arise or relationships have become problematic.'' This wording would be
more consistent with the original training manual.
A few recommended that we delete ``requires staff intervention'' or
else clearly define the phrase. Commenters suggested that we change the
wording to ``benefits from staff intervention.'' One believed that the
criterion should not be limited to situations requiring staff
interventions because there may be instances in which deterioration is
not perceived by staff as disruptive or detrimental, and staff would,
therefore, not intervene. For example, staff would not intervene, in
the commenter's scenario, in a case in which a resident is depressed
and whose behavioral presentation is passive.
One suggestion was to reword paragraph (b)(2)(iv)(D) to read ``A
marked or sudden deterioration in a resident's health status * * *''.
This would clarify that this criterion does not include the expected
clinical progression of a given diagnosis or condition.
A few commenters suggested that we delete paragraph (b)(2)(iv)(D)
because it is too subjective. One commenter stated that this criterion
would have surveyors citing facilities for everything; for example,
just the fact that the resident is old means that their life may be in
danger of ending.
A commenter suggested deleting ``a factor associated with'' at
paragraph (b)(2)(iv)(E) because it does not add anything to the
definition. Others offered suggestions for clarifying the criterion at
paragraph (b)(2)(iv)(E). A few commenters proposed adding ``* * * that
has not responded to treatment in the last 14 days,'' which would give
the clinician a time frame in which to evaluate the effectiveness of an
intervention. Another commenter proposed adding ``* * * that has not
responded to treatment within clinically accepted time period
standards.''
A national provider group proposed that we delete the criterion at
paragraph (b)(2)(iv)(F) and replace it with ``improved behavior, mood,
or functional health status to the extent that the established plan of
care no longer matches what is needed by the resident.'' The commenter
believed that this would confine the definition of change to a
functional measure and focus the criteria on a positive outcome.
A commenter suggested that we add two criteria to paragraph
(b)(2)(iv): ``(iv)(H) Potentially reversible deterioration in mental
functioning due to suspected delirium. (iv)(I) Deterioration in a
resident's family or social circumstances which places the resident's
psychosocial well being in danger.'' The commenter believed that the
criteria, as published in the proposed rule, do not identify changes
that may be temporary, but which could be noteworthy. Furthermore, the
commenter does not believe that enough attention has been paid to the
psychosocial aspects of change.
One State commented that the definition should not be in the
regulation text, but should remain in interpretive guidelines,
asserting that it will affect the objectivity of the assessors in
determining significant changes since these guidelines will become more
concrete.
Response: These substantial comments regarding significant change
assessments warranted extensive evaluation of the definition for
significant change assessment. Over the past several years, we have
been providing clarification regarding the significant change
reassessment requirement in surveyor training and other training that
we have conducted, as well as through verbal and written communication
to States and providers. We believe that it is necessary to include the
definition of significant change in the regulation text. However, the
definition contained in this final regulation is dramatically altered
from that which appeared in our proposed rule, largely in response to
the comments we received and the collective experience of providers and
States since implementing the RAI process in 1990. This changed
definition will remain in the regulation text to reinforce a facility's
responsibility to conduct significant change reassessments.
A key to determining whether a significant change has occurred is
whether the resident's status has changed to the extent that the plan
of care no longer reflects the resident's needs and the facility's plan
to address them.
We are revising the definition of significant change, as follows: A
significant change means a decline or improvement in a resident's
status that will not normally resolve itself without intervention by
staff or by implementing standard disease-related clinical
interventions, that has an impact on more than one area of the
resident's health status, and requires interdisciplinary review or
revision of the care plan, or both. An example of a condition that will
normally resolve itself without intervention by staff is a resident's 5
pound weight loss, which would trigger a significant change
reassessment under the old definition. However, if a resident had the
flu and experienced nausea and diarrhea for a week, a 5 pound weight
loss may be an expected outcome. If the resident did not become
dehydrated and started to regain weight after the symptoms subsided, a
comprehensive assessment would not be required. Generally, if the
condition has not resolved at the end of approximately 2 weeks, staff
should begin a comprehensive assessment.
A significant change reassessment is probably indicated if decline
or improvement are consistently noted in two or more areas of decline,
or two or more areas of improvement:
[[Page 67197]]
Decline
Any decline in activities of daily living physical
functioning in which a resident is newly coded as 3, 4 or 8 (Extensive
assistance, Total dependency, Activity did not occur);
Increase in the number of areas where Behavioral Symptoms
are coded as ``not easily altered'' (for example, an increase in the
use of code 1 for E4B);
Resident's decision-making changes from 0 or 1 to 2 or 3;
Resident's incontinence pattern changes from 0 or 1 to 2,
3 or 4, or placement of an indwelling catheter;
Emergence of sad or anxious mood as a problem that is not
easily altered;
Emergence of an unplanned weight loss problem (5 percent
change in 30 days or 10 percent change in 180 days);
Begin to use trunk restraint or a chair that prevents
rising for resident when it was not used before;
Emergence of a condition or disease in which a facility
judges a resident to be unstable;
Emergence of a pressure ulcer at Stage II or higher, when
no ulcers were previously present at Stage II or higher; or
Overall deterioration of resident's condition; resident
receives more support (for example, in activities of daily living or
decision-making).
Improvement
Any improvement in activities of daily living physical
functioning where a resident is newly coded as 0, 1 or 2, when
previously scored as a 3, 4 or 8;
Decrease in the number of areas where Behavioral Symptoms
or Sad or Anxious Mood are coded as ``not easily altered;''
Resident's decision-making changes from 2 or 3 to 0 or 1;
Resident's incontinence pattern changes from 2, 3 or 4 to
0 or 1; or
Overall improvement of resident's condition; resident
receives fewer supports.
We may revise this list over time, eliminating or adding items as
well as other situations that meet the significant change definition.
In an end-stage disease status, a full reassessment is optional,
depending on a clinical determination of whether the resident would
benefit from it.
We believe that this definition is clearer than the proposed
definition. It also addresses many of the commenters' concerns,
including noting that the change can be for improvement or
deterioration, and eliminates the need to interpret whether a change is
permanent.
A self-limited condition is a condition that will run its course
without intervention. It is of limited duration. Because this implies a
decline in status, we are retaining the phrase, ``a sudden or marked
improvement.''
Comment: Commenters requested that we specify that the time limits
for reassessments begin once the assessor makes a clinical
determination that the change in resident status is permanent, major,
or both (in other words, within 14 days). This would prevent an
inconsistent outcome.
Response: In paragraph (b)(2)(iv), we proposed that the facility
must conduct the reassessment within 14 days after the facility
determines that a significant change has occurred. We are retaining
this provision (in Sec. 483.20(b)(2)(ii)).
Comment: Some commenters addressed the overall goal of
reassessments due to significant change. One commenter stated that the
clinical goal should be to identify functional changes and evaluate
their source. Early identification of illness, injury, etc., may allow
intervention to reverse and prevent permanent loss of function. The
commenter cautioned that the evaluations can be expensive and counter-
productive. Others maintained that some changes are the natural result
of the aging process or of disease processes like Alzheimer's disease.
Some believed that these changes can be anticipated and care planned
without conducting a new assessment. A commenter wanted us to add a new
criterion to the definition for potentially reversible deterioration in
mental functioning due to suspected delirium.
Response: We believe the commenter's suggestion for a new criterion
is included under the new definition. The primary role of the RAPs,
which a facility also must complete for a significant change
reassessment, is to help the facility to identify causal or risk
factors that can be eliminated or minimized. Completing the RAP process
helps the facility determine what services the resident needs. It would
be more costly if the facility does not detect a significant change and
the resident is allowed to decline. The resident could develop
complications from the onset of a health problem or require
hospitalization. Furthermore, significant change reassessments will
help the staff to determine if a change is the expected result of a
disease process or could be reversed. Such would be the case in a drug-
induced delirium.
Comment: A few commenters thought that the final regulation should
allow for consultation with a physician (the medical director, for
example) to determine the significance or permanence of a resident's
change. Therefore, they maintained, the facility staff would not have
the responsibility to make the determination and would not be cited for
it.
Response: We encourage consultation with physicians, but it is not
our intent to absolve facilities from their responsibility to monitor
resident status. The statute requires that a registered nurse conduct
or coordinate the assessment. The registered nurse, by virtue of
licensure requirements and State practice acts, has responsibility for
assessing and monitoring an individual's status, and notifying a
physician, as is warranted by changes in the individual's status.
Proposed Sec. 483.20(b)(3), Quarterly Review (Redesignated as
Sec. 483.20(c))
Comment: Several commenters discussed the proposed quarterly review
requirements. Most agreed that a facility should assess a resident at
least quarterly. A few, including a State, wanted us to mandate the use
of a standard form. They believe that this would provide consistency.
Response: OBRA '87 required that a long term care facility examine
each resident no less frequently than once every 3 months and, as
appropriate, revise the resident's care plan. We are accepting the
recommendation to mandate a standard instrument. Not only will this
provide consistency across the nation, but it will facilitate
computerization of the quarterly review assessment items. In keeping
with the Federal requirement for a uniform resident assessment
instrument, the Quarterly Review form is considered part of the RAI.
States may modify this form or use an alternate instrument by
submitting a request to us. However, each State's Quarterly Review form
must include at least those items on the HCFA Quarterly Review form.
Comment: One commenter said that the requirement for a quarterly
assessment should be taken in the spirit of four times a year and not a
rigid every 90 days. This would allow the facility to be flexible so
that a resident's health status or the facility schedule could be taken
into account.
Response: If a resident is experiencing a transient condition or is
out of the facility when his or her quarterly review is due, the
facility can wait until the resident's condition stabilizes or the
resident returns to the facility. The facility should document the
circumstances associated with the delay in conducting the quarterly
review. Regarding timing of the quarterly review, we draw from the
statutory language, which states that the facility must examine each
resident no less
[[Page 67198]]
frequently than once every 3 months. This is also consistent with the
regulations in effect prior to the publication of the proposed
regulation. We would also point out that the calculation of when the
quarterly review is due based on when the last assessment or quarterly
review completed by the facility. For example, if a facility completed
a quarterly review March 1, and completed a significant change
reassessment April 15, the facility must complete the next quarterly
assessment no later than July 15. If there had not been a significant
change, the next quarterly assessment would have been due no later than
June 1.
Comment: A few commenters wanted us to provide that the quarterly
review determines if a comprehensive reassessment is necessary.
Furthermore, they stated that the care plan may need to be revised as a
result of the quarterly assessment. One commenter proposed that
quarterly reviews must also review any section of the MDS relevant to
problems triggering or found in the assessment.
Response: The purpose of the quarterly review is to ensure that the
resident assessment data is accurate, and that the facility continues
to have an accurate picture of the resident, in order to monitor and
plan care. If the quarterly review indicates that a significant change
has occurred, the facility needs to conduct a comprehensive
reassessment. This also applies to the comment proposing a requirement
to review other areas in the MDS if the quarterly review finds a
problem. The facility is not limited to only reviewing the required
portions of the MDS that comprise the quarterly review. While we
encourage facilities to review any section that might be relevant to an
individual resident, we are not requiring at this time that a facility
review particular sections. We are providing in Sec. 483.20(d) that the
facility must revise the plan of care when indicated by the assessment.
Comment: Several commenters wanted additional MDS items or sections
required as part of the quarterly review. One commenter thought that
the quarterly review should include the entire MDS, providing
additional longitudinal information for an outcome-based quality
assessment system. Some commenters wanted all or a portion of Section
N, Skin Condition, from the MDS+, added. One commenter noted that skin
condition is a vital part of nursing care and the resident's
psychosocial well-being. Others wanted at least one item added from
former Sections B, Cognitive Patterns, C, Communication/Hearing
Patterns, E, Physical Functioning and Structural Problems, F,
Continence in Last 14 Days, H, Mood and Behavior Patterns, K, Health
Conditions, L, Oral/Nutritional Status, and P, Special Treatments and
Procedures.
Response: We are not requiring that a facility complete the entire
MDS on a quarterly basis, as we thought the additional burden this
would impose was not warranted clinically. Based heavily upon
suggestions submitted by commenters, we have added several items to the
quarterly review, including an item on skin condition. The primary use
of the quarterly assessments is to regularly ensure that the care plan
is responsive to the needs of the resident. A secondary use of the
information collected through quarterly assessments is that of quality
monitoring at the resident and facility level. Some of the items on the
quarterly review form have been identified as quality measures. An
example of a quality indicator is urinary incontinence. The MDS item
H.1 is one of the items that we use to monitor quality of care
associated with urinary incontinence. A mandated quarterly review
assessment will provide for the consistent collection and use of such
data.
We are also requiring that a facility transmit its quarterly review
assessment records to the State. There are several reasons for this.
Analysis of resident-level data over time is necessary to generate
quality measures (in other words, a quality indicator system requires
quarterly assessment data for each resident). As noted in the
discussion on establishment of the national data base, a facility can
identify opportunities to improve its own outcome and care practices
through the quality measures. Quarterly data will also help a facility
in its quality assurance program. Furthermore, if MDS data is to be
used for quality monitoring purposes by surveyors, it must be timely.
This means that we must require facilities to transmit their quarterly
review records, in addition to admission, annual and significant change
assessments, in order to use MDS data in the long term care survey
process. Leading researchers and survey experts in this area believe
that quarterly data is absolutely necessary for the timely and reliable
identification of resident outcomes, both at the facility level and the
resident level.
Proposed Sec. 483.20(b)(4) Use (Redesignated as Sec. 483.20(d))
Comment: A few commenters requested a definition of maintaining
``all resident assessments.'' They were confused as to whether this
meant just the MDS, or also the Identification Face Sheet,
documentation of the quarterly reviews, the RAP Summary sheet, and
information pertaining to the decision to proceed to care planning.
Response: ``All resident assessments'' includes all the documents
mentioned by the commenters--all MDS forms (Sections AA through R, and
V--the RAP Summary Form), the Quarterly Review forms, and Discharge and
Reentry forms. The RAP Summary form indicates which RAPs were
triggered, whether care planning was done for each of the RAP
conditions, and where data from the RAP assessment process is
documented.
We also require that a facility complete a subset of items when a
resident is discharged, which includes identifying information about
the resident, the type of discharge and the destination upon discharge.
As mentioned in the discussion on the national data base, we are also
requiring that the facility transmit this information to the State and
to us. This will allow for the closure of a resident's current stay at
the facility. Furthermore, we are requiring that a facility complete a
subset of MDS items upon a resident's reentry to the facility
(Sec. 483.20(f)). This will allow the facility to ``reopen'' the
resident's record in the facility system as well as at the State and
national levels.
Comment: We requested that the public advise us on what the
requirements should be for facilities to keep a hard copy of the MDS on
a resident's file if the assessment is computerized. A few commenters
urged us to allow flexibility. One pointed out that society is making
large strides toward paperless environments, and a Federal regulation
should not inhibit such progress. Other commenters thought that a hard
copy should remain in the resident's record even if the assessment is
computerized. Commenters recommended that hard copies stay in the
record for 2 years, as the proposed regulation discussed. Another
commenter suggested 1 year. A consumer advocacy group noted that hard
copies should always be accurate because it is the copy most likely to
be used by direct care staff. A State said that a hard copy on the
record is essential because appropriate staff may not always be
available to retrieve data from the computer. A few commenters did not
want us to require that a facility keep a hard copy of the MDS in a
resident's active record. Commenters believed that paper records are
expensive to maintain, and it should be acceptable if a hard copy were
readily
[[Page 67199]]
accessible to staff, residents, and surveyors. A State commenter
thought that a coding system would need to be created to handle old
assessments and reassessments. Commenters submitted other ideas. One
suggestion was to keep the original MDS on the chart and not require a
computerized copy. Another was to allow either the original or a
computerized version. A State suggested printing a hard copy at the
time of survey, and that all electronic assessment records should be
created according to the intervals called for under the MDS. The
commenter believed that the computerized system should be able to save
or change information as needed. Another State said that we should
require no further storage of a hard copy version once the facility
produces and transmits a computer version.
Response: In order to be used as intended, by clinical staff at all
levels, we believe that it is necessary for the facility to keep a hard
copy in the resident's record of all assessments for the past 15
months. This issue is also discussed in responses to comments on
Sec. 483.20(b)(5). We agree that direct care staff would be most likely
to use a hard copy of the assessment, and believe that it would be
problematic for clinical staff to be expected to retrieve assessments
from the computer, both in terms of their ability and willingness to do
so and also having the necessary equipment available on all clinical
units. Unless all charting is computerized, we believe that a facility
should maintain RAI assessments as a part of the resident's clinical
record. However, if a facility has a ``paperless'' system in which each
resident's clinical record is entirely electronic, the facility does
not need to maintain a hard copy of the MDS. To qualify for this
exception, the facility's MDS system must meet the following minimum
criteria:
The system must maintain 15 months' worth of assessment
data (as required in Sec. 483.20(d)) and must be able to print all
assessments for that period upon request;
The facility must have a back-up system to prevent data
loss or damage;
The information must always be readily available and
accessible to staff and surveyors; and
The system must comply with requirements for safeguarding
the confidentiality of clinical records.
Furthermore, the facility must maintain evidence that identifies
the Registered Nurse Assessment Coordinator and other staff members
that completed a portion of the assessment.
Comment: Commenters expressed concern with the proposed requirement
to maintain assessments from the previous 2 years on the resident's
clinical record. One stated that the intent of this requirement is
unclear, as it does not appear to serve any purpose for facility staff
in care planning, in that facility staff will be using the most recent
assessment information they have to aid them in the development of the
care plans. According to commenters, maintaining 2 years' worth of
assessment data in the resident's active record would be too bulky and
cumbersome. It could even add to facility costs associated with
purchasing large chart binders and chart racks. One commenter stated
that the full 2-year cycle of a resident would have approximately 42
pages of assessment documentation in the chart. If the resident had two
episodes of ``significant change'' in that time period, this would add
an additional 18 pages. Commenters maintained that a thick record would
be prohibitive and intimidating, adding that quantity does not always
translate into quality.
Several commenters maintained that surveyors look at only the
previous year's assessment information. Also, the MDS does not require
that the assessor look back over more than 180 days, so 1 year's worth
of data would be sufficient. They stated that earlier assessment
information would be easily retrievable from the record if needed.
Commenters asserted that medical information that is more than 12
months old is likely irrelevant and outdated. A commenter believed that
the regulation's intent could be met if historic materials were
retrievable and available to the assessor during the reassessment and
course of care. A commenter suggested that we require that the facility
maintain all full comprehensive resident assessments completed within a
12-month period in a resident record. One commenter wanted the 2-year
requirement to be effective on the date of the final rule.
Response: The original intent of the proposed requirement was to
enable a facility to better monitor a resident's decline and progress
over time. We are not able to determine if requiring that a facility
maintain assessment information for a 2-year period has facilitated the
analysis of this longitudinal data. We believe that the information is
necessary to evaluate the resident's plan of care, but have decreased
the required time period to 15 months of assessment records, since the
survey cycle allows for up to 15 months between surveys. Additionally,
computerizing MDS records will allow a facility to access prior
assessments in a timely and more efficient manner.
Comment: A professional organization did not believe that 2 years
of assessment data was enough to capture a decline in the resident's
status and thought that we should require a facility to maintain 3
years of assessment data. Another suggestion was that we require a
facility to maintain at least two comprehensive assessments in the
record with the appropriate quarterly review and RAP summary forms.
Response: Requiring that a facility maintain assessment data on a
resident's record for 3 years would be too cumbersome for most
facilities; however, a facility can maintain as many years of
assessment information as it likes. It is possible that having this
amount of longitudinal data would be helpful for a facility in tracking
resident progress. However, we are only requiring that a facility keep
15 months of the documentation associated with the RAI in the
resident's active record.
Comment: Commenters requested that we permit a facility to keep
prior assessment data in a ``thinned'' chart or another appropriate
location as opposed to on the active chart. A few commenters did not
feel that we should mandate where the facility keeps documentation.
Commenters suggested that we revise the requirement to provide that the
facility must maintain in active status all resident assessments
completed within the previous 2 years and use the results of the
assessments to develop, review and revise the resident's comprehensive
plan of care.
Response: As stated above, we are revising the regulation to
require that a facility maintain 15 months of assessment records. We
would note, however, that a facility need not store assessment data in
one binder to meet this requirement. A facility may choose to maintain
the data in a separate binder or kardex system, as long as the
information is kept in a centralized location and is accessible to all
professional staff members (including consultants) who need to review
the information to provide care to the residents. It is not acceptable
for the assessment data to be stored where staff cannot easily use it.
Comment: Another suggestion was we require the facility make
available the 2 years of data within 1 hour of request.
Response: We emphasize that the primary purpose of maintaining the
assessment data is so that a facility can monitor resident progress
over time. The information should be readily available at all times.
[[Page 67200]]
Proposed Sec. 483.20(b)(5) Coordination (Redesignated as
Sec. 483.20(e))
Comment: Commenters addressed the proposed requirement that the
facility coordinate the assessment with any State-required preadmission
screening program. Most who addressed this issue agreed that
coordination was needed to prevent duplicative efforts, particularly as
part of the Level II PASRR. Some, including States and provider
organizations, stated that the responsibility for coordination should
be a State function and not the facility's responsibility, noting that
a facility has little or no control over the screenings. One commenter
noted that the facility should not be penalized during a survey because
the State mental health authorities are unable to do appropriate plans
of care. A commenter requested that we not mandate this coordination
because, in most States, coordination will be extremely difficult to
accomplish. A commenter suggested that we provide, instead, that the
facility coordinate assessments to the maximum extent possible.
Response: We agree that coordinating the MDS with Federal PASRR
requirements, to the extent practicable, will prevent duplicative
efforts and the unnecessary expenditure of resources. The proposed
regulation required that the facility coordinate ``to the maximum
extent practicable'' with the PASRR program and we are retaining this
language as is.
With respect to the responsibilities under the PASRR program, the
State is responsible for conducting the screens, preparing the PASRR
report, and providing or arranging the specialized services that are
needed as a result of conducting the screens. The State is required to
provide a copy of the PASRR report to the facility. This report must
list the specialized services that the individual requires and that are
the responsibility of the State to provide. All other needed services
are the responsibility of the facility to provide. The PASRR report
also lists some nursing facility services the State PASRR evaluator
recommends for the facility to consider including in the plan of care.
We note that the survey agency should not cite a facility when the
State fails to fulfill its responsibility. However, if a facility fails
to fulfill its responsibilities to, for example, prepare fully
developed care plans, then the survey agency may cite it.
We would also like to point out that the requirements relating to
the preadmission screening and annual resident review program were
amended on October 19, 1996 by Public Law 104-315. In summary, the
legislation amended section 1919(e)(7) of the Act by removing the
Federal requirement for the annual resident review. Section
1919(b)(3)(E) of the Act was also amended by the addition of a
requirement that a nursing facility notify the State mental health
authority, mental retardation, or developmental disability authority,
as applicable, promptly after there is a significant change in the
physical or mental condition of a resident who is mentally ill or
mentally retarded. Finally, the legislation amended section
1919(e)(7)(B) of the Act to require that the State mental health or
mental retardation authorities conduct a review and determination after
the nursing facility has informed them that there has been a
significant change in the resident's physical or mental condition. In
developing regulations to implement the new provisions of the law, we
will try to ensure that States and facilities are not be subjected to
duplicative requirements or the unnecessary expenditure of resources.
Comment: Commenters were concerned that the condition of a resident
may necessitate a new comprehensive assessment done earlier than
annually, which would be administratively problematic for State mental
health authorities trying to coordinate their reviews.
Response: From the beginning of the PASRR program, a significant
change in the condition of a resident with mental illness or mental
retardation has required a judgement call to be made concerning whether
an annual resident review was necessary. While this requirement may
initially have caused some difficulty in scheduling, these procedures
should already be in place.
Comment: A few commenters submitted suggestions as to specific ways
that the RAI and PASRR could be coordinated. One suggested that we
expand items 11 and 12 in the former Section I, Identification
Information, which pertain to mental health history and conditions
related to mental illness or mental retardation. Another suggested that
we grant psychologists the same status under these regulations to
practice to the full extent of their licensure as has been recognized
under the PASRR regulations. One commenter believed that Level II
screening could serve as part of the cognitive, psychosocial, mood, and
behavior RAPs. A State commenter recommended that the mental health
authority use the MDS for nursing decisions to refer someone into the
community mental health system for further review. Another commenter
proposed that the facility forward a copy of the MDS to the State
mental health authority, and that relevant information from hospital
admissions be incorporated into the MDS.
Response: There are several elements of the MDS that could assist
in determining if the resident has mental illness or mental retardation
and whether nursing home level of care or specialized services, or
both, are necessary. We have changed the language in the Section AB,
Demographic Information of the MDS to be consistent with PASRR language
and definition regarding mental illness and developmental disabilities.
We will further consider the coordination of the RAI and PASRR in the
development of the regulations to implement the new legislation.
Comment: A commenter suggested that we add paragraph (b)(5)(i),
which would provide that State mental health and mental retardation
authorities may determine for those residents whose mental status and/
or intellectual functioning has remained stable over a 2-year period,
based on annual resident review criteria, as defined under subpart C,
Sec. 483.100 et seq., and on-site evaluation and record review, whether
the data contained in the annual RAI/MDS is sufficient to make a
determination of continued need for NF services and/or specialized
services, or whether further evaluation is required. The commenter
believed that much of the information needed for Level II screening can
be obtained from the RAI, especially for long-standing nursing home
residents with mental illness or mental retardation. The State mental
health authority would still be making the determination of level of
services as required under the PASRR requirements.
Response: We agree, as noted above, that the RAI data may serve as
the basis for State mental health and mental retardation authorities to
evaluate and make determinations about the need for NF care and for
specialized services. However, section 1919(e)(7) of the Act prohibits
a State mental retardation authority and a State from delegating their
responsibilities to a nursing facility or to an entity that has a
direct or indirect affiliation or relationship with a facility.
However, those responsible for conducting the evaluations should use
applicable up-to-date data from the MDS.
Comment: A State commenter suggested including results of the PASRR
reviews on the MDS, for example the dates of the reviews, special
needs, dates of recent
[[Page 67201]]
hospitalizations, and whether the resident needs specialized services.
Response: We encourage facilities to keep the results of a
resident's PASRR with his or her MDS. We are not mandating that a
facility record PASRR information on the MDS. The decision about how
much information to share with a facility is up to the State's
discretion, as is the choice of assessment instrument and the
coordination of the various assessments. We believe that a State should
have the flexibility to determine what a facility must retain.
Comment: A State commenter submitted several MDS elements that help
them identify residents who have mental illness or mental retardation
(including a list of ICD-9 codes recorded in the former Section J that
would indicate a developmental disability). The commenter noted that
RAI software exists that enables them to make this determination. Other
MDS items are useful in deciding if someone is exempt from PASRR
because of terminal illness, dementia, or a severe medical condition.
Response: We concur that several MDS items would be helpful in
identifying residents with mental illness or mental retardation. We
encourage States to develop or refine PASRR programs, or individuals
performing surveys of the facilities, as well as those conducting
preadmission screening under Public Law 104-315, to use the information
to the maximum extent possible. We disagree with the commenter who
suggested that an individual with a terminal illness, dementia or a
severe medical condition is exempt from the screening requirements. We
believe the commenter misconstrued the current requirement at
Sec. 483.130, which permits a State to make advance group
determinations when included in an approved State plan. Categorical
determinations are categories for which the State mental health or
mental retardation authorities may make an advance determination that
nursing home services or specialized services are needed for an
individual with mental illness or mental retardation. These categories
may include cases in which the resident has received convalescent care
after an acute physical illness that required hospitalization and do
not meet the criteria for an exempt hospital discharge. Dementia is not
considered a serious mental illness for the purposes of PASRR.
Therefore, a person with a primary diagnosis of dementia would not be
considered to have mental illness and would not be subject to PASRR
screening (unless he or she is also mentally retarded).
Proposed Sec. 483.20(b)(6) Automated Data Processing Requirement
(Redesignated as Sec. 483.20(f))
Comment: Several commenters believed that the proposed October 1,
1994 date for capability of computerization was unrealistic. A national
provider organization stated that, based on the regulation process and
time frames, it was possible that we would require that the systems be
in place before the final rule was published, and this would be unfair.
Commenters offered alternative dates, which included an implementation
date of October 1, 1995; at least 2 years from the effective date of
the final rule; and postponing implementation until a reimbursement
mechanism is in place. Another suggestion was that we publish a rule
specifically on computerization.
Response: We agree that an implementation date for facility
computerization of October 1, 1994 should be deferred until June 22,
1998.
To redesignated Sec. 483.20(f), we are adding the requirement that
a facility transmit at least monthly to the State all assessments
completed in the previous month. This includes admission assessments,
significant change reassessments, annual reassessments, quarterly
reviews, and information captured upon reentry to the facility,
transfer, discharge and death. We are requiring the latter information
for a number of reasons. States that are already computerized have
noted that this information is required to close out the resident's
record at the State level for the facility from which the resident was
discharged. We are aware that there are some States which, for Medicaid
payment purposes, must know where Medicaid recipients are every 24
hours. Information upon reentry, transfer, discharge and death will
allow State and Federal agencies to analyze long term trends in
resource utilization, particularly in regards to movement across
various types of care providers. Additionally, discharge information
will permit facilities to close out residents' records on their system.
In the State Operations Manual, we will provide facilities with
instructions on which MDS items must be completed to document this
information. Furthermore, as discussed elsewhere, we believe that the
information will provide facilities with invaluable data they can use
in a variety of ways.
Comment: A State commenter asserted that we should develop
penalties for non-compliance regarding the computerization requirement.
The commenter questioned whether the penalties would fall on individual
facilities, States, or both. The State suggested that, as an alternate
to penalties, we could provide monetary incentives for timely and
accurate submission.
Response: The requirements to encode the assessments in a machine
readable format and transmit the information to the State are like all
other requirements that a facility must meet to participate in the
Medicare and Medicaid programs. We believe that computer-aided data
analysis facilitates a more efficient, comprehensive and sophisticated
review of health data. Manual record reviews, on the other hand, are
labor intensive and more time consuming, and may, therefore, tend to be
more occasional or anecdotal. Additionally, utilization of the quality
measures and other types of quality monitoring, such as observation of
trends and patterns, is enhanced through computer aided data analysis.
Facility noncompliance with requirements established by this final
rule will be subject to the full range of enforcement remedies set
forth in part 488, subpart F, Enforcement of Compliance for Long-Term
Care Facilities with Deficiencies. However, at a minimum, we will
require that a facility complete a plan of correction and we will
impose the mandatory denial of payment for new admissions sanction if
the facility has not achieved substantial compliance within 3 months
from the date of the finding of noncompliance. Further, if the facility
is still not in compliance within 6 months from the date of the
finding, we will terminate its provider agreement. We may impose one or
more other remedies, as determined by us or the State in accordance
with part 488. Additionally, noncompliance that is repeated or that
recurs intermittently becomes part of the facility's noncompliance
history, which is a factor when we or the State selects the appropriate
enforcement response. A facility that demonstrates little or no
commitment to continual, rather than cyclical, compliance will be
sanctioned by us accordingly. We are not offering incentives for timely
and accurate submission at this time, but may consider such a concept
as we revise the survey process.
Proposed Sec. 483.20(c) Accuracy of Assessments (Redesignated as
Sec. 483.20(g))
Proposed paragraph (c) described the requirements regarding who
conducts and coordinates the assessment, certifying its completion and
accuracy,
[[Page 67202]]
and penalties for knowingly and willfully falsifying the assessment. In
this final rule, we are redesignating content of proposed paragraph (c)
related to accuracy of assessments as paragraph (g), coordination, as
paragraph (h), certification, as paragraph (i), and penalties for
falsification, as paragraph (j).
Proposed Sec. 483.20(c)(1) Coordination (Redesignated as
Sec. 483.20(h))
Comment: Commenters requested clarification on the definition of
``health professionals.'' Some, including a State commenter, wanted to
know if nurse aides who are on the State's nurse aide registry could
complete and document portions of the assessment.
Response: A licensed health professional, as defined at
Sec. 483.75(e), includes a physician, physician assistant, nurse
practitioner, physical, speech or occupational therapist, physical
or occupational therapy assistant, registered professional nurse,
licensed practical nurse, or licensed or certified social worker.
Furthermore, the definition of nurse aide, at Sec. 483.75(e),
specifically excludes licensed health professionals.
A facility may assign responsibility for completing the RAI to a
number of qualified staff members. It is the facility's responsibility
to ensure that all participants in the assessment process have the
requisite knowledge to complete an accurate and comprehensive
assessment. In most cases, participants in the assessment process are
licensed health professionals. Some State licensure and practice acts
specifically prohibit nursing assistants, and in some cases licensed
practical nurses, from conducting assessments. While nurse aides
certainly can and should contribute their knowledge of the resident to
the assessment process, nurse aides typically are not trained in
specific assessment skills, some of which require a significant amount
of knowledge.
Comment: A commenter stated that staff that are mandated to
complete certain sections of the assessment, like gait and movement,
behavior, and aspects of incontinence, do not have the appropriate
skills, clinical experience, or training to understand and assess the
issues involved. The commenter stated that surveyors lack this
expertise and training also.
Response: We are not requiring that specialized professionals
complete any sections of the MDS. As stated in the previous response, a
facility must ensure that staff conducting the assessment have the
requisite knowledge to accurately complete the assessment. We disagree
with the generalization that facility staff and surveyors do not have
the skills and training necessary to accurately assess residents. We
conduct a significant amount of training for surveyors on how to gauge
the accuracy of assessments. Provider groups and facilities also
conduct training in these areas.
Comment: A commenter expressed concern that requiring the
participation of professionals other than registered nurses could place
a burden on a facility that does not employ staff in certain
disciplines. The commenter recommended that the we combine the
requirements for coordination and certification to provide that each
assessment must be conducted or coordinated by a health professional,
in cooperation with other health professionals, as desired, and that a
registered nurse must review, sign and certify the completion of the
assessment.
Response: See previous responses. We do not require the
participation of specialized professionals other than registered
nurses. The personnel participating in an assessment are determined by
the needs of the individual resident. For someone who has significant
rehabilitation potential, for example, it would be reasonable for a
physical therapist to conduct part of the assessment. It is acceptable,
though, for a registered nurse to conduct the entire assessment as long
as it is accurate.
Comment: A consumer advocacy organization suggested that we
prohibit the use of assessment nurses hired solely for the purpose of
completing the MDS and who have no relationship to care provided. This
suggestion was based on a reference in the preamble to the proposed
rule (p. 61633) to staff who have clinical knowledge about the
resident, such as staff nurses.
Response: The requirements for care planning state that a
registered nurse with responsibility for the resident be a part of the
interdisciplinary team that prepares the care plan. This implies that
the registered nurse is directly involved in the resident's care and is
fully knowledgeable about the resident. We believe that the assessment
is conducted most accurately and efficiently in conjunction with the
registered nurse who has primary responsibility for the resident's
care. We believe that this is in line with the intent of Congress.
However, it would be beyond our purview to prohibit ``assessment
nurses.'' A facility is required by the statute to complete an accurate
assessment.
An evaluation of the RAI process, conducted by the Research
Triangle Institute in 1993, under contract with us, indicates that it
is rare for a facility to designate a sole staff member to conduct the
entire assessment. Registered nurses, who are often the primary
assessors get substantial contribution from others in at least some MDS
domains, even in facilities which designate an ``assessment specialist
nurse.'' We cannot necessarily state that a nurse hired solely to
conduct assessments does not have the necessary clinical knowledge.
Additionally, the survey process would detect inaccuracies in the
assessment if an assessor did not have the necessary clinical knowledge
to accurately complete resident assessments.
Proposed Sec. 483.20(c)(2) Certification (Redesignated as
Sec. 483.20(i))
Comment: Commenters suggested that we require that an individual
who completes portions of the assessment date his or her signature.
This would also apply to the assessment coordinator when he or she
signs and certifies the completion of the assessment.
Response: We agree with this suggestion and have changed the form
to reflect this.
Proposed Sec. 483.20(c)(3) Penalty for Falsification (Redesignated as
Sec. 483.20(j))
Comment: Commenters, including a national provider organization,
supported the distinction between clinical disagreement and false
statements. A commenter requested a definition of clinical
disagreement. One commenter expressed concern regarding guidelines for
surveyors and protections to ensure hard copy validity. For example, if
there is oversight in completing a section of the MDS, but the
registered nurse signs to certify completion, we could cite the
facility for falsification. A commenter also suggested that clinical
disagreement on the RAP Summary form does not constitute a material or
false statement.
Response: It is the responsibility of the nurse coordinating the
assessment to make sure that the MDS is complete before he or she
certifies completion. Failure to do so could result in a deficiency,
based upon information gathered by the surveyor.
For purposes of this regulation, clinical disagreement pertains to
coding an item based on observation of the resident over time and on
clinical judgment. If, based on observation, one nurse codes a resident
as needing supervision for locomotion while another nurse codes the
same resident as needing limited assistance based on her observation,
we would consider that
[[Page 67203]]
to be clinical disagreement and not falsification. However, if an
assessor were to complete the assessment without observing the resident
and gathering data, we would consider that to be a material and false
statement. Clinical disagreement applies to the entire RAI, including
the RAP Summary form, and care planning decision making process. The
survey process is not intended to usurp clinical decisions from the
facility.
Sec. 483.315 Specification of Resident Assessment Instrument
This section describes requirements for the States in specifying a
resident assessment instrument. It also lists the components an
instrument must contain if a State wishes to specify an instrument
other than the Federally designated RAI.
Our December 28, 1992 proposed rule placed the entire MDS and
instructions for its use in the regulation text. The proposed rule also
required that a facility encode the MDS in a machine-readable format,
in accordance with HCFA-specified formats. We are removing the MDS from
the regulation text. Because the law requires a standard assessment,
the regulation mandates that a State instrument contain, in its exact
form, the contents of our designated instrument, as set forth in the
State Operations Manual. This instrument is comprised of the MDS and
common definitions, the triggers and utilization guidelines (including
resident assessment protocols (RAPs)). We will ordinarily not approve
an instrument that does not contain the HCFA-designated resident
assessment instrument (RAI). The States may add items to the Federal
instrument, but may not change the MDS items, definitions or triggers,
delete any items, or alter the utilization guidelines pertaining to the
RAPs. This is necessary for the standardization and consistency
required by law. We believe that removing the MDS from the regulations
text is advantageous. It will allow us to easily modify the MDS so that
it requires collection of information that is clinically relevant and
meets evaluative needs as clinical practice evolves. By directly
discussing and negotiating with affected parties, it will be possible
to maintain a resident assessment process that reflects current
standards of clinical practice while obtaining public comment.
It has always been our intent that we would revise the RAI on an
ongoing basis to reflect changes in clinical practice and advances in
assessment technology. The first revision of the MDS and RAPs, known as
version 2.0, was published in Transmittal No. 272 of the State
Operations Manual in April, 1995, and is contained in the preamble of
this rule. For the purpose of this rule, State and provider
requirements related to the RAI pertain to the most current version of
the RAI that has been published by us (that is, presently dated 10/18/
94H, but subject to future revision). We expect to publish revisions to
the RAI, such as new or revised RAPs, in the State Operations Manual no
more frequently than annually, in order to minimize the burden on
providers of transitioning to a revised RAI.
We believe that the regulatory provisions that we are including in
the final rule adequately describe the fundamental MDS requirements and
that the form and details of the MDS are best set forth in interpretive
issuances. This will permit us to easily modify details such as the
measurement scales for a particular condition, or the symptoms that may
be relevant to that condition, and to respond to advances in clinical
standards.
We relied heavily on public comments received on the proposed rule
in modifying the MDS and RAPs contained in version 2.0 of the RAI. We
also drew on the expertise of a small work group comprised of
representatives of three States that had extensive experience in
working with the industry to successfully implement the RAI
requirements. In this way, we were able to address ``real world''
concerns as well as misinterpretations regarding individual MDS items.
We also received comments on a draft of the revised RAI during a public
meeting with national associations representing nursing home providers,
professional disciplines and consumers on December 10, 1993. Under HCFA
contract, Dr. John Morris of the Hebrew Rehabilitation Center for Aged
led the RAI revision effort from 1993 to 1994 and oversaw field
testing.
Proposed Sec. 483.315(a) State Responsibilities (Redesignated as
Sec. 483.315(c))
Comment: A State commenter noted that 30 days to specify an
instrument after we designate or change its instrument is not enough
time. The commenter stated that the survey agency would need to
coordinate with the State Medicaid agency. Furthermore, any change to
the HCFA-designated RAI would require the State to study the benefits
and costs of modifying the State-specified RAI vs. the revised HCFA-
designated RAI, notifying and training facilities, modifying computer
systems, etc. The commenter suggested 180 days. For the aforementioned
reasons, a commenter recommended that providers have advance notice of
changes to the RAI. Another commenter asked if we would extend the time
without specifying the number of days.
Response: We agree that 30 days may not be enough time for a State
to decide whether to adopt our changes or seek approval for an
alternate instrument. However, we believe that the commenter's
recommendation of 180 days is too long. Therefore, we are changing the
requirement to give States 90 days to decide whether they accept our
changes or wish to specify an alternate.
Comment: Commenters questioned whether the State would be required
to seek approval from us to re-adopt our forms every time we make a
revision to the forms. One commenter asked if a State that has already
specified the HCFA-designated RAI will now have to respecify it.
Commenters suggested that a State that has specified our instrument
should be expected to automatically adopt any revisions without
additional paper work.
Response: Our State Operations Manual Transmittal No. 272 contains
information on a State's responsibilities related to respecification of
its RAI. We require that a State notify us of its intent to use our
revised RAI or alternate instrument and specify the effective date for
its use. A State will continue to respecify its instrument whenever we
change the Federally-designated RAI. This enables us to monitor when a
State decides that it no longer wishes to use our instrument. As the
quarterly review form is now part of the Federally-designated RAI, we
require a State to specify the form to their facilities or to include
an alternative form in the package that it submits to us.
Comment: Commenters suggested revisions to paragraph (a)(2). A
commenter wanted to change ``* * * State must assure implementation''
to read ``must assist with implementation of RAI through training and
technical assistance.'' The commenter stated that training and
technical assistance does not ensure implementation, and proposed that
we add paragraph (a)(2)(I), which would provide that States must assure
implementation of RAI through the survey process. Another suggested
that we require that the State ensure facility implementation by
providing the necessary technical direction and education and training
to facilities at least annually. This would accommodate changes in
facility and surveyor staff, facilitate proficiency and maintenance of
assessment skills.
[[Page 67204]]
Response: We accept an amended version of the first two
suggestions. We are providing in Sec. 483.315(c)(3) that, after
specifying an instrument, the State must also provide periodic
educational programs for facility staff to assist with implementation
of the RAI. This parallels sections 1819(g)(1)(B) and 1919(g)(1)(B) of
the Act. We acknowledge that training does not necessarily mean
implementation. We do not wish to specify intervals at which training
must be conducted. Training should be based on provider needs and
should be targeted to focus on identified facility weaknesses. We do
not wish to take away State discretion in this area. We are also
providing in Sec. 483.315(c)(4) that a State must audit implementation
of the RAI through the survey process. Furthermore, we are reordering
the text to be more sequential in regard to the action the State must
take.
Comment: A commenter stated that the proposed requirement at
Sec. 483.315(a)(3) could have a negative impact on facility assessment
and care planning schedules. The commenter suggested that we permit a
facility to use its current RAI until we approve an alternative.
Another commenter requested that we allow States 180 days to secure
approval for an alternative instead of the proposed 4 months.
Response: It appears that the commenter misunderstood when we would
require a facility to implement a newly specified RAI. A facility does
not have to use a newly specified RAI or State alternate RAI until the
date that the State requires it, which would be well after the State
receives approval from us. Once the State receives our approval for an
alternate instrument, the State must specify the instrument for use in
all Medicare and Medicaid certified long term care facilities. The
State would need a realistic implementation time frame which would not
unreasonably have an impact on facilities. This time frame should
accommodate training and the absorption of change.
With respect to the proposed requirement that States have 4 months
to obtain our approval, we are eliminating the time frame entirely. The
time frame was necessary initially when States were specifying
instruments for the October, 1990 implementation of OBRA '87.
Furthermore, our experience working with States that are developing
alternate instruments is that a State may require more than 4 months.
In Sec. 483.315(a)(4), we proposed that, within 30 days of
receiving our approval of an alternate RAI, the State must specify the
RAI for use by all Medicare and Medicaid facilities. We are changing
the requirement to allow States 60 days to specify the instrument to
their long term care facilities (redesignated Sec. 483.315(c)(2)). This
will give the State time to contact each of their certified facilities
as well as reproduce the form for distribution to them. Additionally,
we are deleting the provision that says that HCFA approval of an
alternate RAI continues for 2 years. Our experience shows that many
States make changes to their instrument on a more frequent basis.
Comment: A few commenters questioned whether a State would need to
notify us if it redesigns the RAP Summary sheet.
Response: Since the RAP Summary sheet is part of the State-
specified and HCFA-approved RAI, the State would need to obtain our
approval to alter the sheet. Since we are removing the MDS from the
regulations text, we are making substantial changes to Sec. 483.315,
which addresses the contents of the HCFA-designated RAI. We are adding
to the regulations text the major domains contained on the revised MDS.
This reemphasizes the statutory mandate that alternate instruments
contain at least all the MDS elements. For the same reason, we are also
listing the assessment domains addressed in our RAPs.
Proposed Sec. 483.315(c) Secretarial Approval (Redesignated as
Sec. 483.315(g))
Comment: Commenters suggested that we delete this paragraph.
According to commenters, if States are allowed to reorder sections of
the MDS, use other RAPs, etc. it would be difficult to have consistency
in data collection and submission to us. The commenter suggested that
we require a State that wants an alternate instrument to include a HCFA
section that would incorporate our system.
Response: We agree with the commenters' suggestion to delete most
of the content of proposed paragraph (c). We are replacing it with a
provision that requires the State's alternate instrument to comply with
the standard format, vocabulary and organization requirements set forth
in the State Operations Manual (redesignated paragraph (g)). There are
a number of factors that warrant consistent ordering of data and
assessment items across all States. First, nursing home chains that
operate facilities in a number of States would benefit from some
consistency in the ordering of the MDS items, if not simply to
facilitate effective use of their training and education resources.
Second, software vendors would also welcome standardization of the
ordering of the MDS items in all States, as many of them market their
software to facilities throughout the country and to nursing home
chains that operate in a number of States. It also would minimize the
effort in revising their software. Third, we could also achieve
consistency in training State surveyors on use of the RAI. Fourth,
educational materials, resources, and education programs for nursing
homes and schools that prepare health care professionals could be
developed more cost-effectively and distributed more widely with some
consistency in how the MDS is ordered. Finally, data submission to us
and States will require standardization in the ordering of the MDS
items. Therefore, to facilitate standardization across States, we are
requiring consistent ordering of MDS sections. We will require that
States desiring to add additional data and assessment items, add those
items in section S of the MDS, which has been designated as the section
for State-specific items.
Comment: A few commenters thought that we should convene a clinical
advisory panel to evaluate any alternate RAPs that States submit. They
were concerned that the proposed supporting documentation could merely
be the consensus of the same experts who designed the alternates. This
would not protect the scientific integrity of the assessment system.
Response: We will convene a clinical panel periodically to evaluate
the need to modify the RAI, and to review and evaluate newly developed
RAPs, including those developed both by us and States. The process by
which State-developed RAPs are submitted for our approval is also
described in the State Operations Manual. We intend to have an open,
inclusive revision process.
Comment: Commenters suggested that we require that any alternate
instrument be cross-validated with the MDS on a large sample of
residents. States should submit the data from the cross-validation to
us for comparison of outcomes between States who use the HCFA-
designated RAI and those that do not.
Response: Alternate instruments must contain all MDS items. This
negates the need to cross-validate with the MDS. We have reviewed the
revised items and new items added to the MDS for face validity, and we
tested the individual items in early 1994. We encourage States to field
test and validate the new items, as well as allow review by other
qualified individuals prior to including the additional items on their
instrument and submitting it for approval.
[[Page 67205]]
State Requirement to Establish a Data Base of Resident Assessment
Information
Consistent with the purpose of the proposed rule and, after
considering the comments submitted, we are adding a new paragraph (h)
to Sec. 483.315, which delineates State requirements in establishing a
data base of resident assessment information. In the proposed rule, we
posed questions about the State's role in collecting and maintaining
the RAI data base, and we concluded that specific requirements are
necessary to ensure uniformity. Furthermore, we believe these
requirements are necessary to successfully design and implement a
national data base of resident assessment information. Paragraph (h)
includes provisions for specifying a transmission method for a facility
to send information to the State, specifying edits that the data must
pass, and provisions to transmit the data to us. A State will also be
responsible for resolving incorrect data submitted by a facility. While
the facility will edit the data before transmission to the State, the
State, which has already computerized assessment information, may note
that the data transmitted is not entirely complete or accurate, and
must send it back to the facility for correction. Additional edits at
the State level will help identify incorrect assessment information.
A State must edit the data it receives from a facility according to
formats we specify, but may add State-specific edits that do not cancel
or interfere with HCFA-specified edits. This will help ensure that the
data we receive is uniform, complete and accurate. Furthermore, we are
requiring that a State generate reports and analyze data, as specified
by us. For example, we could require States to run a profile of each
facility, which would allow the facility to analyze the prevalence of a
certain medical diagnosis amongst its residents.
For a number of reasons, as discussed below, we are requiring each
State to use a complete system that is developed or approved by us. We
will develop a single, open system by which States will manage and
analyze data. We believe that there are a number of advantages to
standardizing both the data analysis and the data management functions
which outweigh potential disadvantages.
Cost
Initial system costs will be substantially reduced by producing a
single system versus funding the development of 50 different systems.
Ongoing maintenance costs will be substantially higher if States
implement their own proprietary MDS systems. The costs associated with
modifying individual State systems to incorporate changes in the MDS or
HCFA specifications, formats or edits would be 50 times those
associated with modification of a standardized system and distribution
of new software or other specifications to each State.
Additional cost savings for data analysis activities will be
realized by us. Given that we envision standardizing the State data
analysis function, system standardization at the data management level
will ensure that the necessary infrastructure to support data analysis
is already in place. If States develop proprietary data management
systems, we would probably have to fund additional system/structural
costs when our proposed data analysis requirement becomes effective.
Data Reliability
It would be difficult to maintain quality controls and ensure
adequate data reliability across 50 State systems. For example, each
time we issue a change in transmission specifications or data fields,
each of 50 States would have to modify their proprietary systems to
accommodate the requirement. Past experience with MDS software vendors,
as well as other Federal systems, demonstrates that there is a great
degree of variation in the ability of vendors or agencies to
consistently implement system changes. This would pose a serious threat
to the long term integrity of the national MDS data repository.
Standardization would ensure that changes are implemented completely,
reliably, timely and in a coordinated manner across all States.
Programmatic Needs
Our desire to implement an MDS data-driven long term care survey
process based on quality measures cannot be efficiently realized
without standardization at the initial ``data management'' level.
Assuming that we are redesigning our provider survey model as an
automated, data-driven system, each survey agency will have to be able
to integrate directly with the State MDS repository. If each State has
a unique design for this repository, this integration will not be
possible in a cost-effective manner. Each State would have to use HCFA-
developed MDS data format specifications to extract MDS data into the
standardized survey system. Allowing the development of 50 State
proprietary systems would also result in long term inefficiencies in
that each State would be required to rewrite their data extraction
procedures each time we want to make a change to the survey process,
quality measures or in the MDS itself. Even if we had unlimited
resources for State customization, this would have a serious impact on
our ability to introduce changes in a timely and consistent manner.
HCFA Initiatives to Implement Standardized Clinical Data Sets
These changes are an integral part of the Administration's efforts
to achieve broad-based improvements in the quality of care furnished
through Federal programs and in the measurement of that care, while at
the same time, reducing procedural burdens on providers. Quality
assessment and performance improvement rests on the assumption that a
provider's own quality management system is the key to improved
performance. Our objective is to achieve a balanced approach combining
our responsibility to ensure that essential health and quality
standards are achieved and maintained with a provider's responsibility
to monitor and improve its own performance. To achieve this objective,
we are now developing revised requirements for several major health
care provider types. All of these proposals are directed at (1)
improving outcomes of care and satisfaction for patients, (2) reducing
burden on providers while increasing flexibility and expectations for
continuous improvement, and (3) increasing the amount of, and quality
of, information available to everyone on which to base health care
choices and efforts to improve quality. We note that our revised
approach to quality assurance responsibilities is closely linked both
to the Administration's commitment to reinventing health care
regulations and to our own strategic plan. These initiatives have three
common themes. First, they promote a partnership between us and the
rest of the health care community, including the provider industry,
practitioners, health care consumers, and the States. Second, they are
based on the belief that we should retain only those regulations that
represent the most cost-effective, least intrusive, and most flexible
means of meeting our quality of care responsibilities. Finally, they
rely on the principle that making powerful data available to consumers
and providers can produce a strong nonregulatory force to improve
quality of care.
The MDS is the first of several clinical data sets we envision
creating and implementing in various care settings. Standardized
information on clinical
[[Page 67206]]
status and health care outcomes is necessary for more objective and
focused quality monitoring. Consequently, interest in standardized
clinical data sets has skyrocketed, with much activity occurring in
this arena in both the public and private sectors. We view our efforts
with the MDS as a prototype for the next several years, during which we
propose to build and implement clinical data sets across several
provider types. These data sets will feed into quality indicator
systems, which will supplement our traditional survey processes. At
this point, we are beginning work on designing a comprehensive
standardized assessment tool for home health agencies as well as field
testing the uniform needs assessment instrument, which we are
evaluating for use by all providers and view as forming the ``core'' of
all care-setting specific data sets. Additionally, we propose
development of standardized patient process and outcome measures for
the End Stage Renal Disease program and a standardized instrument for
the Intermediate Care Facility for the Mentally Retarded program in
fiscal years 1996-97. In view of these initiatives, it would be much
more economical and efficient to put in place now, within each State,
standardized system designs and structures to support increased
clinical data management and analysis. Otherwise, we will be
responsible for funding and coordinating State efforts to implement
data systems for each provider type as we implement new requirements.
In the system design process we explored several options,
particularly regarding State systems and gathered a significant amount
of information about current status of State systems. For example, we
sent two questionnaires to the States to determine whether they had
developed an MDS system, what the configuration might be, and what sort
of direction and assistance non-computerized States would want from us.
We convened several meetings across the country which were attended by
more than 45 States. At these meetings we presented the concept of
standardization. Reaction was quite supportive. We are aware that
States which already have systems will have to make significant
adjustments and will provide assistance in the process.
III. Provisions of the Final Rule
In summary, in this final rule, we are adopting, without change,
the provisions of the proposed rule with the exception of the
following.
We are adding greater specificity to the proposed
requirement that each facility establish a data base of resident
assessment information and transmit MDS data to the State at least
monthly (Sec. 483.20(f)).
We are adding a new requirement that each State establish
a data base of resident assessment information received from
facilities, using a system to manage and analyze data that is developed
or approved by us, and transmit that information to us at least monthly
(Sec. 483.315(h)).
We are adding a definition of ``significant change'' in a
resident's physical or mental condition to clarify when a facility must
conduct a comprehensive assessment of a resident (Sec. 483.20(b)(2)).
Instead of including the entire content of the MDS, the
utilization guidelines for resident assessment instruments, common
definitions, resident assessment protocols and instructions in the
regulations text or in an appendix to the text, we are providing
descriptions of the RAI, the MDS, and RAPs. We are providing a
description of the assessment areas included in the MDS
(Sec. 483.315(e)), and a description of the domains addressed in the
RAPs (Sec. 483.315(f)), both of which must be included in the RAI
specified by a State (Sec. 483.20(b)(1)).
To address concerns about confidentiality of resident
data, we are providing that a facility and a State may not release
resident-identifiable information to the public, and may not release
the information to an agent or contractor without certain safeguards
(Secs. 483.120(f)(5) and 483.315(j)).
In this final rule, we are not adopting the proposed
technical revisions to part 456 concerning inspection of care reviews
of SNFs and ICFs. We will include these revisions in another document.
IV. Regulatory Impact Statement
A. General
Consistent with the Regulatory Flexibility Act (RFA) (5 U.S.C. 601
through 612), we prepare a regulatory flexibility analysis unless we
certify that a rule will not have a significant economic impact on a
substantial number of small entities. For purposes of the RFA, all
nursing homes are considered to be small entities. Individuals and
States are not included in the definition of a small entity.
In addition, section 1102(b) of the Act requires us to prepare a
regulatory impact analysis if a rule may have a significant impact on
the operations of a substantial number of small rural hospitals. Such
an analysis must conform to the provisions of section 604 of the RFA.
For purposes of section 1102(b) of the Act, we define a small rural
hospital as a hospital that is located outside of a Metropolitan
Statistical Area and has fewer than 50 beds.
B. Affected Entities
We require that all certified nursing homes assess residents using
a standardized data set known as the MDS. Nursing homes have been
collecting this information manually since October 1990. Most States
implemented a second generation assessment instrument, known as MDS
2.0, on January 1, 1996. The use of the MDS as the core of the
comprehensive assessment requirement has improved the quality of
nursing home services by ensuring that the assessment is consistently
based on all information that is necessary to evaluate a resident's
needs. Accurate and comprehensive resident assessments have improved
the accuracy of the care planning process and, ultimately, the care
provided by the nursing home. The myriad benefits associated with the
MDS have been well documented in a study we commissioned to evaluate
the outcomes of using the MDS. One of the more striking changes
documented by the study was an association of the use of the MDS with a
significant reduction in hospitalization among more cognitively
impaired nursing home residents, without a concomitant increase in
mortality. The study also identified major reductions in rates of
decline (especially among various types of residents) in important
areas such as nutritional status, vision, and urinary incontinence.
However, in order to realize the full benefits of the MDS, the
information needs to be computerized, and configurable as an analytical
tool. Publication of this rule will allow this goal to be realized.
The automation and transmission of MDS data by nursing homes and
States to us will improve the delivery of quality care in the nation's
nursing homes in several ways. An automated MDS data base will provide
information that will benefit both the policy and operational
components of State and Federal governments, as well as furnish
valuable information to long term care providers. The MDS system will
also establish a means of providing consumers with quality-related
information to make health care decisions.
More specifically, the MDS data base will enable us and the States
to provide nursing homes with aggregated State and national resident
status information and trends. This will allow nursing
[[Page 67207]]
homes to compare themselves to similar homes and is consistent with a
quality improvement model. Furthermore, by establishing their own in-
house quality assurance analyses from these computerized data, nursing
homes will be able to evaluate the effectiveness of treatment
modalities given a certain outcome. This type of information will
assist nursing homes in making better use of their staff and other
resources, and also eliminate the allocation of resources that do not
achieve desired outcomes. In short, the MDS data base will provide
nursing homes with the information to identify and correct their own
problems.
States will have access to timely MDS data that will improve their
ability to focus on-site inspection activities associated with the long
term care survey process. Since we require MDS data for all residents
regardless of payor source in nursing homes, these data elements can be
configured into quality measures. The quality measures flag individual
residents and facilities when there may be a problem with the quality
of care provided. For example, the indicators may identify those
residents who were admitted to a nursing home without pressure sores,
but who developed sores in the nursing home. Similarly, a nursing home
that has a relatively high percentage of residents with pressure sores
may indicate a problem when compared to other facilities. This resource
will significantly improve States' ability to identify areas of
potential quality concerns in an effective and efficient manner, and
facilitate the partnership of States and industry in identifying
opportunities to improve care. At both the Federal and State level,
information from the MDS data base will provide a valid and reliable
tool for evaluating and improving the efficacy and effectiveness of
survey and certification activities.
States have also identified a myriad of other intended uses for MDS
data that include Medicaid payment, utilization review, preadmission
screening and resident review, Medicaid coverage authorization, and
State policy analysis and trending. It is our intention that a
standardized MDS data system will support States' unique needs and
should not necessitate the creation of distinct and duplicative data
bases at the State level.
C. Costs Associated With Automating the MDS
We anticipate that both nursing homes and States will incur some
incremental costs from computerizing and transmitting the MDS. We
estimate total start-up costs of $20.3 million, which represents costs
incurred by nursing homes (we will be supplying the MDS systems
directly to the States). We also estimate total ongoing annual costs of
about $34.7 million, which includes $27 million in costs for nursing
homes and $7.7 million in costs for States. Total costs include
Medicare benefit costs of $9.5 million. Total costs also include an
annual administrative cost of $3.5 million that will be absorbed within
HCFA's program management appropriation. However, the benefits
associated with computerizing the MDS far outweigh the additional costs
of automating the data. The following represents our estimates of the
individual costs associated with this effort.
Nursing Homes
Upon publication of this rule, all nursing homes must computerize
the MDS. Most costs associated with computerizing the MDS will be
related to hardware and software. At the current time, we estimate that
approximately 70 percent of the nation's 17,000 Medicare, Medicaid or
dually certified nursing homes have already computerized the MDS or
have the capability to do so. Another 16 percent of nursing homes
already have some kind of computer system that will require upgrading
to meet the requirements for MDS, and only 14 percent have no computer
system at all. Additionally, some facilities with currently operating
MDS systems may require hardware and software upgrades to support
aspects of the national MDS system (for example, a faster modem or
installation of the Windows operating system).
Under the Balanced Budget Act of 1997, nursing homes will be
reimbursed for Medicare under a prospective payment system for cost
reporting periods beginning on or after July 1, 1998. Prior to July 1,
1998, costs incurred by nursing homes associated with computerizing the
MDS will be paid on a reasonable cost basis. Generally, these costs are
considered capital costs and are subject to the applicable Medicare
rules. Additionally, it is likely that nursing homes will also incur
certain routine services costs which will also be paid on a reasonable
cost basis. These costs are subject to cost limits. In the past, the
routine cost limits have included an add-on to account for the costs
associated with the Omnibus Budget Reconciliation Act of 1987 (OBRA
1987), including the cost of conducting resident assessment. When a
provider incurs cost related to OBRA 1987 that exceed its limit
(including the add-on), we have allowed the fiscal intermediary to make
an adjustment to the costs limits. This policy is described in a notice
published in the Federal Register on October 7, 1992 (57 FR 46177).
The Balanced Budget Act of 1997 also prescribes a public process
for the determination of rates for payment under Medicaid State Plans
for nursing home services in which the proposed rates, the
methodologies underlying the establishment of such rates, and the
justifications for the proposed rates are published, thereby giving
providers, beneficiaries and their representatives, and other concerned
State residents an opportunity for review and comment. States have
flexibility in designing the details of their payment systems for NF
care, and to the extent that NFs incur costs in computerizing the MDS
(such as the acquisition of hardware or software, staff training, or
additional staffing), the State may take these costs into account in
setting its rates.
Hardware: We estimate total hardware costs associated with
automating the MDS to be approximately $2,500 for a typical nursing
home, which includes the computer and communications components capable
of running MDS software and transmitting MDS assessments, and a laser
printer. This estimate is based on the most recent cost data available
for a system that includes an Intel Pentium processor. As noted earlier
in this rule, we expect that only 14 percent of all nursing homes will
need to buy an entirely new system. Seventy percent of all nursing
homes are already using an automated MDS collection tool (although some
may require upgrading in order to transmit the MDS data), and the
remaining 16 percent already have some sort of computer system that
simply requires upgrading.
The aforementioned cost estimate is based on the type of system
that we anticipate many nursing homes will choose to purchase. At a
minimum, a nursing home should have at least a 486 personal computer,
either connected to a network or as a stand-alone, with 8 megabytes of
RAM, at least 100 megabytes of available hard disk space, a 14 inch
color monitor, keyboard, mouse, a 3.5 floppy drive and a laser printer.
To operate the transmission software, this machine must run the Windows
operating system, version 3.1 or higher. All nursing homes will also
need a 28.8 Kbps modem for telecommunication of data, as well as a
common data communications software package to transmit MDS assessments
to the State. This communications package must meet our specifications
related to
[[Page 67208]]
transmission of MDS data and represents current technology.
Ongoing hardware maintenance costs for nursing homes are expected
to average about $100 annually.
Software: Nursing homes desiring to meet only the
requirement for data submission can use a less costly software package
to accomplish the basic encoding and formatting functions. A nursing
home must submit MDS records to the State that conform to a specific
ASCII layout and incorporate them into files with header and trailer
records that conform to required formatting standards. However, we
anticipate that most nursing homes, seeking to gain efficiency in
general operations, will choose more capable programs, some of which
could be used to meet (1) other clinical or operational needs (for
example, care planning, order entry, quality assurance, billing) or,
(2) other regulatory requirements for reporting resident information.
The standardized record formatting specifications and additional
policies on MDS automation that we developed should be used by
individual nursing homes, multi-facility chains, and software vendors
to develop products for encoding and transmission of MDS 2.0 data. This
information has been available to the public for about two years
through the Internet, and is located on the HCFA Web site.
There are currently over 100 vendors marketing MDS software
products. While we are not requiring record specifications and
automation policies until this rule is published, we developed them
earlier to provide guidance to the industry and to minimize the need
for a facility to modify and replace systems once this regulation is
published. At this time, we estimate that such software packages will
be available on the market for approximately $1,250 for those nursing
homes that have not yet become MDS automated. We expect that a nursing
home's private sector software vendor will provide primary support to
the facility in terms of MDS encoding and transmission to the State.
State personnel, however, will work with facilities and software
vendors in educating them about this process.
Supplies: Supplies necessary for collection and
transmission of data including diskettes, computer paper, and toner,
will vary according to the size of the nursing home in terms of
residents served and assessments required. Dividing the nursing homes
into groups, supply costs are estimated at the following three levels:
small facilities (with less than 145 residents), $175/year; medium
facilities (with 145 to 345 residents), $225/year; and large facilities
(with greater than 345 residents), $275/year.
Maintenance: There are costs associated with normal
maintenance of computer equipment, such as the replacement of disk
drives or memory chips. Typically, such maintenance is provided via
extended warranty agreements with the original equipment manufacturer,
system reseller, or a general computer support firm. These maintenance
costs are estimated to average no more than $100 per year.
Training: Nursing home staff will need training on
automating the MDS. Since many nursing homes will choose to have their
staff input MDS data at the time of the resident assessment, we
estimate that a typical nursing home will train two nurses for about 3
hours each. We expect that this training will be supplied by the vendor
supplying the MDS encoding software, and estimates that the training
will cost an average nursing home about $144 based on an average hourly
rate for nurses of $24.
Other nursing home staff will need training in transmitting the
data to the State and interpreting messages of record errors. We expect
that this training will require about 3 hours of staff time, and will
cost an average nursing home about $66, based on an average hourly rate
of $12 for technical staff. This cost also includes travel expenses and
travel time, since facility staff may need to travel to a centralized
training site within the State (we anticipate that training will be
provided in multiple sites in the State once the system is
implemented). We expect that the State survey agencies will supply this
training.
Data entry: Nursing homes will have flexibility in the
method used to enter data, but the method must comply with our
requirements for safeguarding the confidentiality of clinical records.
Data can be entered directly by a clinical staff member (that is, the
nurse responsible for coordinating or completing the assessment), from
a hard copy of a completed MDS by a clerical staff member, or by a data
entry operator with whom the nursing home may contract to key in the
data. We estimate that data entry staff could require approximately 15
minutes to enter each MDS. Nursing homes must collect and transmit MDS
data, which for the admission assessment, annual updates, as well as
significant changes in the resident's status, significant correction
assessments, quarterly review assessments, which include a subset of
the MDS items, discharge records, and reentry records. Additionally,
nursing homes must allow time for data validation and preparation of
data for transmission, as well as for correction of returned records
that failed checks at the State data-editing level. We estimate that a
100 bed facility will incur an annual data entry cost of $1,250, (or
$12.50 per resident per year), based on an estimate of five MDSs per
bed (annual plus ``significant changes'') and an hourly rate of $10.
Data Transmission: The State agencies will fund the costs
of transmitting data from the nursing homes to their respective States.
However, nursing home staff time must manage the data transmission
function, correct communications problems, and manage reports logs
transmitted from the State agency. We estimate that it will take an
additional hour of staff time to perform data transmission related
tasks each month. This staff time will cost an average size nursing
home about $144 per year.
States
We expect that overall responsibility for fulfilling requirements
to operate the State MDS system will rest with the survey agency.
However, the State may enter into an agreement with the State Medicaid
agency, another State component, or a private contractor to perform
day-to-day operations of the system. If the State MDS system is
operated by an entity other than the survey agency, the State must
ensure that the survey agency has suitable access to this system to
fully support all MDS-driven functions that the State will require of
the survey agency (for example, quality indicator reporting, survey
targeting). The State is also responsible for reporting MDS data to a
central repository to be established by us.
States will primarily use the MDS data to focus the long term care
survey process and to provide nursing homes and consumers with MDS-
derived information. A State's MDS system includes the following
components: computing hardware that includes data base, communication,
supporting file, and print servers for client workstations; local and
wide-area data networks; and application software for performing all
aspects of MDS-related functions and tasks. As such, the MDS system
will be designed and developed within a broad class of systems known as
Client/Server architecture.
We plan to provide each State with a standardized hardware
environment scaled to meet each State's anticipated processing volumes.
Additionally, a standardized suite of software applications will be
provided to each State to perform all MDS-related
[[Page 67209]]
functions, including receipt and validation of MDS records, posting of
records to the master repository, and analytical applications to be
used to inform and support the long term care survey process. A HCFA
contractor will work closely with each State to customize the ``turn-
key'' MDS system to integrate it into a State's current computer and
network structure. The contractor will visit each State to install and
test equipment, and ensure that the MDS system is fully operational. We
currently plan to phase in State deployment of the system, roughly from
August through December 1997.
We will place this system in each State and it will be operated by
personnel within the designated State agency. We are requiring that the
State systems do the following: receive MDS records from nursing homes;
authenticate and validate the records received from nursing homes;
provide feedback to the nursing homes by indicating acknowledgment of
the transmission of the data and specifying the status of record
validation; store the MDS records in a permanent data base within the
State; create system management reports and logs; generate provider
performance reports including quality indicator reports designed to
support a future data-driven survey process and provider survey
targeting functions; perform other analytical functions, as defined by
us; create ad-hoc reports; and retransmit validated MDS records from
each State agency to a national MDS data repository developed and
maintained by us.
Just as in nursing homes, some States are already using some sort
of an automated MDS collection tool. At least 12 States have already
developed MDS data bases. In nearly all cases, the State Medicaid
agency has been the driving force in getting MDS data to the State
level. System designs and approaches have varied considerably (that is,
while two States have recently moved to modem transmission, other
States still perform data entry at the State level from hard copies
forwarded by nursing homes).
We are providing the MDS system to States primarily for use in the
Survey and Certification program. As such, most Federally reimbursable
costs incurred by the States for automating the MDS will be funded
through that program. However, we anticipate that many States will also
choose to use MDS data in administering their Medicaid programs. When
that is the case, Federal reimbursement is applicable to the extent a
State uses the MDS for administering its Medicaid program. As a result,
it may be appropriate for a State to allocate some MDS costs to its
Medicaid administrative cost claims.
When a State does use MDS in administering its Medicaid programs,
it should apportion Federal costs associated with automating the MDS
and operating the data system between the Medicare and Medicaid Survey
and Certification program, and the Medicaid program (as administrative
costs, when applicable). The State should apportion MDS costs to these
programs based on the State's determination of each program's
utilization of the MDS system. Costs charged to the Medicare and
Medicaid Survey and Certification program will be prorated in terms of
the proportion of SNFs and NFs in the State that participate in the
Medicare and Medicaid programs. Costs for SNFs and NFs are split
equally between the two programs. The Federal financial participation
rate for the Medicaid Survey and Certification Program is 75 percent.
The Federal financial participation rate for costs apportioned as
Medicaid administrative costs is 50 percent. When the State licensure
program benefits from the automation of the MDS, the State should also
share in the MDS automation costs.
Several States asked if we could reimburse Medicaid administrative
costs associated with the development of MDS at Federal financial
participation rates greater than 50 percent, the rate used in computing
Medicaid reimbursement for general administration of the program.
Specifically, they asked if we will reimburse these costs at the same
rates used to reimburse the costs of designing, developing,
implementing and operating a Medicaid Management Information Systems
(MMIS).
Section 1903(a)(3) of the Act and implementing regulations at
Sec. 433.111 describe the MMIS as a mechanized claims processing and
information retrieval system. Federal financial participation is
available at 90 percent in expenditures for design, development,
installation or enhancement of the system, while 75 percent is
available for costs relating to its operations (namely, processing
claims and producing related management information). The MDS is not a
Medicaid claims processing and information retrieval system. We
reimburse other systems not directly related to performing MMIS
functions, such as the MDS, at the 50 percent level of Federal
financial participation.
Commenters asked whether automated systems to collect and analyze
data for rate setting purposes meet the MMIS definition. Because rate
setting is outside the claims payment and information retrieval
processes required by section 1903(a)(3) of the Act, those costs are
not eligible for enhanced Medicaid reimbursement under the MMIS
definition. However, in those instances when specific data elements
from a separate system like MDS must be transferred to the MMIS in
order to calculate individual provider payments, the cost of modifying
and operating the MMIS to accept and use the data from the outside
source qualifies for enhanced Federal financial participation if the
State follows the regulations and guidance found in Secs. 433.110
through 433.112, 433.116 and in Part 11 of the State Medicaid Manual.
For example, a major function of the MMIS is to produce both
beneficiary and provider profiles for program management and
utilization review purposes. NF resident and provider profiles are
required by Sec. 433.116(g). However, both NF resident and NF provider
profiles historically have been very limited because the data elements
on a nursing facility claim provide few details of services provided. A
State may wish to improve the MMIS profiling capability by importing
MDS data to prepare augmented profiles of nursing facility and nursing
facility residents. If the State does that, the enhanced Federal
financial participation will be available for the costs of modifying
and operating the MMIS to accept and use the data from MDS if the State
acts in accordance with the regulations in Secs. 433.110 through
433.112, 433.116 and the guidance in Part 11 of the State Medicaid
Manual. Please note that we currently encourage States to modify their
MMIS to accept encounter documents from Medicaid managed care
organizations to extend the MMIS profiling capability to cover both
managed care and fee-for-service providers and patients. Therefore, it
seems appropriate that we would reimburse the cost of modifying MMIS to
accommodate MDS usage also at the enhanced MMIS rates, if the State
meets the conditions in the aforementioned regulations and State
Medicaid Manual.
The following is our estimate of State costs for automating the
MDS:
Hardware: We will hire a contractor to purchase, deliver,
and install the MDS equipment in each State. Since we will be providing
the equipment to the States, the States will not incur any cost for
hardware. This equipment will include both a communications server and
a data base server. The number of nursing homes within each State will
be the driving factor in determining each State's computer needs. We
will scale
[[Page 67210]]
system requirements to meet the data storage and transmission needs of
the individual State.
Software: Since we are developing the software for each
State's MDS system, we will pay the costs associated with this system
and supply the system directly to the States. Software that we will
supply to the States will include communications software and data base
software, as well as customized analytical software to generate
reports. When a State develops its own customized MDS applications, the
costs of developing and maintaining these additional software
applications (and any related hardware components) will not be
Federally funded.
Operational Staff Time: States may plan to reassign
existing staff or hire additional full-time equivalents to manage the
automation project and perform day-to-day operation of the standardized
MDS system. The staff members assigned to MDS automation tasks will
need to have skills in a variety of areas: technical computer, network,
and telecommunication skills; data processing operations; and, user
support and training (including support for both State and facility
users). In hiring or reassigning staff, we encourage States to recruit
generalists who can perform a wide range of the above tasks.
Each State's actual staffing requirements will vary depending on
the State's size (that is, as measured by the number of nursing homes
regulated). To assist in determining staffing requirements within
particular States, we assigned States to one of three categories based
on the number of certified nursing homes in their jurisdiction: less
than 144, 144 to 356, and those greater than 356 facilities. We
estimate that 1.5 full-time equivalents will be required to manage all
MDS-related operations for each of the three categories; for instance,
States in the smallest group should budget for 1.5 full-time
equivalents, 3 full-time equivalents in the second group, and 4.5 full-
time equivalents in the largest group. This includes an MDS Automation
Project Coordinator.
Specifically, an average sized State regulating about 300 nursing
homes will require about three full-time equivalents to fulfill the
following MDS-related tasks: MDS project coordination (oversight of
daily operations); technical operations (systems management,
configuration and troubleshooting); training and support operations
(facility and MDS software vendor startup training); and operations
(functions associated with transmission logging and error tracking and
resolution). We estimate that MDS-related staffing costs for an average
size State will be about $133,000 per year.
Supplies and Maintenance of Equipment: States can expect
about $600 per year in additional costs for products that are consumed,
such as printer toner and paper. The MDS data management and analysis
equipment to be installed within each State is comprised of standard
``off-the-shelf'' hardware and software components that are generally
covered under typical service agreements that the States may already
have in place. We will ask States to extend these agreements to cover
hardware components delivered as part of the MDS project. These costs
will again vary according to the size of the State requirements, but on
average, the typical State will incur about $750 per year in additional
cost for systems maintenance. We will maintain and upgrade centrally
the standardized MDS software components that we develop and distribute
to States.
Training: We plan to centralize training of State
personnel who will be responsible for administrative and technical
aspects of system operations. Additionally, we will provide separate
training on the technical aspects of the system including its
communications, networking, data base and software application
functions, daily operations and on-going systems management.
In order to promote national consistency in MDS system operations
and troubleshooting, we request that each State designate one
individual as the MDS Automation Project Coordinator. This person will
be our key contact within each State for managing MDS system issues. We
are planning to convene at least one national meeting of the MDS
Automation Project Coordinators each year. We will use this forum to
present new information, gather suggestions for system improvements,
exchange ideas on MDS system operations, administration and
troubleshooting issues, and to discuss objectives for future system
development and refinement.
With our technical support and guidance, States will work closely
with the provider community in providing information on specific
requirements related to the submission of MDS assessments to a
repository maintained by the State. The standardization of the State
MDS system extends back to the provider communications function, in
that nursing homes will use a common data communications software
package to transmit MDS assessments to the State. State personnel will
work with the nursing homes and software vendors in educating them
about this process. We expect that the commitment of staff resources to
this task will be most intensive during the first 6 months of this
process. However, States should also expect some ongoing allocation of
full-time equivalents to support this process on an ongoing basis.
We anticipate annual travel costs associated with training for an
average size State to be about $2,700 per year.
Data Transmission: States will incur data communication
costs for transmission of MDS assessments from nursing homes. These
costs have two basic elements:
(1) Fixed monthly line fees of approximately $32.50 per line per
month. The number of lines required varies from 4 to 16 according to
the number of nursing homes supported by a State. On average, a State's
fixed line cost will be $3,806 per year.
(2) Line connect and long distance charges of approximately $.27
per minute. We estimate that the typical nursing home will require, on
average, 5 minutes ($1.35) of connection time per month for MDS
submissions. This translates into an average connection cost of $5,376
per year per State.
We will fund the cost of the States transmitting their MDS data to
our central repository. Therefore, we do not expect that States will
incur data transmission costs to us.
D. Conclusion
While we acknowledge that nursing homes and States will bear some
incremental costs associated with this proposal, these costs are well
justified when considered within the context of the anticipated
increased quality of care for nursing home residents, as well as the
potential uses of the automated data by the facilities, States, and us.
The foregoing estimates may actually overstate anticipated costs
because they do not take into account cost-savings achieved by
improving nursing homes' management information systems, as well as
potential improvements in resident's overall health status. Nor do they
represent the savings inherent in a more focused, uniform approach by
both the States and us in assessing quality of care in the nation's
nursing homes.
For these reasons, we are not preparing analyses for either the RFA
or section 1102(b) of the Act because we have determined, and we
certify, that this rule will not have a significant economic impact on
a substantial number of small entities or a significant impact on the
operations of a substantial number of small rural hospitals.
In accordance with the provisions of Executive Order 12866, this
regulation
[[Page 67211]]
was reviewed by the Office of Management and Budget.
V. Information Collection Requirements
Sections 4204(b) and 4214(d) of OBRA '87 provide a waiver of Office
of Management and Budget review of information collection requirements
for the purpose of implementing the nursing home reform amendments.
Therefore, the information collection requirements referenced in this
rule are exempt from the Paperwork Reduction Act of 1995.
List of Subjects in 42 CFR Part 483
Grant programs--health, Health facilities, Health professions,
Health records, Medicaid, Medicare, Nursing homes, Nutrition, Reporting
and recordkeeping requirements, Safety.
42 CFR chapter IV is amended as follows:
PART 483--REQUIREMENTS FOR STATES AND LONG TERM CARE FACILITIES
1. The authority citation for part 483 continues to read as
follows:
Authority: Secs. 1102 and 1871 of the Social Security Act (42
U.S.C. 1302 and 1395hh).
2. In Sec. 483.20, paragraphs (d) through (f) are redesignated as
(k) through (m), respectively, paragraphs (b) and (c) are revised and
new paragraphs (d) through (j) are added to read as follows:
Sec. 483.20 Resident assessment.
* * * * *
(b) Comprehensive assessments.
(1) Resident assessment instrument. A facility must make a
comprehensive assessment of a resident's needs, using the resident
assessment instrument (RAI) specified by the State. The assessment must
include at least the following:
(i) Identification and demographic information.
(ii) Customary routine.
(iii) Cognitive patterns.
(iv) Communication.
(v) Vision.
(vi) Mood and behavior patterns.
(vii) Psychosocial well-being.
(viii) Physical functioning and structural problems.
(ix) Continence.
(x) Disease diagnoses and health conditions.
(xi) Dental and nutritional status.
(xii) Skin condition.
(xiii) Activity pursuit.
(xiv) Medications.
(xv) Special treatments and procedures.
(xvi) Discharge potential.
(xvii) Documentation of summary information regarding the
additional assessment performed through the resident assessment
protocols.
(xviii) Documentation of participation in assessment.
The assessment process must include direct observation and
communication with the resident, as well as communication with licensed
and nonlicensed direct care staff members on all shifts.
(2) When required. A facility must conduct a comprehensive
assessment of a resident as follows:
(i) Within 14 calendar days after admission, excluding readmissions
in which there is no significant change in the resident's physical or
mental condition. (For purposes of this section, ``readmission'' means
a return to the facility following a temporary absence for
hospitalization or for therapeutic leave.)
(ii) Within 14 calendar days after the facility determines, or
should have determined, that there has been a significant change in the
resident's physical or mental condition. (For purposes of this section,
a ``significant change'' means a major decline or improvement in the
resident's status that will not normally resolve itself without further
intervention by staff or by implementing standard disease-related
clinical interventions, that has an impact on more than one area of the
resident's health status, and requires interdisciplinary review or
revision of the care plan, or both.)
(iii) Not less often than once every 12 months.
(c) Quarterly review assessment. A facility must assess a resident
using the quarterly review instrument specified by the State and
approved by HCFA not less frequently than once every 3 months.
(d) Use. A facility must maintain all resident assessments
completed within the previous 15 months in the resident's active record
and use the results of the assessments to develop, review, and revise
the resident's comprehensive plan of care.
(e) Coordination. A facility must coordinate assessments with the
preadmission screening and resident review program under Medicaid in
part 483, subpart C to the maximum extent practicable to avoid
duplicative testing and effort.
(f) Automated data processing requirement. (1) Encoding data.
Within 7 days after a facility completes a resident's assessment, a
facility must encode the following information for each resident in the
facility:
(i) Admission assessment.
(ii) Annual assessment updates.
(iii) Significant change in status assessments.
(iv) Quarterly review assessments.
(v) A subset of items upon a resident's transfer, reentry,
discharge, and death.
(vi) Background (face-sheet) information, if there is no admission
assessment.
(2) Transmitting data. Within 7 days after a facility completes a
resident's assessment, a facility must be capable of transmitting to
the State information for each resident contained in the MDS in a
format that conforms to standard record layouts and data dictionaries,
and that passes standardized edits defined by HCFA and the State.
(3) Monthly transmittal requirements. A facility must
electronically transmit, at least monthly, encoded, accurate, complete
MDS data to the State for all assessments conducted during the previous
month, including the following:
(i) Admission assessment.
(ii) Annual assessment.
(iii) Significant change in status assessment.
(iv) Significant correction of prior full assessment.
(v) Significant correction of prior quarterly assessment.
(vi) Quarterly review.
(vii) A subset of items upon a resident's transfer, reentry,
discharge, and death.
(viii) Background (face-sheet) information, for an initial
transmission of MDS data on a resident that does not have an admission
assessment.
(4) Data format. The facility must transmit data in the format
specified by HCFA or, for a State which has an alternate RAI approved
by HCFA, in the format specified by the State and approved by HCFA.
(5) Resident-identifiable information. (i) A facility may not
release information that is resident-identifiable to the public.
(ii) The facility may release information that is resident-
identifiable to an agent only in accordance with a contract under which
the agent agrees not to use or disclose the information except to the
extent the facility itself is permitted to do so.
(g) Accuracy of assessments. The assessment must accurately reflect
the resident's status.
(h) Coordination. A registered nurse must conduct or coordinate
each assessment with the appropriate participation of health
professionals.
(i) Certification. (1) A registered nurse must sign and certify
that the assessment is completed.
(2) Each individual who completes a portion of the assessment must
sign and
[[Page 67212]]
certify the accuracy of that portion of the assessment.
(j) Penalty for falsification. (1) Under Medicare and Medicaid, an
individual who willfully and knowingly--
(i) Certifies a material and false statement in a resident
assessment is subject to a civil money penalty of not more than $1,000
for each assessment; or
(ii) Causes another individual to certify a material and false
statement in a resident assessment is subject to a civil money penalty
of not more than $5,000 for each assessment.
(2) Clinical disagreement does not constitute a material and false
statement.
* * * * *
3. Subpart F consisting of Sec. 483.315 is added to read as
follows:
Subpart F--Requirements That Must be Met by States and State
Agencies, Resident Assessment
Sec. 483.315 Specification of resident assessment instrument.
(a) Statutory basis. Sections 1819(e)(5) and 1919(e)(5) of the Act
require that a State specify the resident assessment instrument (RAI)
to be used by long term care facilities in the State when conducting
initial and periodic assessments of each resident's functional
capacity, in accordance with Sec. 483.20.
(b) State options in specifying an RAI. The RAI that the State
specifies must be one of the following:
(1) The instrument designated by HCFA.
(2) An alternate instrument specified by the State and approved by
HCFA, using the criteria specified in the State Operations Manual
issued by HCFA (HCFA Pub. 7) which is available for purchase through
the National Technical Information Service, 5285 Port Royal Rd.,
Springfield, VA 22151.
(c) State requirements in specifying an RAI.
(1) Within 30 days after HCFA notifies the State of the HCFA-
designated RAI or changes to it, the State must do one of the
following:
(i) Specify the HCFA-designated RAI.
(ii) Notify HCFA of its intent to specify an alternate instrument.
(2) Within 60 days after receiving HCFA approval of an alternate
RAI, the State must specify the RAI for use by all long term care
facilities participating in the Medicare and Medicaid programs.
(3) After specifying an instrument, the State must provide periodic
educational programs for facility staff to assist with implementation
of the RAI.
(4) A State must audit implementation of the RAI through the survey
process.
(5) A State must obtain approval from HCFA before making any
modifications to its RAI.
(6) A State must adopt revisions to the RAI that are specified by
HCFA.
(d) HCFA-designated RAI. The HCFA-designated RAI is published in
the State Operations Manual issued by HCFA (HCFA Pub. 7), as updated
periodically, and consists of the following:
(1) The minimum data set (MDS) and common definitions.
(2) The resident assessment protocols (RAPs) and triggers that are
necessary to accurately assess residents, established by HCFA.
(3) The quarterly review, based on a subset of the MDS specified by
HCFA.
(4) The requirements for use of the RAI that appear at Sec. 483.20.
(e) Minimum data set (MDS). The MDS includes assessment in the
following areas:
(1) Identification and demographic information, which includes
information to identify the resident and facility, the resident's
residential history, education, the reason for the assessment,
guardianship status and information regarding advance directives, and
information regarding mental health history.
(2) Customary routine, which includes the resident's lifestyle
prior to admission to the facility.
(3) Cognitive patterns, which include memory, decision making,
consciousness, behavioral measures of delirium, and stability of
condition.
(4) Communication, which includes scales for measuring hearing and
communication skills, information on how the resident expresses himself
or herself, and stability of communicative ability.
(5) Vision pattern, which includes a scale for measuring vision and
vision problems.
(6) Mood and behavior patterns, which include scales for measuring
behavioral indicators and symptoms, and stability of condition.
(7) Psychosocial well-being, which includes the resident's
interpersonal relationships and adjustment factors.
(8) Physical functioning and structural problems, which contains
scales for measuring activities of daily living, mobility, potential
for improvement, and stability of functioning.
(9) Continence, which includes assessment scales for bowel and
bladder incontinence, continence patterns, interventions, and stability
of continence status.
(10) Disease diagnoses and health conditions, which includes active
medical diagnoses, physical problems, pain assessment, and stability of
condition.
(11) Dental and nutritional status, which includes information on
height and weight, nutritional problems and accommodations, oral care
and problems, and measure of nutritional intake.
(12) Skin condition, which includes current and historical
assessment of skin problems, treatments, and information regarding foot
care.
(13) Activity pursuit, which gathers information on the resident's
activity preferences and the amount of time spent participating in
activities.
(14) Medications, which contains information on the types and
numbers of medications the resident receives.
(15) Special treatments and procedures, which includes measurements
of therapies, assessment of rehabilitation/restorative care, special
programs and interventions, and information on hospital visits and
physician involvement.
(16) Discharge potential, which assesses the possibility of
discharging the resident and discharge status.
(17) Documentation of summary information regarding the additional
assessment performed through the resident assessment protocols.
(18) Documentation of participation in assessment.
(f) Resident assessment protocols (RAPs). At a minimum, the RAPs
address the following domains:
(1) Delirium.
(2) Cognitive loss.
(3) Visual function.
(4) Communication.
(5) ADL functional/rehabilitation potential.
(6) Urinary incontinence and indwelling catheter.
(7) Psychosocial well-being.
(8) Mood state.
(9) Behavioral symptoms.
(10) Activities.
(11) Falls.
(12) Nutritional status.
(13) Feeding tubes.
(14) Dehydration/fluid maintenance.
(15) Dental care.
(16) Pressure ulcers.
(17) Psychotropic drug use.
(18) Physical restraints.
(g) Criteria for HCFA approval of alternate instrument. To receive
HCFA approval, a State's alternate instrument must use the standardized
format, organization, item labels and definitions, and instructions
specified by HCFA in the latest issuance of the State Operations Manual
issued by HCFA (HCFA Pub. 7).
(h) State MDS collection and data base requirements. (1) As part of
facility
[[Page 67213]]
survey responsibilities, the State must establish and maintain an MDS
Database, and must do the following:
(i) Use a system to collect, store, and analyze data that is
developed or approved by HCFA.
(ii) Obtain HCFA approval before modifying any parts of the HCFA
standard system other than those listed in paragraph (h)(2) of this
section (which may not be modified).
(iii) Specify to a facility the method of transmission of data to
the State, and instruct the facility on this method.
(iv) Upon receipt of data from a facility, edit the data, as
specified by HCFA, and ensure that a facility resolves errors.
(v) At least monthly, transmit to HCFA all edited MDS records
received during that period, according to formats specified by HCFA,
and correct and retransmit rejected data as needed.
(vi) Analyze data and generate reports, as specified by HCFA.
(2) The State may not modify any aspect of the standard system that
pertains to the following:
(i) Standard approvable RAI criteria specified in the State
Operations Manual issued by HCFA (HCFA Pub. 7) (MDS item labels and
definitions, RAPs and utilization guidelines).
(ii) Standardized record formats and validation edits specified in
the State Operations Manual issued by HCFA (HCFA Pub. 7).
(iii) Standard facility encoding and transmission methods specified
in the State Operations Manual issued by HCFA (HCFA Pub. 7).
(i) State identification of agency that collects RAI data. The
State must identify the component agency that collects RAI data, and
ensure that this agency restricts access to the data except for the
following:
(1) Reports that contain no resident-identifiable data.
(2) Transmission of data and reports to HCFA.
(3) Transmission of data and reports to the State agency that
conducts surveys to ensure compliance with Medicare and Medicaid
participation requirements, for purposes related to this function.
(4) Transmission of data and reports to the State Medicaid agency
for purposes directly related to the administration of the State
Medicaid plan.
(5) Transmission of data and reports to other entities only when
authorized as a routine use by HCFA.
(j) Resident-identifiable data. (1) The State may not release
information that is resident-identifiable to the public.
(2) The State may not release RAI data that is resident-
identifiable except in accordance with a written agreement under which
the recipient agrees to be bound by the restrictions described in
paragraph (i) of this section.
(Catalog of Federal Domestic Assistance Program No. 93.778, Medical
Assistance Program; and No. 93.773, Medicare-- Hospital Insurance)
Dated: December 3, 1997.
Nancy-Ann Min DeParle,
Administrator, Health Care Financing Administration.
Dated: December 9, 1997.
Donna E. Shalala,
Secretary.
[FR Doc. 97-32828 Filed 12-22-97; 8:45 am]
BILLING CODE 4120-01-P