[Federal Register Volume 59, Number 231 (Friday, December 2, 1994)]
[Unknown Section]
[Page 0]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 94-29573]
[[Page Unknown]]
[Federal Register: December 2, 1994]
_______________________________________________________________________
Part IV
Environmental Protection Agency
_______________________________________________________________________
National Environmental Laboratory Accreditation Conference (NELAC);
Notice
ENVIRONMENTAL PROTECTION AGENCY
[FRL-5115-5]
National Environmental Laboratory Accreditation Conference
(NELAC)
AGENCY: Environmental Protection Agency (EPA).
ACTION: Notice of Conference and Availability of Standards.
-----------------------------------------------------------------------
SUMMARY: The Environmental Protection Agency will hold a conference on
national environmental laboratory accreditation, to discuss all aspects
of accreditation for laboratories performing analyses to demonstrate
compliance with EPA regulations. The conference is open to the public.
The constitution and the draft standards are being published to allow
time for review before the conference.
DATES: The conference will be held on February 14-16, 1995. All
meetings will convene at 9:00 am and adjourn at 5:00 pm, except
Thursday, February 16 when the meeting will adjourn at 3:00 pm.
ADDRESSES: The meeting will be held at the Hyatt Regency, 2799
Jefferson Davis Highway, Arlington, VA 22202.
FOR FURTHER INFORMATION CONTACT: Conference arrangements are being
coordinated by TLI Systems. For information on registration, hotel
rates, transportation, and reservations call Dan Dozier of TLI Systems
at 301/718-2270 to receive a brochure. If you have technical questions
regarding the conference program please contact one of the following
individuals: Jeanne Hankins Mourrain; EPA; Office of Research and
Development; Atmospheric Research and Exposure Assessment Laboratory
(MD-77B); Research Triangle Park, NC 27711; telephone 919/541-1120; fax
919/541-7953, or Gary Bennett; EPA Region IV; Environmental Services
Division; Athens, GA 30605-2720; telephone 706/546-3287; fax 706/546-
3375 or Kenneth Jackson, Marge Prevost, and Matt Caruso; State of New
York; Department of Health; PO Box 509; Albany, NY 12201-0509;
telephone 518/474-8519; fax 518/474-6184.
SUPPLEMENTARY INFORMATION: Laboratories, the regulated community,
laboratory clients and the regulatory agencies have experienced
difficulty under the current system of laboratory accreditation. In
response to complaints received in 1990, EPA has been working in a
cooperative venture with the states and the public to develop a system
which would surmount many of the existing problems.
In 1991, the Committee on National Accreditation of Environmental
Laboratories (CNAEL) was chartered at the request of EPA's Deputy
Administrator. The CNAEL was charged with determining the need and
advisability of a national environmental laboratory accreditation
program, alternatives to such a program, and the role of EPA in any
program. CNAEL was composed of members from the laboratory and
regulated industry communities, academia, other federal agencies, the
states, public environmental interest groups, and private accrediting
bodies.
CNAEL identified and prioritized numerous issues which were of
concern to each of the affected parties and reached agreement on an
overall problem statement: to achieve data of needed quality in a cost
effective manner. Fifteen alternative solutions were proposed and
evaluated in relation to the problem statement. Multiple options for
operation of a program were identified and ranked. In addition, the
scope of a program was defined in terms of environmental regulations,
which laboratories should be included, and which activities/tests would
be included. Finally CNAEL identified the elements of a national
environmental laboratory accreditation program for purposes of clarity.
At the conclusion of its deliberations CNAEL recommended that a
national program for accreditation of environmental laboratories, which
includes the key elements of on-site audits, performance evaluation
testing, and data audits, be implemented by enlisting states and/or
third parties to perform the accrediting function with oversight of the
accrediting bodies by a federal agency.
In order to act upon the recommendations of CNAEL, EPA convened the
State/EPA Focus Group, which is composed of ten states (California,
Colorado, Florida, Maryland, Michigan, New Jersey, New Mexico, New
York, South Carolina, Texas) and eight EPA Offices (Office of Air and
Radiation, Office of Enforcement and Compliance Assurance, Office of
Prevention, Pesticides and Toxic Substances, Office of Research and
Development, Office of Regional Operations and State/Local Relations,
Office of Solid Waste and Emergency Response, Office of Water, Region
4). The Focus Group has developed a set of draft standards based on the
ISO 25 Guidelines, as proposed by CNAEL, and a constitution for
operation of the National Environmental Laboratory Accreditation
Conference (NELAC). This set of standards and the constitution will
form the basis for the discussions at the NELAC as discussed below.
The NELAC is a key step in implementing the solution proposed by
CNAEL and has the potential to influence the entire environmental
laboratory community. States and federal agencies will function as the
decision making (voting) members of the conference. The proposed
process is to modify and refine the set of draft standards. Additions
and modifications are expected to be made during the NELAC committee
meetings where the comments and suggestions of all sectors of the
laboratory community will be solicited to ensure that the standards are
responsive to the needs of the public, are practical to implement, are
scientifically sound, and are as cost efficient as possible. For those
unable to participate directly in the NELAC, written comments can be
submitted to Jeanne Mourrain.
Once agreement is reached on the standards, the states may
voluntarily adopt the standards and would have responsibility for
ensuring conformance with the standards. Any laboratory which performs
analyses to demonstrate compliance with federal environmental
regulations in a participating state would be required to be accredited
under the state regulations and would be deemed accredited in all other
participating states. EPA would provide oversight of state
accreditation programs and inspections of state and federal
laboratories. In most cases the states would accredit laboratories only
within their own state. However, laboratories located in states which
do not participate in the national program could seek accreditation
from another state which has adopted the national standards.
The goal of the program is to accredit all laboratories which
perform environmental analyses for compliance with regulations. The
information and status of these laboratories would be readily available
to regulators, clients, and the general public. Reciprocity would be
easily obtained and would be granted on a state-by-state basis.
Reciprocity would eliminate duplicate on-site inspections and
performance evaluation sample testing. Some states may have
supplemental requirements required by state law which exceed the
national standards, e.g. some states have regulated additional
compounds for drinking water, which might require analysis of
additional performance evaluation samples.
A fully operational national environmental laboratory accreditation
program would enable a laboratory to conduct business in any of the
states or territories, with minimal disruption of operations and lower
costs. International acceptance of national accreditation should be
greatly facilitated. The capacity of laboratories for all types of
environmental analyses would be included in the scope of a national
program, enabling clients of laboratories to readily identify the
laboratories which could perform the needed analyses.
The NELAC will provide the opportunity for the entire laboratory
community to voice their concerns, provide advice based on their
professional experience and effect positive changes in the current
accreditation process. Among those who are encouraged to attend are
state and federal accrediting and laboratory agencies, private sector
laboratories, the regulated industry, environmental interest groups,
accrediting bodies, academia, and the general public.
Ramona Trovato,
Director, Water Enforcement Division, Office of Enforcement and
Compliance Assurance.
National Environmental Laboratory Accreditation Conference
Draft
Constitution and Bylaws
November 1994.
Prepared by the State/EPA Focus Group
Table of Contents
Constitution
Article I--General
Article II--Objectives
A. Forum
B. Mechanism
C. Consensus
D. Uniformity
E. Cooperation
Article III--Membership
Article IV--Officers
Section 1--Ex Officio Officers
A. Director
B. Executive Secretary
Section 2--Elective Officers
A. Eligibility
B. Nominations and Elections
Article V--Appointive Officials
Section 1--Officials, Specific
A. Appointment
B. Assumption of Office
Article VI--Meetings of the Conference
A. Annual Meeting
B. Interim Meeting
C. Special Meetings
D. Rules of Order
Article VII--Fees and Dues
Article VIII--Amendments to the Constitution
Article IX--Bylaws
Section 1--Supplementation of Constitution
Section 2--Amendments and Repeals of the Bylaws
Section 3--Renumbering
BYLAWS
Article I--Application for Membership
Section 1--Form of Application
Article II--Fees, Membership Records
Section 1--Fees
Section 2--Membership Year
Section 3--Billing
Section 4--Evidence of Membership or Contributorship
Article III--Use of the Insignia
Article IV--Board of Directors
Section 1--Membership
Section 2--Duties
Article V--Duties of the Officers
Section 1--Chair
Section 2--Chair-Elect
Section 3--Past Chair
Section 4--Executive Secretary
Section 5--Treasurer
Section 6--Assistant Treasurer
Section 7--Parliamentarian
Article VI--Committees
Section 1--General
Section 2--Administrative Committees
A. Terms
B. Duties
Section 3--Standing Committees
A. General
B. Duties
Section 4--Special Committees, Task Forces and Study groups
Section 5--Subcommittees
Article VII--Voting System
Section 1--House of Representatives
A. Official Designation
B. Composition
C. Method of Designation
Section 2--House of Delegates
A. Designation
B. Requirements
Section 3--Voting Rules
A. Proxy Votes
B. Method
C. Timing
D. Recording
E. Applicability
Section 4--Committee Reports
Section 5--Floor Amendments
A. Amendments
B. Changes
Section 6--Seating
A. Arrangement
B. Supervision
Section 7--Voting
A. Minimum Votes
B. Motion Accepted
C. Motion Rejected
D. Split or Tie Vote
Section 8--Procedures
Section 9--Changes in Organization and Procedure
Figure 1. Seating Arrangement
Constitution
Article I--General
This Association shall be known as ``The National Environmental
Laboratory Accreditation Conference'' (NELAC) and is sponsored by the
United States Environmental Protection Agency (U.S. EPA) as a voluntary
association of State and Federal Officials for the purpose of
establishing standards to consolidate and make more uniform the
laboratory accreditation process.
Article II--Objectives
The objectives of the National Environmental Laboratory
Accreditation Conference are:
A. Forum
To provide a national forum for the discussion of all questions
related to standards for environmental laboratory accreditation by
officials of the Federal Government and regulatory officials of the
States, Commonwealths, Territories and Possessions of the United
States, their political subdivisions, and the District of Columbia.
B. Mechanism
To provide a mechanism to establish policy and coordinate
activities within the Conference on matters of national and
international significance pertaining to environmental laboratory
accreditation standards.
C. Consensus
To develop a consensus on uniform standards, laws, regulations and
specifications for laboratory inspections, procedures, criteria,
personnel qualification, testing, administrative procedures and
enforcement.
D. Uniformity
To encourage and promote uniformity of requirements and methods
among jurisdictions.
E. Cooperation
To foster cooperation among regulatory officers and between them
and the manufacturing, industrial, business, academic, consumer, and
other interests affected by their official activities.
Article III--Membership
Membership consists of two classes:
Active Membership Active membership is limited to officials
actively engaged in accreditation of environmental laboratories or
environmental program officials who are in the employ of the Government
of the United States, the States, the Commonwealths, the Territories,
or the Possessions of the United States, or the District of Columbia.
Contributors Contributors comprise representatives of laboratories,
manufacturers, industry, business, consumers, academia, laboratory
associations, industrial associations, laboratory accreditation
associations, and other persons who are interested in the objectives
and activities of the Conference.
Article IV--Officers
Section 1--Ex Officio Officers
A. Director. The Director of the Environmental Protection Agency
National Environmental Laboratory Accreditation Program is the ex
officio Director of the Conference.
B. Executive secretary. The Director of the Environmental
Protection Agency National Environmental Laboratory Accreditation
Program designates a senior member of the Environmental Protection
Agency who is thoroughly conversant with laboratory accreditation to
serve the National Conference as its Executive Secretary.
Section 2--Elective Officers
The Elective officers of the Conference shall be:
Chair,
Chair-Elect,
Past-Chair,
Treasurer, and 6 members-at-large to serve on the NELAC Board of
Directors.
The consecutive reelection of a Chair-Elect is prohibited; the
Chair-Elect shall not serve on any committee other than the Board of
Directors. Should the Chair-Elect for any reason be unable or unwilling
to be installed as Chair, his/her successor shall be elected in the
manner prescribed. In this event, the newly elected Chair-Elect shall
be installed as Chair.
A. Eligibility. 1. Any Active Member in good standing shall be
eligible to hold any office provided that the individual meets the
other requirements set forth in the Constitution and Bylaws.
2. The Chair-Elect will be elected at the Annual Meeting one year
prior to the term of service as Conference Chair. After serving one
year as Chair-Elect, the incumbent will succeed to the office of
Conference Chair. Only a state official is eligible for election to
Chair-Elect.
B. Nominations and elections. 1. Nominating committee. The Chair
shall appoint a Nominating Committee consisting of the most recent
active Past Chair as Committee Chair and six (6) active members, to be
geographically representative insofar as possible.
2. Nominations. a. The Nominating Committee shall submit one name
for each elective office and present its recommendation as a slate to
the Conference.
b. Additional nominations for officers may be made from the floor
at the Annual Meeting provided that prior consent of the nominee has
been obtained in writing and presented to the presiding officer at the
time of the nomination.
3. Elections. Officers shall be elected during a designated session
of the Annual Meeting by a formal recorded vote of the members in
attendance and eligible to vote on Conference motions.
4. Terms of office. a. The Chair, Chair-Elect, and Past Chair,
shall serve for a term of one year or until their successors are
respectively qualified and elected or appointed.
b. The Treasurer will serve a term of three years.
c. The six Board of Directors members-at-large shall serve for 3-
year terms; two elected each year.
d. All officers shall take office immediately following the close
of the Annual Meeting at which they were elected.
5. Filling vacancies. In case of a vacancy in any of the elective
offices, the Board of Directors shall fill the office by appointment.
Article V--Appointive Officials
Section 1--Officials, Specific
The Conference Chair with the approval of the Board of Directors
will appoint the following officials:
Parliamentarian
Assistant Treasurer
A. Appointment. The Conference Chair shall appoint other officials
to conduct Conference activities. See Bylaws, Article V--Duties of the
Officers and Article VI--Committees.
B. Assumption of office. All appointive officials shall take office
immediately following appointment and will serve through the subsequent
Annual Meeting of the Conference unless otherwise specified by the
Conference Chair, Constitution or Bylaws.
Article VI--Meetings of the Conference
A. Annual Meeting
The Annual Meeting of members shall be held each year. The agenda
for this meeting shall include the election of officers, reports from
the various committees, task forces, study groups, and the Treasurer,
other items pertinent to the Conference, and presentation to the
Membership of pending issues requiring action by vote.
The Annual Meeting may include the presentation of technical
papers, discussions, displays, or other events at the discretion of the
Board of Directors.
B. Interim Meeting
The Interim Meeting of the Board of Directors and those Standing
Committees designated by the Chair shall be held annually,
approximately six months prior to the Annual Meeting to develop the
agenda and committee recommendations for presentation to and action by
the membership at the Annual Meeting. Draft resolutions and standards
regarding environmental laboratory accreditation which have been
published in the Federal Register and commented upon are discussed and
modified as appropriate in the Interim Meeting.
C. Special Meetings
1. The Conference Chair is authorized to order a meeting of the
Board of Directors at any time deemed necessary by the Chair to be in
the best interest of the Conference.
2. Other Committees of the Conference are authorized to hold
meetings at times other than the Annual Meeting or Interim Meeting
provided that:
a. such meeting or meetings have been funded in the Conference
budget approved by the Board of Directors, or
b. such meeting or meetings are approved by the Chair and funding
is available within the approved budget or can be made available.
3. A quorum shall consist of a majority of the eligible voters.
D. Rules of Order
The rules contained in Robert's Rules of Order (Revised) shall
govern the Conference in all cases to which they are applicable, and in
which they are not inconsistent with the Constitution or Bylaws or the
special rules of the Conference.
Article VII--Fees and Dues
The annual Membership fees and the registration fees for the Annual
Meeting are recommended by the Conference Management and Funding
Committee and shall be approved (and may be revised) by a majority vote
of the Board of Directors at any official meeting of that Committee.
Article VIII--Amendments to the Constitution
This Constitution may be amended, added to, or repealed at any
Annual Meeting of the Membership under normal Conference procedures.
Proposed changes must be included in the agenda of the Board of
Directors for the Interim Meeting, published in the Recommendations of
the Board of Directors in its Tentative Report, and discussed at the
general session of the Board of Directors at the Annual Meeting at
which said changes will be voted upon. Amendments to the Constitution
must be approved by a minimum of a two-thirds vote in both the House of
Representatives and the House of Delegates.
Article IX--Bylaws
Section 1--Supplementation of Constitution
This Constitution shall be supplemented by Bylaws which shall
detail the methods of operation of the Conference. Such Bylaws shall
not be inconsistent with the provisions of the Constitution.
Section 2--Amendments and Repeals of the Bylaws
The Bylaws may be amended or repealed in the same manner as
prescribed for the Constitution (See Article VIII).
Section 3--Renumbering
The Executive Secretary is authorized to renumber the Articles and
Sections of the Constitution or Bylaws to accommodate any changes made.
Bylaws
Article I--Application for Membership
Section 1--Form of Application
Each application for membership or contributorship shall be
submitted to the Executive Secretary. The application shall be
accompanied by the Membership or Contributor fee. The successful
applicant's name will be added to the Conference mailing list.
Confirmation of Member or Contributor status will be mailed.
Article II--Fees, Membership Records
Section 1--Fees
The fees for annual Membership, Contributorship, as well as the
registration fee for the Annual Meeting, are established by the
Conference Management and Funding Committee and are subject to approval
and revision by the Board of Directors.
Section 2--Membership Year
Annual membership fees shall be payable by July 1 of each year and
will cover the period July 1 to June 30 of the following year.
Section 3--Billing
The Executive Secretary shall bill each Member and Contributor for
yearly dues 2 months prior to the expiration of the current membership
year.
Section 4--Evidence of Membership or Contributorship
Membership certificates and cards of suitable design, bearing the
insignia of the Conference shall be issued to the Members. Contributor
certificates of a noticeably differing design shall be issued to the
Contributors. The Executive Secretary shall advise the Treasurer of the
count of new Members and Contributors and will forward the membership
monies for deposit in the Conference Account.
Article III--Use of the Insignia
The insignia of the Conference may be used or displayed only by
members of the Conference unless expressly authorized in writing by the
Conference.
Article IV--Board of Directors
Section 1--Membership
A. The Board of Directors consists of the Director, Executive
Secretary, Chair of the Conference, Chair-Elect, the most recent still
active Past Chair of the Conference, the Treasurer, and the six at-
large members.
B. The Nominating Committee in recommending candidates for the
Board of Directors shall consider regional representation.
C. The term of the Board of Directors runs from the adjournment of
the Annual Meeting at which its members are elected (or appointed)
through the succeeding Annual Meeting of the Conference.
Section 2--Duties
A. The Board of Directors has leadership responsibility for the
conference and is charged with guiding the Conference in its primary
mission of establishing standards for the accreditation of
environmental laboratories.
B. It generates the constitution and bylaws of the Conference,
presents amendments, proposes changes in organizational structure, and
defines roles and responsibilities as appropriate, for approval of the
membership.
C. It establishes administrative procedures and policy on internal
matters and serves as the policy and coordinating body in matters of
national and international significance.
D. It holds accountable, reviews, and approves actions of all
Committees.
E. It utilizes the Standing Committees to resolve technical
criteria issues regarding laboratory accreditation.
F. It acts for the Conference in all routine or emergency
situations.
G. It authorizes interim meetings of Conference Committees as
necessary.
H. It fills any vacancy in any elective office of the Conference
caused by death, resignation or retirement from active official
service.
I. It brings recommendations to the Conference for consideration
and action as appropriate.
Article V--Duties of the Officers
Section 1--Chair
The Conference Chair is the principal presiding officer at the
meetings of the Conference and of the Board of Directors, makes
appointments to the several standing and administrative committees, and
appoints other Conference officials to serve during his or her term of
office. All appointments will be made with the consent of the Board of
Directors.
Section 2--Chair-Elect
The Chair-Elect will:
A. serve as acting Chair of the Conference and the Board of
Directors in the event that the Chair is unable to carry out the duties
of that office;
B. perform other duties assigned by the Conference Chair, including
presiding over sessions of the meetings of the Conference as assigned
by the Conference Chair and assisting the Chair in the discharge of his
or her duties; and
C. serve on the Board of Directors.
Section 3--Past Chair
The most recent still-active Past Chair will serve on the Board of
Directors and as Chair of the Nominating Committee and perform such
duties as may be assigned by the Board of Directors. The Conference
Past Chair may preside over sessions of the meetings of the Conference
as assigned by the Conference Chair and assist the Chair in the
discharge of his or her duties.
Section 4--Executive Secretary
The Executive Secretary acts as the executive officer of the
Conference, the secretary and executive officer of the Board of
Directors, and the non-voting secretary to each standing committee;
keeps the records of the proceedings of the meetings and manages the
conference administration as prescribed in its administrative
procedures.
Section 5--Treasurer
The Treasurer receives and accounts for all monies collected and
pays all Conference bills certified by the Conference Management and
Funding Committee as correct. The Treasurer is an ex officio member of
the Conference Management and Funding Committee.
Section 6--Assistant Treasurer
The Assistant Treasurer shall assist the Treasurer in the discharge
of his or her duties.
Section 7--Parliamentarian
The Parliamentarian shall assist in assuring meetings of the
Conference are conducted according to Robert's Rules of Order and any
special rules adopted by the Conference.
Article VI--Committees
Section 1--General
Each administrative committee will consist of five Active Members
(except the Contributors Committee) appointed by the Chair of the
Conference to serve appropriate terms on a rotating basis or until a
successor is appointed. All committee members will be appointed the
initial year for appropriate staggered terms to allow for subsequent
appointment to full terms.
Except for the Nominating Committee, whose chair will be the
Conference Past-Chair, each committee annually selects one of its
members to serve as its chair, who may succeed himself or herself.
Each standing committee will consist of five members elected from
the Active membership of the Conference to serve 5 years with one
member being elected each year. All committee members shall be elected
during a designated session of the Annual Meeting by a formal recorded
vote of the members in attendance and eligible to vote on Conference
motions.
When necessary, an appointment will be made to any of the standing
committees to fill a vacancy caused by death, resignation or retirement
from active service by a committee member. The appointment is for the
unexpired portion of the member's term.
Section 2--Administrative Committees
A. Terms. 1. Conference Management and Funding Committee. The term
of service will be three years; two members to be appointed each of two
years and one the third year.
2. Nominating Committee. The chair shall be the Conference Past
Chair. Four members shall be appointed annually to serve one year.
3. Membership Committee. The term of service will be two years. Two
members will be appointed one year and three the alternate year.
4. Auditing Committee. The term of service will be three years. Two
members are to be appointed in each of two years and one in the third
year.
5. Liaison Committee. The term of service will be three years. Two
members are to be appointed in each of two years and one in the third
year.
6. Contributors's Committee. This committee will consist of five
contributors to serve two years. Three members will be appointed one
year and two in alternate years.
B. Duties. 1. Conference Management and Funding Committee. This
committee prepares the annual budget for approval by the Board of
Directors, sets and collects annual membership fees and conference
registration fees, selects the place and dates of each Annual and
Interim Meeting of the Council and manages the logistic details of the
Interim and Annual meeting, certifies to the Treasurer the correctness
of bills submitted to the Conference for payment, and publicizes the
Annual and Interim Meetings. The Treasurer is an ex-officio member of
this committee.
2. Nominating Committee. This committee presents a slate of
nominees for all elective offices at the Annual Meeting. The names of
these nominees shall appear in the report of the Nominating Committee
and be published in the Conference Announcement.
3. Membership Committee. This committee initiates membership
invitations and publicizes the Conference to prospective members. This
committee also provides coordination and participation of Contributors
in all affairs of the Conference.
4. Fiscal Auditing Committee. This committee arranges for annual
audits of the Conference books. This committee reviews audit reports to
the Board of Directors with recommendations as necessary to resolve
discrepant audit findings or recommendations.
5. Liaison Committee. This committee provides liaison with
international organizations, federal agencies, other groups and
organizations. This committee provides and solicits information and
develops a spirit of cooperation between NELAC and other organizations.
6. Contributor's Committee. This committee serves as the focal
point for the Contributors. It solicits information from and provides
feedback to the Contributors and acts as liaison to the Board of
Directors on Contributor matters.
Section 3--Standing Committees
A. General. Standing Committee members serve for five years, one
member being appointed annually.
B. Duties. 1. Program Structure Committee. This committee shall
develop modifications to the scope, structure, and requirements to the
tiers and fields of testing.
2. The Accrediting Authority Committee. This committee provides the
standards used by EPA to approve state authorities.
3. Quality Systems Committee. This committee establishes and keeps
current the key elements of QA/QC, including record keeping and
staffing requirements. The committee also defines uniform standards for
each of the elements of QA/QC.
4. Performance Evaluation Testing Committee. This committee
determines the requirements for the Performance Evaluation Program,
generates the standards for the Performance Evaluation samples,
provides criteria for selection of the provider of the Performance
Evaluation samples and provides and updates the protocol for the use of
the Performance Evaluation Program in the accreditation of
laboratories.
5. On-Site Assessment Committee. This committee determines the
training and experience requirements of the assessors, establishes the
frequency of inspection, generates the procedures for on-site visits
and publishes these standards in a National Environmental Laboratory
Accreditation Manual.
6. Accreditation Process Committee. This committee generates and
develops modifications for the accreditation process of environmental
laboratories, including the requirements for accreditation, procedures
for suspension, revocation and denial of accreditation, relative roles
and responsibilities of laboratories and appeal processes. This
committee considers matters concerning reciprocity of accreditation and
establishes the process for the approval of state/federal accrediting
authorities.
7. Regulatory Committee. This committee provides the Standing
Committees with current information on federal regulations that impact
laboratory testing. This committee annually presents a report for
conference action. Its scope embraces all matters regarding the
development and interpretation of uniform laws and regulations, the
study and analysis of bills for legislative enactment, and the
establishment and maintenance of published guidelines and other
effective means of encouraging uniformity of interpretation and
application of laboratory requirements. The Regulatory Committee shall
also provide uniform language to assist states in adopting the
standards in state statutes.
Section 4--Special Committees, Task Forces, and Study Groups
Special committees, task forces, and study groups may be
established by the Conference Chair as the need arises or as requested
by the Conference. Members will be appointed from the Active Members
for as long as deemed appropriate. Upon completion of its assigned
task, such bodies shall be dissolved by the Chair of the Conference.
Section 5--Subcommittees
Upon request of a committee, the Conference Chair may appoint a
subcommittee(s) to assist the committee in fulfilling its
responsibilities.
Article VII--Voting System
All questions before a meeting of the Conference that are to be
decided by a formal recorded vote of the Active Members are voted upon
in accordance with the following voting structures and procedures.
Section 1--House of Representatives
A. Official Designation. This body of officials shall be known as
the ``House of Representatives.''
B. Composition. 1. Each State is authorized one official to serve
as its representative at the annual meeting of National Environmental
Laboratory Accreditation Conference. The state representative shall be
the Director of the State Environmental Laboratory Accreditation
Program or the highest level technically competent scientist
knowledgeable about environmental laboratory analysis and
accreditation, or his/her designee.
2. Each of seven EPA Assistant/Associate Administrators (OSWER,
OAR, ORD, OW, OPPTS, OECA, and OROSLR) or his or her designee may
appoint two members, one from headquarters and one from an EPA region.
3. Each other participating federal agency with responsibilities in
the environmental laboratory field is authorized to appoint one
official to the House of Representatives.
C. Method of Designation. Each representative is specified annually
to the Board of Directors 120 days before the NELAC Annual Meeting.
Accommodation may be made for exceptions to this deadline. An alternate
should be named prior to the Annual Meeting in case the designated
representative cannot attend.
Section 2--House of Delegates
A. Designation. All other State and Federal environmental officials
(those not sitting in the House of Representatives) are grouped as a
body known as the ``House of Delegates''. The number of potential
members is not limited.
B. Requirements. No other special requirements apply.
Section 3--Voting Rules
A. Proxy votes. Proxy votes are not permitted. Since issues and
recommendations in the committees' interim reports are often modified
and amended at the Annual Meeting, the attendance of officials at the
NELAC Annual Meeting and voting sessions is vital.
B. Method. All voting is by show of hands, standing vote or machine
(electronic). There shall be no voice voting. No abstentions are
permitted. NELAC Annual Meeting and voting sessions are mandatory.
C. Timing. Voting by both Houses is simultaneous.
D. Recording. The voting system is designed to record the votes of
the Representatives whether an electronic system, show of hands,
standing vote or other method capable of being tallied is used.
E. Applicability. These procedures (rules) apply only to the Annual
Meetings of NELAC. However, only active members are permitted to vote
in committee or other meetings.
Section 4--Committee Reports
Alternatives that may be used in voting on the reports:
A. vote on the entire report;
B. vote on grouped items or sections; or
C. vote on individual items, according to
1. committee discretion, or
2. on request by a voting delegate, with the support of 10 others.
Section 5--Floor Amendments
A. Amendments. Committee chairs are allowed to offer amendments on
the day of voting to make editorial changes in their final reports.
B. Changes. Substantive changes can be made at the request of House
of Representatives or House of Delegates members and:
1. A majority of the voting delegates of each House must vote
favorably before a proposed amendment can be accepted for debate.
2. A two-thirds favorable vote of each House on the amendment is
required for passage (the requirement for the minimum number of votes
in both Houses also applies).
Section 6--Seating
A. Arrangement. The seating arrangement for voting sessions is
shown in Figure 1.
B. Supervision. The members of the Board of Directors will control
placement and movement of delegates. The Executive Secretary will count
votes.
Section 7--Voting
At the conclusion of debate on a motion, there shall be a call for
the vote by a show of hands, standing, electronic count, or other tally
method.
A. Minimum Votes. 1. House of Representatives. A minimum of one-
half the participating agencies must cast their votes in favor of, or
in opposition to an issue for the vote to be considered official.
2. House of Delegates. A minimum number of votes, equivalent to
one-half the number of participating agencies, must be cast in favor
of, or in opposition to an issue for the vote to be considered
official.
B. Motion accepted. 1. If the minimum number of members of the
House of Representatives votes Yea; and if
2. A majority of the members of the House of Delegates votes Yea
(the minimum number of Yea votes required). If the minimum number of
votes required to pass or fail an issue is not cast in the House of
Delegates, the issue will be determined by the vote of the House of
Representatives.
C. Motion rejected. 1. If the minimum number of members of the
House of Representatives votes Nay.
and if
2. A majority of the members of the House of Delegates votes Nay
(the minimum number of Nay votes required). Should a tie vote occur, or
if the minimum number of votes required to pass or fail an issue is not
cast in the House of Delegates, the issue will be determined by the
vote of the House of Representatives.
D. Split or tie vote. When the two Houses split on an issue or the
minimum number of votes supporting or opposing an issue is not obtained
in the House of Representatives, the issue is returned to the standing
committee for further consideration.
The committee may drop the issue or reconsider it for submission
the following year. The issue cannot be recalled for another vote at
the same Annual Meeting.
Section 8--Procedures
The Conference officers and committees observe in all procedures
the principles of due process--the protection of the rights and
interests of affected parties; specifically, they (a) give reasonable
advance notice of contemplated committee studies, items to be
considered for committee action, and tentative or definite
recommendations for Conference action, for the information of all
parties at interest, and (b) provide that all interested parties have
an opportunity to be heard by committees and by the Conference.
Section 9--Changes in Organization and Procedure
Proposals for changes in organization or procedure of the
Conference are not acted upon until the Annual Meeting of the
Conference following the Annual Meeting at which such proposals are
made.
BILLING CODE 6560-50-P
TN02DE94.008
BILLING CODE 6560-50-C
National Environmental Laboratory Accreditation Conference
Draft
Standards
November 1994.
Prepared by the State/EPA Focus Group
Table of Contents
1.0 Policy and Structure
1.1 Introduction
1.2 Purpose of the Conference
1.3 Structure of the Conference
1.3.1 The Board of Directors
1.3.2 The Environmental Laboratory Advisory Board
1.3.3 The Committees
1.3.3.1 The Standing Committees
1.3.3.2 The Administrative Committees
1.3.4 The Membership
1.3.5 The Generation of Standards
1.3.6 Adoption of Standards
1.4 Roles and Responsibilities of the Federal Government, the
States, and Other Parties
1.4.1 Federal Government (USEPA)
1.4.2 State Governments
1.4.3 Joint Federal and State Roles
1.4.4 Other Parties
1.5 Scope of the Program
1.6 Structure of the Accreditation Requirements
1.6.1 General Requirements
1.6.1.1 Organization and management
1.6.1.2 Quality system, audit and review
1.6.1.3 Personnel
1.6.1.4 Accommodation and environment
1.6.1.5 Equipment and reference materials
1.6.1.6 Measurement traceability and calibration
1.6.1.7 Calibration and test methods
1.6.1.8 Handling of calibration and test items
1.6.1.9 Records
1.6.1.10 Certificates and reports
1.6.1.11 Sub-contracting of calibration or testing
1.6.1.12 Outside support services and supplies
1.6.1.13 Complaints
1.6.2 Specific Requirements Linkage
1.6.3 Discussion
1.7 Funding of the Program
1.7.1 Self supported NELAC
1.7.2 EPA Program Support
1.7.3 Fee Supported State Programs
1.8 Reciprocity
1.8.1 Fair Representation of Accrediting Authorities
1.8.2 Scope and Essential Quality Standards
1.8.3 Fee Structures
Figure 1-1
Figure 1-2
Figure 1-3
2.0 Performance Evaluation Testing Program
2.1 Enrollment in PE Testing Program
2.2 Approval of PE Testing Programs
2.3 Testing of Samples
2.4 Scoring
2.5 Successful Participation
3.0 On-Site Assessment
3.1 Introduction
3.2 On-Site Assessment Personnel
3.2.1 Training
3.2.2 Qualifications
3.2.3 Additional qualifications
3.2.4 Assessor Certification
3.3 Frequency of On-Site Assessments
3.3.1 Frequency
3.3.2 Follow-up evaluations
3.3.3 Changes in laboratory capabilities
3.3.4 Announced and unannounced visits
3.4 Pre-Assessment Procedures
3.4.1 Introduction
3.4.2 Scope of the assessment
3.4.2.1 Laboratory evaluations
3.4.2.2 Records review
3.4.3 Assessment planning
3.4.4 Reviewing NELAP/State information
3.4.5 Providing Advance Notification
3.4.6 Assessment Team Coordination
3.4.7 Gathering assessment documents and equipment
3.4.7.1 Types of documents
3.4.7.2 Assessment equipment
3.4.8 Confidential Business Information Considerations
3.5. Assessment Schedule/Format
3.5.1 Length of evaluation
3.5.2 Opening conference
3.5.3 Records review
3.5.4 Staff interviews
3.5.5 Closing conference
3.5.6 Follow-up procedures
3.6 Criteria for Assessment
3.6.1 Assessor's Manual
3.6.2 Assessors role
3.6.3 Checklists
3.6.4 Evaluation criteria
3.6.4.1 Facility assessment
3.6.4.2 Organization assessment
3.6.4.3 Personnel assessment
3.6.4.4 Sample handling assessment
3.6.4.5 Equipment assessment
3.6.4.6 Calibration standards assessment
3.6.4.7 Methodology assessment
3.6.4.8 Data audit
3.6.4.9 QA Plan assessment
3.6.4.10 General health and safety procedures
3.6.4.11 Laboratory waste disposal assessment
3.7 Documentation of On-Site Assessment
3.7.1 Checklists
3.7.2 Report Format
3.7.3 Distribution
3.7.4 Report Deadline
3.7.5 Release of Report
3.7.6 Report Storage Time
4.0 Accreditation Process
4.1 Components of Accreditation
4.1.1 Personnel Qualifications
4.1.2 On-site Assessments
4.1.3 Performance Evaluation Samples
4.1.4 Corrective Action Reports
4.1.5 Ethical Standards
4.1.6 Fee Process for National Accreditation
4.1.7 Application Process
4.1.8 Transfer of Ownership/Change of Ownership and/or Location
of Laboratory
4.1.9 ``Certification of Compliance'' Statement
4.2 Period of Accreditation
4.3 Maintaining Accreditation
4.3.1 Performance Evaluation Samples
4.3.2 On-Site Assessments
4.3.3 Other Accreditation Elements
4.3.4 Notification and Reporting Requirements
4.3.5 Record Keeping and Retention
4.3.6 Payment of Fees
4.4 Suspension, Revocation and Denial of Accreditation
4.5 Interim Accreditation
4.5.1 Interim Accreditation
4.5.2 Revocation of Interim Accreditation
4.6 Awarding of Accreditation
4.6.1 The Certificate of Accreditation
4.6.2 Changes in Areas of Accreditation
4.7 Enforcement
4.7.1 Role of Enforcement vs QA/QC
4.7.2 Defining Enforceable Violations
4.7.3 Recommendation
5.0 Quality Systems
5.1 Introduction
5.2 Quality System
5.2.1 Quality Assurance Plan
5.3 General Quality Control Procedures
5.3.1 Chemical Testing
5.3.2 Bioassays
5.3.3 Microbiology
5.3.4 Radiochemistry
5.3.5 Air Testing
5.4 Performance Evaluation Samples
5.5 Environmental Laboratory Staffing Requirements
5.5.1 General requirements for laboratory staff
5.5.2 Laboratory Staff Responsibilities and Credentials
5.5.3 Quality Assurance Officer
5.6 Equipment
5.7 Test Methods and Standard Operating Procedures
5.7.1 Laboratory Method Manual(s) and Standard Operating
Procedures
5.7.2 Method Validation/Initial Demonstration of Method
Performance (Performance-based methods and non-approved methods)
5.7.3 Calibration
5.7.3.1 Documentation and Labeling
5.7.3.2 Initial Calibrations
5.7.3.3 Continuing Calibration Verification
5.8 Physical Facilities
5.8.1 Environment
5.8.2 Work Area
5.9 Sample Acceptance Policy and Sample Receipt
5.9.1 Sample Acceptance Policy
5.9.2 Sample Receipt Protocols
5.9.3 Storage Conditions
5.10 Sample Tracking
5.11 Record Keeping, Data Review and Reporting
5.11.1 Sample Custody Requirements
5.11.1.1 Essential Documentation
5.11.1.2 Record Keeping System and Design
5.11.1.3 Laboratory Report Format and Contents
5.11.1.4 Records Management and Storage
5.11.2 Sample Custody Tracking and Data Documentation for
Laboratory Operations
5.11.2.1 Sample Receipt, Log In and Storage
5.11.2.2 Intralaboratory Distribution of Samples for Analysis
5.11.3 Legal or Evidentiary Custody Procedures
5.11.3.1 Basic Requirements
5.11.3.2 Required Information in Custody Records
5.11.3.3 Controlled Access to Samples
5.11.3.4 Transfer of Samples to Another Party
5.11.3.5 Sample Disposal
5.12 Corrective Action Policy and Procedures
Appendix A
Appendix B
1.0 Policy and Structure
1.1 Introduction
The Committee on National Accreditation of Environmental
Laboratories (CNAEL) in its final report of September 1992 recommended
the establishment of a national environmental laboratory accreditation
program (NELAP). The States function as the primary accrediting
authorities and may contract with a third party as the accrediting body
for purposes of carrying out some parts of the accrediting functions,
e.g. on-site inspections. As accrediting authorities, the states would
maintain the authority to grant accreditation, enforce compliance, etc.
EPA shall oversee and approve the state's compliance with all standards
applicable to an accrediting authority. The recommended key elements
for NELAP include on-site assessments, performance evaluation testing,
and data audits. To achieve the stated goals of the CNAEL report, it is
proposed to establish a National Environmental Laboratory Accreditation
Conference (NELAC), which is modeled after the National Conference on
Weights and Measures. NELAC membership shall be voluntary and shall be
open to environmental laboratory accrediting authorities. The NELAC
shall serve as the organization that shall establish and modify the
accreditation standards. Broad participation in NELAC shall identify
laboratories which are capable of providing reliable, uniform
laboratory data which are acceptable to both Federal and State
environmental programs. National accreditation standards and procedures
shall provide a level playing field where reciprocity among the States
in environmental laboratory accreditation shall be practicable. The
creation of a National Environmental Laboratory Accreditation Program
allows coordination of the current accreditation activities of
different States or other governmental agencies, and reduces the number
of on-site inspections, performance evaluation tests, and related
requirements with which the accredited organizations must comply. It is
intended that NELAP function in a manner which minimizes negative
effects on the current accreditation operations of the States, requires
minimum outlay of State and Federal funds to implement, and that is
self supporting.
1.2 Purpose of the Conference
The National Environmental Laboratory Accreditation Conference
shall be a standards setting body. NELAC shall, through the process
described, establish consensus uniform standards on which the national
accreditation program shall be based. These uniform standards shall
include, but are not limited to, quality systems, performance
evaluation, audit programs, and other key elements as established by
the standing committees of NELAC. It is NOT the purpose of NELAC to
function as an accrediting body, oversee or approve accrediting bodies,
or administer any of the main elements of the accreditation program.
1.3 Structure of the Conference
The structure of the Conference is shown in Figure 1-1. The Board
of Directors shall assume the overall supervisory, administrative, and
procedural duties. The Standing Committees and Administrative
Committees are overseen by the Board of Directors. The Standing
Committees shall receive input regarding standards and test procedures,
then process this input into resolutions which shall be put before the
Membership at the Annual Conference. These resolutions shall be voted
on by Active Members. The non-voting Contributors shall also have the
opportunity to make presentations and comments on the resolutions
throughout the process and at the Annual Conference. The NELAC may also
take into consideration advice and comment provided to the
Environmental Protection Agency through the Environmental Laboratory
Advisory Board (ELAB) chartered under the Federal Advisory Committee
Act (FACA). The composition and relationships of these bodies is
described below.
1.3.1 The Board of Directors
The Board of Directors consists of the Conference Chair, the Chair-
Elect, the most recent still active Past Chair, the Treasurer, six
members elected at large from the active membership (to serve 3-year
staggered terms), an EPA official to be appointed by the EPA
Administrator as the NELAP Director (see section 1.4.1), and an
Executive Secretary to be named by the Director. The Board of Directors
serves as a policy and coordinating body in matters of national and
international significance. The Board of Directors also makes interim
policy decisions when necessary before the Voting Delegates have an
opportunity to vote on the issues in question.
1.3.2 The Environmental Laboratory Advisory Board
The ELAB consists of nine members composed of eight nongovernmental
representatives and chaired by an EPA representative. The members may
be selected from a slate of nominees prepared by the Contributors'
Committee. This FACA board advises the EPA on matters affecting the
interests of the contributors and other interested parties.
1.3.3 The Committees
The committees are the Standing Committees and the Administrative
Committees. Both are overseen by the Board of Directors.
1.3.3.1 The Standing Committees
These committees each consist of five members elected from the
Active Membership of the Conference. They serve five years and one new
member is elected each year. The committee elects a chair. The
committees shall generate standards and policies for which they have
responsibility to be presented at the annual Conference for vote. The
committees shall receive input via comments and presentations at the
interim and annual conferences. The committees shall draft resolutions
which shall be published by EPA in the Federal Register. The committees
shall prepare and arrange timely agendas for Interim Meetings and
Annual Conferences.
The Program Structure Committee. This committee shall develop
modifications to the scope, structure, and requirements to the tiers
and fields of testing.
The Accrediting Authority Committee. This committee provides the
standards used by EPA to approve state authorities.
The Quality System Committee. This committee shall establish and
keep current the key elements of a quality system including record
keeping and staffing requirements. The Committee shall also define
uniform standard criteria for each of the elements of the quality
system.
The Performance Evaluation Program Committee. This committee shall
determine the requirements for the Performance Evaluation Program. The
committee shall generate the standards for the Performance Evaluation
Samples, provide criteria for selection of the provider of the
Performance Evaluation Samples, and provide and update the protocol for
the use of Performance Evaluation Program in the accreditation of
laboratories.
The On-Site Assessment Committee. This committee shall establish
the training and experience requirements of the assessors; establish
the frequency of inspections; and generate the procedures for on-site
visits.
The Accreditation Process Committee. This committee shall establish
and develop modifications for the accreditation process including the
requirements for accreditation; procedural requirements for suspension,
revocation and denial of accreditation; relative roles and
responsibilities of laboratories; and appeal processes. The Committee
considers matters concerning reciprocity of accreditation.
The Regulatory Committee. This committee provides the Standing
Committees with current information on Federal regulations which affect
the testing that the laboratories do. The Regulatory Committee annually
presents a report for Conference action. Its scope embraces all matters
regarding the development and interpretation of uniform laws and
regulations; the study and analysis of bills for legislative enactment;
and the establishment and maintenance of published guidelines and other
effective means of encouraging uniformity of interpretation and
application of laboratory accreditation laws and regulations. This
committee shall develop language which shall assist the states in the
preparation and adoption of standardized statutes and regulations.
1.3.3.2 The Administrative Committees
The Administrative Committees, with the exception of the
Contributors Committee, shall consist of members appointed from the
active membership. The functions and the responsibilities of the
Administrative Committees are described below.
The Nominating Committee. The Nominating Committee annually
presents a slate of nominees for all elective offices at the national
annual conference.
The Conference Management and Funding Committee. This committee
sets annual membership fees and conference registration fees, manages
the logistical details of the interim meetings and annual conferences,
prepares an annual budget for the Conference to be submitted for
approval to the Board of Directors, and publicizes the interim meetings
and annual conferences. The Treasurer shall be an ex-officio member of
this committee.
The Membership Committee. This committee initiates membership
invitations and publicizes the Conference to prospective members. The
committee also provides coordination and participation of Contributors
in all affairs of the Conference.
The Fiscal Auditing Committee. This committee is responsible for
the conduct and review of the annual audit of the Conference and shall
report such findings to the Board of Directors. It also audits the
Treasurer's books annually.
The Liaison Committee. This committee shall provide liaison with
other federal agencies such as the Department of Energy and the
Department of Defense. In addition this committee shall provide liaison
with other national and international standard setting bodies such as
the National Institutes of Standards and Technology (NIST) and the
International Organization for Standardization (ISO). The function of
this committee is to provide and solicit information and develop a
spirit of cooperation between NELAC and outside organizations.
The Contributors Committee. This committee is composed of five
Contributors. Its function is to serve as a focal point for the
Contributors. The committee shall propose a slate of candidates to the
EPA as potential appointees to the ELAB. It solicits information from
and provides feedback to the Contributors.
1.3.4 The Membership
The Membership consists of two classes--Active Members and
Contributors.
Active Membership. Active membership is limited to State and
Federal Officials. The Active Members may vote and serve on the
Committees. At the annual conference the voting Members are divided
into a House of Representatives and a House of Delegates. The House of
Representatives is composed of one officially designated State
Representative from each State or Territory, two representatives from
each of seven EPA Assistant/Associate Administrators (OSWER, OAR, ORD,
OW, OPPTS, OECA, and OROSLR), and one officially designated Federal
Representative from each other participating federal program. The state
representative should be the director of the state environmental
laboratory accreditation, or a high level technically competent
scientist who is knowledgeable about environmental laboratory analysis
and accreditation programs, or his or her designee. The Federal
Representative is designated by the appropriate person in charge of the
federal program. All other State and Federal Officials are grouped as a
body known as the House of Delegates.
Contributors. The contributors are all other interested parties and
groups. They include, but are not limited to, laboratory personnel,
industry representatives, environmental groups, the general public,
laboratory associations, industry associations, accreditation
associations, and retired active members. The Contributors may not
vote, but can make presentations, comments or input at all stages of
the standards and procedures making process.
1.3.5 The Generation of Standards
The standards for the accreditation of laboratories begin in the
various committees (see Figure 1-2). Draft standards proposed by the
committees are published in the Federal Register by EPA. After
providing an appropriate time for review, an Interim Meeting is held
and presentations, comments and other input are received. The draft
proposals are processed and either presented at the Annual Conference
or returned to committee for further work. These resolutions presented
at the Annual Conference are voted upon by the Active Membership. (See
Constitution and Bylaws for voting procedures.) If rejected, they go
back to committee for reassessment or shelving. If approved, they are
presented in the Federal Register in final form by EPA.
1.3.6 Adoption of Standards
Participating States must adopt the standards to maintain status as
a NELAP accreditor. If a State chooses not to participate in all or
part of the accreditation program, laboratories in that State may
obtain certification from a participating State that is approved under
NELAP.
1.4 Roles and Responsibilities of the Federal Government, the States,
and Other Parties
1.4.1 Federal Government (USEPA)
The role of the federal government, as represented by the USEPA
(the Agency), shall be that of oversight and evaluation of the
accrediting authorities and that of administration of NELAP program
elements which require a high degree of standardization between
different accrediting authorities. In addition, the USEPA shall provide
staff support to the Conference as provided for in the Bylaws and
agreed to by the Agency. The EPA shall assist the Conference by
publishing in the Federal Register all proposed and final standards.
The EPA will also evaluate state and federal laboratories to assure
compliance with NELAC standards. The EPA Administrator will appoint a
Director of the National Environmental Laboratory Program. The Director
shall serve as an ex officio member of the Board of Directors. He or
she shall select a senior member of EPA with laboratory accreditation
experience as the Executive Secretary of the Conference (a full time
position). The Director's Office shall establish a program which
evaluates, approves, and reports on the accreditation programs
implemented by the state accrediting authorities. In these reports,
state accreditation programs shall be evaluated against the national
standard as established by the Conference. The EPA shall evaluate,
inspect, and approve state and federal laboratories as complying with
NELAC standards. In addition, the Agency shall establish a five member
board, the Accrediting Authority Review Board, composed of
representatives from the states, EPA, and other federal agencies, to
review the process and procedures used by EPA to approve State and
Federal laboratories and accrediting authorities. It is recommended
that the Agency provide administrative support to a performance
evaluation sample program so as to ensure uniformity of sample
composition and performance evaluation standards.
1.4.2 State Governments
State governments shall be the primary accrediting authority. The
state's Laboratory Accreditation Program will be audited and approved
by the Director's Office. As the accrediting authority, states will
have full responsibility for ensuring conformance with the national
standard established by NELAC. States will be responsible for
accrediting applicant organizations through approving applications,
performing on-site assessments and maintaining performance evaluation
sample programs. States are responsible for ensuring that on-site
inspectors are trained in accordance with NELAP requirements. States
shall submit the names, and appropriate accreditation material, to the
EPA for inclusion in the National Laboratory Database. States may
choose to contract accreditation activities to a third party (non-
government) agency. If contracted to a third party, states remain the
accrediting authority and retain responsibility for ensuring compliance
with the standards established by NELAC.
1.4.3 Joint Federal and State Roles
The NELAC (Conference) shall be the joint responsibility of the
Federal Government (Agency) and the state accrediting authorities. As
provided in the following section on structure of the Conference and
the Conference Bylaws, state accrediting authorities and the Agency
share responsibilities of governance, analysis and establishment of
policy, and analysis and establishment of technical standards as they
apply to the NELAP.
1.4.4 Other Parties
All other interested parties including, but not limited to, the
laboratory industry, clients of the laboratory industry, environmental
or other public interest groups, and the general public, shall function
as contributors to the Conference. In this role, these other parties
shall bring technical and policy issues to the attention of the
Conference, its managing Board, or its subcommittees. It is anticipated
that these issues shall be brought to the Conference in the form of
reports, presentations, discussion material, or other forms of
documentation for presentation at the annual Conference, committee, or
subcommittee meetings.
1.5 Scope of the Program
The scope of the National Environmental Laboratory Accreditation
Program shall encompass the necessary scientific testing to serve all
U.S. Environmental Protection Agency (EPA) monitoring, compliance or
other functions mandated by statutes and pursuant regulations. Some of
the statutes are the Federal Insecticide, Fungicide and Rodenticide Act
(FIFRA); the Safe Drinking Water Act (SDWA); the Resource Conservation
and Recovery Act (RCRA); the Comprehensive Environmental Response
Compensation and Liability Act (CERCLA): the Federal Water Pollution
Control Act (Clean Water Act); the Clean Air Act (CAA); and the Toxic
Substances Control Act (TSCA). In addition, the program shall also
include provisions to permit special requirements or fields of testing
promulgated by any of the States and/or Territories.
However, the program shall not be implemented or administered in a
way which limits the ability of local, state or federal agencies to
investigate and prosecute enforcement cases. Specifically, when engaged
in the collection and analysis of forensic evidence to support
litigation those agencies may use any procedure that is appropriate
given the nature of the investigation, subject only to the bounds of
sound scientific practice. This program shall not apply to those
government laboratories engaged solely in the analysis of forensic
evidence.
1.6 Structure of the Accreditation Requirements
The structure of the NELAP shall be based on the field of testing
(see Figure 1-3). It shall consist of a set of general requirements
that all applicants must satisfy. Applicants for a particular field of
testing must also meet the necessary number of additional levels of
specific requirements or functions that are linked to the general
requirements. The number and the degree of difficulty of the required
additional levels shall depend on the complexity of the test procedures
in question.
It is proposed that the different fields of laboratory testing be
structured into groupings based on parameter, group of parameters, or
method. In addition, a category of supplemental accreditation will be
designated. A ``supplemental'' accreditation means accreditation of a
laboratory which has met additional methods or parameters required by a
state accrediting authority.
``Supplemental'' accreditation shall be needed only for those few
methods and/or parameters which are unique to a particular state. These
supplemental requirements shall be limited in number and scope.
1.6.1 General Requirements
The general requirements are applicable to all applicants
regardless of their size, volume of business, or field of testing. The
organizational structure, or procedures used by applicant organizations
to meet these general requirements may differ as a function of size or
scope of testing of an organization. The general requirements shall
include all the elements outlined in General Requirements for the
Competence of Calibration and Testing Laboratories, ISO/IEC Guide 25:
1990 (E).
General requirements shall include Health and Safety, and Waste
Management Programs. Applicant organizations shall be required to be in
compliance with all applicable federal, state, and local rules and
regulations covering environmental, and occupational health and safety.
Responsibility for the evaluation of compliance with these rules and
regulations shall remain with the appropriate regulatory body.
The relevant elements listed in the document are as follows:
1.6.1.1 Organization and Management
The organization shall be legally identifiable; the organization
shall have managerial staff with the authority and resources needed to
discharge their duties; this includes technical management with overall
responsibility for the technical operations, and quality management
with responsibility for the quality system and its implementation.
1.6.1.2 Quality System, Audit and Review
The organization shall establish and maintain a quality system
appropriate to the type, range and volume of calibration and testing
activities it undertakes; the quality manual, and related quality
documentation, shall state the organization's policies and operational
procedures; the organization shall arrange for audits of its activities
at appropriate intervals to verify that its operations continue to
comply with the requirements of the quality system.
1.6.1.3 Personnel
The organization shall have sufficient personnel, having the
necessary education, training, technical knowledge and experience for
their assigned functions.
1.6.1.4 Accommodation and Environment
Organization facilities shall have suitable space, energy sources,
lighting, heating and ventilation for proper performance of
calibrations or tests.
1.6.1.5 Equipment and Reference Materials
The organization shall be furnished with all items of equipment
(including reference materials) required for the correct performance of
calibrations and tests.
1.6.1.6 Measurement Traceability and Calibration
Standards used for calibration must be traceable.
1.6.1.7 Calibration and Test Methods
The organization must document instructions on the use and
operation of all relevant equipment.
1.6.1.8 Handling of Calibration and Test Items
The organization must document a system used to identify the items
to be calibrated or tested.
1.6.1.9 Records
The organization shall maintain a record system to suit its
particular circumstances and comply with any applicable regulations.
1.6.1.10 Certificates and Reports
The organization certifies and reports the calibration and/or test
results.
1.6.1.11 Sub-Contracting of Calibration or Testing
The organization shall sub-contract work only to organizations that
are accredited by a NELAC accrediting authority. Subcontractors must be
clearly identified.
1.6.1.12 Outside Support Services and Supplies
The organization must use only those outside support services and
supplies that are of adequate quality.
1.6.1.13 Complaints
The organization shall have documented policy and procedures for
the resolution of complaints received from clients or other parties
about the organization's activities with records maintained of all
complaints and of the actions taken by the organization; where a
complaint, or any other circumstance, raised doubt concerning the
procedures, or other requirements or otherwise concerning the quality
of the organization's calibrations or tests, the organization involved
is promptly audited in accordance with pre-established procedures.
1.6.2 Specific Requirements Linkage
Additional tiers of requirements can be linked to the general
requirements. To illustrate the tiered approach, a schematic
representing the accreditation scope and structure by field of testing
is given in Figure 1-3. It indicates that all NELAP applicants must
meet the basic requirements. Additional specific tiers of requirements
are linked to the basic requirements for a particular test or activity.
An organization seeking accreditation in hazardous waste organic
testing must meet all the requirements listed in basic requirements,
general laboratory, organic, and hazardous waste. The specific and
detailed requirements under this scheme have not been developed at this
time. The appropriate and necessary requirements of the various tiers
and fields of testing will be developed by the Program Structure
Committee.
1.6.3 Discussion
The field of testing structure proposed for the national
environmental laboratory accreditation program provides flexibility.
This allows for the incorporation of new methods or new instrumentation
without the applicants repeatedly demonstrating the basic requirements
that the accreditation applicants have previously satisfied. Redundancy
of qualification assessment is avoided. Avoidance of redundant reviews
and assessments shall significantly expedite the processing of
applications which cover different fields of testing. Such a scheme
provides a structure whereby appropriate and specific accreditation
requirements can be established to meet the prevailing needs of
environmental laws and regulations. Regulators are thus provided with
environmental sample testing results generated by laboratories
according to specified or equivalent methods and quality assurance
protocols.
Additionally, the adoption of parameter, method specific and
supplemental classifications allows for the design of accreditation to
suit needs of individual laboratories and states. This flexibility
shall promote reciprocity among all the participating States. The field
of testing approach proposed shall also allow for the future
incorporation of performance based methods (PBM) by substituting an
approved PBM for the specified analytical methods. Any supplemental
requirements essential to meet state needs would be added at the
parameter or method specific level.
1.7 Funding of the Program
Funding shall be needed to cover the costs arising from at least
three areas: the administration and functions of NELAC; expenses
incurred by EPA through its oversight and related administrative
duties; and expenses incurred by the States because of accreditation
functions including on-site visits, performance evaluation samples,
processing applications, and other duties. Funding mechanisms for each
of these cost areas is proposed below:
1.7.1 Self Supported NELAC
The NELAC should be self-sustaining financially insofar as
possible. The Interim meetings and Annual Conferences expenses should
be financed by registration fees and annual dues for Members and
Contributors. These dues and registration fees should be set by the
Conference Management and Funding Committee. Other expenses of
committee members shall be paid by their organizations.
1.7.2 EPA Program Support
The EPA should provide support for the National Environmental
Laboratory Accreditation Program. This program includes oversight and
evaluation of accreditation authorities, evaluation and approval of
state and federal laboratories, administrative support for the
Conference, and publications in the Federal Register.
1.7.3 Fee Supported State Programs
All costs of state accreditation programs may be covered through
the collection of application fees from the applicant organizations.
Such fees would cover the cost of application and processing,
performance evaluation, site assessments, staff training, Conference
membership and participation, and other appropriate activities, whether
such activities were carried out directly by the state accrediting
authority or by contract to a third party. It is recommended that a
dual fee structure be implemented by the state authorities. A full fee
should be charged applicants for which the state is the primary
accreditor. A reduced fee should be charged applicants for which the
state is the secondary accreditor. This fee structure is based on the
principle that fees shall cover the actual cost of an accreditation.
The primary accrediting authority shall incur the full cost of
accreditation. The secondary accrediting authority, having accepted the
accreditation of another authority through reciprocity, shall only
incur the cost of registration of the accredited organization. Costs
incurred by a secondary accrediting authority related to supplemental
requirements, as described in section 1.8.2, should be reflected in
supplemental fees.
1.8 Reciprocity
All member accrediting authorities shall grant reciprocity to all
other member accrediting authorities which have met the national
standard. This principle of reciprocity is an element of the national
accreditation standard, to which all member accrediting authorities are
held.
Reciprocity among the environmental laboratory accrediting
authorities is essential to the success of a national program. The
principal accrediting authorities shall be the states. The states or
federal agencies which act as accrediting authorities, must accept
accreditation from other accrediting authorities in order for a
national uniform program to succeed. Three policy issues are presented
which are key to acceptance of the reciprocity principle by accrediting
authorities.
1.8.1 Fair Representation of Accrediting Authorities
The accrediting authorities must have a fair and representative
voice in the National Environmental Laboratory Accreditation
Conference. NELAC shall establish the basic scope, structure, and
standards of the national program. Acceptance of the national program,
in lieu of state programs, shall be significantly enhanced by fair and
meaningful participation of state accrediting authorities in the
establishment of the national program.
1.8.2 Scope and Essential Quality Standards
The national program (the national consensus standard) adopted by
NELAC shall have a scope and essential quality standards which meet or
exceed the requirements of the existing state accrediting authorities.
NELAC must consider the range of scope and quality systems requirements
of the state accrediting authorities in the adoption of a national
program. A national program which falls significantly short of the
existing state program requirements, shall either not be accepted by
state authorities, or shall require such extensive state supplementary
requirements as to make the national program irrelevant. It is
recognized that certain state authorities shall have special
requirements which arise from a unique statutory, economic, or
ecological situation. Reciprocity shall be possible if state mandated
supplementary requirements are limited in number and complementary to
the national program.
1.8.3 Fee Structures
NELAC shall adopt a policy which recommends that all accrediting
authorities institute a fee structure which reflects the cost of
operation of the accreditation program. NELAC requires that
laboratories apply for accreditation in the state of their primary
operation.
BILLING CODE 6560-50-P
TN02DE94.009
TN02DE94.010
TN02DE94.011
BILLING CODE 6560-50-C
2.0 Performance Evaluation Testing Program
2.1 Enrollment in PE Testing Program
Each laboratory must enroll in a performance evaluation (PE)
testing program that meets the criteria detailed by the National
Environmental Laboratory Accreditation Program (NELAP). The laboratory
must participate in an approved program or programs for each field of
testing for which it seeks accreditation. Participation shall mean the
analysis and reporting of all test samples. Laboratories shall
participate in PE testing for all fields of testing at a frequency
determined by the NELAC standards.
The laboratory must notify the accreditation agency of the NELAP-
approved program or programs in which it chooses to participate to meet
PE testing requirements. For those tests performed by the laboratory
for which PE testing is not currently available, the laboratory must
establish and maintain the accuracy and reliability of its testing
procedures by a system of internal quality management.
For each field of testing for which the laboratory seeks
accreditation, it must participate in the designated, NELAP- approved
PE testing program for at least twelve months before designating a
different program. The laboratory must notify the primary accreditor
before any change in designation.
Laboratories shall bear the cost of any subscription to a PE
testing program required by NELAP.
Each participant must authorize the PE testing program to release
to the primary accreditor all data required to determine the
laboratory's compliance with the criteria. The primary accreditor shall
make individual performance results available to all requesters.
2.2 Approval of PE Testing Programs
In order for a PE testing program to receive approval, the program
must be offered by a Federal or State agency, or entity acting as a
designated agent for the Federal or State agency. A Federal or State
program seeking approval or renewal for its PE program for the next
calendar year must submit an application to the NELAP director
providing the required information by July 1 of the current year. The
program must provide technical assistance to resolve problems that the
participants experience such as anomalies during analysis of the
samples, lost samples, or receipt of broken sample containers, etc. In
addition, the PE testing program must,
(a) Assure the quality of test samples, appropriately evaluate and
score the PE test results, and identify performance problems in a
timely manner;
(b) Demonstrate to the primary accreditor (or NELAP) that it has:
(1) The technical ability required to:
i. Either prepare samples or evaluate samples purchased from
manufacturers, who prepare the samples in conformance with the
appropriate good manufacturing practices; and
ii. Distribute samples with at least two levels of analytes.
Rigorous quality control must assure that samples mimic actual
environmental samples when possible and that samples are homogeneous
and remain stable over the period of testing. Stability shall be
verified by routine testing on stored samples, within the time frame
for analysis by PE test participants. Samples shall be maintained by
the PE testing program to retest laboratories with unsatisfactory
performance, or which have significant changes in accreditation status;
(2) A scientifically defensible process for determining the correct
result for each challenge offered by the program;
(3) A program of sufficient challenge, with a frequency of no less
than two times per year, to establish that a laboratory has met
performance requirements;
(4) The resources needed to distribute, analyze and interpret
individual laboratory performance. The PE program will provide:
i. Individual results to the laboratories,
ii. Statewide and nationwide reports to regulatory agencies on
individual laboratory performance on PE test events,
iii. Cumulative reports and scores for each laboratory, and
iv. Reports of specific laboratory failures using grading criteria
acceptable to NELAP,which must be provided on a timely basis.
(5) Provisions on each PE report form used by the laboratory to
record PE results, an attestation statement that PE test samples were
tested in the same manner as routine samples, with a signature block to
be completed by the individual performing the test as well as by the
laboratory management;
(6) A mechanism for notifying participants of the PE shipping
schedule and for participants to notify the PE testing program within
three days of the expected date of receipt of the shipment that samples
have not arrived or are unacceptable for testing. The program must have
provisions for replacement of samples that are lost in transit or are
received in a condition that is unacceptable for testing; and
(7) A process to resolve technical, administrative, and scientific
problems about program operations;
(c) Provide and maintain the following documentation as described:
(1) Reports of PE test results and all scores for each laboratory's
performance (an electronic or a hard copy, or both) must be provided to
the primary accreditor, NELAP, and the participating laboratory in the
format required by NELAP within 60 days after the date by which the
laboratory must report PE test results to the PE testing program;
(2) Records of each laboratory's performance must be maintained for
a period of five years or such time as may be necessary for any legal
proceedings; and
(3) An annual report must be provided to the primary accreditor and
NELAP with, if needed, an interim report, which identifies any
previously unrecognized sources of variability in kits, instruments,
methods, or PE samples, which may adversely affect the ability of the
primary accreditor or NELAP to evaluate laboratory performance.
If a PE testing program is determined by NELAP to fail to meet any
criteria for acceptance as an approved performance evaluation testing
program, NELAP will notify the PE testing program and the primary
accreditor. The PE program must notify all laboratories enrolled in
their PE program of the nonapproval and the reasons for nonapproval,
within 30 days of the notification.
2.3 Testing of Samples
The laboratory must examine or test, as applicable, the PE samples
it receives from the PE testing program in the same manner as it tests
environmental samples, and return the results by the deadline stated in
the sample package. The analyst testing or examining the samples and
the laboratory management must attest to the routine integration of the
samples into the workload using the laboratory's routine methods. The
laboratory must test samples the same number of times that it routinely
tests environmental samples.
Laboratories that perform tests on PE samples must comply with the
following restrictions and limitations on communications and sample
transfer:
(a) Laboratories must not engage in any interlaboratory
communications pertaining to the results of PE sample(s) until after
the date by which the laboratory must report the results to the PE
program for the PE test event in which the samples were sent;
(b) Laboratories with multiple testing sites or separate locations
must not participate in any communications or discussions across sites/
locations concerning PE sample results until after the date by which
the laboratory must report the PE test results to the program; and
(c) The laboratory must not send PE samples or portions of samples
to another laboratory for any analysis for which they seek
accreditation.
Any laboratory that the primary accreditor or NELAP determines
intentionally referred its PE samples to another laboratory for
analysis and submits the other laboratory's results as their own, will
have its certification revoked for a minimum period of one year. Any
laboratory that receives PE samples from another laboratory for testing
must notify the accreditation program of the receipt of those samples.
Laboratories not doing so may have their accreditation suspended for a
period not to exceed one year. This policy is not intended to prevent
interlaboratory testing designed as part of a methods development or
evaluation study, and applies only to PE samples.
The laboratory shall initiate chain of custody procedures upon
receipt of all PE samples. The laboratory must maintain a copy of all
records, including analytical worksheets, for a minimum of five years.
This record must include a copy of the PE program report forms used by
the laboratory to record PE results, and an attestation statement
signed by the analyst and the laboratory management stating that PE
samples were tested in the same manner as routine samples.
2.4 Scoring
Option I: Pre-established pass/fail range set by calculating 95%
confidence interval determined by previous studies.
Option II: Statistical evaluation of data from all participants in
the current study. Calculation of 95% and 99% confidence intervals to
set marginal and unsatisfactory performance.
Option III: Pre-established pass/fail intervals as established in
40 CFR 136, appendix B.
Option IV: The following scoring protocol applies to: All chemical
analytes; bacteriology samples that require quantitation (total and
fecal coliform in non-potable water); fibers in air determined by phase
contrast microscopy; asbestos in friable solid material by polarized
light microscopy; and asbestos in air and potable water by transmission
electron microscopy.
The true values may be established through robust statistical
analysis of the results reported by all laboratories, in order to
reject gross outliers and establish a mean result and standard
deviation, or through results obtained by a panel of 12 reference
laboratories (this is done for asbestos in friable material). A
laboratory's result on a given sample is then assessed as:
Good if it is within the 95% confidence interval about the mean, or
reported as ``less than'' the method detection limit if the sample is a
blank;
Marginal if it is outside the 95% confidence interval, but within
the 99% confidence interval about the mean, or reported as ``less
than'' twice the method detection limit; or
Unsatisfactory if it is any other result.
For each test, a laboratory receives 2 PE samples for each
certified analyte. On two consecutive tests, a laboratory must obtain a
passing score of at least 75% on the 4 samples analyzed, calculated by
applying the following formula.
TN02DE94.012
Hence, the laboratory must obtain at least two good results plus
two marginal results, or three good results plus one unsatisfactory
result, over two consecutive tests.
In response to the accreditation program guidelines, certain
chemistry analytes are scored by taking fixed intervals about the known
target value, where good performance is defined as a result within
those fixed target intervals, and unsatisfactory performance is any
other result.
For the potable water total coliforms, where qualitative analysis
is required (i.e., presence/absence), a laboratory is required to
maintain an average passing score of 90% on two consecutive tests.
Laboratories being tested for the determination of radon in air are
required to submit 5 sampling devices to the PE testing program. Four
of these are exposed to a known concentration in a standard atmosphere
exposure chamber, and the remaining device is left unexposed as a
``blank''. The devices are then returned to the laboratories for
analysis, and they are required to report results within 25% of the
target value on at least 4 of the 5 devices.
2.5 Successful Participation
Each laboratory must successfully participate in a PE testing
program approved by NELAP for each field of testing in which the
laboratory is accredited. If a laboratory's accreditation is suspended
or revoked because it fails to participate in PE testing for one or
more fields of testing, or voluntarily withdraws its accreditation for
the failed field of testing, the laboratory must then demonstrate
satisfactory performance on two consecutive PE test events, one of
which may be onsite, before the primary accreditor will consider it for
reinstatement.
Laboratories shall agree to test additional samples at the option
of the primary accreditor for the following situations:
(a) A major change in ownership or supervision of the laboratory;
(b) Complaints by users or employees;
(c) Unsatisfactory performance on most recent PE test event; or
(d) Request by the laboratory to be reinstated in a field of
testing.
Failure to participate in a PE test event shall result in an
automatic rating of unsuccessful performance and results in a score of
zero for the PE test event. Consideration may be given to those
laboratories failing to participate in a PE test event only if:
(a) Routine testing was suspended during the time frame allotted
for testing and reporting PE test results; and
(b) The laboratory notifies the primary accreditor and the PE
testing program within the time frame for submitting PE test results of
the suspension of routine testing and the circumstances associated with
failure to perform tests on PE samples.
Failure to return PE test results to the PE program within the time
frame specified by the program is unsuccessful performance and results
in a score of zero for the PE test event. The PE testing program will
specify the conditions and procedures for late submissions, e.g., lost
or broken samples. For those late submission categories, the
participant will be allowed to test the samples on an alternate
schedule.
For any unsatisfactory PE test event for reasons other than a
failure to participate, the laboratory must undertake appropriate
training and employ the technical assistance necessary to correct
problems associated with a PE test failure.
Remedial action must be taken and documented, and the documentation
must be maintained by the laboratory for five years from the date of
participation in the PE test event. Failure to achieve an overall PE
test event passing score for two consecutive PE test events or two out
of three consecutive PE test events is unsuccessful performance.
3.0 On-Site Assessment
3.1 Introduction
The on-site assessment is an integral part of a lab accreditation
program and will be one of the primary means of determining a
laboratory's capabilities and qualifications. During the on-site
assessment, the assessment team will collect information and make
observations which will be used to evaluate the laboratory's
conformance with established accreditation criteria. It is essential
that the on-site assessment be conducted in a uniform, consistent
manner throughout the nation to facilitate reciprocity among States,
and for the laboratory community to accept the accreditation process.
This section contains proposals and recommendations for conducting on-
site assessments.
3.2 On-Site Assessment Personnel
3.2.1 Training
The National Environmental Laboratory Accreditation Conference
(NELAC) will specify the minimum level of education and training for
assessors, including refresher/update training. The NELAC will also
develop criteria for training requirements. The assessor training
course will be developed and implemented by EPA, NIST, or a non-Federal
entity with oversight by EPA. A state may develop and implement it's
own assessor training program, subject to EPA oversight, if the state
program can meet the NELAC standards.
3.2.2 Qualifications
A laboratory assessor may work for a Federal, State, or a third
party accrediting body. An assessor, including each member of an
inspection team, must be an experienced professional and hold at least
a B.S. degree, or equivalent education and experience, in the specific
discipline being evaluated. Each assessor must also have satisfactorily
completed a laboratory accreditation training course and a health and
safety training course, and take periodic update/refresher training, as
specified by NELAC. Each new candidate assessor must undergo on-the-job
training during one or more inspections until judged proficient.
3.2.3 Additional Qualifications
In addition, the assessors must:
(a) Be familiar with the relevant legal regulations, accreditation
procedures, and accreditation requirements;
(b) Have a thorough knowledge of the relevant assessment methods
and assessment documents;
(c) Be technically conversant with the specific tests or types of
tests for which the accreditation is sought and, where relevant, with
the associated sampling procedures;
(d) Be able to communicate effectively, both orally and in writing;
and
(e) Be free of any commercial interest that might cause the
assessor to act in other than an impartial or nondiscriminatory manner.
3.2.4 Assessor Certification
Before an assessor can conduct on-site inspections, the individual
must be certified to do so, in writing, by either the NELAP or State in
which the individual will assess laboratories. For each laboratory
inspection performed by a state-designated third party assessor (i.e.
non-EPA, non-State), the assessor must sign a statement before the
inspection, certifying that no conflict of interest exists.
3.3 Frequency of On-Site Assessments
3.3.1 Frequency
Accreditors should perform a routine on-site assessment at least
annually. Assessments may be more frequent at laboratories where a
problem exists, including complaints about laboratory quality,
questions of fraud, or recurring failure on performance evaluation
samples.
3.3.2 Follow-Up Evaluations
In addition to routine evaluations, assessors may need to conduct
one-time follow-up evaluations at laboratories where a significant
deficiency was identified by the previous evaluation. These evaluations
may be limited to determining whether a laboratory has corrected its
deficiency(ies), or determining the merit of a formal appeal from the
laboratory. When deficiencies may result in downgrading of
accreditation status, follow-up evaluations should occur as soon as
possible but no later than 60 days after the original evaluation.
3.3.3 Changes in Laboratory Capabilities
The accrediting authority may also deem necessary a limited one-
time evaluation when a major change occurs at a laboratory in
personnel, equipment, or a laboratory location that might impair
analytical/biological capability and quality. A major change in
personnel is defined as the loss or replacement of the laboratory
management staff, or loss of a trained and experienced individual who
performs a particular test for which accreditation has been granted.
3.3.4 Announced and Unannounced Visits
The accrediting authority is not required to provide advance notice
of an assessment. However, the policy is to provide such notification,
based on the circumstances of the particular assessment and laboratory.
Since these highly technical assessments may involve sensitive
information and because there is a need to ensure that appropriate
personnel and records are available for assessment, the testing
laboratory usually is notified in advance of a planned assessment. The
accrediting authority, at its discretion, may conduct unannounced
evaluations for cause (e.g., questions of fraud, tips, complaints, or
problems with performance evaluation samples) or as part of a routine
practice.
3.4 Pre-Assessment Procedures
3.4.1 Introduction
A good assessment begins with planning, which should commence well
before the assessment team visits the laboratory.
Planning is the means by which the lead assessor identifies all the
required activities to be completed during the assessment process.
These activities include obtaining records before the assessment,
conducting the assessment, writing reports and following up.
Pre-assessment activities include: deciding the scope of the
assessment (Section 3.4.2); assessment planning (Section 3.4.3);
reviewing NELAP/State information (Section 3.4.4); providing advance
notification of the assessment to the laboratory (Section 3.4.5);
coordinating the assessment team (Section 3.4.6); and gathering
assessment documents and equipment (Section 3.4.7). Section 3.4.8
discusses Confidential Business Information issues.
3.4.2 Scope of the Assessment
The first step in the assessment planning process is deciding what
type of assessment will be conducted. The assessments usually include a
laboratory evaluation and a records review.
3.4.2.1 Laboratory Evaluations
A laboratory assessment obtains a ``snapshot in time'' at a testing
laboratory by evaluating what activities are being conducted when the
assessment takes place. During a laboratory evaluation, the assessment
team may identify a number of samples or a recently completed or on-
going project and evaluate to what extent the tests are being conducted
according to NELAP or client requirements.
3.4.2.2 Records Review
The purpose of a records review is to learn if the testing
laboratory has maintained data and other information necessary to
support reports previously issued. During a records review, team
members will conduct an overall audit of data, and will compare data
with submitted reports to determine whether the data were generated or
collected following the proper procedures in the NELAP/State, EPA, or
client requirements.
3.4.3 Assessment Planning
Planning includes conducting a thorough review, prior to the
assessment, of NELAP and/or State records pertaining to the laboratory
to be inspected. This will save time because familiarity with the
operation, history, and compliance status of the laboratory increases
the efficiency and focus of an on-site visit. Planning also promotes a
better relationship with the laboratory community because the lead
assessor will be better able to answer questions concerning the
application of NELAP/State requirements to a particular laboratory. It
also enhances the laboratory's confidence in the lead assessor and aids
in establishing good relationships with laboratory representatives.
Another important benefit of planning is to enhance the lead
assessor's ability to identify and document potential problems and plan
to collect necessary information to assist the accrediting authority in
their subsequent decisions concerning the laboratory. Planning an
assessment will result in an efficient and productive assessment
overall.
3.4.4 Reviewing NELAP/State Information
The lead assessor's responsibilities start with receipt of the
Assessment Assignment. For a records review, copies of all appropriate
documents related to the laboratory will be forwarded by the
accrediting authority to the lead assessor or directly to a team
member, if appropriate, ideally at least six weeks prior to the start
of the assessment. The lead assessor should request any other
information that will be useful in preparing for the assessment. Such
information may include:
(a) Copies of previous assessment reports and PE sample results;
(b) General laboratory information such as laboratory submitted
self-assessment forms, SOPs and Quality Assurance plan;
(c) Correspondence with laboratory personnel;
(d) Discussion with appropriate NELAP/State staff;
(e) Available documents from recipients of reports from the
laboratory; and
(f) Relevant program documents such as NELAP/State guidelines or
SOPs.
3.4.5 Providing Advance Notification
No fewer than two weeks prior to an announced assessment, the
accrediting body will contact the responsible management official at
the laboratory to schedule the assessment. The initial telephone
notification will be confirmed by a notification letter. A copy of the
notification letter also will be given to the lead assessor. An
assessment assignment that gives the name and telephone number of the
laboratory contact person and of each assessment team member, as well
as other available information necessary to the planning and conduct of
the assessment will also be provided to the lead assessor.
Once the laboratory has been notified by the accrediting authority
that an assessment will be conducted, the primary responsibility for
the conduct of the assessment passes to the lead assessor. Any further
communications with the laboratory personnel should be made by the lead
assessor. The lead assessor should keep his/her supervisory personnel
informed of the status of the assessment, and should consult with them
on any substantive problems that may arise or changes that may be
required.
There are several items to be addressed in the advanced
notification. The lead assessor should make note of when and to whom
advance notification was provided. Written advance notification should
do the following:
(a) Introduce the lead assessor and team members to the laboratory;
(b) Schedule the assessment, including establishing time of
arrival;
(c) Obtain verbal agreement for entry;
(d) Confirm the appropriate address for the assessment, including
identifying the location of necessary records, as specified in the
assessment plan;
(e) Ensure that laboratory personnel are available to accompany
assessors during the assessment;
(f) Encourage the laboratory to transfer all records to the
assessment site before the assessment;
(g) Obtain directions to the laboratory; and
(h) Allow discussion of problems, concerns, or questions about the
assessment or any other issues.
Especially when the laboratory has not previously been assessed by
the accrediting authority, the lead assessor should be certain that
laboratory personnel are aware of what an assessment involves, what
data and records should be made available and what personnel should be
present. If the laboratory representative does not cooperate, the lead
assessor's supervisor and the accrediting authority management should
be consulted for instructions on how to proceed.
3.4.6 Assessment Team Coordination
When the identity of the assessment team is known, the lead
assessor should contact each person and begin planning the conduct of
the assessment. As early as possible the lead assessor should:
(a) Coordinate travel plans, including the hotel and transportation
arrangements;
(b) Notify each team member of the dates of the assessment and pre-
assessment team meeting;
(c) Ensure that each team member has been briefed on specific
procedures for the assessment;
(d) Define the time allotted for the assessment. The lead assessor
should be careful not to underestimate the time needed to conduct the
assessment; and
(e) Confirm for those individuals who will be conducting the
records review, their familiarity with the records to be reviewed. Each
member of the assessment team should be aware of their responsibilities
during the assessment.
The lead assessor should also arrange to provide copies of
applicable NELAP/State standard operating procedures (SOPs) to team
members who do not already possess these documents. In addition, the
lead assessor may need to assure that the assessment team is aware of
proper procedures for receipt and handling of confidential business
information (CBI). The lead assessor should determine the level of
experience of each team member in conducting laboratory evaluations or
records reviews under NELAP/State requirements. The lead assessor may
need to guide less experienced team members, both prior to and during
the assessment as well as with report preparation. The lead assessor
should assemble the team just prior to the assessment to attend to last
minute details.
3.4.7 Gathering Assessment Documents and Equipment
Besides preparing the assessment plan and reviewing accrediting
body records and laboratory submissions prior to conducting the
assessment, the lead assessor should gather and prepare the necessary
documents and equipment to be used during the assessment. No single
list of documents and equipment can be appropriate for all assessments.
The lead assessor's experience in the field and information obtained
during pre-assessment planning should assist in preparing lists
tailored to specific assessment sites and needs. Specific needs will be
determined by the requirements of the assessment, the availability of
equipment, conditions at the laboratory, NELAP/State policies, and
whether advance notification of an assessment is given.
3.4.7.1 Types of Documents
Documents necessary for the assessment should be prepared before
the assessment, whenever possible. The lead assessor should obtain
copies of the required assessment forms. Several spare copies of each
form should always be carried. Assessments may require:
--Notice of assessment;
--Assessment confidentiality notice;
--Conflict of interest form;
--Assessor credentials;
--Assessment assignment;
--Assessment notification letter;
--Attendance sheet, opening and closing conference; and
--assessment appraisal form.
In addition, the lead assessor should be certain to take the
following documents and materials on an assessment:
(a) Copies of NELAP/State requirements. Lead assessors should have
copies of the applicable NELAP/State requirements available upon
request. Having such data available can help improve the relationship
between NELAP/State and the laboratory community, which can foster
better laboratory compliance;
(b) NELAP/State checklists for evaluations;
(c) NELAP/State outreach materials. Lead assessors should provide
current, relevant educational, and/or guidance information to
laboratory officials upon request or as deemed appropriate by the lead
assessor; and
(d) Administrative information. Travel authorizations and telephone
numbers of travel and procurement personnel who may need to be
contacted should be taken by the lead assessor when on travel.
3.4.7.2 Assessment Equipment
The types of equipment that a lead assessor takes to an assessment
site will vary from assessment to assessment, depending upon the nature
and extent of the assessment and the type of testing laboratory to be
inspected. Therefore, prior to each assessment, the lead assessor
should check the equipment to make sure that it is in good working
condition. Since each assessment is unique, no single list of equipment
or forms can be devised that will fit every assessment situation.
3.4.8 Confidential Business Information Considerations
NELAP/State SOPs protect Confidential Business Information (CBI)
from disclosure. CBI includes trade secrets (including process,
formulation, or production data) and certain financial information, the
uncontrolled disclosure of which could cause damage to a laboratory's
competitive position. In general, disclosure of CBI is prohibited,
except in certain limited situations.
The lead assessor should keep in mind that information obtained
from a laboratory during an assessment can, for the most part, be
disclosed in response to a request from the public, or other requesting
party, under Federal or State Freedom of Information requirements.
However, if the data has been properly claimed as CBI, it may not
generally be disclosed under these requirements.
A lead assessor must present notice to laboratory representatives
of their right to claim data at the laboratory as CBI and such claims
are frequently made. Because the lead assessor is very likely to
require access to CBI before (i.e., while preparing for an assessment),
during, and after an assessment, the lead assessor must be
knowledgeable of NELAP/State procedures governing access to, handling
of, and disclosure of CBI. The lead assessor and others who may use the
information must have CBI access authorization, since only authorized
individuals may have access to CBI. A CBI-cleared lead assessor may
obtain access to CBI documents from the accrediting authority by
requesting access to the information from the appropriate official.
Whether or not it is anticipated that CBI documents will be
collected during an assessment, the lead assessor must provide a NELAP/
State assessment confidentiality notice to the responsible laboratory
official at the beginning of the assessment. This notice informs
laboratory officials of their right to claim part of the assessment
data as CBI. The lead assessor should be familiar with the procedures
for asserting a CBI claim, and the criteria that the claimed
information must meet.
The lead assessor must take custody of all CBI documents before
leaving the laboratory, and must maintain them in custody, using all
proper procedures and safeguards, until they can be received by the
accrediting authority.
3.5. Assessment Schedule/Format
3.5.1 Length of Evaluation
The length of an on-site assessment will depend upon a number of
factors, such as the number of tests evaluated, the number of assessors
available, the size of the laboratory, the number of problems
encountered during the assessment, and the cooperativeness of the
laboratory staff. The accrediting body should assign an adequate number
of assessors to complete the evaluation within a reasonable period of
time. Assessors must strike a balance between thoroughness and
practicality, assuring that the assessment covers all aspects of the
laboratory operation.
3.5.2 Opening Conference
Arrival at the facility should occur during normal working hours.
The facility representative should be located as soon as the assessment
team arrives on the premises. A laboratory's refusal to admit the
assessment team for an evaluation may result in an automatic failure or
loss of accreditation on the part of the laboratory, unless there are
extenuating circumstances that are accepted by the accreditation body.
The team leader should notify the accrediting body as soon as possible
after refusal of entry.
When the appropriate official has been located, the team leader
should introduce the team and should present credentials. Many
companies require that the assessment team sign a visitor's sheet that
contains the name, time, reason for visit, organization, etc., which
should be signed. However, any request for any assessment team member
to sign a ``visitor's release'' or ``waiver'' that would relieve the
company of responsibility for injury or that would limit the rights of
the accrediting body to use the data obtained should not be signed. If
such a waiver or release is presented, the team leader should politely
explain that they cannot sign and request a blank sign-in sheet. The
assessment team leader should brief the appropriate responsible
official(s) of the facility to introduce team members, explain areas to
be evaluated and verify application information.
The assessment team leader should request relevant documents for
review that were not part of the application materials, such as
standard operating procedures, chain-of-custody forms, report forms,
etc.
The assessment appraisal form should be presented to the
appropriate laboratory official with a request that the form be
completed and returned to the accrediting authority after the
assessment. This form will allow feedback from the laboratory on the
manner in which the assessment was conducted.
3.5.3 Records Review
The records requested during the opening conference will be
reviewed by assessment team members for accuracy, completeness and
proper methodology for each area to be evaluated.
Trade secrets and confidential business information are protected
from public disclosure. The type of information that may be considered
confidential business information is defined in Title 40, Code of
Federal Regulation, Part 2. All financial and trade information should
be kept confidential, if so requested by the laboratory. All other
information for all aspects of application, assessment and
accreditation of laboratories is considered public information. If the
laboratory requests that information other than that noted above is
confidential, the information should be treated as confidential until a
ruling can be made by the accreditation body.
The team leader must mark all confidential information received and
handle it as required by appropriate laws and regulations.
3.5.4 Staff Interviews
The assessment team will evaluate a test by having the individual
that normally conducts the specific procedure walk through the
procedure, including a step-by-step description of exactly what is done
and what equipment and supplies are employed. The assessor will note
and record the procedure on the standardized checklists for that
particular test and application. Any deficiencies shall also be noted
and discussed with the individual.
The assessment team members shall have the authority to conduct
interviews with any/all staff and, if necessary, conduct private
interviews. Calculations, data transfers, calibration procedures,
quality control/assurance practices and adherence to SOP's shall be
assessed for each test.
During the evaluation, sufficient information may become available
to indicate that a particular person has violated an environmental law
or regulation, such as knowingly making a false statement on a report.
This information should be carefully documented, since it may be used
in a legal action. When the possibility of additional legal
investigation exists, the assessor should not discuss the legal
implications of the suspected violation with the individual or any
laboratory representative. However, the assessor should continue to
gather the information necessary to complete the accreditation
assessment.
3.5.5 Closing Conference
The assessment team should meet with representatives following the
evaluation of the laboratory for an informal debriefing and discussion
of findings.
In the event the laboratory disagrees with the findings of the
assessor(s), and the team leader adheres to the original findings, the
area(s) protested shall be documented by the team leader and included
in the report to the accreditation body for consideration. The
accrediting authority will make the final determination.
The assessment team should provide the accreditation body with an
assessment report encompassing all relevant information concerning the
ability of the applicant laboratory to comply with the accreditation
requirements. If data is available from performance evaluation testing,
this should be included in the final report.
3.5.6 Follow-Up Procedures
The accrediting authority will issue the assessment to the
applicant laboratory that outlines any areas of deficiencies. The
applicant laboratory should then submit a plan of corrective action, if
necessary, and provide any missing documentation required within 45
days from the date of report receipt.
After reviewing documentation and corrective actions, the
accrediting authority will make the decision to pass, fail or provide
interim accreditation for a laboratory. If the deficiencies listed are
substantial or numerous, an additional assessment (possibly
unannounced) may be conducted before a final decision for accreditation
can be made.
3.6 Criteria for Assessment
3.6.1 Assessor's Manual
The NELAC will develop a manual(s) for on-site assessors to assure
that on-site assessments are performed in a uniform, consistent manner.
The manual(s) will be provided when assessors take the NELAC required
training (section 3.2.1) and will serve as guidance for on-site
assessment personnel.
The manual(s) provided to on-site assessors should include
instructions for evaluating the following items:
(a) Size, appearance, adequacy of the laboratory facility;
(b) Organization and management of the laboratory;
(c) Qualifications and experience of laboratory personnel;
(d) Receipt, tracking and handling of samples;
(e) Quantity, condition, performance of laboratory instrumentation
and equipment;
(f) Preparation and traceability of calibration standards;
(g) Analytical and biological methodology (including the
laboratory's standard operating procedures as well as confirmation of
individuals' adherence to SOPs, and the individual's proficiency with
the methodology);
(h) Data reduction procedures, including an examination of raw data
and confirmation that final reported results can be traced to the raw
data/original observations;
(i) Quality assurance/quality control procedures, including
adherence to the laboratory's quality assurance plan and adequacy of
the plan;
(j) General health and safety procedures as they relate to good
scientific practices;
(k) Laboratory waste disposal procedures;
(l) Environmental and toxicological test methods and SOPs; and
(m) Care, use, and maintenance of test organisms.
3.6.2 Assessors Role
When performing an on-site laboratory evaluation, the assessor must
appraise each of the areas listed in section 3.6.1. The on-site
assessor should use a variety of tools in the evaluation process. The
experience of the assessor, his/her observations, interviews with
laboratory staff, and examination of SOPs, raw data, and the
laboratory's documentation will all play an important role in the
assessment. The role of the on-site assessor is a critical one in the
entire laboratory accreditation process. The accreditation of a
particular laboratory will depend to a large extent on the assessor's
recommendation. While much of the on-site assessment will depend upon
the assessor's judgement, the recommendation not to accredit a
laboratory must be based on factual information, not on opinions or
suppositions. Therefore it is crucial that the on-site assessor have a
clear understanding of the laboratory's procedures and policies, and
that the assessor document any deficiencies. Also the assessor should
discuss any deficiencies with the laboratory's management in order to
allow them to provide additional information which might affect the
assessor's recommendations.
3.6.3 Checklists
Standardized checklists for the on-site assessment must be used.
The use of checklists does not discourage the need for additional
observations and staff interviews, but is merely another tool in the
assessor's inventory which assists in conducting a thorough and
efficient evaluation. Using a checklist as a substitute for assessor
training and experience must not occur.
Note: It is anticipated that standardized checklists will be
developed or adopted by NELAC's On-Site Assessment Committee for the
assessor's review of analytical and biological methodology.
3.6.4 Evaluation Criteria
The following considerations should be taken into account by on-
site assessors when evaluating the areas listed in section 3.6.1:
3.6.4.1 Facility Assessment
The assessor(s) should tour the laboratory facility with the
laboratory management representative. Usually the tour will occur
during the initial phase of the on-site visit, perhaps after the
opening conference. During the tour, the assessor should visually
inspect the facility with respect to general housekeeping, cleanliness,
lighting, bench space and continuous temperature monitoring (if
required). The assessor should note whether the appropriate laboratory
services (e.g., vacuum system, compressed air, gases, etc.) are
available. It may be necessary to have the laboratory representative
demonstrate that certain pieces of equipment are working properly, for
example, a fume hood may be turned on to assure that it does indeed
exhaust air from the laboratory. This type of demonstration is not
intended to certify that the hood meets design specifications or safety
requirements, but merely that it is operational. During the tour, the
assessor(s) should determine if sample storage areas are sufficient and
whether there are problems with laboratory operations which would
affect data quality. For example, an extraction operation located in
the same room where volatile organic analyses are performed could
contribute contamination to the volatile organic analyses.
Any problems or deficiencies with the laboratory facility should be
brought to the attention of the laboratory management at the time of
the tour and reinforced at the closing conference. If discrepancies are
noted between statements made by the laboratory representative and
visual observations, it may be necessary to interview other laboratory
personnel to obtain an explanation of the situation. As with all areas
of the on-site assessment, the experience and training of the on-site
assessor are critical to the success of the facilities evaluation.
3.6.4.2 Organization Assessment
The assessor should review laboratory QA plans, SOPs,
organizational charts and/or other documentation to determine the
laboratory's operational structure. If a documented organizational plan
exists, the assessor should ascertain during subsequent interviews with
laboratory personnel if the laboratory operation follows the documented
plan. The assessor should interview laboratory management to determine
the roles of management and how laboratory policy is created. The
absence of a documented organizational structure, clearly defined
functional responsibilities, and lines of communication, should be
considered a deficiency.
3.6.4.3 Personnel Assessment
The assessor should review the laboratory's written qualification
requirements for each position, and the qualifications of those persons
currently holding the positions. Key personnel, e.g., laboratory
management staff, quality assurance coordinator, section managers,
chief analysts, etc., should be interviewed to verify their
qualifications for their positions. These interviews may be conducted
concurrently with interviews on analytical and biological procedures,
quality control requirements, etc., in order to expedite the process.
The assessor should be cautious when making judgments on personnel
qualifications, and must be aware that experience may be an acceptable
substitute for formal education. When in doubt concerning personnel
qualifications, the assessor should conduct an in-depth interview with
the individual to determine his/her expertise in a given area.
Note: Section 5, Quality Systems, contains details on personnel
qualifications.
3.6.4.4 Sample Handling Assessment
The assessor should review the laboratory's SOP for sample receipt
to assure that all appropriate elements (e.g., proper sample
containers, preservatives, chain of custody, sample storage, sample
rejection policy, etc.) are included. Any omissions should be brought
to the attention of the laboratory management and appropriate
laboratory staff person. Absence of a written sample receipt SOP should
be considered a serious deficiency. The assessor should inspect the
sample storage areas to insure that the facilities are adequate and
secured. Cold storage facilities should be checked for maintenance of
proper temperatures, proper monitoring devices (thermometers, etc.) and
appropriate documentation. Sample receipt personnel should be
interviewed to determine their adherence to the SOP. Sample receipt
documentation and chain-of-custody records should be reviewed to
determine if documentation is adequate. Failure to follow SOPs may be
considered a serious deficiency, depending on the degree of deviation.
Failure to keep sample receipt and chain-of-custody documentation
should be considered a serious deficiency.
3.6.4.5 Equipment Assessment
The assessor should determine if the laboratory has all equipment
and instrumentation necessary to perform the analyses for which
certification is requested. This determination should be performed by
visual inspection of the laboratory. The assessor should determine if
the equipment is in reasonable working condition. An actual
demonstration of equipment performance is not necessary in all
circumstances, but should be required if the assessor has doubts about
the condition of certain pieces of equipment. The absence of a required
piece of equipment or instrument for a particular test should be
considered a serious deficiency. The assessor should determine if the
laboratory has written records of equipment repairs, maintenance,
testing and calibration.
3.6.4.6 Calibration Standards Assessment
The assessor shall ascertain whether the laboratory has the
necessary stock calibration standards and should spot check calibration
standards to see if they are within expiration dates. The assessor
should determine if stock standards are properly stored, e.g., volatile
organic standards are stored in sealed vials in a freezer. The assessor
should examine the laboratory's records for stock standards and the
preparation of working standards to determine if the records are
complete.
3.6.4.7 Methodology Assessment
The assessor should determine whether the laboratory has standard
operating procedures for all test methods used by the laboratory. The
standard operating procedures should be reviewed to determine if they
adequately address all aspects of the analytical and biological
procedures, e.g., sample preparation, calibration standard preparation,
instrument calibration, etc. The analysts should be interviewed to
verify that they have access to and are following the standard
operating procedures for all methods. The lack of analytical and
biological standard operating procedures or significant deviations from
the standard operating procedures should be considered as serious
deficiencies.
While the ideal on-site assessment would consist, in part, of
observing each individual perform his/her assigned work, time
considerations will not permit this approach in a laboratory which
conducts a wide variety of analytical or biological procedures.
Consequently, the on-site assessor will need to rely more heavily on
interviews with laboratory personnel, observations, and review of
records to determine proficiency with, and knowledge of, the analytical
or biological methodology. The assessor's experience and training will
play a key role in this process.
The assessor should be familiar with the performance of a test, so
that the appropriate technical questions may be asked of the
laboratory's analysts. The assessor should pose questions to the
laboratory's staff in such a way as to not lead the individual into the
correct response. The individual's responses should be cross-checked
with the laboratory's documentation. During interviews with the
individuals, it may be unclear as to how the analytical and biological
procedures are being performed. If this occurs, then the assessor
should ask the individual to demonstrate the procedure.
3.6.4.8 Data Audit
The assessor should perform a data audit on an appropriate number
of sample sets which contain all the tests for which the laboratory is
seeking accreditation. It may be necessary to audit multiple sample
sets in order to cover all tests. The assessor should verify that the
required sample receipt documentation and chain-of-custody records are
on file and that they contain all necessary information. The assessor
should obtain final data reports for the sample set being audited. The
assessor should verify that the final reports contain the following
information:
--Sample receipt date;
--Sample analysis date;
--Sample identification;
--Method used for analysis;
--Quantitation units, e.g., mg/L, mg/Kg, g/m3, etc.;
--If sample is a solid, whether results are calculated on a wet weight
or dry weight basis, and if on a dry weight basis, the percent moisture
or percent solids;
--The sample result (if the result is none detected, the method
detection limit should also be reported); and
--Method of statistical determination of test result, if applicable.
The assessor should assure that all information needed to verify
the final result is on file, including reasons for invalidating testing
results if this has occurred. The information may include sample
preparation data, instrument output (chromatograms, mass spectra, strip
charts), instrument calibration records, and records of dilutions. Once
the information is located, the assessor should recreate the
calculation in order to verify the final reported result. The absence
of the required information needed to verify the final result should be
considered a serious deficiency. If the assessor is unable to recreate
a calculation, the problem should be discussed with laboratory
personnel in an attempt to resolve the issue. If any calculations/final
results are determined to be incorrect, the assessor should examine
approximately ten percent of the data for the test in question over a
selected time period to see if a systematic error has occurred.
In addition to auditing results from routine sample analyses,
assessors must also audit results of performance evaluation (PE)
samples analyzed by the laboratory for the NELAP. Assessors should
verify that the sample(s) were analyzed using the criteria set forth by
NELAP. The data generated during the analysis of PE samples should be
examined and compared with final results reported to the NELAP.
3.6.4.9 QA Plan Assessment
The assessor should examine the laboratory's written QA Plan to
determine if it conforms to the Quality Systems requirements in Section
5.0. The assessor should examine the laboratory's raw data to ascertain
if the required QC checks have been documented. If QC criteria were
exceeded, the assessor must determine if corrective action was
initiated. Laboratory personnel should be interviewed to determine if
they understand and follow the requirements of the QA Plan. Laboratory
management should be interviewed to determine their commitment to the
QA program. The absence of a QA Plan, or an incomplete QA Plan, should
be considered a major deficiency. The lack of appropriate corrective
action or documentation of corrective action should be considered a
serious deficiency.
3.6.4.10 General Health and Safety Procedures
The responsibility for promulgating and enforcing occupational
safety and health standards rests with the U.S. Department of Labor.\1\
While it is not within the scope of the assessment team to evaluate all
health and safety regulations, any obviously unsafe condition(s) should
be described to the appropriate laboratory official, and reported to
the appropriate state or federal agency. The accreditation on-site
assessment is not intended to certify that the laboratory is in
compliance with all applicable health and safety regulations.
---------------------------------------------------------------------------
\1\Handbook for Analytical Quality Control in Water and
Wastewater Laboratories, EPA-600/4-70-019, March 1979.
---------------------------------------------------------------------------
3.6.4.11 Laboratory Waste Disposal Assessment
The assessor(s) should ask if adequate facilities are available for
the collection, storage and/or treatment (if applicable) of all
laboratory wastes. The waste disposal system(s) should be operated in
such a manner to protect the air, water, and land by minimizing and
controlling all releases from fume hoods and bench operations.
Compliance is also required with any wastewater discharge permits and
regulations. It is the laboratory's responsibility to comply with all
federal, state, and local regulations governing waste management,
particularly the hazardous waste regulations. The accreditation on-site
assessment is not intended to certify that the laboratory is in
compliance with all applicable waste disposal regulations.
3.7 Documentation of On-Site Assessment
3.7.1 Checklists
The checklists used by the assessors during the assessment should
become a part of the permanent file kept by the NELAP/State on each
laboratory.
3.7.2 Report Format
Evaluation reports should be generated in a narrative format,
allowing for differences in style and technique between accrediting
authorities. Deficiencies must be addressed at a minimum, however,
documentation of positive aspects should be included. Documentation of
existing conditions at the laboratory should be included in each report
to serve as a baseline for future contacts with the facility.
3.7.3 Distribution
The accrediting authority should be recognized as having the
responsibility for the content of the evaluation reports. The team
leader should compile, edit and submit the final report to the
accrediting authority. The team leader must assure that the results
within the final report conform to established criteria for the
evaluated parameters.
3.7.4 Report Deadline
No longer than thirty (30) days should elapse from the last day of
an on-site evaluation until the report is submitted to the accrediting
authority for review and final decision.
3.7.5 Release of Report
On-site evaluation reports should be released by the accrediting
authority only. The reports will be released to the management of the
affected laboratory and to those persons nominated by the laboratory to
receive a copy of the report. The assessment report shall not be
released until the assessment and all other appropriate action has been
completed. In accordance with the Freedom of Information requirements,
any documentation adjudged to be proprietary, financial and/or trade
information will be considered exempt from release to the public.
3.7.6 Report Storage Time
At a minimum, copies of all evaluation reports must be retained by
the evaluators and the accrediting authority for a period of five
years, or longer if required by regulation.
4.0 Accreditation Process
4.1 Components of Accreditation
These criteria must be fulfilled for accreditation. The components
and criteria are herein described.
4.1.1 Personnel Qualifications
This component ensures that the managerial and supervisory
personnel in the environmental laboratory meet a minimum set of
qualifications that address the elements of education, training and
experience. It should be recognized that some of these elements are
interconnectable, i.e. a greater magnitude of training and/or
experience may substitute for lesser degrees of formal education. Refer
to Quality Systems for a detailed review of supervisors and managers,
and the criteria to be maintained by the supervisors and managers for
awarding accreditation.
4.1.2 On-Site Assessments
On-site assessments and evaluations may be of two types: announced
and unannounced. The assessment ensures that the environmental
laboratory is capable of performing analyses to the level, precision
and accuracy required by the specific method or performance based
method. Announced assessments test these methods and evaluate the
results against the criteria under the best circumstances in a
controlled environment. The unannounced assessment measures the
abilities of the laboratory to meet these standards for methods on an
average day under normal working conditions and in a normal working
environment. Each type of assessment has limitations and advantages,
but the information obtained from both will provide a higher degree of
confidence in the ability of the laboratory to attain a required level
of competence in the quality of data produced for regulatory and
compliance purposes. Refer to on-site assessment for additional
information regarding frequency, procedures, criteria, scheduling and
documentation of on-site assessments.
Announced Assessments--The elements present in and criteria for
announced assessments for national accreditation are:
(a) The assessment must be performed a minimum of one time per year
and be conducted on-site; i.e., the site at which the actual analyses
take place;
(b) The assessment may consist of any or all of the categories for
which the laboratory wants to obtain accreditation;
(c) The inspector must have access to all information and data
requested both for analyses completed and laboratory personnel;
(d) The results of the assessment and the Performance Evaluation
sample analyses indicating satisfactory or unsatisfactory performance
will be sent to the National Database on environmental laboratories;
and
(e) At least two performance evaluation (PE) samples, twice per
year, for each method or field of testing, must be successfully
analyzed according to the standards established for quality assurance/
quality control, precision and accuracy. It may not be required to
analyze PE samples during the on-site assessment. Marginal performance
on any previous PE samples can be grounds for requiring that a
subsequent PE sample analysis be performed under the observation of an
inspector.
Unannounced Assessments--The elements and criteria for the
unannounced assessments for the purpose of the national accreditation
program are:
(a) The inspector may not be denied immediate access to the
laboratory facility;
(b) Elements (a) through (d) under announced assessments are also
applicable to unannounced assessments;
(c) Performance evaluation samples may be distributed and analyses
run in the categories and for the methods that are determined by and
prescribed by the inspector; and
(d) All performance evaluation samples and other analyses required
by the inspector are to be done as directed by the inspector. These
include parameters such as: specified equipment, analysts and times,
but are not limited to these factors.
Factors Examined in Announced and Unannounced Laboratory
Assessments.--Refer to On-site Assessments for assessment criteria
required to be satisfied for accreditation. It should be noted, the
inspector is not limited to these factors in reaching an evaluation and
conclusion. Other factors may be considered and documented as
appropriate.
Laboratories will be furnished with an inspection report
documenting any deficiencies found in the factors listed above or any
others considered by the inspector. It shall also include whether a
specific method passed or failed based on the Performance Evaluation
sample. All such reports are public record and any or all of the
information contained therein may be put into the National Database.
Proprietary data will be excepted from all public records.
The laboratory will have no more than 45 days from the date of
receipt of the report to correct deficiencies noted in the inspection
report. At that time, if no remedial action has been taken to correct
the noted deficiencies, accreditation for categories or specific
methods within those categories will be immediately revoked.
4.1.3 Performance Evaluation Samples
A critical component of laboratory assessments is the analysis of
the Performance Evaluation Samples. Refer to Performance Evaluation
Testing, specifically Testing of Samples, for additional information
regarding separate treatment of Performance Evaluation samples,
discussion of issues of availability, and purity and distribution.
Performance Evaluation samples would be used and evaluated in the
accreditation process in the following manner:
(a) All laboratories seeking National Accreditation must receive,
examine and analyze initial performance evaluation sample(s) for each
category (e.g., drinking water, hazardous waste, etc.) in which they
are requesting accreditation. The analysis must be completed and the
results reported to the performance evaluation testing organization or
the Inspector within 45 days of the receipt of the sample.
(b) Each laboratory seeking national accreditation shall also be
required to perform analyses on at least two performance evaluation
samples, two concentrations, two times per year in each category for
which they have applied for accreditation or for which the laboratory
is currently accredited.
(c) The laboratory will be informed of the results of the
performance evaluation sample analysis within 60 days of receipt by the
state agency or authorized third party contractor. The results of all
of the performance evaluation sample tests indicating satisfactory or
unsatisfactory compliance will be public record and will be recorded on
the national database.
(d) The results of the performance evaluation sample analysis will
be considered, along with other information obtained from announced
and/or unannounced assessments in determining whether accreditation
should be granted, denied or modified for a category, or whether the
laboratory should lose accreditation for a category or method within a
category.
4.1.4 Corrective Action Reports
The purpose of the corrective action report is to have a written
record of response to deficiencies that are noted in the laboratory
assessment procedure.
(a) After being notified of deficiencies from the laboratory
inspection, the laboratory has 45 days from the date of receipt of the
deficiency report to submit a corrective action report.
(b) The state authority or authorized third party contractor will
respond to the action noted in the corrective action report within 30
days of receiving it. The report must address each of the deficiencies
noted on the deficiency report.
(c) A laboratory can lose accreditation in a category or a method
within a category by any or all of the following items:
i. Failing to respond to corrective action two times;
ii. Failing to submit a corrective action report;
iii. Failing to address each item noted as a deficiency in the
corrective action report;
iv. Failing the same performance evaluation sample analysis two
consecutive times for the same analyte; or
v. Failing to achieve an overall testing event passing score for
two consecutive testing events or two out of three consecutive testing
events.
(d) All information included and documented in a deficiency report
and the corrective action report are considered to be public
information. Other states participating in the National Environmental
Laboratory Accreditation Program would have access to this information
through a national database. At a minimum, the database would include
the following information:
i. Name and location of laboratory;
ii. Number and dates of assessments performed and whether they were
announced or unannounced;
iii. Performance evaluation samples and analyses done, the date
completed and the status (in process; passed, failed);
iv. Categories and methods for which the laboratory is currently
accredited and date of accreditation; and/or
v. Categories and method for which the laboratory has lost
accreditation and the date of loss of accreditation.
4.1.5 Ethical Standards
Elements in a national program that ensure consistency and promote
the use of quality assurance/quality control procedures to generate
quality data for regulatory purposes are
(a) NELAC strongly recommends requires that each laboratory seeking
national accreditation have a named Quality Assurance Officer. NELAC
strongly recommends that the Quality Assurance Officer be a person
other than any supervisor of laboratory analysts, who reports directly
to the laboratory management and not to the laboratory supervisor in
matters related to quality assurance and quality control of analyses,
methods relating to these analyses, and instrumentation.
(b) NELAC will consider that responsibility for falsification of
data, records or instrument parameters will rest upon the Quality
Assurance Officer (named in 4.1.4a above), the laboratory management
and the company.
(c) The National Environmental Laboratory Accreditation Program
shall establish a ``Laboratory Fraud Hotline'' telephone number.
Alleged cases of data, record or analytical fraud reported via this
hotline will be referred to the relevant state authority for
investigation. The fact that a federal or state has taken regulatory,
legal, or contractual action against a laboratory will be made
available on the national database.
4.1.6 Fee Process for National Accreditation
Refer to Policy and Structure, specifically funding of the program
1.7.3, regarding the funding of state accreditation programs, including
a fee structure covering the actual cost of an accreditation.
(a) The cost incurred in the application process for national
environmental laboratory accreditation will be called an accreditation
fee.
(b) Where required, accreditation fees will be paid to the state(s)
which grants accreditation to the laboratory. These fees must be paid
in accordance with existing state regulations, levels and practices.
(c) Failure to remit the accreditation fee within the time limit as
established by the individual state authority will be grounds for
immediate loss of accreditation in that state. The loss of
accreditation will immediately be entered in the national database.
4.1.7 Application Process
The National Environmental Laboratory Accreditation Program
encompasses a standardized set of elements in each application for
accreditation that will be reported to and recorded in the national
database. The application package includes any specific state
regulatory requirements that are essential for accreditation within an
individual state.
The application form for national environmental laboratory
accreditation shall include:
(a) Legal name of laboratory
(b) Laboratory mailing address
(c) Name of owner
(d) Location (full address) of laboratory
(e) Name and phone number of laboratory contact person
(f) Name and phone number of Quality Assurance Officer
(g) Name and phone number of laboratory management representative
(h) Laboratory hours of operation
(i) States for which the laboratory is requesting accreditation
(j) Categories for which the laboratory is requesting accreditation
(k) Description of laboratory type
--Commercial
--Federal
--Hospital or health care
--State
--University
--Public water system
--Public wastewater system
--Industrial (an industry with discharge permits)
--Other (Describe)____________
(1) Certification of compliance by laboratory management (vide
infra: 4.1.9)
4.1.8 Transfer of Ownership/Change of Ownership and/or Location of
Laboratory
Accreditation may be transferred when the legal status or ownership
of an accredited laboratory changes without affecting its staff,
equipment, and organization. The accrediting agency may charge a
transfer fee and shall conduct an on-site assessment to verify affects
of such changes on laboratory performance.
The following conditions apply to the change in ownership and/or
the change in location of a laboratory that has national accreditation.
(a) Any change in ownership and/or location of an accredited
laboratory must be reported in writing to the primary state(s) and the
National Environmental Laboratory Accreditation Program within twenty
business days of such a change taking effect.
(b) Such a change in ownership and/or location will not necessarily
require reaccreditation or reapplication in any or all of the
categories in which the laboratory is currently accredited.
(c) Change in ownership and/or location may require a mandatory on-
site assessment with the elements of the assessment being determined by
the inspector.
(d) Any change in ownership must assure historical traceability of
the laboratory accreditation number(s).
(e) For a change in ownership, one of the following conditions must
be in effect:
i. The previous (transferring) owner must agree in writing, before
the transfer of ownership takes place, to be responsible for any
analyses, data and reports generated up to the time of legal transfer
of ownership; or
ii. The buyer (transferee) must agree in writing to be responsible
for any analyses, data and reports generated before the legal transfer
of ownership occurs.
4.1.9 ``Certification of Compliance'' Statement
The following ``Certification of Compliance'' statement must
accompany the application for laboratory accreditation. It must be
signed and dated by both the laboratory management and the quality
assurance officer for that laboratory.
Certification By Applicant
The applicant understands and acknowledges that the laboratory is
required to be continually in compliance with the National
Environmental Laboratory Accreditation Program's rules and regulations
concerning laboratory accreditation and standards and will be subject
to the penalty provisions provided therein.
The applicant understands and acknowledges that accreditation is
specifically subject to unannounced assessments.
Authorized representatives of any state in which the laboratory is
accredited may make an announced or unannounced inspection, search, or
examination of an accredited or interim approved laboratory whenever
the state, at its discretion, considers such an inspection, search or
examination necessary to determine the extent of the laboratory's
compliance with the conditions of its accreditation and these
regulations. Any refusal to allow entry to the state's representatives
shall constitute a violation of a condition of accreditation and
grounds for revocation of accreditation or loss of accreditation.
The applicant hereby certifies that all analyses performed are done
in accordance with applicable U.S. Environmental Protection Agency
Guidelines.
I hereby certify that I am authorized to sign this application on
behalf of the applicant/owner and that there are no misrepresentations
in my answer to the questions on this application.
----------------------------------------------------------------------
Signature Quality Assurance Officer
Officer
----------------------------------------------------------------------
Name of Quality Assurance
----------------------------------------------------------------------
Print Name of Applicant Laboratory
(Legal Name)
----------------------------------------------------------------------
Date
----------------------------------------------------------------------
Signature
Laboratory Management Representative
Representative
----------------------------------------------------------------------
Name
Laboratory Management.
4.2 Period of Accreditation
For a laboratory in good standing, the period for accreditation
within categories for methods or analytes will be reevaluated yearly
and will be considered to be ongoing once a laboratory has been
accredited for that category, method, or analyte. The loss of
accreditation for categories, methods or analytes will occur upon not
fulfilling any of the conditions outlined below in the sections on
maintaining accreditation and supervision, revocation and loss of
accreditation. Additionally, failure to pay the required fees as
determined by the participating states within the stipulated deadlines
or by the stipulated dates will result in loss of accreditation. This
information will be entered into the National Database.
There is a separate process for accreditation for new categories,
methods and analytes (vide supra: Application Process, 4.1.7).
Each year the National Environmental Laboratory Accreditation
Program will provide each laboratory with a current directory with
information on what categories, methods, and analytes for which they
are accredited. Additionally, new categories, methods, and analytes
will appear on the actual certificate that is reissued as these items
are added and/or deleted during the year. All new categories will be
included in updates to the database.
4.3 Maintaining Accreditation
Accreditation remains in affect until revoked by the accrediting
authority, until discontinued by the accredited laboratory, or until
expiration of accreditation date. To maintain accreditation, the
accredited laboratory shall complete or comply with elements 4.3.1 TO
4.3.7. Failure to complete or comply with these elements may be cause
for downgrading or revoking accreditation.
4.3.1 Performance Evaluation Samples
Performance evaluation samples appropriate for the accredited
methodology shall be acquired twice per year from a source acceptable
to the National Environmental Laboratory Accreditation Program,
successfully analyzed, and reported to the accrediting body within
required deadlines. In the event of unsatisfactory performance and
required reanalysis, repeat analysis shall also be completed and
reported within established deadlines. Poor performance on a
performance evaluation sample or failure to submit results within
required deadlines may be cause for downgrading accreditation.
4.3.2 On-Site Assessments
Announced on-site assessments shall be performed by the accrediting
agency at a minimum frequency of one assessment every year. Unannounced
on-site assessments or follow-up on-site assessments may be conducted
more frequently, for cause, at the option of the accrediting agency.
Situations which might trigger an unannounced on-site assessment or
follow-up on-site assessment include, review of a previously deficient
on-site assessment, poor performance on a performance evaluation
sample, change in other accreditation elements, or other information
concerning the capabilities or practices of the accredited laboratory.
On-site assessments, regardless of frequency, shall be successfully
completed to maintain accreditation. Deficiencies identified during the
on-site assessment shall be corrected within deadlines established in
these guidelines or according to deadlines in an approved correction
action plan. Failure to pass an on-site assessment or to correct
deficiencies according to the provisions of an approved corrective
action plan may be cause for downgrading accreditation.
4.3.3 Other Accreditation Elements
The accredited laboratory shall maintain other key accreditation
elements which originally served as the basis for accreditation
including the facility, organization and management, qualifications of
key personnel, sample handling procedures, calibration standards,
analytical methods, data reduction procedures, and laboratory quality
assurance plan. Failure to maintain, revise, or replace any of these
key components may be cause for downgrading accreditation status.
4.3.4 Notification and Reporting Requirements
The accredited laboratory shall notify the accrediting body of any
changes in key accreditation criteria including but not necessarily
limited to the laboratory ownership, location, key personnel, and major
instrumentation. The accredited lab shall also comply with any other
reporting requirements identified in these guidelines.
4.3.5 Record Keeping and Retention
All lab records associated with accreditation parameters, including
raw data associated with each analysis, changes in method standard
operating procedures, or the laboratory quality assurance plan, shall
be maintained for a minimum of five years unless otherwise designated
for a longer period in another regulation. In the case of data used in
litigation, the laboratory is required to store such records for a
longer period upon written notification from the accrediting agency.
4.3.6 Payment of Fees
The accredited lab shall pay all fees associated with maintaining
accreditation to the accrediting body within established deadlines.
4.4 Suspension, Revocation and Denial of Accreditation
Reasons to deny an initial application or reapplication shall
include:
(a) Failure of laboratory staff to meet the personnel
qualifications as required by NELAC. These qualifications include
education, training and experience requirements.
(b) Failure to successfully perform performance evaluation test as
required by NELAC.
(c) Failure to attest that analysis are performed by approved
methodologies and/or in accordance with NELAC requirements.
A laboratory shall have two opportunities to correct the areas of
deficiencies which results in a denial of applications. If the
laboratory is not successful in remedying said deficiencies, it must
wait six months before again applying for accreditation.
Revocation--shall mean the total withdrawal of a laboratory's
accreditation by the accrediting authority. The laboratory cannot
reapply for accreditation for 6 months, by which time the reason/cause
of the revocation must be corrected.
Reasons for revocation shall include:
(a) Failure to participate or unsatisfactory performance in the
performance evaluation testing program as required by the program.
(b) Submitting performance evaluation sample results generated by
another laboratory.
(c) Misrepresentation of any material fact pertinent to receiving
initial approval.
(d) Denial of entry for laboratory inspection.
(e) Conviction of charges of the falsification of any report of or
relating to a laboratory analysis.
(f) Failure to pay accreditation fees.
No laboratory's accreditation will be revoked or a renewal denied
without the opportunity to request a hearing.
Suspension shall mean the temporary removal of a laboratory's
accreditation for a defined period of time. The purpose of suspension
is to allow a laboratory time to correct deficiencies or area of non-
compliance with program requirements as defined by regulation. A
suspended laboratory would not have to reapply for accreditation if the
cause/causes for suspension are corrected within six months. A
laboratory's accreditation may be suspended in total or in part. It may
retain those areas of accreditation where it continues to meet the
standards and requirements of the program.
Reasons for suspension shall include:
(a) Failure to successfully perform performance evaluation tests
pursuant to the requirements of the program;
(b) Failure to submit and implement corrective action related to
deficiencies found during laboratory inspections;
(c) Loss of personnel with the required educational, training and
experience qualifications; or
(d) Failure to pay accreditation application fees.
4.5 Interim Accreditation
4.5.1 Interim Accreditation
If a laboratory completes all of the requirements for accreditation
except that of an on-site assessment because the accrediting authority
is unable to schedule the assessment an interim accreditation shall be
issued and will be in effect until the assessment requirements have
been completed. Interim accreditation will allow a laboratory to
perform analyses and report results of samples with the same status as
a fully accredited laboratory until an on-site assessment has been
completed. Accreditation will still be granted when performance
evaluation samples are not available.
4.5.2 Revocation of Interim Accreditation
Revocation of interim accreditation may be initiated for due cause
as described in 4.4.0 by order of the accrediting agency, without right
to a hearing.
4.6 Awarding of Accreditation
When a participating laboratory has met the requirements specified
for receiving accreditation, the laboratory will receive a single
certificate awarded on behalf of the state accrediting authority. The
certificate will provide the following information: the name of the
laboratory, address of the laboratory, the specifications of the
accreditation action (for example, the laboratory may be accredited for
analysis of water or for use of a specific analytical methodology,
etc.), the states in which the laboratory may operate. Even though a
parent laboratory is accredited, the subfacilities (laboratories
operating under the same parent organization, analytical procedures,
and quality assurance system) are also required to become accredited.
The subfacilities accredited will be listed on the certificate of the
parent laboratory.
4.6.1 The Certificate of Accreditation
The certificate of accreditation will briefly define the rules of
obtaining and maintaining accreditation. Finally, the certificate will
be signed by a member of the accrediting authority.
To address the concern that an individual state may revoke a
laboratory's accreditation for work in that state, the certificate will
explain that continued accredited status depends on successful ongoing
participation in the program. The certificate will urge a customer to
verify the laboratory's current accreditation standing within a
particular state. The certificate must be returned to the accrediting
agency upon loss of accreditation.
4.6.2 Changes in Areas of Accreditation
If an accredited laboratory increases its areas of accreditation, a
new certificate will be awarded which details the spectrum of
accreditations the laboratory has achieved.
4.7 Enforcement
The development of an enforcement component of the National
Environmental Laboratory Accreditation Program (NELAP) should be based
on explicit values, or principles, with which all participants concur.
The proposed basic principles are:
(a) The program should be fair to all participants;
(b) The rules should be well publicized;
(c) The program needs of the participating agencies must be upheld;
and
(d) The due process rights of participating laboratories must be
protected.
The major components of the program shall include:
(a) All enforcement actions are taken independently by EPA or state
agencies and communicated to all other NELAP participating agencies.
(b) NELAP enforcement is limited to suspension (short-to-long-term)
from NELAP only. Any other civil/criminal actions are taken by
participating agencies.
(c) An effective information-sharing database used by all
participating agencies is essential to ensure informed decision-making
based on lab performance.
4.7.1 Role of Enforcement vs QA/QC
Most agencies have historically conducted laboratory QA/QC programs
designed to help laboratories identify and correct technical problems
affecting their performance. This is basically a technical assistance
function by government. Enforcement, on the other hand, is an oversight
process of taking informative (``warning/information gathering
letters'') or punitive actions to ensure the public's desired
objectives (``reliable data'') are achieved. QA/QC and enforcement are
different functions and need to be kept separate.
4.7.2 Defining Enforceable Violations
The NELAP will need to specify what actions by laboratories will
result in enforcement action. Furthermore, enforcement actions should
be developed in increasing severity to allow laboratory correction with
minimal enforcement effort. This could be done with tiers of
enforcement actions, e.g. warning letter, suspension investigation
order, suspension order, and suspension hearing.
Enforceable violations will also need to be established to provide
the basis for the enforcement program. Categories of enforceable
violations could include:
(a) Data falsification--intentional, by lab management, by
employees, etc.;
(b) False advertising--misinforming clients regarding their
accreditation and capabilities; and
(c) Continuing technical problems--lack of technical staff, failure
to follow required SOP's, lack of equipment, etc.
4.7.3 Recommendation
Given resource constraints, strong interest in encouraging state
support, and the greater potential for implementation in the mid-term
(2 to 5 years), a variation of the decentralized option is recommended.
This approach will still require a federal-state laboratory integrated
effort to ensure the objectives, structure, and issues are defined in
the necessary detail.
5.0 Quality Systems
5.1 Introduction
Quality Systems include all quality assurance (QA) policies and
quality control (QC) procedures, which shall be delineated in a QA Plan
to help ensure and document the quality of the analytical data. These
shall include QA policies, which will establish essential QC procedures
applicable to environmental laboratories regardless of size and
complexity. The laboratory shall meet any additional or more stringent
requirements as specified by the analytical methods, specific programs
or Agencies.
All items identified in this discussion shall be available for on-
site inspection or data audit.
5.2 Quality System
5.2.1 Quality Assurance Plan
All laboratories shall prepare and have available for review a
written description of the laboratory's quality assurance activities,
i.e., a QA plan. The QA plan must be an independent document that may
incorporate by reference, already available standard operating
procedures (SOPs) or other material, e.g., methods, guidance documents,
etc., that are approved by the laboratory management. Analysts in the
laboratory should either have copies of the document or easy access to
the document. The items listed below constitute essential requirements
of a Quality System. All laboratories should be encouraged to add any
additional items thought to improve the analytical data. The following
items shall be included:
--General QC procedures
--Performance evaluation samples
--Staff
--Equipment
--Test methods & standard operating procedures (SOPs)
--Physical facilities
--Sample acceptance policy & sample receipt
--Sample tracking
--Record keeping, data review and reporting
--Corrective action policy and procedures
--Definition of terms
--Bibliography
5.3 General Quality Control Procedures
The following are the essential requirements and routines to
calculate and assess analytical precision, accuracy, and method
detection limits. All records and related quality control procedures
shall be documented and maintained.
The required essential quality control shall be as specified in the
analytical methods or as listed below, whichever is more stringent.
5.3.1 Chemical Testing
(a) Method Reagent Blanks--A minimum of 1 per batch of 20 or less
samples per matrix type per sample extraction or preparation.
(b) Matrix Spikes (MS), Matrix Spike Duplicates (MSDs), and Sample
Duplicates (SD).
i. Matrix spikes: required frequency as per the method reagent
blank, except for analytes for which standards are not available (BOD,
TSS, O&G, and pH, etc.).
ii. Matrix spike duplicates or sample duplicates shall be analyzed
at the same frequency as the original matrix spike (MS).
(c) Laboratory Fortified Blanks--(QC Check Samples).
It is suggested that these be analyzed at the same frequency as the
matrix spikes, but are mandatory if the matrix spikes are not within
quality control acceptance limits.
(d) Surrogates--Surrogate compounds must be added to all samples,
standards, and blanks whenever possible for all organic chromatography
methods. Limits must be used to determine acceptable surrogate
recoveries on a daily basis.
(e) Quality Control Validation Studies or Initial Demonstration of
Analytical Capability--QC Validation Studies shall be performed on a
one-time basis (initially and with a significant change, e.g., new
analyst, instrument or technique).
(f) Methods Used to Assess Precision and Accuracy--The laboratories
shall calculate and track precision and accuracy of test measurements
and the associated acceptance ranges using the data from the duplicate,
MS, blank and surrogate measurements. The resulting acceptance ranges
(and/or quality control charts) shall be used to assess data acceptance
and shall be readily accessible in an identifiable file to all
personnel involved with the data review/data acceptance process.
(g) Method Detection Limits--Method detection limits shall be
determined by an approved protocol or by a method specified by the
accrediting authority. The detection limit is to be determined for the
compounds of interest in each method in laboratory pure water and the
matrix of interest. The procedure used must be documented.
(h) Qualitative Identifications--Qualitative quality control refers
to the identification of a specific compound. Identification of all
analytes must be accomplished with a verified standard of the analyte.
When analyzing a new matrix, a new analyte or where other reasons
for doubt exists, a confirmatory analysis shall be performed. Such
analysis shall be a technique with a different scientific principle and
may include:
--Second column confirmation
--Alternate wavelengths
--Derivatization
--Mass spectral interpretation
--Alternate detectors
--Additional cleanup procedures
(i) Reagent Quality, Water Quality and Checks
i. Reagents--In methods where the purity of reagents is not
specified, analytical reagent grade shall be used. Reagents of lesser
purity than that specified by the method shall not be used. The labels
on the container should be checked and the contents examined to verify
that the purity of the reagents meets the needs of the particular
method.
ii. Water--Where the method does not specify the type of water
(e.g., distilled, deionized, etc.), the water quality shall be free
from all constituents that may potentially interfere with the sample
preparation or analytical test. The quality of water sources shall be
monitored and documented.
(j) Glassware Cleaning--In the analysis of samples containing
components in the parts per billion range, the preparation of
scrupulously clean glassware is mandatory. Particular care must be
taken with glassware such as Soxhlet extractors, Kuderna-Danish
evaporative concentrators, sampling-train components, or any other
glassware coming in contact with an extract that will be evaporated to
a lesser volume.
Any cleaning and storage procedures that are not specified by the
method shall be documented in laboratory records and SOPs.
(k) Internal Audits--The laboratory shall have a system in place
for conducting internal audits of the methods, data, and staff employed
at the lab. The audits shall be conducted at least twice annually and
the results shall be documented.
5.3.2 Bioassays
(a) Dilution Water Control--Every toxicity test or range-finding
test shall include a dilution water control treatment consisting of the
same dilution water, conditions, procedures, types and number of
organisms as used in the effluent treatments, except that none of the
effluent being tested shall be added to the dilution water.
Whenever artificial sea salts are used in the salinity adjustment
of either the dilution water sample or effluent sample, an additional
control treatment shall be included. This additional control treatment
shall consist of replicate chambers containing only artificial
saltwater made with the same artificial sea salts used to adjust the
samples. The artificial saltwater shall be made to the same
standardized salinity and Ph as the other test treatments.
(b) Distribution of Test Organisms--Test organisms must be randomly
distributed to the test chambers either by:
i. Adding to each chamber no more than 20% of the total number to
be assigned to each chamber, then repeating the process until each test
chamber contains the total number of test organisms desired; or
ii. Randomly assigning one test organism to each test chamber, then
randomly assigning a second test organism to each test chamber, etc.,
continuing the random assignments until the total number of test
organisms desired has been distributed to each test chamber.
(c) Dissolved Oxygen Requirement--The DO in the test chambers shall
be maintained at greater than 40% of saturation but less than 100% when
testing chronic toxicity for all species except Ceriodaphnia which must
be adjusted only prior to test initiation or sample renewal. Acute
tests shall assure that a minimum level of 4.0 mg/L DO is maintained.
(d) Duplicate Requirements--When the purpose of a definitive acute
toxicity test is to determine compliance with an LC50, or EC50 permit
limitation, the test shall consist of one or more control treatments
and a series of at least five effluent concentrations, in duplicate.
i. If the toxicity of the effluent to the test organism is not
known, then the concentration of effluent in each treatment, except for
the highest concentration and the control(s) shall be at least 50% of
the next higher one. The concentrations selected shall be evenly spaced
on either a logarithmic or geometric scale.
ii. Definitive test concentration series must, at a minimum, be
conducted in duplicate. Additional replicate series may be necessary in
order to achieve required test precision. Only true replicates, with no
water connections between test chambers shall be used.
iii. A minimum of twenty test organisms shall be exposed to each
effluent concentration and each control treatment; this means, when
conducting the test in duplicate, at least ten organisms per test
chamber. The number of organisms used in each effluent concentration
shall be equal to the number used in other effluent concentrations and
to the number used in the control. Organism loading limits shall be
observed.
(e) No Measurable Acute Toxicity--When the purpose of ``no
measurable acute toxicity (N.M.A.T.) is to determine compliance with a
N.M.A.T. permit limitation, the effluent must be known to generally
have an LC50 of greater than or equal to 100%, and the toxicity test
design must comply with the following:
i. The test series shall consist of one or more control treatments,
a 100% effluent-by-volume concentration and a 50% effluent-by-volume
concentration. The test shall be conducted with at least four
replicates, and at least ten organisms per chamber. Additional
duplicate series may be necessary in order to achieve required test
precision. Only true duplicates, with no water connections between test
chambers, shall be used.
ii. Forty or more test organisms shall be exposed to each control
treatment and each effluent treatment.
(f) Range Finding Toxicity Test--If required by the accrediting
agency and in the event historical aquatic toxicological data are not
available on an effluent, the lab shall conduct a range finding
toxicity test to ascertain the range of effluent concentrations for
subsequent definitive tests. Range finding toxicity tests shall at a
minimum consist of one or more control treatments, and treatments of
100% effluent-by-volume, 50% effluent-by-volume, and 12.5% effluent-by-
volume. A single test series is adequate, although duplicates may be
used. Five or more test organisms shall be exposed to each control
treatment and each effluent treatment.
(g) Species Identification
i. For species identification, the laboratory shall maintain or
have access to a type specimen collection.
ii. The laboratory must, at a specified frequency, use taxonomic
experts to corroborate species identification. In-house or outside
experts are acceptable for taxonomic identification of test species.
(h) Criteria for Test Types--All definitive acute toxicity tests
and N.M.A.T definitive acute toxicity tests must be conducted as either
static non-renewal, static-renewal, or flow-through tests. Range-
finding toxicity tests (if required) must be conducted as either static
or flow-through.
(i) Reference Toxicants--Reference toxicants shall be used as
specified by method.
5.3.3 Microbiology
(a) Blanks (Sterility Checks)
i. Membrane Filter (MF) Analysis Blank--A membrane filter sterile
control test of rinse water, media and supplies shall be inoculated
with at least 10 milliliters of sterile phosphate buffered dilution
water (dilution blank control). These shall be performed at the
beginning and end of all processed samples and after every tenth
sample.
ii. Multiple Tube Fermentation (MTF) Analysis Blank--A MTF blank
shall be performed with each MTF sample. A single tube of LTB broth
media shall be inoculated with 10 milliliters of sterile phosphate
buffered dilution water (dilution blank control).
(b) Laboratory Pure (Reagent) Water Requirements
i. Laboratory pure water shall be analyzed annually by the
Suitability Test for bactericidal properties for distilled water.
ii. Laboratory pure water shall be analyzed monthly for pH,
chlorine residual, standard plate count, and conductivity.
iii. The laboratory pure water must be analyzed annually for trace
metals.
(c) MPN Analysis--The MPN test for all water samples shall be
completed on 10% of positive confirmed samples, except that gram
staining need not be performed for drinking water samples. If no
positive tubes result from the tested drinking water samples, the
complete MPN test, but not gram staining, must be performed on a
quarterly basis on at least one positive water source.
(d) MF Analysis--5% of all positive environmental samples analyzed
and at least 10 of the sheen colonies for drinking water by membrane
filter shall be verified per method requirements.
(e) Duplicates--At least 5% of the positive samples shall be
duplicated. In laboratories with more than one analyst, have each make
parallel analyses on at least one positive sample per month.
(f) Positive and Negative Controls--Positive and negative control
cultures shall be analyzed for the microorganisms under test for each
lot of media used with each analytical procedure.
5.3.4 Radiochemistry
(a) Instrument Blanks--Instrument blanks are blanks at the
background levels for any of the nuclide emission of interest.
Instrument blanks consist of a clean planchet, ampule or sealed
canister that is placed in the instrument to duplicate sample counting
geometry. The purpose of the instrument blank is to verify instrument
operation and ensure that no contamination has occurred in the counting
chamber. Instrument blanks are used for calculation of lower limits of
detection. The frequency of instrument analysis depends on the type of
instrument. Essential frequencies for analysis of instrument blanks on
typical instruments are:
------------------------------------------------------------------------
Instrument Frequency
------------------------------------------------------------------------
Gamma spectrometers...................................... Monthly.
Low background proportional counters..................... Daily.
Low level liquid scintillation counters.................. Daily.
Scintillation counters................................... Weekly.
Alpha spectrometers...................................... Weekly.
Radon flask counters..................................... Monthly.
------------------------------------------------------------------------
(b) Method Blanks--The required frequency for method blanks shall
be at least once each batch or one out of every 20 samples, whichever
is greater. These specifications are applicable to all radiochemistry
techniques except for gamma spectroscopy where no chemical separation
or other chemical manipulation is performed.
(c) Laboratory Control Samples (LCS)--At least one LCS shall be
included with each batch or one out of every 20 analytical samples,
whichever is greater.
(d) Matrix Spikes--Matrix spikes shall be included with each sample
batch where chemical manipulations and separations are performed. The
frequency for measurement of matrix spikes shall be at least one per
batch or one out of every 20 samples, whichever is greater.
The following criteria is recommended for spiking:
i. Samples should be spiked at random within each batch. There
should be adequate samples available for duplicate analysis, if
necessary.
ii. Spikes should be prepared in a manner to minimize alteration of
the original matrix (i.e., minimize dilution of the sample during the
spiking).
iii. Spikes should be prepared at a level that is at least two
times the concentration of the analyte of interest.
(e) Laboratory Duplicates--Sample analysis shall be duplicated on a
randomly selected sample (not field blanks) within every batch or one
per 20 samples, whichever is greater.
5.3.5 Air Testing--To be added as document undergoes review.
5.4 Performance Evaluation Samples
Each laboratory shall participate in a performance evaluation
program as outlined in Chapter 2.0.
5.5 Environmental Laboratory Staffing Requirements
5.5.1 General Requirements for Laboratory Staff
The testing laboratory shall have sufficient supervisory and other
personnel, having the necessary education, training, technical
knowledge and experience for their assigned functions.
Job descriptions shall be available for all positions.
The laboratory shall have available a clear description of the
lines of responsibility in the laboratory and shall be proportioned
such that adequate supervision is ensured. An organizational chart is
recommended.
5.5.2 Laboratory Staff Responsibilities and Credentials
Laboratory management shall be responsible for:
(a) All analytical and operational activities of the laboratory,
including those of any auxiliary or mobile laboratory facilities;
(b) Supervision of all personnel employed by the laboratory,
including those assigned to work in any auxiliary or mobile laboratory
facilities, and those persons designated as principle analysts;
(c) Assuring that all sample acceptance criteria (Section 5.9) are
met and that samples are logged into the sample tracking system and
properly labeled and stored; and
(d) The production and quality of all data reported by the
laboratory, including any auxiliary or mobile laboratory facilities.
Each analyst and other members of the staff shall be responsible
for complying with all QA requirements. Each laboratory position must
have a combination of experience and education to adequately
demonstrate a specific knowledge of their particular function and a
general knowledge of laboratory operations, analytical methods, quality
assurance/quality control procedures and records management.
5.5.3 Quality Assurance Officer
A quality assurance officer shall:
(a) Serve as the focal point for QA/QC and be responsible for
analytical data review (sign off on data is required);
(b) Have functions independent from laboratory management;
(c) Be able to objectively evaluate data and perform assessments
without outside (e.g., managerial) influence;
(d) Have formal training and experience in QA/QC procedures and be
knowledgeable in the quality system as defined under NELAP;
(e) Have a general knowledge of the analytical methods for which
data review is performed; and
(f) Conduct internal audits on the entire operation twice annually.
5.6 Equipment
A laboratory must have access to all equipment specified by the
analytical procedures for which accreditation is sought. All
maintenance activities, both routine and nonroutine, shall be
documented. The following records shall be maintained for each piece of
equipment:
--Name of item;
--Manufacturer's name, type identification and serial number;
--Date received and placed in service;
--Current physical location;
--Maintenance log; and
--Calibration information, if appropriate.
5.7 Test Methods and Standard Operating Procedures
When the use of approved methods for a specific sample matrix is
required, only those methods shall be used. In addition, where
performance-based methods or non-legally mandated methods are
permitted, the relevant start-up and ongoing validation procedures, and
calibrations as specified in 5.7.2 must be followed and documented.
The criteria listed in 5.7 must be met for all methods and SOPs.
5.7.1 Laboratory Method Manual(s) and Standard Operating Procedures
Each certified laboratory shall have and maintain an in-house
methods manual(s) and SOPs. The methods manual(s) and any associated
reference works (if required) shall be available to the bench analyst.
For each analyte certified, a method or methods to be used by the
laboratory shall be described in the methods manual. The method
description shall include:
--Analyte name and qualifier (the qualifier is a word, phrase or
number that better identifies the method; e.g., ``Iron, Total'', or
``Chloride, Automated Ferricyanide'', or ``Our Lab. Method SOP No.
101'');
--Applicable matrix or matrices;
--Method detection limit;
--Scope and application;
--Summary of the method;
--Definitions;
--Interferences;
--Safety;
--Equipment and supplies;
--Reagents and standards;
--Sample collection, preservation, shipment and storage;
--Quality control;
--Calibration and standardization;
--Procedure;
--Data analysis and calculations;
--Method performance;
--Pollution prevention;
--Waste management;
--References; and
--Any tables, diagrams, flowcharts and validation data.
5.7.2 Method Validation/Initial Demonstration of Method Performance
(Performance-Based Methods and Non-Approved Methods)
Prior to acceptance and institution of any method, satisfactory
initial demonstration of method performance, in conformance with the
relevant EPA guidelines, is required. In the absence of method-
specified requirements, this demonstration shall follow the outlined
protocols of Paragraph 8.1.1 and Section 8.2 in the methods published
in 40 CFR Part 136, Appendix A. Thereafter, continuing demonstration of
method performance, in conformance with the relevant EPA guidelines, is
required. In both cases, the appropriate standard Performance Based
Method System (PBMS) checklist (see Appendix B) must be completed,
submitted to the accrediting organization, and a copy must be retained
in the laboratory. All associated supporting data necessary to
reproduce the analytical results summarized in the checklists must be
retained by the laboratory. Initial demonstration of method performance
must be completed each time there is a change in equipment, personnel
or procedure.
5.7.3 Calibration
5.7.3.1 Documentation and Labeling
The laboratory shall retain records (e.g., manufacturer's statement
of purity), of the origin, purity and traceability of all standards and
reagents (including balance weights and thermometers). These records
shall include the date of receipt, the date of opening and an
expiration date.
Detailed records shall be maintained on reagent and standard
preparation. These records shall indicate traceability to purchase
stocks or neat compounds, and must include the date of preparation and
preparer's initials.
Where calibrations do not include the generation of a standard
curve (e.g., thermometers, balances, titrations, etc.), records shall
indicate the calibration date and type (e.g. balance weight,
thermometer serial number, primary standard concentration, etc.) of
calibration standard that was used.
All prepared reagents and standards shall be clearly identified
with preparation date, concentration(s) and preparer's initials.
All standard curves shall be dated and labeled with method, analyte
and standard concentrations and instrument responses.
The axes of the calibration curve should be labeled. For electronic
data processing systems, that automatically compute the calibration
curve, the equation for the curve and the correlation coefficient must
be recorded. The equation for the line and the correlation coefficient
shall also be recorded when the calibration curve is prepared manually.
A criteria for an acceptable correlation coefficient shall be
established.
5.7.3.2 Initial Calibrations
All initial calibrations shall be verified with standards of high
quality obtained from a second or different source. These verification
standards shall be analyzed with each initial calibration or quarterly,
whichever is more frequent.
Standard curves shall be prepared as specified in the method.
The lowest standard should approach the method detection limit.
If a method does not provide guidance in the preparation of a
standard curve, the following guidelines shall be followed: For all
methods, use a blank and at least three (3) standards that lie within
the linear portion of the curve. Additional standards are required for
non-linear calibration curves. In all cases, the sample results must be
closely bracketed by calibration standards.
A new curve shall be run if two successive runs of one continuing
calibration check is outside acceptable limits.
5.7.3.3 Continuing Calibration Verification
When an initial calibration curve is not run on the day of
analysis, the integrity of the initial calibration curve shall be
verified on each day of use (or 24 hour period) by initially analyzing
a blank and a standard at a concentration equal to or near the lowest
calibration standard (the lowest calibration standard shall be in the
range of 4 to 8 times the calculated method detection limit).
Additional standards shall be analyzed after the initial
calibration curve or the integrity of the initial calibration curve
(see previous paragraph) has been accepted.
(a) These standards shall be analyzed at a frequency of 5% or every
8 hours whichever is more frequent and may be standards used in the
original calibration curve or standards from another source.
(b) The concentration of these standards shall be determined by the
anticipated or known concentration of the samples. To the extent
possible, the samples in each interval (i.e. every 20 samples or every
8 hours) should be bracketed with standard concentrations closely
representing the lower and upper range of reported sample
concentrations. If this is not possible, the standard calibration
checks should vary in concentration throughout the range of the data
being acquired.
When not specified by the analytical method, these calibration
verification standards shall be within 15% of the true value.
5.8 Physical Facilities
5.8.1 Environment
The laboratory facilities shall be maintained to permit the
production of analytical data of needed quality. In addition to
adequate housekeeping that must be performed to assure that
contamination is unlikely, the following elements shall be controlled:
--Temperature;
--Humidity;
--Electrical power;
--Vibration;
--Electromagnetic fields;
--Dust;
--Direct sunlight;
--Ventilation (exhaust hoods, air exchangers, etc.); and
--Lighting.
5.8.2 Work Area
Adequate work spaces to ensure an unencumbered work area must be
available. These include:
--Controlled access to the laboratory;
--Separation of incompatible analyses;
--Sample receipt area;
--Sample storage area;
--Chemical and waste storage area(s); and
--Data handling and storage area(s).
5.9 Sample Acceptance Policy and Sample Receipt
Regardless of the laboratory's level of control over sampling
activities, the following are essential to ensure sample integrity and
valid data.
5.9.1 Sample Acceptance Policy
The laboratory shall have a written sample acceptance policy that
clearly outlines the circumstances under which samples will be
accepted. Data from any samples which do not meet the following
criteria must be flagged in an unambiguous manner clearly defining the
nature and substance of the variation. This document should be
circulated to sample collecting personnel with other sampling
instructions and shall include the following areas of concern:
(a) Submittal of field quality control samples as required by the
accrediting agency. The samples may include trip blanks, field blanks,
equipment blanks, duplicates or other field-submitted quality control
measures;
(b) Proper, full, and complete documentation, which shall include
sample identification, the location, date and time of collection,
collector's name, preservative added, sample type and any special
remarks concerning the sample;
(c) Proper sample labeling to include unique identification and a
labeling system for the samples with requirements concerning the
durability of the labels (water resistant) and the use of indelible
ink;
(d) Evidence of proper preservation and use of appropriate sample
containers. The type of sample containers and preservatives are as
specified by the individual programs, a Performance Based Method System
or NELAP;
(e) Adherence to specified holding times. The maximum allowable
holding time prior to analyses are as specified by individual Programs,
a Performance Based Method System or NELAP; and
(f) Adequate sample volume. Sufficient sample volume must be
available to perform the necessary analysis.
5.9.2 Sample Receipt Protocols
Samples shall be checked upon receipt for thermal preservation (if
applicable) and all other aforementioned items. Chemical preservation
(e.g., appropriate Ph) shall be checked upon receipt or prior to sample
preparation/analyses. The results of such checks shall be recorded.
Data from any samples which do not meet the criteria must be flagged in
an unambiguous manner clearly defining the nature and substance of the
variation.
If applicable, a complete chain of custody record (Section 5.11.3)
shall be maintained.
5.9.3 Storage Conditions
The samples shall be properly preserved and stored in approved
containers specified by the individual EPA or state programs, the
Performance Based Method System or NELAP. Samples shall be stored in a
secure area.
5.10 Sample Tracking
The laboratory shall design a system to unequivocally identify all
samples, subsamples and subsequent extracts and/or digestates so that
each aliquot is uniquely identified.
The laboratory shall assign a unique identification (ID) code to
each sample container received in the laboratory. Multiple aliquots of
a sample that have been received for different analytical tests (e.g.,
nutrients, metals, VOCs, etc.) must be assigned a different ID code.
The use of container shape, size or other physical characteristic
(e.g., amber glass, purple top, etc.) is not an acceptable means of
identifying the sample.
This laboratory code shall maintain an unequivocal link with the
unique field ID assigned each container.
The laboratory ID number shall be placed on the sample container as
a durable label.
The laboratory ID number shall be entered into the laboratory
records (see 5.11.2) and shall be the link that associates the sample
with related laboratory activities (i.e., sample preparation,
calibration, etc.).
In cases where the sample collector and analyst are the same
individual or the laboratory preassigns numbers to sample containers,
the laboratory ID number may be the same as the field ID number.
5.11 Record Keeping, Data Review and Reporting
The laboratory shall implement protocols that will produce
unequivocal, accurate records which document all laboratory activities
associated with sample receipt, preparation, analysis, review and
reporting.
There are two levels of record keeping: (1) Sample custody or
tracking and (2) legal or evidentiary chain of custody. All essential
requirements for sample custody are outlined in Sections 5.11.1.1, and
5.11.1.2. The basic requirements for legal chain of custody (if
required or implemented) are specified in Section 5.11.3.
5.11.1 Sample Custody Requirements
5.11.1.1 Essential Documentation
(a) Sample Handling--Sample custody shall document all procedures
and activities to which a sample is subjected. These activities shall
include but are not limited to
--Sample preservation including appropriate sample container and
compliance with holding time;
--Sample identification, receipt, acceptance or rejection and log-
in;
--Sample storage and tracking (includes shipping receipts,
transmittal forms, and internal routing and assignment records);
--Sample preparation (includes cleanup and separation protocols, ID
#s, volumes, weights, instrument printouts, meter readings,
calculations, reagents, etc.);
--Sample analysis;
--Standard and reagent origin, receipt, preparation, and use;
--Equipment receipt, use, specification, operating conditions and
preventative maintenance;
--Calibration criteria, frequency and acceptance criteria;
--Data and statistical calculations, review, confirmation,
interpretation, assessment and reporting conventions;
--Method performance criteria including expected quality control
requirements;
--Quality control protocols and assessment;
--Electronic data security, software documentation and verification,
software and hardware audits, backups, and records of any changes to
automated data entries;
--All automated sample handling systems;
--Records storage and retention; and
--Sample disposal including the date of sample or subsample disposal
and name of the responsible person.
(b) Laboratory Support Activities--In addition to documenting all
the above-mentioned activities, the following shall be retained:
--All original raw data, whether hard copy or electronic, for
calibrations, samples and quality control measures, including
analysts work sheets and data output records (chromatograms, strip
charts, and other instrument response readout records);
--Copies of final reports;
--Archived standard operating procedures;
--Correspondence relating to laboratory activities for a specific
project;
--All corrective action reports, audits and audit responses;
--Performance evaluation results and raw data; and
--Data review and cross checking.
(c) Analytical Records--The essential information to be recorded on
all raw data associated with analysis (e.g., strip charts, tabular
printouts, computer data files, analytical notebooks, run logs, etc.)
shall include:
--Laboratory sample ID number;
--Date of analysis;
--Instrumentation identification and instrument operating
conditions/parameters (or reference to such data);
--Analysis type;
--All calculations (automated and manual); and
--Analyst's or operator's initials/signature.
5.11.1.2 Record Keeping System and Design
Each organization shall design and maintain a record keeping system
that is succinct, self-explanatory and efficient and allows historical
reconstruction of all laboratory activities that produced the resultant
sample analytical data. The history of the sample must be readily
understood through the documentation. This shall include
interlaboratory transfers of samples and/or extracts.
All information relating to the laboratory facilities equipment,
analytical methods, and related laboratory activities (e.g., sample
receipt, sample preparation, data review, etc.) shall be documented.
All documentation shall be maintained to reflect current operating
protocols.
The organization should establish essential personnel
qualifications and shall maintain records on personnel training.
Organizations shall maintain standard operating procedures (SOPs)
that accurately reflect all phases of current laboratory activities
including assessing data integrity.
(a) These documents may be specific sample preparation or
analytical references, (e.g., analytical method numbers), equipment
manuals (provided by the manufacturer), or internally written
documents.
(b) The SOPs shall also include a list of analytical methods that
are used by the laboratory. This list shall be indexed according to
NELAC accreditation categories (e.g., drinking water, solid waste,
etc.).
(c) In cases where minor modifications to accepted methods have
been made (e.g., change in type of column, change in operating
conditions, etc.), or where the referenced method is ambiguous or
provides insufficient detail (e.g., reagent purity, reagent
concentration, etc.), these changes or clarifications shall be
documented as an appendix to the referenced method.
Copies of the above-mentioned SOPs shall be accessible to the
workplace.
The record keeping system shall facilitate the retrieval of all
working files and archived records for inspection and verification
purposes.
All documentation entries shall be signed or initialed by
responsible staff. The reason for the signature or initials shall be
clearly indicated in the records (e.g., sampled by, prepared by,
reviewed by, etc.).
Entries into all records shall be legibly written in indelible ink.
Entries in records shall not be obliterated by erasures or
markings. All corrections to record-keeping errors shall be made by one
line marked through the error. The individual making the correction
shall sign (or initial) and date the correction. These criteria also
shall apply to electronically maintained records.
5.11.1.3 Laboratory Report Format and Contents
The laboratory shall report results, accurately, clearly,
unambiguously and objectively and in a manner that is understandable to
the recipient. The basic information to be included in the report
includes the following:
(a) Report title (e.g., ``Certificate of Results'', ``Laboratory
Results'', etc.) with the name, address and phone number of the
laboratory (or laboratories, see subcontracted laboratories below);
(b) Name and address of client and/or project;
(c) Description and identification of sample (including client ID
number);
(d) Date of sample receipt, sample collection and sample analysis;
(e) Time of sample preparation and/or analysis if the required
holding time for either activity is less than or equal to 48 hours;
(f) Test method or unambiguous description of any non-standard
method;
(g) Test results with any failures or deviations from methods or
quality control criteria identified (i.e., data qualifiers);
(h) Signature and title of individual(s) accepting responsibility
for the content of the report and date of issue; and
(i) Clear identification of any results that were performed by a
subcontracted laboratory.
If appropriate, the laboratory shall certify that the test results
meet all requirements of NELAP or provide reasons and/or justification
if they do not.
Once issued, the laboratory report shall remain unchanged. Any
corrections, additions and/or deletions from the original reports shall
be supported by supplementary documentation, shall clearly identify its
purpose, and shall contain all reporting requirements specified above.
5.11.1.4 Records Management and Storage
(a) All records of an organization that are pertinent to a
specified project shall be retained for a minimum of five years unless
otherwise designated for a longer period of time in another regulation.
The records specified in 5.11.1.1 and 5.11.1.2 above shall be retained.
(b) Records that are stored or generated by computers or personal
computers (PCs) shall have hard copy and write-protected backup copies.
(c) When a procedure or document (e.g., initial calibration
records, SOPs, etc.) becomes obsolete or is replaced, the records shall
clearly indicate the time period (or sample sets, if applicable) during
which the procedure or document was in force.
(d) All access to archived information shall be documented.
(e) If an organization goes out of business or changes ownership
before the time period for records retention has expired, all
documentation shall be transferred in whole to the archives of the
sponsor (client) of the work or to the new owner as described in
Section 4.1.8.
5.11.2 Sample Custody Tracking and Data Documentation for Laboratory
Operations
5.11.2.1 Sample Receipt, Log In and Storage
All records pertinent to sample receipt, log in and storage shall
be maintained. In addition, the laboratory shall:
(a) Retain all correspondence and/or official conversations
concerning the final disposition of rejected samples;
(b) Fully document any decision to proceed with the analysis of
compromised samples:
--The condition of these samples shall be noted in all documentation
associated with the sample.
--The analysis data shall be appropriately ``qualified as
estimated'' on all internal documentation and on the final report.
(c) Utilize a permanent, chronological log to document receipt of
all sample containers. The following information must be recorded in
the laboratory sequential log:
--Date of laboratory receipt of sample;
--Sample collection date;
--Unique laboratory ID code (see 5.10 above);
--Field ID code supplied by sample submitter;
--Requested analyses, including approved method number, if
applicable;
--Signature or initials of data logger;
--Comments resulting from inspection for sample acceptance
rejection; and
--Sampling kit code (if applicable).
(d) All documentation that is transmitted to the laboratory by the
sample transmitter shall be retained (e.g., memos, transmittal forms,
etc.).
5.11.2.2 Intralaboratory Distribution of Samples for Analysis
(a) The laboratory shall utilize a proactive procedure to ensure
that all samples and subsamples are analyzed within allowed maximum
allowable holding times.
(b) All distribution of samples and subsamples for preparation and
analysis shall be documented as to task assignment and analysis date
deadline.
5.11.3 Legal or Evidentiary Custody Procedures
The use of legal chain of custody (COC) protocols is strongly
recommended and may be required by some state or federal programs. In
addition to the records listed in 5.11.1.1 and 5.11.1.2, the following
protocols shall be incorporated if legal COC is implemented by the
organization.
5.11.3.1 Basic Requirements
The chain of custody records shall establish an intact, contiguous
record of the physical possession, storage and disposal of sample
containers, collected samples, sample aliquots, and sample extracts or
digestates. For ease of discussion, the above-mentioned items shall be
referred to as samples:
(a) The COC records shall account for all time periods associated
with the samples.
(b) The COC records shall include signatures of all individuals who
were involved with physically handling the samples.
(c) In order to simplify record-keeping, the number of people who
physically handle the sample should be minimized.
(d) The COC records are not limited to a single form or document.
However, organizations should attempt to limit the number of documents
that would be required to establish COC.
(e) Legal chain of custody shall begin at the point established by
the federal or state oversight program. This may begin at the point
that cleaned sample containers are provided by the laboratory or the
time sample collection occurs.
(f) The COC forms shall remain with the samples during transport or
shipment.
5.11.3.2 Required Information in Custody Records
In addition to the information specified in 5.11.1.1 and 5.11.1.2,
tracking records shall include, by direct entry or linkage to other
records:
(a) Time of day and calendar date of each transfer or handling
procedure;
(b) Signatures of all personnel who physically handle the
sample(s);
(c) All information necessary to produce unequivocal, accurate
records that document the laboratory activities associated with sample
receipt, preparation, analysis and reporting; and
(d) Common carrier documents.
5.11.3.3 Controlled Access to Samples
Access to all legal samples and subsamples shall be controlled and
documented.
5.11.3.4 Transfer of Samples to Another Party
Transfer of samples, subsamples, digestates or extracts to another
party are subject to all of the requirements for legal chain of
custody.
5.11.3.5 Sample Disposal
(a) If the sample is part of litigation, disposal of the physical
sample shall occur only with the concurrence of the affected legal
authority, sample data user and/or submitter of the sample.
(b) All conditions of disposal and all correspondence between all
parties concerning the final disposition of the physical sample shall
be recorded and retained.
(c) Records shall indicate the date of disposal, the nature of
disposal (i.e. sample depleted, sample disposed in hazardous waste
facility, sample returned to client, etc.), and the name of the
individual who performed the task.
5.12 Corrective Action Policy and Procedures
The laboratory shall develop contingencies for unacceptable quality
control results. These policies shall be specified in written SOPs and
shall include the following:
(a) Identification of such problems, and the anticipated and/or
recommended corrective actions to correct and/or eliminate future
occurrences;
(b) Requirement for written records that document the problem, the
corrective measures, and the final outcome; and
(c) An established policy requiring that a laboratory does not
accept samples on a routine basis without the capability of meeting the
maximum holding times.
Appendix A
Definitions
Accreditation: The process by which an agency or organization
evaluates and recognizes a program of study or an institution as
meeting certain predetermined qualifications or standards, thereby
accrediting the laboratory. In the context of the National
Environmental Laboratory Accreditation Program (NELAP), this process
is a voluntary one.
Accreditation Authority Review Board: A five member group
appointed by EPA from the states, EPA, and other federal agencies to
review the process and procedures used by EPA to approve state and
federal laboratories and accreditation authorities.
Accrediting Authority: The agency having responsibility and
accountability for environmental laboratory accreditation and who
grants accreditation. For the purposes of NELAC, this is EPA, other
federal agencies, or the state.
Accrediting Body: The organization that actually executes the
accreditation process, i.e., receives and reviews accreditation
applications, reviews QA documents, reviews performance evaluation
testing results, surveys the site, etc., whether EPA, the state, or
contracted private party.
Accuracy: The degree of agreement between an observed value and
an accepted reference value. Accuracy includes a combination of
random error (precision) and systematic error (bias) components
which are due to sampling and analytical operations; a data quality
indicator. (Glossary of Quality Assurance Terms, QAMS, 8/31/92).
Administrative Committee: A committee of the National
Environmental Laboratory Accreditation Conference involved with the
internal business affairs of the conference. Currently, these are
the Conference Management and Funding, Nominating, Membership,
Auditing, Liaison, and Contributor Committee.
Applicant: Any environmental laboratory seeking accreditation.
Assessment: The physical process of inspecting, testing and
documenting results from a laboratory for purposes of accreditation.
Assessment Team: An individual or group of individuals who
perform the on-site assessment of a laboratory.
Board of Directors: The guiding body of NELAC composed of the
Director, Executive Secretary, Chair, Chair-elect, Past Chair,
Treasurer, and six at-large members.
Calibration Standard: A solution prepared from the primary
dilution standard solution or stock standard solutions and the
internal standards and surrogate analytes. The Calibration solutions
are used to calibrate the instrument response with respect to
analyte concentration. (Glossary of Quality Assurance Terms, QAMS,
8/31/92).
CNAEL: The Committee on National Accreditation of Environmental
Laboratories chartered by EPA in 1991 to assess the need,
feasibility, and practicability of a national environmental
laboratory accreditation program. Dissolved after its report to EPA
in September 1992.
Compromised Samples: Those samples which were improperly
sampled, or with insufficient documentation (chain of custody and
other sample records and/or labels), improper preservation and/or
containers were used, or the holding time has been exceeded. Under
normal conditions compromised samples are not analyzed. If emergency
situations require analysis, the results must be appropriately
qualified.
Contracted Organization: A private accrediting body meeting the
standards for accreditation of environmental laboratories and
employed by an accrediting authority to perform certain accrediting
functions, e.g. on-site audits.
Contributors: Any person or group having an interest in
environmental laboratory accreditation other than a state or federal
official involved in environmental laboratory affairs, who may
participate in the deliberations of the conference by presenting
papers, debating issues, etc. but without vote or formal membership
on a committee.
Deficiency Report: A report generated by the Inspector who is a
state employee or authorized agent of the state in response to
deficiencies noted in the course of a laboratory assessment,
inspection or performance evaluation sample analysis result.
Denial: The refusal to grant approval to all or part of a
laboratory's initial or subsequent application for certification by
the National Environmental Laboratory Accreditation Program.
Environmental Laboratory Advisory Board: The name of the Federal
Advisory Committee Act body chartered by EPA and composed of special
interest groups or persons to interact with the Board of Directors.
Equipment Blank (Sample Equipment Blank): A clean sample (e.g.,
distilled water) that is collected in a sample container with the
sample-collection device and returned to the laboratory as a sample.
Sampling equipment blanks are used to check the cleanliness of
sampling devices. (Glossary of Quality Assurance Terms, QAMS, 8/31/
92).
Failure: Failing one or more of the criteria outlined in factors
examined in announced and unannounced laboratory assessments which
include: competence of staff, qualifications of staff and
supervisors, working conditions, equipment, supplies, supervision,
methods used, quality assurance/quality control procedures,
recordkeeping, and compliance with good laboratory practices.
Field Blank: A clean sample (e.g., distilled water), carried to
the sampling site, exposed to sampling conditions (e.g., bottle caps
removed, preservatives added) and returned to the laboratory and
treated as an environmental sample. Field blanks are used to check
for analytical artifacts and/or background introduces by sampling
and analytical procedures. (Glossary of Quality Assurance Terms,
QAMS, 8/31/92).
Holding Times (Maximum Allowable Holding Times): The maximum
times that samples may be held prior to analysis and still be
considered valid. (40 CFR Part 136).
Initial Demonstration of Analytical Capability: Procedure to
establish the ability to generate acceptable accuracy and precision
which is included in many of the EPA's analytical methods. In
general the procedure includes the addition of a specified
concentration of each analyte (using a QC check sample) in each of
four separate aliquots of laboratory pure water. These are carried
through the entire analytical procedure and the percentage recovery
and the standard deviation are determined and compared to specified
limits. (40 CFR Part 136).
Inspection Report: The written results listing specific
deficiencies and levels of performance that result from a laboratory
assessment. This is a public record document prepared by the
inspector.
Inspector: The authorized representative of the appropriate
department within a state who directly conducts the laboratory
assessment of inspection. This representative may be a third party
contractor to the state who inspects and acts under the authority of
the state. All actions and requests made by such a third party are
made under the regulatory authority of the state.
Instrument Blank: A clean sample (e.g., distilled water)
processed through the instrumental steps of the measurement process;
used to determine instrument contamination. (Glossary of Quality
Assurance Terms, QAMS, 8/31/92).
Laboratory: A facility engaged in the collection or analysis and
reporting of environmental samples, whether fixed or mobile.
Laboratory Control Sample (quality control sample): An
uncontaminated sample matrix spiked with known amounts of analytes
from a source independent of the calibration standards. It is
generally used to establish intra-laboratory or analyst specific
precision and bias or to assess the performance of all or a portion
of the measurement system. (Glossary of Quality Assurance Terms,
QAMS, 8/31/92).
Legal Chain of Custody (COC): An unbroken trail of
accountability that ensures the physical security of samples, data
and records. (Glossary of Quality Assurance Terms, QAMS, 8/31/92).
Local: An individual state.
Manager: The individual designated as being responsible for the
overall operation, all personnel, and the physical plant of the
environmental laboratory. A supervisor may report to the manager. In
some cases, the supervisor and the manager may be the same
individual.
Matrix Spike (spiked sample, fortified sample): Prepared by
adding a known mass of target analyte to a specified amount of
matrix sample for which an independent estimate of target analyte
concentration is available. Matrix spikes are used, for example, to
determine the effect of the matrix on a method's recovery
efficiency. (Glossary of Quality Assurance Terms, QAMS, 8/31/92).
Matrix Spike Duplicate (spiked sample/fortified sample
duplicate): A second replicate matrix spike is prepared and analyzed
to obtain a measure of the precision of the recovery for each
analyte. (Glossary of Quality Assurance Terms, QAMS, 8/31/92).
Member (or active member): A state or federal official engaged
in setting regulatory standards or accreditation of environmental
laboratories, eligible for committee assignment and having voting
privileges in the NELAC.
Method Blank: A clean sample processed simultaneously with and
under the same conditions as samples containing an analyte of
interest through all steps of the analytical procedures. (Glossary
of Quality Assurance Terms, QAMS, 8/31/92).
Method Detection Limit (Analytical Detection Limit): The minimum
concentration of a substance (an analyte) that can be measured and
reported with 99% confidence that the analyte concentration is
greater than zero and is determined from analysis of a sample in a
given matrix containing the analyte. (40 CFR Part 136 Appendix B).
National Database: A database run by the Federal Government or
its authorized agent that has public information readily available
to the states participating in the NELAP program. It would include
information regarding the current accreditation and accreditation
process and status on a laboratory by laboratory basis.
NELAC: National Environmental Laboratory Accreditation
Conference. A voluntary organization of state and federal
environmental officials and interest groups purposed primarily to
establish mutually acceptable standards for accrediting
environmental laboratories. A subset of NELAP.
NELAP: The overall National Environmental Laboratory
Accreditation Program of which NELAC is a part.
On-site: The laboratory facility, whether fixed or mobile, in
the context of actually visiting the facility for evaluation or
review of its program.
Reagent Blank (method reagent blank): A sample consisting of
reagent(s), without the target analyte or sample matrix, introduced
into the analytical procedure at the appropriate point and carried
through all subsequent steps to determine the contribution of the
reagents and of the involved analytical steps. (Glossary of Quality
Assurance Terms, QAMS, 8/31/92).
PBM: Performance Based Methods.
Participating Member: A state or federal agency identified by
EPA as having met all the standards for an accrediting authority to
accredit environmental laboratories.
Performance Evaluation Program: The aggregate of providing
rigorously controlled and standardized environmental samples to a
laboratory for analysis, reporting of results, statistical
evaluation of the results in comparison to peer laboratories and the
collective demographics and results summary of all participating
laboratories.
Performance Evaluation Sample (PE): A sample, the composition of
which is unknown to the analyst and is provided to test whether the
analyst/laboratory can produce analytical results within specified
performance limits. (Glossary of Quality Assurance Terms, QAMS, 8/
31/92).
Precision: The degree to which a set of observations or
measurements of the same property, usually obtained under similar
conditions, conform to themselves; a data quality indicator.
Precision is usually expressed as standard deviation, variance or
range, in either absolute or relative terms. (Glossary of Quality
Assurance Terms, QAMS, 8/31/92).
Preservation: Refrigeration and/or reagents added at the time of
sample collection to maintain the chemical and/or biological
integrity of the sample.
Pure Reagent Water: Water in which an interferant is not
observed at the MDL of the parameters of interest. (40 CFR Part 136)
Quality Assurance Plan: A written description of the
laboratory's quality assurance activities.
Quality Assurance: An integrated system of activities involving
planning, quality control, quality assessment, reporting and quality
improvement to ensure that a product or service meets defined
standards of quality with a stated level of confidence. (Glossary of
Quality Assurance Terms, QAMS, 8/31/92).
Quality Control: The overall system of technical activities
whose purpose is to measure and control the quality of a product or
service so that it meets the needs of users. (Glossary of Quality
Assurance Terms, QAMS, 8/31/92).
Quality Control Sample: An uncontaminated sample matrix spiked
with known amounts of analytes from a source independent from the
calibration standards. It is generally used to establish intra-
laboratory or analyst specific precision and bias or to assess the
performance of all or a portion of the measurement system. (Glossary
of Quality Assurance Terms, QAMS, 8/31/92).
Quality Control Validation Studies: The formal study of a
sampling and/or analytical method, conducted with replicate,
representative matrix samples, following a specific study protocol
and utilizing a specific written method, by a minimum of seven
laboratories, for the purpose of estimating inter-laboratory
precision, bias and analytical interferences. (Glossary of Quality
Assurance Terms, QAMS, 8/31/92).
Sample Container: The specific requirements for sample
containers are to assure a representative samples and sample
integrity, e.g., septa vials, glass or plastic.
Sample Duplicate: Two samples taken from and representative of
the same population and carried through all steps of the sampling
and analytical procedures in an identical manner. Duplicate samples
are used to assess variance of the total method including sampling
and analysis. (Glossary of Quality Assurance Terms, QAMS, 8/31/92).
Standard Operating Procedures (SOPs): A written document which
details the method of an operation, analysis or action whose
techniques and procedures are thoroughly prescribed and which is
accepted as the method for performing certain routine or repetitive
tasks. (Glossary of Quality Assurance Terms, QAMS, 8/31/92).
Standing Committee: A committee of NELAC involved with
establishing the technical standards for accreditation of
environmental laboratories. Currently, these are the Quality
Systems, Performance Evaluation Testing, On-site Assessment,
Accreditation Process, Regulatory, Accrediting Authority, and
Program Structure Committees.
Supervisor: The individual designated as being responsible for a
particular area or category of scientific analysis. This
responsibility includes direct day-to-day supervision of technical
employees, supply and instrument adequacy and upkeep, quality
assurance/quality control duties and ascertaining that technical
employees have the required balance of education, training and
experience to perform the required analyses.
Surrogate: A substance with properties that mimic the analyte of
interest. It is unlikely to be found in environment samples and is
added to them for quality control purposes. (Glossary of Quality
Assurance Terms, QAMS, 8/31/92).
Technical Employee: The designated individual who performs the
``hands-on'' analytical methods and associated techniques and who is
the one responsible for applying required Good Laboratory Practice
notices and other pertinent Quality Controls to meet the required
level of quality.
Trip Blank: A clean sample of matrix that is carried to the
sampling site and transported to the laboratory for analysis without
having been exposed to sampling procedures. (Glossary of Quality
Assurance Terms, QAMS, 8/31/92).
Appendix B
Bibliography
References for Water, Sediments, Soils, Sludges, Hazardous Wastes and
Biological Analyses
These methods or methods specified by the accreditation
authority shall be used when analyzing samples.
Drinking Water
(1) 40 CFR Part 141, National Primary Drinking Water
Regulations, July 1, 1992, Subpart C and Subpart I.
(2) ``Methods for the Determination of Organic Compounds in
Drinking Water,'' EPA 600/4-88-039, December 1988.
(3) ``Methods for Chemical Analysis of Water and Wastes,'' EPA
600/4-79-020, revised March 1983.
(4) ``Manual for Certification of Laboratories Analyzing
Drinking Water, Criteria and Standards Quality Assurance'' EPA 570/
9-90-008, April 1990 and the first update (Change I) EPA 570/9-90-
008a, October 1991.
(5) 40 CFR Part 136, Guidelines Establishing Test Procedures for
the Analysis of Pollutants Under the Clean Water Act, July 1, 1991,
Appendix A.
(6) Standard Methods for the Examination of Water and
Wastewater, APHA-AWWA-WPCF, 18th Edition, 1992.
(7) ``Guidance on the Evaluation of Safe Drinking Water Act
Compliance Monitoring Results from Performance Based Methods'',
September 30, 1994, Second draft.
Surface Water, Groundwater, and Wastewater Municipal/Industrial
Effluents
(1) 40 CFR Part 136, Guidelines Establishing Test Procedures for
the Analysis of Pollutants Under the Clean Water Act, Tables IA, IB,
IC, ID and IE, as published in the Federal Register, Vol. 65, No.
165, pp. 50758-50770, October 8, 1991.
(2) Methods for Chemical Analysis of Water and Wastes, EPA 600/
4-79-020, revised March 1983.
(3) Test Methods for Evaluating Solid Waste, Physical/Chemical
Methods, (SW-846), Third edition, 1986, as amended by Updates 1 and
IIA, August 31, 1993.
(4) 40 CFR Part 261, Identification and Listing of Hazardous
Waste, July, 1991, Appendix III (Chemical Analysis Test Methods)
(5) Standard Methods for the Examination of Water and
Wastewater, APHA-AWWA-WPCF, 17th Edition, 1989.
Notes:
(1) Laboratories analyzing samples in support of NPDES Permits
are limited to methods specified in Reference 1 above or those
specifically approved for use by EPA.
Soils and Sediments, Municipal and Industrial Sludges (Residuals) and
Solid and Hazardous Wastes
(1) ``Test Methods for Evaluation of Solid Waste, Physical and
Chemical Methods'', Third Edition (EPA SW-846), 1986 as amended by
Final Updates I and II, November, 1990 and 1991.
(2) ``Procedures for Handling and Chemical Analysis of Sediments
and Water Samples'' EPA/Corps of Engineers, EPA/CE-81-1, 1981.
(3) *USEPA Contract Laboratory Statement of Work for
Inorganic Analysis'', ILMO 2.1 (September 1991).
(4) *USEPA Contract Laboratory Program Statement of Work
for Organic Analysis'', ILMO 2.0 (July 1990) and ILMO 2.1 (September
1991).
---------------------------------------------------------------------------
\*\Methods from these references shall be used by laboratories
participating in the EPA Contract Laboratory Program to perform
analyses for Superfund (CERCLA) site investigations.
---------------------------------------------------------------------------
(5) ``POTW Sludge Sampling and Analysis Guidance Document''
USEPA Permits Division, August 1989.
Air
To be added as document goes through review.
Biological
Microbiological. (1) Drinking Water Analyses--40 CFR Part 141,
Subpart C (Monitoring and Analytical Requirements, section 141.21)
July 1, 1991.
(2) Water and Wastewater Analyses--40 CFR Part 136, Table IA as
published in the Federal Register, Vol. 65, No. 165, pp. 50758-
50770, October 8, 1991.
(3) ``Microbiological Methods for Monitoring the Environment''
EPA-600/8-78-017, 1978.
(4) Standard Methods for the Examination of Water and
Wastewater, APHP-AWWA-WPCF, 17th Edition, 1989.
Bioassay. (1) ``Methods for Measuring the Acute Toxicity of
Effluents and Receiving Waters to Freshwater and Marine Organisms
(Fourth Edition)'' EPA 600/4-90-027, September, 1991.
(2) ``Short-Term Methods for Estimating the Chronic Toxicity of
Effluents and Receiving Waters to Freshwater Organisms (Third
Edition)'' EPA 600/4-91-002, 1991.
(3) ``Short-Term Methods for Estimating the Chronic Toxicity of
Effluents and Receiving Waters to Marine and Estuarine Organisms
(Second Edition)'' EPA 600/4-91/003, 1991.
Macrobenthic identification and enumeration. (1)
``Macroinvertebrate Field and Laboratory Methods for Evaluating the
Biological Integrity of Surface Waters'', ORD, Washington, D.C.,
November, 1990.
(2) Standard Methods for the Examination of Water and
Wastewater, Part 10500, 17th Edition, APHA, 1989.
Radiochemistry
(1) 40 CFR Part 141.25, ``Analytical Methods for
Radioactivity'', July 1, 1992 edition.
(2) Analytical Methods for Radiochemistry Analyses, EPA 600/4-
80-032 and EPA 600/5-84-006.
[FR Doc. 94-29573 Filed 12-1-94; 8:45 am]
BILLING CODE 6560-50-P