06-8478. Revisions to Ambient Air Monitoring Regulations  

  • Start Preamble Start Printed Page 61236

    AGENCY:

    Environmental Protection Agency (EPA).

    ACTION:

    Final rule.

    SUMMARY:

    The EPA is issuing final amendments to the ambient air monitoring requirements for criteria pollutants. The purpose of the amendments is to enhance ambient air quality monitoring to better serve current and future air quality management and research needs. The final amendments establish limited ambient air monitoring requirements for thoracic coarse particles in the size range of PM10−2.5 to support continued research into these particles' distribution, sources, and health effects. The ambient air monitoring amendments also require each State to operate one to three monitoring stations that take an integrated, multipollutant approach to ambient air monitoring. In addition, the final amendments modify the general monitoring network design requirements for minimum numbers of ambient air monitors to focus on populated areas with air quality problems and to reduce significantly the requirements for criteria pollutant monitors that have measured ambient air concentrations well below the applicable National Ambient Air Quality Standards. These amendments also revise certain provisions regarding monitoring network descriptions and periodic assessments, quality assurance, and data certifications. A number of the amendments relate specifically to PM2.5, revising the requirements for reference and equivalent method determinations (including specifications and test procedures) for fine particle monitors.

    DATES:

    This final rule is effective on December 18, 2006.

    ADDRESSES:

    The EPA has established a docket for this action under Docket ID No. EPA-HQ-OAR-2004-0018. All documents in the docket are listed in the http://www.regulations.gov index. Although listed in the index, some information is not publicly available, e.g., confidential business information or other information whose disclosure is restricted by statute. Certain other material, such as copyrighted material, will be publicly available only in hard copy. Publicly available docket materials are available either electronically in http://www.regulations.gov or in hard copy at the Revisions to the Ambient Air Monitoring Regulations Docket, EPA/DC, EPA West, Room B102, 1301 Constitution Ave., NW., Washington, DC. The Public Reading Room is open from 8:30 a.m. to 4:30 p.m., Monday through Friday, excluding legal holidays. The telephone number for the Public Reading Room is (202) 566-1744, and the telephone number for the Air Docket is (202) 566-1742.

    Note:

    The EPA Docket Center suffered damage due to flooding during the last week of June 2006. The Docket Center is continuing to operate. However, during the cleanup, there will be temporary changes to Docket Center telephone numbers, addresses, and hours of operation for people who wish to visit the Public Reading Room to view documents. Consult EPA's Federal Register notice at 71 FR 38147 (July 5, 2006) or the EPA Web site at http://www.epa.gov/​epahome/​dockets.htm for current information on docket status, locations, and telephone numbers.

    Start Further Info

    FOR FURTHER INFORMATION CONTACT:

    For general questions concerning the final amendments, please contact Mr. Lewis Weinstock, U.S. EPA, Office of Air Quality Planning and Standards, Air Quality Assessment Division, Ambient Air Monitoring Group (C304-06), Research Triangle Park, North Carolina 27711; telephone number: (919) 541-3661; fax number: (919) 541-1903; e-mail address: weinstock.lewis@epa.gov. For technical questions, please contact Mr. Tim Hanley, U.S. EPA, Office of Air Quality Planning and Standards, Air Quality Assessment Division, Ambient Air Monitoring Group (C304-06), Research Triangle Park, North Carolina 27711; telephone number: (919) 541-4417; fax number: (919) 541-1903; e-mail address: hanley.tim@epa.gov.

    End Further Info End Preamble Start Supplemental Information

    SUPPLEMENTARY INFORMATION:

    I. General Information

    A. Does this action apply to me?

    Categories and entities potentially regulated by this action include:

    CategoryNAICS code 1Examples of regulated entities
    Industry334513, 541380Manufacturer, supplier, distributor, or vendor of ambient air monitoring instruments; analytical laboratories or other monitoring organizations that elect to submit an application for a reference or equivalent method determination under 40 CFR part 53.
    Federal government924110Federal agencies (that conduct ambient air monitoring similar to that conducted by States under 40 CFR part 58 and that wish EPA to use their monitoring data in the same manner as State data) or that elect to submit an application for a reference or equivalent method determination under 40 CFR part 53.
    State/territorial/local/tribal government924110State, territorial, and local, air quality management programs that are responsible for ambient air monitoring under 40 CFR part 58 or that elect to submit an application for a reference or equivalent method determination under 40 CFR part 53 or for an approved regional method approved under 40 CFR part 58 appendix C. The proposal also may affect Tribes that conduct ambient air monitoring similar to that conducted by States and that wish EPA to use their monitoring data in the same manner as State monitoring data.
    1 North American Industry Classification System.

    This table is not intended to be exhaustive, but rather provides a guide for readers regarding entities likely to be regulated by this action. This table lists the types of entities that EPA is now aware could potentially be regulated by this action. Other types of entities not listed in the table could also be regulated. To determine whether your facility or Federal, State, local, or territorial agency is regulated by this action, you should carefully examine the requirements for reference or equivalent method determinations in 40 CFR part 53, subpart A (General Start Printed Page 61237Provisions) and the applicability criteria in 40 CFR 51.1 of EPA's requirements for State implementation plans. If you have questions regarding the applicability of this action to a particular entity, consult the person listed in the preceding FOR FURTHER INFORMATION CONTACT section.

    B. Where can I obtain a copy of this action?

    In addition to being available in the docket, an electronic copy of this final action will also be available on the Worldwide Web (WWW) through the Technology Transfer Network (TTN). Following the Administrator's signature, a copy of the final amendments will be placed on the TTN's policy and guidance page for newly proposed or promulgated rules at http://www.epa.gov/​ttn/​oarpg. The TTN provides information and technology exchange in various areas of air pollution control.

    C. Public Comments on Proposed Amendments

    EPA received approximately 20,000 public comments on the proposed amendments to the ambient air monitoring regulations during the 90-day comment period. These comments were submitted to the rulemaking docket and also during public hearings held in Chicago, Illinois; Philadelphia, Pennsylvania; and San Francisco, California (71 FR 8228, February 16, 2006). Public comments on the proposed amendments were submitted by States, local governments, Tribes, and related associations; energy, mining, ranching, and agricultural interests and related associations; vendors, laboratories, and technical consultants; health, environmental, and public interest organizations; and private citizens. The EPA has carefully considered these comments in developing the final amendments. Summaries of these comments and EPA's detailed responses are contained in the Response to Comments document included in the docket.

    D. Judicial Review

    Under section 307(b)(1) of the Clean Air Act (CAA), judicial review of the final amendments is available only by filing a petition for review in the U.S. Court of Appeals for the District of Columbia Circuit by December 18, 2006. Under section 307(d)(7)(B) of the CAA, only an objection to the final amendments that was raised with reasonable specificity during the period for public comment can be raised during judicial review. Moreover, under section 307(b)(2) of the CAA, the requirements established by the final amendments may not be challenged separately in any civil or criminal proceedings brought by EPA to enforce these requirements.

    E. Peer Review

    The EPA sought expert scientific review of the proposed methods, technologies, and approach for ambient air monitoring by the Clean Air Scientific Advisory Committee (CASAC). The CASAC is a Federal advisory committee established to review scientific and technical information and make recommendations to the EPA Administrator on issues related to the air quality criteria and corresponding NAAQS. CASAC formed a National Ambient Air Monitoring Strategy (NAAMS) Subcommittee in 2003 to provide advice for a strategy for the national ambient air monitoring programs. This subcommittee, which operated over a 1-year period, and a new subcommittee on Ambient Air Monitoring and Methods (AAMM), formed in 2004, provided the input for CASAC on its consultations, advisories, and peer-reviewed recommendations to the EPA Administrator.

    In July 2003, the CASAC NAAMS Subcommittee held a public meeting to review EPA's draft National Ambient Air Monitoring Strategy document (dated September 6, 2002), which contained technical information underlying planned changes to the ambient air monitoring networks. The EPA continued to consult with the CASAC AAMM Subcommittee throughout the development of the proposed amendments. Public meetings were held in July 2004, December 2004, and September 2005 to discuss the CASAC review of nearly 20 documents concerning methods and technology for measurement of particulate matter (PM); data quality objectives for PM monitoring networks and related performance-based standards for approval of equivalent continuous PM monitors; configuration of ambient air monitoring stations; [1] and other technical aspects of the proposed amendments. These documents, along with CASAC review comments and other information are available at: http://www.epa.gov/​ttn/​amtic/​casacinf.html.

    F. How is this document organized?

    The information presented in this preamble is organized as follows:

    I. General Information

    A. Does this action apply to me?

    B. Where can I obtain a copy of this action?

    C. Public Comments on Proposed Amendments

    D. Judicial Review

    E. Peer Review

    F. How is this document organized?

    II. Authority

    III. Overview

    A. Summary of Concurrent Final Action on Revisions to the National Ambient Air Quality Standards for Particulate Matter

    B. Summary of Changes to Ambient Air Monitoring Regulations

    C. Significant Dates for States, Local Governments, Tribes, and Other Stakeholders

    D. Implementation of the Revised Monitoring Requirements

    E. Federal Funding for Ambient Air Monitoring

    IV. Discussion of Regulatory Revisions and Major Comments on Proposed Amendments to 40 CFR Part 53

    A. Overview of Part 53 Regulatory Requirements

    B. Requirements for Candidate Reference Methods for PM10−2.5

    C. Requirements for Candidate Equivalent Methods PM2.5 and PM10−2.5

    D. Other Changes

    V. Discussion of Regulatory Revisions and Major Comments on Proposed Amendments to 40 CFR Part 58

    A. Overview of Part 58 Regulatory Requirements

    B. General Monitoring Requirements

    1. Definitions and Terminology

    2. Annual Monitoring Network Plan and Periodic Network Assessment

    3. Operating Schedules

    4. Monitoring Network Completion for PM10−2.5 and NCore Sites

    5. System Modifications

    6. Annual Air Monitoring Data Certification

    7. Data Submittal

    8. Special Purpose Monitors

    9. Special Considerations for Data Comparisons to the National Ambient Air Quality Standards

    C. Appendix A—Quality Assurance Requirements for State and Local Air Monitoring Stations and Prevention of Significant Deterioration Air Monitoring

    1. General Quality Assurance Requirements

    2. Specific Requirements for PM10−2.5, PM2.5, PM10, and Total Suspended Particulates

    3. Particulate Matter Performance Evaluation Program and National Performance Audit Programs

    4. Revisions to Precision and Bias Statistics

    5. Other Program Updates

    D. Appendix C—Ambient Air Quality Monitoring Methodology

    1. Applicability of Federal Reference Methods and Federal Equivalent Methods

    2. Approved Regional Methods for PM2.5

    E. Appendix D—Network Design Criteria for Ambient Air Quality Monitoring

    1. Requirements for Operation of Multipollutant NCore Stations Start Printed Page 61238

    2. Requirements for Operation of PM10−2.5 Stations

    3. Requirements for Operation of PM2.5 Stations

    4. Requirements for Operation of PM10 Stations

    5. Requirements for Operation of Carbon Monoxide, Sulfur Dioxide, Nitrogen Dioxide, and Lead Monitoring Sites

    6. Requirements for Operation of Ozone Stations

    7. Requirements for Operation of Photochemical Assessment Monitoring Stations

    F. Appendix E—Probe and Monitoring Path Siting Criteria for Ambient Air Monitoring

    1. Vertical Placement of PM10−2.5 Samplers

    2. Ozone Monitor Setback Requirement from Roads

    G. Sample Retention Requirements

    H. Deletion of Appendices B and F

    VI. Statutory and Executive Order Reviews

    A. Executive Order 12866: Regulatory Planning and Review

    B. Paperwork Reduction Act

    C. Regulatory Flexibility Act

    D. Unfunded Mandates Reform Act

    E. Executive Order 13132: Federalism

    F. Executive Order 13175: Consultation and Coordination With Indian Tribal Governments

    G. Executive Order 13045: Protection of Children From Environmental Health and Safety Risks

    H. Executive Order 12898: Federal Actions to Address Environmental Justice in Minority Populations and Low-Income Populations

    I. Executive Order 13211: Actions That Significantly Affect Energy Supply, Distribution, or Use

    J. National Technology Transfer Advancement Act

    K. Congressional Review Act

    II. Authority

    The EPA rules for ambient air monitoring are authorized under sections 110, 301(a), and 319 of the Clean Air Act (CAA). Section 110(a)(2)(B) of the CAA requires that each State implementation plan (SIP) provide for the establishment and operation of devices, methods, systems, and procedures needed to monitor, compile, and analyze data on ambient air quality and for the reporting of air quality data to EPA. Section 103 authorizes, among others, research and investigations relating to the causes, effects, extent, prevention and control of air pollution. Section 301(a) of the CAA authorizes EPA to develop regulations needed to carry out EPA's mission and establishes rulemaking requirements. Uniform criteria to be followed when measuring air quality and provisions for daily air pollution index reporting are required by CAA section 319.

    III. Overview

    A. Summary of Concurrent Final Action on Revisions to the National Ambient Air Quality Standards for Particulate Matter

    Elsewhere in this Federal Register, EPA is finalizing revisions to the National Ambient Air Quality Standards (NAAQS) for particulate matter (PM). These revisions were proposed on January 17, 2006 (71 FR 2620). For a detailed explanation of these revisions, see that preamble elsewhere in this Federal Register.

    The EPA is finalizing the PM2.5 NAAQS revisions as proposed. With regard to the primary standards for fine particles (generally referring to particles less than or equal to 2.5 micrometers (μm) in diameter, PM2.5), EPA is revising the level of the 24-hour PM2.5 standard to 35 micrograms per cubic meter (μg/m3), providing increased protection against health effects associated with short-term exposure (including premature mortality and increased hospital admissions and emergency room visits). The EPA is retaining the level of the annual PM2.5 standard at 15 μg/m3, continuing protection against health effects associated with long-term exposure (including premature mortality and development of chronic respiratory disease). The EPA is also finalizing the proposed revisions in the conditions under which spatial averaging of the annual primary PM2.5 NAAQS is permitted, and placing these conditions in appendix N of 40 CFR part 50 rather than in appendix D of 40 CFR part 58.

    With regard to secondary PM standards, EPA is revising the current 24-hour PM2.5 secondary standard by making it identical to the revised 24-hour PM2.5 primary standard, retaining the annual PM2.5 and 24-hour PM10 secondary standards, and revoking the annual PM10 secondary standard. This suite of secondary PM standards is intended to provide protection against PM-related public welfare effects, including visibility impairment, effects on vegetation and ecosystems, and materials damage and soiling.

    The EPA is finalizing the proposed Federal reference method (FRM) for PM2.5. This action in essence codifies certain desirable features that have already been in widespread use as elements of approved equivalent methods or national user modifications.

    The EPA is not finalizing the proposed NAAQS for PM10−2.5, for reasons explained in the accompanying preamble to the revisions to the NAAQS. As a result, EPA is not finalizing a number of related provisions (notably those which would have prescribed which monitors could have been used for comparison with that proposed NAAQS) proposed as amendments to 40 CFR part 58. The EPA is, however, finalizing the proposed FRM for PM10−2.5 (see appendix O to 40 CFR part 50). This FRM is based on paired filter-based samplers for PM2.5 and PM10 and it will serve as the standard of reference for measurements of PM10−2.5 concentrations in ambient air. This should provide a basis for approving Federal Equivalent Methods (FEMs) and promote the gathering of scientific data to support future reviews of the PM NAAQS. Because it is a filter based system, this method can itself be used to provide speciated data. The reference measurement from the PM10−2.5 FRM is also important in the development of alternative PM10−2.5 speciation samplers such as dichotomous samplers. The EPA will be issuing guidance to ensure the use of a consistent national approach for speciated coarse particle monitors as soon as possible.

    In conjunction with the above NAAQS revisions and FRM provisions, as part of this final monitoring rule, as described below EPA is finalizing certain provisions which support collection of additional high quality data on ambient concentrations of PM10−2.5. These data should be useful in improving the understanding of PM10−2.5 air quality and in conducting future reviews of the PM NAAQS.

    As explained in the preamble to the NAAQS revisions, EPA is revoking the annual NAAQS for particles generally less than or equal to 10 μm in diameter (PM10). However, EPA is retaining the 24-hour PM10 NAAQS as a standard for short-term exposure to thoracic coarse particles, rather than revoking that standard in all but 15 areas as proposed. This change from the NAAQS revision proposal necessitates that the final monitoring rule restore certain PM10 monitoring provisions that were proposed for removal.

    B. Summary of Changes to Ambient Air Monitoring Regulations

    This rule, in most respects, finalizes the proposals put forth in the January 17, 2006, notice of proposed rulemaking (71 FR 2710). This final rule will facilitate monitoring program changes envisioned in the draft National Ambient Air Monitoring Strategy which was fully described in the proposal. These final changes, which apply to the monitoring program for all of the criteria pollutants, will reduce the required scale of monitoring for pollutants for which most areas have reached Start Printed Page 61239attainment. The changes are intended to better focus monitoring resources on current air quality challenges. The changes will also allow States and local monitoring agencies more flexibility to design their monitoring programs to reflect local conditions.

    In amendments to 40 CFR part 53 (Reference and Equivalent Methods), this final rule incorporates the proposed criteria for approval of Federal equivalent methods (FEM) for PM2.5, with some modifications to the method testing requirements and approval criteria in response to persuasive public comments. The modifications will require a more robust set of testing conditions and closer performance matching of candidate FEMs to FRMs. The EPA is also finalizing the rule with some strengthening revisions to the proposed criteria for approved regional methods (ARMs) for PM2.5. The new criteria for PM2.5 FEMs and ARMs will facilitate the commercialization and EPA approval of continuous PM2.5 mass monitors, allowing them to be substituted for many of the currently operating filter-based FRMs, which will support additional monitoring objectives and reduce annual monitoring costs.

    In other amendments to 40 CFR part 53, EPA is adopting FEM approval criteria for PM10−2.5, with some revisions from the proposal that will provide for approval and use of methods that can meet multiple monitoring objectives. The new FEM performance criteria for PM10−2.5 will facilitate approval of filter-based methods for direct sampling of PM10−2.5 concentrations that can be chemically speciated using post-sampling laboratory analysis. The FEM criteria are also expected to encourage commercialization of highly time-resolved continuous methods. The EPA is hopeful that the PM2.5 and PM10−2.5 FEM criteria together will result in the approval and commercialization of methods that provide equivalent measurements of PM2.5, PM10, and PM10−2.5 from a single instrument.

    In amendments to 40 CFR part 58 (Ambient Air Quality Surveillance), this final rule, as proposed, requires States to establish and operate a network of NCore multipollutant monitoring stations. The EPA intends the NCore network to consist of approximately 75 stations, of which the rule requires between 62 and 71 such stations. These stations must be operational by 2011. Most States, as well as the District of Columbia, Puerto Rico, and the Virgin Islands, will be required to operate a single station. California, Florida, Illinois, Michigan, New York, North Carolina, Ohio, Pennsylvania, and Texas will be required to operate two or three NCore stations. For these States, the selection between two or three stations will be part of the development and approval of the NCore monitoring plan that is due by July 1, 2009. The EPA also plans to negotiate with a number of States, local agencies, and/or Tribes to operate additional NCore stations on a voluntary basis, bringing the total number of stations to about 75. By approving some required stations to be in rural areas and by negotiating for additional voluntary sites in rural areas, EPA expects that about 55 NCore sites will be in urbanized areas and about 20 in rural areas. The rural sites are intended to be sited away from any large local emission sources, so that they represent ambient concentrations over an extensive area. The NCore stations must perform the types of pollutant measurements that were proposed, with three exceptions. PM10−2.5 measurements may be made on a 1-in-3 day schedule rather than the proposed every day schedule, NOy[2] measurements may be waived by the EPA Administrator based on certain criteria, and as explained later in this section, PM10−2.5 chemical speciation will be required in addition to PM10−2.5 mass concentration measurements.

    The EPA estimated that the proposed rule would have required States to operate about 225 PM10−2.5 monitors based on the population and estimated PM10−2.5 concentrations of metropolitan statistical areas (MSAs) with populations of 100,000 or more. In addition, PM10−2.5 monitors were proposed to be required at NCore stations; some monitors likely would have satisfied both of these requirements. Because EPA is not adopting a NAAQS for PM10−2.5, the final monitoring rule does not include the proposed requirement for the broad network of PM10−2.5 monitoring stations in MSAs over 100,000 population. However, the final monitoring rule does require PM10−2.5 monitors at the required NCore multipollutant monitoring stations. The data gathered from these stations should be useful in improving understanding of PM10−2.5 air quality and in conducting future reviews of the PM NAAQS. The EPA anticipates that due to natural variations among the cities and rural areas where the NCore stations will be sited, the NCore PM10−2.5 monitors will represent a range of concentrations and nearby emission source types, and that many but not all will be in well populated locations.

    The EPA is not adopting the proposed population-based and population density-based siting requirements for PM10−2.5 monitors, or any part of the proposed five-part suitability test for PM10−2.5 monitoring sites, which as proposed would have controlled whether PM10−2.5 data from a monitoring site could be compared to the proposed PM10−2.5 NAAQS. These proposed requirements were tied to the establishment of a PM10−2.5 NAAQS with a qualified PM10−2.5 indicator based on a determination of whether ambient mixes of coarse particles are or are not dominated by coarse particle emissions from enumerated types of sources. Since EPA is not adopting this part of the proposal, these issues are now moot. In the absence of a PM10−2.5 NAAQS, our goal nevertheless will be to locate PM10−2.5 monitors in a manner that satisfies an objective of the proposed rule, which was to focus most monitoring resources on population centers.

    This final rule contains a requirement for PM10−2.5 speciation to be conducted at NCore multipollutant monitoring stations. The EPA had proposed a requirement for PM10−2.5 speciation in 25 areas, with the areas required to have this monitoring selected based on having a Metropolitan Statistical Area (MSA) population over 500,000 and having an estimated design value of greater than 80 percent of the proposed PM10−2.5 NAAQS. This would have concentrated the PM10−2.5 speciation monitoring in areas that have high populations and high exposures to PM10−2.5. Since EPA is requiring PM10−2.5 monitoring at NCore primarily for scientific purposes, it is more appropriate to have monitoring in a variety of urban and rural locations to increase the diversity of areas for which chemical species data will be available to use in scientific studies. The EPA had already proposed to require chemical speciation for PM2.5 at NCore stations. The collocation of both PM10−2.5 and PM2.5 speciation monitoring at NCore stations is consistent with the multipollutant objectives of the NCore network and will support further research in understanding the chemical composition and sources of PM10, PM10−2.5, and PM2.5 at a variety of urban and rural locations. The EPA will work with States to ensure that PM10−2.5 speciation monitors employ the latest in speciation technology to advance the science so that future regulation will provide more targeted protection against the effects only of those coarse particles Start Printed Page 61240and related source emissions that prove to be of concern to public health.

    Because the 24-hour PM10 NAAQS is being retained in all parts of the country, this final rule retains the existing minimum monitoring network design requirements for PM10. These longstanding requirements are based on the population of a MSA and its historical PM10 air quality. For any given combination of these two parameters, a range of required monitors is prescribed, with the required number to be determined as part of the annual monitoring plan. The EPA estimates that once States and Regional Administrators have considered how current population data and recent PM10 air quality affect the required number of PM10 monitors in each area, between 200 and 500 FRM/FEM monitors will be required, compared to about 1,200 in operation now. While States may of course choose to continue to operate monitors in excess of the minimum requirements, EPA notes that many PM10 monitors have been recording concentrations well below the PM10 NAAQS and are candidates for discontinuation at a State's initiative. States may choose to retain PM10 monitors that are recording concentrations below the PM10 NAAQS level to support monitoring objectives other than attainment/nonattainment determinations, such as baseline monitoring for prevention of significant deterioration permitting or public information.

    This final rule changes the requirements for the minimum number of monitors for PM2.5 and ozone (O3) monitoring networks. In response to comments, the final requirements require more O3 and PM2.5 monitoring in more polluted areas and more monitors in CSAs than was proposed. While this final rule requires fewer monitors than are now operating for O3 and PM2.5, as did the pre-existing monitoring rule, EPA does not intend to encourage net reductions in the number of O3 and PM2.5 monitoring sites in the U.S. as a whole. The surplus in the existing networks relative to minimum requirements gives States more flexibility to choose where to apply monitoring resources for O3 and PM2.5. For PM2.5, this final rule requires that sampling be conducted on a daily basis for monitors that have recently been recording the highest concentrations in their area and have been recording concentrations very near the 24-hour NAAQS, to avoid a bias in attainment/nonattainment designations that can occur with less frequent sampling. Pursuant to this provision, EPA estimates that about 50 sites now sampling less frequently will be required to change to daily sampling.

    As proposed, minimum monitoring requirements for carbon monoxide (CO), sulfur dioxide (SO2), and nitrogen dioxide (NO2) are eliminated in this final rule. Minimum requirements for lead (Pb) monitoring stations and Photochemical Assessment Monitoring Stations (PAMS) are reduced to those that were proposed. For all five criteria pollutants, however, existing monitoring sites (except those already designated as special purpose monitors) cannot be discontinued without EPA Administrator (for PAMS or NCore stations) or Regional Administrator (for all other types of monitoring) approval. Regional Administrator approval is also required for discontinuation of O3, PM2.5, and PM10 sites even if they are in excess of minimum network design requirements. While the rule requires EPA approval, such approvals should be facilitated where appropriate by rule provisions which clearly establish certain criteria under which discontinuation will be approved. These criteria are the same as those proposed with four minor changes explained in detail in section V.B.5, System Modifications. These criteria are not exclusive, and monitors not meeting any of the listed criteria may still be approved for discontinuation on a case-by-case basis if discontinuation does not compromise data collection needed for implementation of a NAAQS. Specific monitoring for these pollutants may currently be required in individual SIPs; this monitoring rule does not affect any SIP requirements for such specific monitoring.

    Appendix A to this final rule includes most of the proposed revisions to the quality system for ambient air monitoring. In particular, the proposed requirement for States to ensure a program of adequate and independent audits of their monitoring stations is included in this final rule. One way, but not the only way, a State can satisfy this requirement is to agree that EPA will conduct these audits using funds that otherwise would have been awarded to the State as part of its annual air quality management grant. A small number of changes to the proposed quality system requirements reflect public comments on details of the proposed revisions. Also, because the objective of PM10−2.5 monitoring is to better understand PM10−2.5 air quality and to support health effects studies, rather than to provide data for use in nonattainment designations, and because there consequently will be a much smaller network of required PM10−2.5 monitors than proposed, the quality system for PM10−2.5 in this final rule differs from the proposed system in that it aims to quantify data quality at the national level of aggregation rather than at the level of individual monitoring organizations as had been proposed. Another change from the proposal is that a provision has been added allowing the EPA Regional Administrator to waive the usual quality system requirements for special purpose monitors when those requirements are logistically infeasible due to unusual site conditions and are not essential to the monitoring objectives.

    The EPA is finalizing the proposed provisions regarding when data from special purpose monitors (SPMs) can be compared to a NAAQS, with minor clarifications. In summary, the final rule provides that if an ozone or PM2.5 SPM operates for only two years or less, EPA will not use data from that monitor to make attainment/nonattainment determinations. This limitation is inherent in the form of these NAAQS, which require three years of data for a determination to be made. For the other NAAQS pollutants, as a policy matter, EPA will not use only two years of data from a SPM to voluntarily redesignate an area to nonattainment. This limitation is possible because as established in Section 107(d)(1) of the Act, the only time EPA is obligated to redesignate areas as attainment or nonattainment is after it promulgates or revises a NAAQS. Under an existing standard, voluntary redesignations are at the Administrator's discretion: EPA has no legal obligation to redesignate an area even if a monitor should register a violation of that standard (see CAA Section 107(d)(3)). In particular, in the case of PM10, EPA stated in section VII.B of the preamble to the NAAQS rule (printed in today's Federal Register) that because EPA is retaining the current 24-hour PM10 standards, new nonattainment designations for PM10 will not be required under the provisions of the Clean Air Act. The same is true for CO, NO2, SO2, and Pb. However, all valid data from a SPM will be considered in determining if a previously designated nonattainment area has subsequently attained the NAAQS. See also section V.B.8 below.

    This final rule advances, to May 1, the date each year by which monitoring organizations must certify that their submitted data is accurate to the best of their knowledge. However, this requirement will take effect one year later than proposed, in 2010 for data collected in 2009.

    This final rule retains the current requirement for an annual monitoring plan and finalizes most of the new Start Printed Page 61241substantive and procedural requirements that were proposed for these plans. One change is that some required new elements proposed for the annual plan have instead been shifted to the 5-year network assessment, to reduce the annual plan preparation burden and to allow these elements to be prepared more carefully. The first 5-year network assessment has been postponed by one year, to July 1, 2010.

    The proposed requirements regarding probe heights for PM10−2.5 monitors, increased O3 monitor distance from roadways (for newly established O3 stations), data elements to be reported, and PM filter retention are included in this final rule.

    This final rule also removes and reserves the pre-existing appendix B, Quality Assurance Requirements for Prevention of Significant Deterioration (PSD) Air Monitoring, and appendix F, Annual SLAMS Air Quality Information, of 40 CFR part 58 because they are no longer needed.

    C. Significant Dates for States, Local Governments, Tribes, and Other Stakeholders

    Only State governments, and those local governments that have been assigned responsibility for ambient air monitoring by their States, are subject to the mandatory requirements of 40 CFR part 58.[3] The following summary of applicable requirements is presented in chronological order, as an aid for States in planning their activities to comply with the rule. States are required to comply with pre-existing requirements in 40 CFR part 58, until the compliance date for each new requirement is reached.

    The following provisions in 40 CFR part 53 and part 58 are effective on December 18, 2006:

    • The criteria and process for EPA Administrator approval of FRMs, FEMs, and ARMs or where applicable Regional Administrator approval of ARMs. Manufacturers of continuous PM2.5 and PM10−2.5 instruments may apply for designation of their instruments as FRMs or FEMs starting today. The EPA is eager to receive such applications as soon as manufacturers can collect and analyze the necessary supporting data. State, local, and Tribal monitoring agencies may seek approval of their PM2.5 continuous monitor as ARMs beginning today, either independently or in cooperation with instrument manufactures.
    • The revised quality system requirements, except that full quality assurance practices, if not waived, are not required until January 1, 2009 for SPMs which use FRM, FEM, or ARM monitors.
    • The new minimum requirements (or absence of minimum requirements) for the number of monitors for specific NAAQS pollutants and for PAMS stations, if the new minimum allows a State to discontinue a previously required monitor. See below for the compliance date of the new minimum requirements in situations in which the final requirement is greater than the currently operating network.
    • The criteria for EPA Regional Administrator approval for removal of monitors that are in excess of minimum required, if a State seeks such removal.
    • The criteria for use of data from SPMs in determinations of attainment/nonattainment.
    • The elimination of the requirement for reporting of certain PM2.5 monitor operating parameters.
    • The revised requirement for separation between roadways and O3 monitors, for new O3 monitors whose placement has not already been approved as of December 18, 2006.
    • The new specification for probe heights for PM10−2.5 monitors.

    The new requirement to archive all PM10c and PM10−2.5 filters for 1 year begins with filters collected on or after January 1, 2007. However, EPA expects few if any monitoring agencies to be operating PM10c or PM10−2.5 filters this early, so most will be affected later.[4]

    The requirement to submit mass data on blank PM2.5 filters begins on January 1, 2007.

    The required date to begin daily PM2.5 sampling at certain PM2.5 monitoring sites is January 1, 2007. The EPA believes this will affect about 50 PM2.5 monitoring sites. The EPA will notify the affected States directly.

    This final rule does not change the schedule for reporting ambient air quality data to the Administrator, via the Air Quality System (AQS). However the rule now explicitly requires that associated quality assurance data be submitted along with ambient concentration data. The first submission affected will be the one due on June 30, 2007 for data collected in January through March of 2007.

    As presently is the case, States must submit an annual network plan by July 1 of each year. The next plan is due July 1, 2007.

    States whose PM2.5, PM10, or O3 networks do not meet the revised requirements of this final rule regarding the number of monitors in a given MSA or CSA are required to submit a plan for adding the necessary additional monitors by July 1, 2007 and to begin operating the new monitors by January 1, 2008. The EPA believes that this will only affect O3 and PM2.5 monitoring in fewer than ten locations each. The EPA will notify these States directly.

    A plan for the implementation of the required NCore multipollutant monitoring stations, including site selection, is due by July 1, 2009. States must implement the required NCore multipollutant stations by January 1, 2011, including PM10−2.5 monitoring.

    States will be required to submit earlier certification letters regarding the completeness and accuracy of the ambient concentration and quality assurance data they have submitted to the Air Quality System (AQS) operated by EPA, starting May 1, 2010 for data collected during 2009. Until then, States are required to submit these letters by July 1 of each year.

    Network assessments are required from States every 5 years starting July 1, 2010.

    Under the Tribal Authority Rule (TAR) (40 CFR part 49), which implements section 301(d) of the CAA, Tribes may elect to be treated in the same manner as a State in implementing sections of the CAA. However, EPA determined in the TAR that it was inappropriate to treat Tribes in a manner similar to a State with regard to specific plan submittal and implementation deadlines for NAAQS-related requirements, including, but not limited to, such deadlines in CAA sections 110(a)(1), 172(a)(2), 182, 187, and 191. See 40 CFR 49.4(a). For example, an Indian Tribe may choose, but is not required, to submit implementation plans for NAAQS-related requirements, nor is any Tribe required to monitor ambient air. If a Tribe elects to do an implementation plan, the plan can contain program elements to address specific air quality problems in a partial program. The EPA Start Printed Page 61242will work with the Tribe to develop an appropriate schedule for making any appropriate monitoring system changes which meet the needs of each Tribe.

    Indian Tribes have the same rights and responsibilities as States under the CAA to implement elements of air quality programs as they deem necessary. Tribes can choose to engage in ambient air monitoring activities. In many cases, Indian Tribes will be required by EPA regions to institute quality assurance programs that comply with 40 CFR part 58 appendix A, utilize FRM, FEM, or ARM monitors when comparing their data to the NAAQS, and to insure that the data collected is representative of their respective airsheds. For FRM, FEM, or ARM monitors used for NAAQS attainment or nonattainment determinations, quality assurance requirements of 40 CFR part 58 must be followed and would be viewed by EPA as an indivisible element of a regulatory air quality monitoring program.

    D. Implementation of the Revised Monitoring Requirements

    After promulgation, EPA will assist States in implementing the amended requirements using several mechanisms. The EPA will work with each State to develop approvable monitoring plans for its new NCore multipollutant monitoring stations, including PM10−2.5 monitoring. For example, EPA will negotiate the selection of required new monitoring sites (or new capabilities at existing sites) and their schedules for start up as well as plans to discontinue sites that are no longer needed. The EPA will negotiate with each State its annual grant for air quality management activities, including ambient monitoring work. Once States have established a new monitoring infrastructure to meet the new requirements, EPA will review State monitoring activities, submitted data, and plans for further changes on an annual basis.

    The EPA's support for and participation in enhancing the national ambient air monitoring system to serve current and future air quality management and research needs will extend beyond ensuring that States meet the minimum requirements of this final monitoring rule. The EPA will work with each State or local air monitoring agency to determine what affordable monitoring activities above minimum requirements would best meet the diverse needs of the individual air quality management program as well as the needs of other data users. The EPA may also work with the States, and possibly with some Tribes, to establish and operate PM10−2.5 speciation sites in addition to those required by this final rule. The EPA also plans to work with the States, and possibly with some Tribes, to establish and operate sites that will measure only PM10−2.5 concentrations in rural and less urbanized locations, in addition to the PM10−2.5 monitors required at NCore sites.

    An important element of implementing the new requirements will be EPA's role in encouraging the development and application of FEMs, and the development of a sampler or samplers that can provide a direct measurement of PM10−2.5 for collection of filters used in chemical speciation and for continuous methods that measure both PM2.5 and PM10−2.5. The EPA has determined that continuous monitoring of PM2.5 has many advantages over the filter-based FRM. This final rule makes it more practical for manufacturers and users of continuous PM2.5 instruments to obtain designation for them as FEMs or ARMs. To ensure objectivity and a sound scientific basis for decisions, EPA's Office of Research and Development will review applications for FEM and ARM designations based on the criteria in this final rule and will recommend approval or disapproval to the Administrator. For agencies seeking use of an ARM already approved in another monitoring network, the applicable Regional Office will conduct a review, most often as part of the EPA approval of an annual monitoring plan, based on the criteria in this final monitoring rule.

    The EPA will also provide technical guidance documents and training opportunities for State, local, and Tribal monitoring staff to help them select, operate, and use the data from new types of monitoring equipment. The EPA has already distributed a technical assistance document on the precursor gas monitors [5] that will be part of the NCore multipollutant sites and EPA has conducted multiple training workshops on these monitors. Additional guidance will be developed and provided on some other types of monitors with which many State monitoring staff are currently unfamiliar, and on network design, site selection, discontinuation of sites, quality assurance, network assessment, and other topics. While Tribes are not subject to the monitoring requirements of this final rule, these technical resources will also be available to them directly from EPA and via grantees, such as the Institute for Tribal Environmental Professionals and the Tribal Air Monitoring Support Center.

    The EPA will also continue to support the National Park Service's operation of the IMPROVE monitoring network, which provides important data for implementing both regional haze and PM2.5 attainment programs.[6] The number of sites in the IMPROVE program may vary, depending on EPA's enacted budget and the data needs of the regional haze and PM2.5 attainment programs.

    The EPA will also continue to operate the Clean Air Status and Trends Network (CASTNET), which monitors for O3, PM, and chemical components of PM in rural areas across the nation.[7] EPA is in the process of revising CASTNET to upgrade its monitoring capabilities to allow it to provide even more useful data to multiple data users. The EPA expects that about 20 CASTNET sites will have new capabilities similar to some of the capabilities required at NCore multipollutant sites.

    This final rule includes a requirement that States must ensure a program of adequate and independent audits of their monitoring stations. One way, but not the only way, a State can satisfy this requirement is to agree that EPA will conduct these audits using funds that otherwise would have been awarded to the State as part of its annual air quality management grant. In anticipation of the possible inclusion of this requirement in this final rule, EPA has been working with monitoring organizations to determine which of these organizations prefer this approach. The EPA expects that, for 2007, nearly all monitoring organizations will request that EPA conduct these audits. For those that chose another acceptable approach, EPA will conduct limited cross-checks of equipment, calibration standards, auditor preparation, and audit procedures to ensure that their audit programs are adequate.

    The EPA recognizes that characterizing and managing some air quality problems requires ambient concentration and deposition data that cannot be provided by the types of monitoring required by the monitoring activities addressed in today's final rule. These problems include near-roadway exposures to emissions from motor Start Printed Page 61243vehicles and mercury deposition. The EPA is actively researching these issues and developing concepts for monitoring programs to address them, but these issues are outside the scope of this final rule.

    The Air Quality System (AQS) is the data system EPA uses to receive ambient air monitoring data from State, local, Tribal, and other types of monitoring organizations and to make those data available to all interested users. AQS is based on a particular data structure and uses particular data input formats including data elements and defined values for categorical data. The existing AQS data structure and input formats are for the most part consistent with a number of changes made in this final rule to pre-existing terminology and requirements, but some changes will be needed in AQS to re-establish full consistency with requirements in the monitoring rule. The changes to AQS will likely, in turn, require some modifications to data preparation tools and practices at monitoring agencies. The EPA will prepare and implement a plan for making these changes, and will advise AQS users of the ramifications while doing so. Generally, the compliance deadlines in the rule are such that monitoring agencies are not required to immediately comply with any changes in rule provisions that would affect data transfer formats and procedures. Monitoring agencies, for the present, should continue to follow pre-existing AQS formats and procedures until notified.

    E. Federal Funding for Ambient Air Monitoring

    EPA has historically funded part of the cost to State, local, and Tribal governments of installation and operation of monitors to meet Federal monitoring requirements. Sections 105 and 103 of the CAA allow EPA to provide grant funding for programs for preventing and controlling air pollution and for some research and development efforts respectively. Eligible entities must apply for section 103 grants. Eligible entities must provide nonfederal matching funds for section 105 grants. The EPA's enacted budget specifies overall how much State and Tribal Air Grant (STAG) funding is available for these grants.

    In recent years, EPA has received special authority through appropriations acts to use section 103 grant funding for establishing and operating PM2.5-related monitoring stations. Funding for other types of monitoring has been included in the grants awarded under section 105. Grants to Tribes for air quality management work, including ambient monitoring, have been awarded under section 103 with the overall amount for these funds established by the enacted budget.

    During the public comment period for this rulemaking EPA received a large number of comments addressing funding issues. Most of these comments expressed opposition to the Administration's proposed EPA budget for fiscal year 2007, which included a proposal to provide PM2.5 monitoring support through section 105 grant funding, as is done for all other criteria pollutants. (As of today, the Congress has not enacted a 2007 budget for EPA.) Commenters stated that if funding for monitoring were reduced as proposed, State and local agencies would have less flexibility than desired in designing and operating their monitoring programs, and that the proposed requirements for new PM10−2.5 and NCore networks and for adequate and independent audits of monitoring stations would be burdensome. Some commenters requested that the proposed new requirements not be included in this final rule for this reason.

    The EPA understands these concerns. However, the CAA requirements from which this final rule derives [8] are not contingent on EPA providing funding to States to assist in meeting those requirements. Accordingly, the comments regarding funding are not directly relevant to the content of this final rule. Nevertheless, EPA recognizes that resources always have been and will remain a practical consideration for establishing and operating monitoring programs. The EPA will continue to work with States in this regard, in particular as EPA determines how to allocate enacted funding among States and among types of monitoring so as to achieve the best possible environmental outcomes. Several provisions of this final rule reduce minimum requirements, which will provide flexibility for States to reduce some of their pre-existing costs.

    Several commenters stated that EPA should not use STAG funds for the improvement or operation of Federal monitoring networks such as CASTNET. The EPA does not intend to use STAG funds from fiscal year 2007 or beyond in this way.

    IV. Discussion of Regulatory Revisions and Major Comments on Proposed Amendments to 40 CFR Part 53

    A. Overview of Part 53 Regulatory Requirements

    Various appendices to 40 CFR part 50 define certain ambient air monitoring methods as Federal reference methods which may be used to determine attainment of the National Ambient Air Quality Standards (NAAQS), and which form the benchmark for determining equivalency of other methods which may also be used to determine attainment. Under 40 CFR part 53, EPA designates specific commercial instruments or other versions of methods as Federal reference methods (FRMs). To be so designated, a particular FRM must be shown, according to the procedures and requirements of part 53, to meet all specifications of both the applicable appendix of part 50 as well as applicable specifications and requirements of part 53.

    To foster development of improved alternative air monitoring methods, EPA also designates—as Federal equivalent methods (FEMs)—alternative methods that are shown to have measurement performance comparable to the corresponding FRM. Part 53 contains explicit performance tests, performance standards, and other requirements for designation of both FRMs and FEMs for each of the criteria pollutants. In addition, States' air surveillance monitoring networks are required, under 40 CFR part 58, appendix C, to use only EPA-designated FRMs, FEMs, or ARMs at SLAMS sites. A list of all methods that EPA has designated as either FRMs or FEMs for all criteria pollutants is available at http://www.epa.gov/​ttn/​amtic/​criteria.html.

    Elsewhere in today's Federal Register, EPA is promulgating a new Federal reference method for measurement of mass concentrations of thoracic coarse particles (PM10−2.5) in the atmosphere, to be codified as appendix O to 40 CFR part 50. Although, as explained earlier, EPA is not at this time adopting any NAAQS for PM10−2.5, EPA believes an FRM for PM10−2.5 is still highly desirable to aid in a variety of needed Start Printed Page 61244research studies.[9] This new FRM is defined as the standard of reference for measurement of PM10−2.5 concentrations in ambient air. It will be an acceptable and readily available PM10−2.5 measurement method for new NCore multipollutant monitoring sites to be located at approximately 75 urban and rural locations. Availability of an approved FRM for PM10−2.5 will also help provide consistency among PM10−2.5 measurements used in future health studies of the adverse health effects associated with exposure to thoracic coarse particles. Lastly, the PM10−2.5 reference method will provide the basis for development of speciation samplers capable of providing an improved understanding of the compositions of different ambient mixes of thoracic coarse particles, so that this composition can be related to both health effects and to particle sources. Associated with this new reference method, EPA is also establishing related amendments to 40 CFR part 53 to extend the designation provisions of FRMs and FEMs to methods for PM10−2.5. These amendments set forth explicit tests, performance standards, and other requirements for designation of specific commercial samplers, sampler configurations, or analyzers as either FRMs or FEMs for PM10−2.5, as appropriate.

    As noted in section VI.A of the preamble to the NAAQS revisions published elsewhere in this Federal Register, EPA recognizes that the FRM, while providing a good standard of performance for comparison to other methods, is not itself optimal for routine use in PM10−2.5 monitoring networks. Alternative methods are needed that provide a more direct measurement of ambient PM10−2.5 concentrations. Methods are also needed that collect samples of PM10−2.5 that are more physically separated for analysis of chemical species. Also, automated, continuous-type methods provide many operational advantages to ease monitoring burdens, reduce on-site service requirements, and eliminate off-site sample filter support services, as well as to provide measurement resolution of 1 hour or less and near real-time reporting of monitoring data. Therefore, EPA is interested in encouraging the development of alternative monitoring methods for PM10−2.5 by focusing on the explicit test and qualification requirements necessary for designation of such types of methods as FEMs for PM10−2.5. In fact, EPA anticipates that alternative FEMs will eventually provide most of the PM10−2.5 monitoring data obtained in the States' monitoring networks.

    Further, EPA recognizes that the potential benefits of automated/continuous monitoring methods apply as well to FEMs for PM2.5. Accordingly, as proposed, EPA is also establishing new requirements in part 53 for designation of continuous FEMs for PM2.5. See 71 FR 2721. The PM2.5 and PM10−2.5 FEM provisions parallel each other in many respects so inclusion now is both appropriate and conforming.

    The new requirements for approval of automated/continuous FEMs can accommodate a wide range of potential PM10−2.5 or PM2.5 continuous measurement technologies. Ambient air testing of a candidate technology at diverse monitoring sites is required in order to demonstrate that the level of comparability to collocated Federal reference method measurements is adequate to meet established data quality objectives (DQOs).

    This final rule also modifies somewhat certain existing requirements for designation of alternative, non-continuous methods for PM2.5. As explained in section IV.B of this preamble, the modified requirements will be fully consistent with the more advanced new requirements for both continuous and non-continuous FEMs for PM10−2.5.

    B. Requirements for Candidate Reference Methods for PM10−2.5

    No comments were received related specifically to the PM10−2.5 FRM designation requirements. These provisions are adopted as proposed. Because of the nearly complete similarity between the specifications for the new PM10−2.5 reference method and for the existing PM2.5 reference method, the designation requirements for PM10−2.5 reference methods are essentially the same as those for PM2.5 reference methods. As set forth in the new appendix O to 40 CFR part 50, the PM10−2.5 reference method specifies a pair of samplers consisting of a conventional PM2.5 sampler and a special PM10 sampler. The PM2.5 sampler must meet all requirements for a PM2.5 reference method in 40 CFR part 50, appendix L, as well as additional requirements in part 53. However, the PM10 sampler required by the method is not a conventional PM10 sampler as described in 40 CFR part 50, appendix J; rather, it is a sampler specified to be identical to the PM2.5 sampler of the pair, except that the PM2.5 particle size separator is removed. This special PM10 sampler is identified as a “PM10c” sampler to differentiate it from conventional PM10 samplers that meet the less exacting requirements of 40 CFR part 50, appendix J. In view of the similarity of the PM10−2.5 FRM requirements to those of the PM2.5 FRM, the new requirements will allow a PM10−2.5 sampler pair consisting of samplers that have already been shown to meet the PM2.5 FRM requirements (except for the PM2.5 particle size separator in the case of the PM10c sampler) to be designated as a PM10−2.5 reference method without further testing.

    C. Requirements for Candidate Equivalent Methods for PM2.5 and PM10−2.5

    As pointed out in the preamble to the proposed rule (71 FR 2721), EPA believes very strongly that provisions to allow designation of Federal equivalent methods provide an important incentive to encourage the commercial development of innovative new and advantageous alternative methods for monitoring air pollutants. However, it is also important to show conclusively that any new candidate method will produce measurements comparable to those of the FRM and will have performance characteristics that are adequate to meet DQOs. At the same time, the testing that is necessary to show comparable and adequate performance must not be so burdensome that it undermines incentives for new method development.

    Because of the complex nature of particulate matter, it is also complex to test the performance of PM monitoring methods. For methods for PM2.5, EPA defined three classes of candidate FEMs (Classes I, II, and III) based on the extent to which the method differs from the FRM, so that the nature and extent of the performance and comparability testing necessary can be more closely matched to the nature of the candidate method. See 40 CFR 53.3(a)(2)−(4). In this final rule, as proposed, EPA is extending these same class definitions and tiered testing requirements to apply to PM10−2.5 candidate FEMs as well.

    Class I methods are limited to minor deviations from the FRM; Class II covers Start Printed Page 61245integrated-sample, filter-based, gravimetric methods deviating more significantly from the FRM; and Class III methods (originally) included all other methods not categorized as Class I or II. The three classes are described in more detail in the proposal preamble (71 FR 2721). As proposed, the definition of Class III FEMs is narrowed to include only continuous or semi-continuous analyzer methods having 1-hour or less measurement resolution, which are the Class III methods that by far hold the most potential for monitoring applications and FEM designation. The EPA has thus avoided the restrictions and complexity that would be necessary to accommodate the wide variety of other types of non-Class I or II methods that are unlikely to be economically and commercially practical. Also, the continuous operational nature of such Class III methods gives rise to a statistical advantage that allows more tolerant limits of adequate comparability, relative to a method that is not operated continuously, to achieve a similar limit of uncertainty in the monitoring data.

    Class III continuous methods appear to offer many potential benefits for use in routine field monitoring networks. These automated analyzers eliminate most, if not all, of the pre- and post-weighing of sample filters, require less frequent on-site service, may be less costly to operate, and offer near real-time, electronic reporting of hourly (or less) mass concentration measurements (similar to data reporting that is common for gaseous pollutant monitors). The EPA is accordingly adopting the proposed Class III FEM provisions for PM10−2.5 and PM2.5 in today's rule, with some changes in response to comments.

    Continuous methods, by nature, tend to have somewhat different performance characteristics from those of the corresponding filter-based FRMs, so the comparability and performance testing requirements must be adequately comprehensive and discriminating without being excessively burdensome. The Class III FEM requirements being promulgated today are based predominantly on demonstrating an adequate degree of comparability between candidate method measurements and concurrent, collocated Federal reference method measurements under a representative variety of site conditions. Many issues and much technical input were carefully considered during the development of the requirements, including peer review by the Ambient Air Monitoring and Methods Subcommittee of the Clean Air Scientific Advisory Committee. The salient Class III FEM requirements were summarized in the proposal preamble (71 FR 2722-2724). Not unexpectedly, a considerable number of comments were received in connection with the specifics of the proposed Class II and Class III requirements. The more significant of these comments are addressed below, after a summary of the proposal regarding requirements for Class II and Class III methods. Remaining comments are addressed in the Response to Comments document.

    Class II candidate FEMs, although not offering the operational advantages of continuous Class III methods, are nevertheless important as well. Class II methods encompass the dichotomous and virtual impactor types of methods that can provide a more direct, gravimetric, filter-based measurement of PM10−2.5 than available with the FRM. These methods are also most likely to fulfill the substantial need for collecting PM10−2.5 samples that are physically separated from other particle sizes, or nearly so, for chemical species analysis. New requirements for Class II FEMs for PM10−2.5 are being established in this final rule, and some of the previously established requirements for Class II FEMs for PM2.5 are being changed somewhat to make them more consistent with the corresponding new requirements for PM10−2.5 Class II FEMs and to incorporate some minor technical improvements.

    The proposed Class II FEM requirements, as outlined in the proposal preamble (71 FR 2721-2725), were based on daily sampling; therefore, Class II equivalent methods used for determining compliance with the PM2.5 NAAQS would generally have been restricted to daily sampling. However, in response to concerns about method performance in relatively clean areas, EPA has strengthened the additive bias (intercept) requirement. With this tighter performance criteria and considering that Class II methods are filter-based samplers, a minimum of a one-in-three day sample frequency will be appropriate to meet the network data quality objectives. Class II methods are also expected to be used for collecting samples used in chemical species analysis, which would not require daily operation. The character of the test sites specified for Classes II and III tests for both PM2.5 and PM10−2.5 are similar, so concurrent testing for PM2.5 and PM10−2.5 methods of both classes can be carried out, substantially reducing the testing burden for candidate FEMs that measure both PM2.5 and PM10−2.5 or for testing multiple candidate methods simultaneously.

    Of particular note to instrument manufacturers, this final rule allows applications for Class II candidate FEMs for both PM10−2.5 and PM2.5 to optionally substitute the more extensive Class III comparability field tests in subpart C for some or all of the rather extensive and arduous laboratory wind tunnel tests, loading test, and volatility test of subpart F to which a Class II candidate FEM sampler may otherwise be subject. Such a substitution of test results may be particularly important when the special facilities necessary for the wind tunnel tests or other tests are not available. Concurrent testing of multiple methods under the Class III requirements may also help to reduce overall testing costs.

    In regard to the proposed testing requirements for Class III (continuous) FEMs for PM2.5 and PM10−2.5, EPA specifically solicited comments related to the adequacy of the number and location of the test sites required for the field tests to determine comparability of a candidate method to the respective FRM. See 71 FR 2722. By definition, a designated FEM is generally qualified for use at any monitoring site in the U.S. (with the possible exception of some areas with extreme conditions), so the test requirements for comparability need to represent a wide variety of possible site conditions. The EPA proposed that candidate methods be tested within three general geographical areas: (1) The Los Angeles area in winter and summer seasons, (2) eastern U.S. in winter and summer, and (3) western U.S. in winter only (for a total of five 30-day test campaigns). Each proposed test site area was selected for representing particular and diverse typical site conditions.

    In response to several comments addressing this issue, a fourth test site—in the U.S. Midwest, with tests required in the winter season only—has been added to the requirements to further increase the geographical diversity. However, the requirement for a winter test campaign in the eastern U.S. has been withdrawn while the requirement for a summer test campaign in the eastern U.S. has been retained, so the total number of required test campaigns (five) is unchanged. Comparability testing of a candidate method is costly, rendering it impractical to test a candidate method under all possible combinations of site and seasonal conditions that might be encountered in national PM monitoring networks. The EPA considers the specified complement of five test campaigns in the four specified geographical areas and two seasons to be reasonable to conduct and adequately representative of the diversity of site and seasonal PM Start Printed Page 61246monitoring conditions across the U.S. As noted above, the two test site areas specified for testing candidate Class II FEMs are compatible with the test sites for candidate Class III methods, which will significantly reduce testing costs by allowing Class II and III candidate methods to be tested simultaneously at the same test site. Also, the test sites have been relabeled for ease of referencing east and west sites.

    Some commenters expressed concern that the Class III comparability test standards might be inadequate because a candidate method that had an unacceptable seasonal bias (such as has been noted for some continuous methods) could be found acceptable, because in pooling test data from summer and winter seasons the biases would compensate. The EPA finds that the associated minimum correlation requirement of the regression test should adequately avoid that situation. Further, in the revised test requirements, summer and winter tests at the same site, where the data are pooled, are required at only one of the four required tests sites.

    Another issue concerning the proposed testing requirements for Class III (continuous), as well as Class II candidate equivalent methods for PM2.5 and PM10−2.5, was the specific acceptance criteria for the regression analysis statistics—particularly the additive bias (intercept) parameter—of the comparison between collocated measurements obtained with the candidate and FRM methods. As proposed, the upper and lower limits for the regression intercept were specified as functions of the corresponding slope, with the acceptable combinations of slope and intercept represented by the area inside a trapezoid or a hexagon shape plotted on a slope-intercept coordinate system (Figures C-2 and C-3 in proposed revised subpart C of part 53 at 71 FR 2768-2769). These acceptance limits were based on statistical considerations related to the uncertainty allowable in making correct NAAQS attainment decisions for PM2.5 (or similar comparisons of PM10−2.5 concentrations to non-regulatory benchmarks). Several commenters were concerned that the range of acceptable intercepts proposed for Class II and III FEMs, although appropriate for DQOs related to attainment (or similar) decisions, may allow excessive measurement bias for FEMs used for other PM monitoring applications—especially those applications that require measurements of concentrations well below the level of the NAAQS.

    In response to these comments and in deference to potential use of FEMs for a variety of applications, EPA has somewhat strengthened the range of allowable intercepts for those candidate FEMs. For Class III FEMs, new fixed limits of ±2.0 μg/m3 for PM2.5 methods and ±7.0 μg/m3 for PM10−2.5 methods have been added. For Class II FEMs for PM10−2.5, the fixed intercept limit has been reduced from ±7.0 to ±3.5 μg/m3. (The intercept requirements proposed for candidate Class II PM2.5 methods were re-examined and found to be appropriate as proposed.) The more restrictive intercept limits will reduce the maximum allowable measurement bias and are represented by smaller hexagonal acceptance areas, as specified in 40 CFR part 53, subpart C revised Table C-4 and as illustrated in revised Figures C-2 and C-3 of this final rule.

    Nevertheless, EPA wishes to point out that, because of the design of the equivalent method comparability tests (which require no low-level test concentrations) and the nature of the regression analysis, a seemingly high positive or negative intercept resulting from the regression analysis of the test data is not necessarily indicative or likely to be characteristic of the actual measurement errors or bias of the candidate method relative to the FRM at low or very low concentrations. This situation may be particularly true when the concentration coefficient of variation (CCV) for the FEM test data (see 40 CFR 53.35(h)) is relatively low, resulting in greater uncertainty in the predicted additive bias (and in the multiplicative bias (slope) as well).

    Class III FEMs will generally provide 1-hour concentration measurements (in addition to the required 24-hour measurements), and EPA asked for comments on whether the FEM provisions should include any specific requirements for 1-hour precision, and if so, whether a specific standard of performance should be specified and how it should affect FEM designation. See 71 FR 2723. Of the few comments received on this issue, most agreed with EPA that 1-hour precision is an important descriptor associated with a Class III candidate method and that 1-hour FEM test data should be submitted in a Class III FEM application so that the short-term precision can be determined, but no specific standard should be set for the precision parameter in connection with the FEM designation qualifications. A few commenters suggested that a precision performance parameter based on a running average of a few (e.g., 3 to 5) hours should be established and regulated, however, to preserve flexibility, EPA believes that precision estimates are better included in method-specific quality assurance guidance (to be used by instrument operators as they believe appropriate) rather than as a formal part of the FEM provisions. Therefore, no changes were made to the proposed requirement that FEM applicants submit the 1-hour FEM test data, and there is no designation requirement based on 1-hour precision or any other particular 1-hour based performance statistic.

    The EPA also asked for comments on the adequacy and appropriateness of the proposed test requirements for Class II FEMs. See 71 FR 2724. Some commenters suggested that the proposed Class II tests were inadequate because there was more variation in the PM at different sites than could be represented in the tests—particularly in regard to chemical compositions—and suggested that continued FEM designation should be conditioned on a mandatory periodic reassessment of local-agency comparisons to FRM measurements. The EPA recognizes that data produced by all FEMs operated in monitoring networks under 40 CFR part 58 should meet the data quality objectives (DQOs) of 40 CFR part 58, appendix A, section 2.3.1 on a continuing basis. The operational requirements of appendix A will help ensure this. Moreover, EPA can invoke designation cancellation procedures for the method designation under 40 CFR 53.11 (Cancellation of reference or equivalent method designation) if EPA observes that DQOs are not being maintained for a particular designated Class II equivalent method (or for any FEM or FRM). However, EPA believes that designation cancellation should be initiated by EPA when necessary, rather than have designations conditioned on specific periodic reassessments as commenters suggested. Other commenters suggested that the test sites be approved by both EPA and the STAPPA/ALAPCO Monitoring Committee, but EPA believes that would be cumbersome and unnecessary.

    D. Other Changes

    EPA proposed several other relatively minor changes to various provisions of subparts A, C, E, and F of part 53. See 71 FR 2724-2725. Organizational changes in subpart C consolidate the provisions for various types of methods, making them easier to understand. Other changes clarify or simplify some existing provisions for PM10 and PM2.5 Class I and II FEM testing and implement minor technical improvements to test protocols, with little, if any, impact on the nature or efficacy of the tests. Minor changes are made to subparts A, E, and F to incorporate the new PM10−2.5 provisions and some new definitions, Start Printed Page 61247make a few administrative adjustments, and incorporate a few minor technical changes. These changes are described more completely in the proposal preamble (71 FR 2724), and they are being adopted as proposed, as no comments were received pertinent to these minor changes.

    After considering all comments carefully, EPA determined that no further changes should be made to the proposed new or revised FRM and FEM requirements. The EPA is thus adopting the proposed new or revised requirements and provisions for Federal reference and Federal equivalent methods for PM2.5 and PM10−2.5, modified to incorporate the changes described above.

    V. Discussion of Regulatory Revisions and Major Comments on Proposed Amendments to 40 CFR Part 58

    A. Overview of Part 58 Regulatory Requirements

    Part 58 of 40 CFR, Ambient Air Quality Surveillance, contains requirements for ambient air monitoring programs operated by States (or designated local agencies). As proposed, the structure of part 58 remains much the same as the 1997 version. Proposed subparts A through G, containing 40 CFR 50.1 through 50.61, provide definitions of terms; require the operation of certain numbers and types of monitors by certain dates; require the use of certain monitoring methods, quality system practices, and sampling schedules and frequencies; require annual plans describing a State's monitoring network and planned changes to it; provide criteria for EPA approval of planned changes; require data submission and certification that submitted data is accurate to the best of the knowledge of responsible State official; address special rules regarding special purpose monitors; provide rules for comparing monitoring data to applicable National Ambient Air Quality Standards (NAAQS); require reporting of the Air Quality Index (AQI) to the public in some areas; and provide for monitoring directly by EPA if a State fails to operate required monitors. As proposed, part 58 also includes appendices A, C, D, E, and G which were referenced by various numbered sections in subparts A through G. These appendices contain many detailed requirements, as well as considerable explanatory or background material and non-binding advice. Appendix A addresses quality system requirements, appendix C addresses monitoring methods and equipment, appendix D mostly addresses the number of required monitors and their placement within a metropolitan or other area, appendix E addresses the details of monitoring station layout, and appendix G addresses AQI reporting. (Subpart B of the 1997 version was proposed to be removed. Subpart F was already reserved in the 1997 version. No amendments were proposed to the part 58 requirements for reporting of the AQI and the associated appendix G.)

    To aid in understanding the provisions of the final part 58 and their relationship to the 1997 and proposed provisions, the following discussion for the most part follows the order of the final part 58, addressing each affected numbered section and then the appendices.

    B. General Monitoring Requirements

    1. Definitions and Terminology

    The EPA proposed to discontinue the use of the term “National air monitoring stations (NAMS)”. See 71 FR 2720. Previously, this term was used to designate Federal reference method (FRM) and Federal equivalent method (FEM) monitors which were operated to meet set requirements for the number (and, for some pollutants the type of location) of monitors and which required EPA Administrator approval for changes, as distinguished from “State and local air monitoring stations (SLAMS)” which referred to additional FRM and FEM monitors for which generally there was no minimum number, for which siting was more at the State's discretion, and for which changes were approved by the Regional Administrator.

    The EPA proposed a new definition for “National Core (NCore)” stations.

    The definition of “State or local air monitoring stations (SLAMS)” was proposed to be modified to include NCore, Photochemical Air Monitoring Systems (PAMS), and all other State or locally operated stations (such as PM2.5 speciation stations) that have not been designated as a special purpose monitor or monitoring station (SPM). This change was proposed for convenience in referencing these types of monitors together because some provisions in the rule apply to all of them but not to SPMs. See 71 FR 2720. Previously, “SLAMS” referred only to FRM and FEM monitors.

    The term, “Approved regional methods” (ARMs), proposed at 71 FR 2720, is added to refer to alternative PM2.5 methods that have been approved by EPA for use specifically within a State, local, or Tribal air monitoring network for purposes of comparison to the NAAQS and to meet other monitoring objectives, but which may not have been approved as FEM for nationwide use.

    The EPA proposed to adopt a new term, “Primary quality assurance organization” to clarify the working definition of the term “Reporting organization” currently utilized in section 3.0.3. of 40 CFR part 58, appendix A, Quality Assurance Requirements, and to avoid confusion with the different way “reporting organization” has come to be used in a related but distinct context (final uploading of data to the Air Quality System). See 71 FR 2778.

    The EPA also proposed additional definitions to be consistent with terminology used in 40 CFR part 50, appendix O, the FRM for PM10−2.5. See 71 FR 2777. Modifications to the definitions of key geographical terms were proposed, as needed, to reflect changes in U.S. Census Bureau usage since the last revision to monitoring regulations.

    The EPA received some questions seeking clarification of the new term “Primary quality assurance organization,” which are addressed in the Response to Comments document available in the docket. No other adverse comments were received on these proposed definitions, and this final rule includes all of them.

    2. Annual Monitoring Network Plan and Periodic Network Assessment

    The EPA proposed to consolidate current requirements for the SLAMS air quality surveillance plan and NAMS network description into elements of the annual monitoring network plan described in 40 CFR 58.10 of the proposed rule. See 71 FR 2725. The annual monitoring network plan would provide a statement of purpose for each monitor in a monitoring agency network and provide evidence that siting and operation of each monitor meet the requirements of appendices A, C, D, and E of part 58, as applicable. The EPA also proposed the addition of some required elements to the annual monitoring network plan and proposed to add a new requirement for a periodic network assessment.

    The EPA received comments on a number of specific elements within the annual monitoring network plan and with regard to the network assessment requirement. The comments that were the basis for modifications to the proposed rule are discussed briefly here. Detailed responses to all comments are provided in the Response to Comments document available in the docket.

    Comments were received on the proposed requirement for a 30-day Start Printed Page 61248public inspection period before State submittal of a draft annual monitoring network plan to the Regional Administrator as well as on the proposed requirement for Regional Administrator approval of annual monitoring network plans seeking SLAMS network modifications including new monitoring sites. Some commenters requested clarification regarding what methods would be considered acceptable for making documents available for public inspection. Commenters also expressed concern that the 120 days proposed for Regional Administrator review and approval/disapproval would result in unnecessary delays.

    The EPA notes the general support in the comments for the public inspection requirement. Commenters also supported the flexibility in the proposed rule which would allow monitoring agencies to design and implement appropriate ways of allowing this inspection. The EPA supports use of monitoring agency Web sites for such postings, along with other means of providing public notice including hard-copy posting in libraries and public offices. Although the public inspection requirement does not specifically require States to obtain and respond to received comments, such a process is encouraged with the subsequent transmission of comments to the appropriate EPA Regional Office for review. Therefore, EPA has modified this final rule from the proposal to specify that where the State has provided for a public comment process and provided any comments received to EPA, and the posted plan has not been substantially altered as a result of the public comments, the requirement for the Regional Administrator to obtain public comment by a separate process can be waived. The 120 days allowed for Regional Administrator review of an annual plan is a feature of the current monitoring rule, and has been kept in this final rule.

    The EPA received many comments on the proposed requirement for the annual monitoring network plan to contain cost information. See 71 FR 2780. Commenters were concerned that no details were provided regarding what information would be required and how the information would be used. The accounting difficulty in calculating such cost information was also noted along with concerns regarding the administrative burden of preparing and documenting the cost estimates.

    The EPA has considered the proposed requirement for cost information in the annual monitoring network plan and agrees that considerable effort would be needed to develop guidance to standardize the development of financial information and for States to collect and summarize the information for submittal. Without such standardization, cost information would be difficult to interpret. In view of these comments, EPA has deleted this element from the list of required information to be contained in the annual monitoring network plan.

    The EPA proposed a new requirement that the annual monitoring network plan consider the ability of existing and proposed sites to support air quality characterization for areas with relatively high populations of susceptible individuals (e.g., children with asthma), and, for any sites that are being proposed for discontinuance, the effect on data users other than the agency itself, such as nearby States and Tribes or health effects studies. See 71 FR 2780. Several commenters noted that this requirement would be challenging to implement and involves knowledge of public health that may not be readily available to monitoring organizations. In addition, it was noted that, absent the availability of a centralized information clearinghouse, it would be difficult for States to be aware of all possible users of data for health studies or other types of research.

    This new element of the annual monitoring network plan highlights the importance that EPA places on the consideration of sensitive populations when evaluating the relative value and representativeness of monitoring sites, particularly for areas where one or more NAAQS may be approached or exceeded.[10] The EPA acknowledges the potential challenge in obtaining information about the distribution of susceptible individuals in specific geographic areas around existing and proposed sites, and has purposely defined the requirement as a “consideration” to provide significant latitude for monitoring organizations to determine the complexity and depth of their response. In recognition of the potential complexity of preparing assessments of susceptible populations on a sub-county sized spatial scale as represented by typical monitoring sites, in this final rule EPA has moved this requirement to become a required element of the 5-year network assessment rather than the annual monitoring network plan.

    With regard to the proposed provision requiring States to consider the effect on data users of proposed actions to discontinue sites, EPA notes that States are already required to make their annual network monitoring plans available for public inspection and that process provides the basic framework for disseminating information about anticipated site discontinuations. The EPA recognizes that there are many potential users of air quality information and that States cannot be aware of all such users. However, to the extent that information about site shutdowns can be disseminated more widely, there are benefits to be gained by protecting key monitors that (for example) support ongoing health studies or that are the basis for long-term trend analyses, or otherwise provide information that is used by stakeholders other than the operating agency. As such, EPA has retained this provision in this final rule. The EPA will work with States and health organizations to explore options for tracking the status of key air quality sites.

    The EPA received many comments in response to the proposed requirement for a network assessment to be completed every 5 years and to be submitted with the required annual network monitoring plan. Commenters acknowledged the overall value of a more complete evaluation of monitoring programs but expressed concern about the resource burden in meeting the requirement.

    Network assessments are a key tool to help ensure that the right parameters are being measured in the right locations, and that monitoring resources are used in the most effective and efficient manner to meet the needs of multiple stakeholders. Network assessments can help identify new data needs and associated technologies, find opportunities for consolidation of individual sites into multi-pollutant sites, and identify geographic areas where network coverage should be increased or decreased based on changes in population and/or emissions. The EPA has already issued draft guidance to describe the possible techniques that States can use in developing their assessments, and has purposely limited the required elements to provide flexibility in the amount of resources that would be required. After consideration of the comments, EPA has retained the network assessment requirement in this final rule. In light of the concerns raised about the resource requirements needed to complete network assessments, the deadline for the first required assessment under this final rule has been delayed an additional year to July 1, 2010.

    The EPA is not adopting the proposed requirement for a separate plan Start Printed Page 61249establishing a network of PM10−2.5 stations as an addendum to the annual monitoring network plan (see 71 FR 2740, 2779) since the only required PM10−2.5 monitoring will take place as part of the NCore multi-pollutant stations, already covered by the proposed plan due July 1, 2009. The EPA has added clarifying language to this final rule requiring Administrator approval for the NCore plan due July 1, 2009 and subsequent annual monitoring network plan elements proposing modifications, consistent with the requirement for Administrator approval of NCore stations in section 3(a) of appendix D.

    The proposed plan element supporting PM10−2.5 suitability tests for NAAQS comparisons likewise is not being adopted since EPA is not finalizing the proposed PM10−2.5 NAAQS.

    The proposed prescriptive wording with reference to public hearings in the context of reviews of changes to violating PM2.5 monitors and/or community monitoring zones (71 FR 2780) has been modified to specify that draft plans containing such proposed changes to PM2.5 networks must be made available for public inspection and comment by States prior to submission to the EPA Regional Administrator but that States can design the process for achieving such goals.

    3. Operating Schedules

    The EPA proposed that manual PM2.5 monitors at SLAMS be required to operate on a 1-in-3 day sampling frequency, except under certain conditions and when approved by the Regional Administrator. See 71 FR 2780. As discussed in section II.E.1 of the preamble to the final revisions to the PM NAAQS, published elsewhere in this Federal Register, commenters pointed out a potential bias in the method used to calculate the 98th percentile form of the 24-hour PM2.5 NAAQS. As explained there, to avoid this potential bias, EPA is requiring daily sampling at design value sites that are within 5 percent of the 24-hour NAAQS for PM2.5.

    The EPA proposed that manual PM10−2.5 samplers at SLAMS stations must operate on a daily schedule, without a requirement for any collocated continuously operated FEM PM10−2.5 samplers. See 71 FR 2780. Numerous commenters noted that a 1-in-3 day sampling frequency was acceptable for PM2.5 sites and said that the same sampling frequency for PM10−2.5 would produce sufficient data for comparison to the proposed 24-hour PM10−2.5 NAAQS averaged over 3 years. Commenters also noted the lack of currently available continuous FEM PM10−2.5 instruments and the burdensome resource requirements associated with daily sampling requirements using the proposed filter-based FRM.

    The proposed requirement for daily PM10−2.5 sampling was based on a data quality objective system analysis that identified such a frequency as being a key factor in reducing statistical uncertainty at concentrations near the level of the proposed 24-hour PM10−2.5 NAAQS. Since EPA is not finalizing a PM10−2.5 NAAQS but instead is requiring a more limited set of PM10−2.5 monitors at NCore sites to support objectives other than and (obviously) not including NAAQS compliance, additional flexibility in sampling frequency requirements is appropriate. Although daily sampling of PM10−2.5 at NCore sites remains a desirable outcome, and will become a more practical goal with the advent of continuous FEM monitors in several years, EPA has reduced the PM10−2.5 sampling frequency requirement in this final rule to 1-in-3 days.

    The EPA proposed reducing the sample frequency requirement for PM10 manual methods. Reducing the sample frequency for PM10 was possible since EPA had proposed to have daily sampling of PM10−2.5 to support protection from thoracic coarse particles. As published elsewhere in today's Federal Register, EPA is retaining the 24-hour PM10 standard and not finalizing a PM10−2.5 standard. The EPA is also only finalizing a limited network of PM10−2.5 monitors at multi-pollutant NCore stations for scientific purposes. Therefore, since the existing requirement for PM10 sample frequency is for daily sampling for the site with the expected maximum concentration in each area, and previous assessments of the 24-hour standard demonstrates that maximizing sample frequency will minimize decision errors, EPA is retaining the existing daily sample frequency requirement for the site with expected maximum concentration in each area. This existing requirement also allows for other sites in the same area to operate on a 1-in-6 day sample frequency. Sample frequency relief is possible for expected maximum concentration sites that are significantly away from the 24-hour PM10 NAAQS and in seasons exempted by the Regional Administrator.

    4. Monitoring Network Completion for PM10−2.5 and NCore Sites

    The proposed requirement for specified numbers of PM10−2.5 sites to be physically established no later than January 1, 2009 is not included in this final rule. However, by January 1, 2011, States must implement the less extensive monitoring for PM10−2.5, including speciation sampling, as part of the generally-applicable requirement to operate NCore multipollutant monitoring stations by that date. A plan for the implementation of the required NCore multipollutant monitoring stations, including site selection, is due July 1, 2009.

    Little comment was received on the requirement for the NCore multipollutant sites to be physically established no later than January 1, 2011, and that requirement remains unchanged in this final rule as EPA continues to believe that this is practical and desirable.

    5. System Modifications

    In part, EPA started this rulemaking based on the recognition by EPA and leaders of State and local monitoring agencies that State/local monitoring networks should be modified to reduce some types of monitoring activity in some areas and to begin new types of monitoring. The EPA proposed rule changes to revise the minimum required number of monitors for ozone (O3), PM2.5, lead (Pb), and PAMS pollutants and to eliminate altogether the minimum number of required monitors for carbon monoxide (CO), sulfur dioxide (SO2) and nitrogen dioxide (NO2) in order to utilize scarce resources more productively by allowing for reductions in the number of monitoring sites where appropriate. See 71 FR 2729.

    The EPA stated in the proposal that the remaining requirements for the minimum number of monitors for Pb, PM2.5, and O3 were intended to be necessary but not always sufficient to meet the requirements in section 110(a)(2)(B) of the Clean Air Act (CAA) that State implementation plans (SIPs) provide for operation of appropriate systems to monitor, compile, and analyze data on ambient air quality. Similarly, although EPA believes that one-size-fits-all rules for the number of CO, SO2, and NO2 monitors are no longer appropriate in light of the rarity of NAAQS violations for those pollutants, EPA believes that some monitoring should be continued in many areas for these pollutants. Accordingly, EPA proposed to continue to require States to propose changes in their monitoring networks, including discontinuation of monitors, and obtain EPA approval before making changes, even when the remaining minimum requirements, if any, for number of monitors would still be met after the Start Printed Page 61250changes. The EPA approval would be given by the Regional Administrator, usually through approval of the annual monitoring network plan, except for changes involving NCore sites, PAMS sites, and PM2.5 speciation trends sites which would require Administrator approval.

    While local situations need to be considered individually, EPA proposed six criteria for approval of requests to discontinue monitors. See 71 FR 2749. To summarize, the six criteria addressed: (1) Any monitor which could be shown to have a low probability of future violations; (2) a CO, PM10, SO2, or NO2 monitor that has been reading consistently lower than another monitor in the same area; (3) any highest reading monitor that has not indicated any NAAQS violation in the previous 5 years and for which the approved SIP provides for an alternative to continued monitoring; (4) any monitor which cannot be compared to a NAAQS because of siting considerations; (5) any monitor designed only to measure transport from upwind areas if another transport monitor were replacing it; and (6) any monitor for which logistical problems make continued operation at the current site impossible. Situations not addressed by these criteria would be considered on a case-by-case basis.

    The EPA received a number of comments on the proposed removal of the minimum monitoring requirements for some of the criteria pollutants, on the revision of the minimum numbers of monitors for other criteria pollutants, on the six proposed criteria for discontinuing monitors, and on the issue of discontinuing monitors more generally, mostly from State and local monitoring agency officials. This final rule provisions on minimum numbers of monitors for O3, PM2.5, PM10, and Pb are discussed in section V.E of this preamble. Comments on the other parts of the proposal are addressed here. A few commenters specifically endorsed all or part of these proposals, or at least the intention to facilitate reductions in unnecessary or duplicative monitoring activities. Most commenters expressed concern over the proposals.

    A number of commenters appear to have interpreted the proposals as indicators of network reductions EPA intended to require monitoring agencies to make, and expressed opposition to such reductions. The EPA clarifies here that EPA believes that proposals for network modifications should generally be initiated by the monitoring agency; EPA does not intend to compel any agency to remove any monitor. The proposals related to network modifications, and the provisions in this final rule, govern only EPA's consideration of changes which monitoring agencies seek to adopt. The EPA recognizes that funding constraints may require agencies to discontinue monitors that they otherwise would operate, but this reinforces the need for EPA review and the usefulness of having criteria for discontinuance to govern that review.

    A few commenters suggested that EPA include in the rule or provide via guidance specific formulas or calculation procedures regarding the estimation of the probability of a future NAAQS exceedance, which is the basis of the first of the six proposed adjudicative criteria. The EPA intends to provide guidance on this matter in the future, but we believe that binding formulas or procedures in rule form would preclude development of better general procedures and the sort of case-specific analysis of unique factors that is likely to be appropriate in some situations.

    A number of commenters stated that the six proposed criteria were overly focused on whether a monitor is providing data for use in making comparisons to the NAAQS for purposes of attainment/nonattainment findings, and that decisions to remove or retain a monitor should also recognize the utility of the monitor in satisfying other required monitoring objectives. Section 1 of the proposed appendix D of 40 CFR part 58 stated that air monitoring networks must be designed to meet three monitoring objectives: (1) Providing air pollution data to the public; (2) supporting compliance with ambient air quality standards and emission strategy development; and (3) supporting air pollution research studies. Some commenters pointed out that EPA has articulated in the draft National Ambient Air Monitoring Strategy [11] seven objectives for the NCore multipollutant monitoring stations (overlapping in part with the three objectives in section 1 of appendix D) and stated that single-pollutant stations should be considered to be part of an overall network to meet these objectives. The EPA agrees that these two sets of overlapping objectives are important and that monitors should not be discontinued without regard to whether these objectives will continue to be met, but EPA believes the proposed criteria, along with other provisions regarding approval of annual monitoring network plans and periodic network assessments, protect the required monitoring objectives. The paragraphs below address two objectives that were most often mentioned by commenters.

    Several commenters stated that ambient monitoring can serve as a continuing check on the compliance of a specific source, or sources in the aggregate, with applicable emissions limits. The EPA believes that given that factors such as wind direction, dispersion conditions, and atmospheric reactivity conditions can greatly influence the relationship between emissions and ambient concentrations, situations are infrequent in which ambient monitoring is a critical, or the most important, element of source compliance monitoring. Other EPA rules address requirements for direct emissions and compliance monitoring for many types of sources. Ambient monitoring agencies will have the option of continuing to operate ambient monitors they feel are useful for this objective.

    Some commenters stated that the ability to track trends in air quality and assess whether those trends are consistent with trends expected from the emission control program in general or from specific control measures (i.e., accountability) could be impaired if too many existing monitors are removed. The EPA believes that tracking trends is most important for O3, PM2.5, and PM10 because these are the NAAQS with more than a few remaining nonattainment areas. For these pollutants the revised requirements in this final rule for minimum number of monitors, the new requirement for NCore multipollutant monitoring stations, and the interest of monitoring agencies in continuing these types of monitoring as indicated by the comments themselves will, in EPA's opinion, result in networks that are appropriately robust for tracking trends and assessing causal factors. The EPA believes that the availability of multiple collocated and time resolved measurements at NCore sites will be a major advantage in this work.

    The Response to Comments document available in the docket explains in more detail how the other objectives mentioned by commenters are consistent with the six proposed criteria.

    Accordingly, this final rule mirrors the proposals, with the following four exceptions:

    (1) In the first criterion, which as proposed would have allowed the removal of a monitor for any criteria pollutant if it has shown attainment over the last five years and has less than a 10 percent probability of exceeding 80 percent of the NAAQS over the next three years and if it is not specifically Start Printed Page 61251required by the attainment plan or maintenance plan, this final rule also conditions the removal of the last remaining SLAMS monitor in a nonattainment or maintenance area on the attainment plan or maintenance plan not having any contingency measure triggered by air quality concentrations. If a plan does have such a trigger, a plan revision to remove that trigger would have to be adopted by the State and approved by EPA. The EPA will address the requirements for such a revision at a future date.

    (2) While the preamble described a sixth criterion for approval of State proposals to discontinue a monitor, having to do with logistical problems at a current site, the proposed rule text inadvertently omitted this criterion. This final rule includes it.

    (3) The second and third criteria have been slightly revised to make them applicable also to the lower reading monitor of a pair that are in the same attainment area and county, and not just to the lowest reading monitor of a pair that are in the same nonattainment area or maintenance area. A commenter pointed out the need for this revision to achieve the obvious intention of the proposal.

    (4) The third proposed criterion, worded to apply only to “the highest reading monitor * * * in a county,” required that a described monitor could be removed only if the approved SIP provided for a specific, reproducible approach to representing the air quality of the affected county in the absence of actual monitoring data. While EPA intended the highest reading monitor to be addressed in this third criterion, EPA did not intend to preclude the possibility that a lower reading monitor ineligible for removal under the first two criteria could be addressed also. This final rule revises the criterion to encompass any monitor not eligible for removal under the first two criteria where applicable.

    6. Annual Air Monitoring Data Certification

    The EPA proposed a shorter timeframe for States to submit the annual letter certifying ambient concentration and quality assurance data to the Administrator. See 71 FR 2749. Under current requirements, States have until July 1 to certify data from January 1 to December 31 of the previous year. For data collected in 2006, for example, the annual certification letter is due no later than July 1, 2007. Under the proposed requirement, the schedule for certification would be moved up 60 days, with the data certification letter required under the accelerated deadline to be due by May 1, 2009, for data collected in 2008. The EPA proposed this change to provide opportunity for an earlier start and completion for nationwide designation actions, to provide States and the public with earlier design values in time for most ozone seasons, and to support other data uses that could benefit from earlier data certification.

    In response, some commenters expressed reservations about the accelerated schedule as it applies to all submitted data, while others supported the proposal for continuous instruments that collect and report hourly data but not for data requiring lab analysis for samples collected in the field. These commenters were concerned about the feasibility and cost of meeting an accelerated schedule. The EPA notes that some States have recently provided certifications for filter-based data ahead not only of the July 1 deadline, but also of the proposed May 1 deadline, when such certifications were deemed advantageous by the States for data uses such as PM2.5 nonattainment designations. This suggests that all States could be capable of certifying data by the proposed May 1 deadline, if not earlier, if they invest in needed improvements in information technology or efficiencies in administrative procedures. Therefore, this final rule includes the proposed May 1 deadline. In recognition of the time necessary for States to adjust to the accelerated certification requirement, the implementation date has been delayed 1 year, until May 1, 2010, for data collected in 2009.

    One commenter questioned the types of annual summary reports that would required to be submitted with the data certification letter, finding the proposed requirements of 40 CFR 58.15(b) unclear. The EPA notes that different reports were mentioned in the proposal to clarify the difference between SLAMS and SPM monitors (only FRM, FEM, and ARM SPM monitors are required to be certified) and to ensure that annual summary reports are provided for both types of monitors. Providing one annual summary report for certification of both SLAMS and SPM data is appropriate. An additional report providing a summary of precision and accuracy data is necessary to demonstrate that applicable monitors meet appendix A criteria.

    7. Data Submittal

    The EPA proposed to reduce the data reporting requirements associated with PM2.5 FRMs to ease the data management burden for monitoring agencies. See 71 FR 2748. The following Air Quality System (AQS) reporting requirements were proposed for elimination: Maximum and minimum ambient temperature, maximum and minimum ambient pressure, flow rate coefficient of variation, total sample volume, and elapsed sample time. AQS reporting requirements were retained for average ambient temperature and average ambient pressure, and any applicable sampler flags.

    The EPA also proposed a requirement for the submission of data on PM2.5 field blank mass in addition to PM2.5 filter-based measurements. See 71 FR 2749. Field blanks are filters which are handled in the field as much as possible like actual filters except that ambient air is not pumped through them, to help quantify contamination and sampling artifacts. This requirement only applies to field blanks which States are already taking into the field and weighing through their laboratory procedures.

    Commenters supported the proposed changes to data submittal requirements and they are being finalized without modification. The requirement for reporting of field blank mass data begins with filters collected on or after January 1, 2007.

    8. Special Purpose Monitors

    The January 17, 2006 proposal included a background explanation of the historical distinctions between regular air monitors and special purpose monitors (SPMs) with respect to monitoring objectives, siting actions, quality assurance, and use of data. See 71 FR 2745. The EPA proposed a revision of the definition of SPM, to the effect that any SPM must be in excess of the required minimum number of monitors and that designation of a monitor as an SPM be made by the State. The EPA also proposed that States would continue to be able to choose to start and stop SPMs at will, without needing EPA approval and that States be required to submit all data from SPMs to the AQS operated by EPA. In addition, EPA proposed that States follow 40 CFR part 58 appendix A quality assurance requirements for any SPM that utilizes a FRM, FEM, or ARM instrument and which is sited consistently with the requirements of appendix E (which does not apply to SPMs on a mandatory basis). The existing rule provides that States follow these requirements only if the data from the SPM are intended by the State for use in attainment/nonattainment determinations.

    The EPA also proposed that data from the first 2 years of operation of a SPM (even if using a FRM, FEM, or ARM Start Printed Page 61252instrument and meeting appendix A and E requirements) would not be used by EPA in attainment/nonattainment findings for PM2.5 or O3 if the monitor stopped operating by the end of those 2 years. See 71 FR 2745. For CO, SO2, NO2, Pb, and the 24-hour PM10 NAAQS, EPA proposed that data from the first 2 years of operation of a SPM would not be used by EPA for nonattainment redesignations but that such data would be considered when determining whether a nonattainment area had attained the NAAQS. The reasons for this distinction by pollutant had to do with differences in the form of the respective NAAQS and whether the EPA action in question is mandatory or discretionary. These reasons were explained in detail in the preamble to the proposal. Finally, EPA proposed that currently operating monitors not already designated as SPMs could not be designated as SPMs after January 1, 2007.

    The EPA received many comments on these issues, mostly from State and local air monitoring officials but also from two industry groups. No commenter objected to the flexibility States have to start and stop SPMs. That flexibility is retained in this final rule.

    Some commenters pointed out an ambiguity in the proposed requirement that data from SPMs be submitted to AQS. The EPA intended, but did not clearly state in the proposal, that this requirement apply only to SPMs that are FRMs, FEMs, or ARMs and that are operated consistently with the requirements of 40 CFR 58.11 (network technical requirements), 40 CFR 58.12 (operating schedule), and part 58, appendix A (quality assurance requirements). These would be the SPMs that produce data that will be of most interest to EPA and the public, because except for possible inconsistencies with the siting requirements of appendix E to part 58, these are the type of data which can be compared to the respective NAAQS. This final rule provides this clarification.

    One commenter suggested that the specific reference to the AQS data system be made more general, to provide for the development and use of other suitable data submission systems in the future. This comment is relevant to all monitoring data, not just data from SPMs. This final rule retains references to AQS. If AQS is replaced or supplemented with approved alternatives in the future, terminology can be updated at that time.

    One State official supported the proposal that SPMs be subject to the regular quality requirements of appendix A, if the SPM is a FRM, FEM, or ARM. All other commenters on this issue contended that States should be allowed more flexibility. Most of these commenters agreed that regular quality assurance practices were desirable generally, but stated that practical difficulties can arise at a specific SPM site, such that requiring regular quality assurance practices would effectively mean that the SPM could not be legally operated at all and the useful data it could have provided would be lost to users.

    After considering these comments, EPA continues to believe that regular quality assurance practices are practical and of reasonable cost and feasibility in nearly all situations, as shown by successful adherence to these practices at thousands of regular monitoring stations. They are appropriate in most cases and should be the presumptive requirement. As proposed, this final rule provides for a transition period by delaying this requirement until January 1, 2009. However, EPA recognizes that unusual situations may exist in which exceptions should be allowed. For example, a State, perhaps with EPA encouragement, might operate an automated O3 monitor year-round but have difficulty getting personnel and equipment to the site regularly in winter due to road conditions. This final rule allows the Regional Administrator to approve other appropriate quality assurance practices if the requirements of 40 CFR part 58 appendix A would be physically and/or financially impractical due to physical conditions at the monitoring site and the quality assurance practices are not essential to achieving the intended data objectives. This approval can be given separately, or as part of the approval of the annual monitoring plan. Approval of alternative quality assurance practices for all or part of the year does not qualify the affected data from an affected SPM for comparison to the relevant NAAQS.

    Most of the comments received on the SPM proposals addressed the application of SPM data to attainment/nonattainment findings and designations. One citizen supported the proposal. About 20 commenters argued for a general, indefinitely long prohibition on the use of data from SPMs for nonattainment findings and designations, for States to have a way of blocking EPA from using particular SPM data indefinitely, or for States to be able to negotiate in advance with EPA for particular SPM data to not be used. Those commenters who explained their position generally stated that the risk of a nonattainment finding would discourage voluntary special purpose monitoring that could benefit air quality management.

    In the proposal preamble (71 FR 2745, January 17, 2006), EPA stated that it understood and to some degree sympathized with the thrust of very similar input EPA had received during the development of the proposed rule, but that EPA believed that under the CAA EPA may not legally ignore technically valid data from FRM and FEM (and by implication and logical extension ARM) monitors when making attainment or nonattainment determinations. The comments have not provided EPA with any reason to change this view of our legal obligation. There are only two situations where EPA would not have to consider such data. One situation is when the data would be insufficient for making a finding because it is of insufficient duration given the averaging period or form of the relevant NAAQS. This was the basis for the proposal concerning PM2.5 and O3 for which the form of the NAAQS requires 3 years of data.

    The other situation is when EPA has the discretion to simply not make a finding or to take an action, for example by taking no action to redesignate an area to nonattainment even though a SPM indicates a new violation of a NAAQS subsequent to the area's initial designation as attainment. This was the basis for the proposal concerning the CO, SO2, NO2, Pb, and PM10 NAAQS. Unlike the PM2.5 and O3 NAAQS, the NAAQS for these pollutants have forms that allow a nonattainment finding based on only 1 or 2 years of data, either because the NAAQS is explicitly based on only one year of data or because a single year of data may include so many exceedances that it is certain that the average number of expected exceedances over three years will be greater than one. However, for these other NAAQS, EPA does not have a mandatory duty to make nonattainment redesignations until such time as the NAAQS are revised. In the absence of either a NAAQS revision or a State request for redesignation, the Administrator has discretion in determining whether to redesignate an area based on data from a SPM which has operated for two years or less. The EPA does regard air quality violations seriously, and does expect States to take actions to reduce air quality to healthy levels in any areas that are experiencing violations. However, EPA recognizes that there are other ways to address such violations besides redesignating an area as nonattainment. For example, EPA can work directly with a State and nearby industries to take appropriate actions to reduce emissions that are Start Printed Page 61253contributing to the violation. The EPA has worked in this way with States in the past. In the case of PM10, EPA stated in section VII.B of the preamble to the NAAQS rule (printed in today's Federal Register) that because EPA is retaining the current 24-hour PM10 standards, new nonattainment designations for PM10 will not be required under the provisions of the Clean Air Act.

    With respect to the second situation, applicable to the CO, SO2, NO2, Pb, and 24-hour PM10 NAAQS, EPA believes it could have extended the proposed 2-year exclusion from use of SPM data in making nonattainment findings to a longer period. However, such a provision could exclude more data than appropriate and could prevent consideration of violations in making nonattainment decisions even when a SPM monitor has shown violations over 3 or more years. The EPA believes that in some and perhaps many situations like this, it would be good policy to avoid a nonattainment designation and to find other less prescriptive approaches to reducing risk to public health. EPA also believes, however, that it could be appropriate to base a nonattainment designation on such data in some other cases, where a nonattainment designation is the appropriate way to deal with a long-term nonattainment problem. Since under the final rule EPA still has the discretion not to make nonattainment redesignations based on three more years of data if EPA so chooses, EPA concludes the appropriate approach is not to universally extend the exclusion and rather rely on the Administrator's discretion to redesignate areas only in appropriate cases.

    This final rule follows the proposed approach for use of data from SPMs. The EPA would like to emphasize, however, that States and other parties will have practical ways of obtaining useful information using SPMs without risk of a nonattainment redesignation. In many situations, the potential problem to be investigated, or the place under investigation, is such that a FRM, FEM, or ARM instrument meeting the siting requirements of 40 CFR part 58, appendix E is not the only suitable measurement system, and may not even be a preferred way to measure. For example, there are many commercially available PM2.5 monitors that lack FRM, FEM, or ARM status that nevertheless would be suitable for an initial study of PM2.5 concentrations in an unmonitored area of interest. In some other cases, 2 years may be sufficient to achieve the study objectives. Finally, under the 1997 rule (see statement at 71 FR 2719 and section 2.8.1.2.3 of appendix D to part 58 of the 1997 rule), [12] a SPM that is not population-oriented may not be used in comparisons to the PM2.5 NAAQS; this may be the situation in some studies focusing on near-source impacts as well as in some studies of transport of air pollution from rural upwind areas. If the Regional Administrator has approved alternative quality assurance practices in place of the requirements of appendix A, the data from the affected SPM are not eligible for comparison to the relevant NAAQS.

    In reviewing comments about SPMs, EPA noticed that the proposed rule text for 40 CFR 58.11(d) implied that all SPMs using FRM, FEM, or ARM methods must meet appendix E siting requirements. This was not our intention, as the study objective for a SPM may require it to be located inconsistently with appendix E requirements. The implied restriction in 40 CFR 58.11(d) as proposed conflicted with an explicit statement to the contrary in 40 CFR 58.20(b) as proposed. Removing this implication is certainly in keeping with the sense of most SPM-related comments, which supported flexibility for States to operate SPMs as they choose. The promulgated version of 40 CFR 58.11(d) is drafted so as to remove this implied restriction. Data from a SPM not sited consistently with appendix E are not eligible for comparison to the respective NAAQS, unless the State has requested and EPA has approved a waiver of these criteria.

    In the course of considering all the public comments on SPMs, EPA realized that the proposed restriction on designating pre-existing SLAMS monitors as SPMs after January 1, 2007 would have the effect of preventing a State from switching a monitor to SPM status even if EPA had approved the outright removal of that monitor under other provisions. This could be counter-productive. This final rule provides that if EPA has approved the discontinuation of a SLAMS monitor, the State may choose to retain the monitor and redesignate it to be a SPM. Such a monitor could be removed later without further EPA approval.

    9. Special Considerations for Data Comparisons to the National Ambient Air Quality Standards

    By way of background, the preamble to the proposed monitoring rule provided an explanation of when and how monitoring data are considered comparable to the respective NAAQS under existing rules and EPA policies. See 71 FR 2719-20. The EPA also proposed to relocate one of the provisions mentioned in the discussion, proposing to move pre-existing PM2.5 rule language currently found in section 2.8.1.2.3 of appendix D to 40 CFR 58.30 of subpart D without substantive change. This relocation would provide a more prominent rule location for monitoring requirements detailing the comparability of ambient data to the PM2.5 NAAQS. See 71 FR 2782. One commenter objected, not to this proposed rearrangement of rule language, but rather to the underlying existing (1997) requirement that PM2.5 sites must be population-oriented to be comparable to the PM2.5 NAAQS. This commenter stated that EPA had failed to justify any benchmark for defining an area as population-oriented. Another commenter challenged whether EPA had provided an adequate public health basis for this provision.

    The EPA considers these comments to be outside the scope of the proposal. EPA noted in the preamble to the monitoring proposal that some existing regulatory language was being reprinted without change and that such reprinting was done solely for the readers' convenience to aid in viewing the proposal in a single context (71 FR 2712). EPA also stated that all of the background description of existing regulatory provisions—including the provision the commenters challenged—was presented not to reexamine any of the background provisions but rather “to facilitate informed public comment” on certain aspects of the proposal other than these background provisions. These other provisions were “requirements for the proposed PM10−2.5 NAAQS”, “provisions for special purpose monitors”, provisions “related to the required spacing between ozone monitors and roadways”, and “certain quality assurance requirements” (71 FR at 2719). EPA thus did not seek comment on, reconsider, or otherwise reopen the pre-existing provision regarding population-oriented PM2.5 monitors (or any of the other provisions recited in the background section). The EPA notes, however, that the pre-existing rule and this final rule do provide the same definition of population-oriented, in 40 CFR 58.1 Definitions, which while not quantified in terms of population affected has served to guide PM2.5 monitor placement and interpretation of monitoring data since 1997.

    The most controversial portion of this part of the proposal dealt with issues pertaining to the proposed NAAQS for Start Printed Page 61254PM10−2.5. The EPA proposed a new five-part suitability test for the comparison of PM10−2.5 data to the proposed qualified PM10−2.5 indicator. This test included an urbanized area population criterion, a block group population density criterion, a requirement for sites to be population oriented, an exclusion for source-influenced microscale sites, and a site-specific assessment to insure that data were dominated by certain sources of concern. See 71 FR 2736-2738. The EPA received extensive comment on the proposed PM10−2.5 qualified indicator and on the proposed PM10−2.5 NAAQS five-part site-suitability test. These issues are now moot since EPA is not adopting a NAAQS using a PM10−2.5 indicator. See also section III.C of the preamble to the final rule adopting revisions to the PM NAAQS which explains why EPA did not adopt the proposed qualified indicator for thoracic coarse particles and why the proposed monitoring suitability criteria proved to be inappropriate.

    C. Appendix A—Quality Assurance Requirements for State and Local Air Monitoring Stations and Prevention of Significant Deterioration Air Monitoring

    A quality system provides a framework for planning, implementing and assessing work performed by an organization and for carrying out required quality assurance (QA) and quality control (QC) activities. The proposed amendments to 40 CFR part 58, appendix A were intended to provide the requirements necessary to develop quality systems for monitoring the pollutants of SO2, NO2, O3, CO, PM2.5, PM10 and PM10−2.5 at SLAMS stations including NCore stations, PAMS, and Prevention of Significant Deterioration (PSD) networks, and SPM stations using FRM, FEM, or ARM monitors. The proposed revisions addressed responsibilities for implementing the quality system for EPA and monitoring organizations. They also addressed adherence to EPA's QA policy, DQOs, and the minimum QC requirements and performance evaluations needed to assess the data quality indicators of precision, bias, detectability, and completeness. In addition, the proposed amendments described the required frequency of the QC requirements and performance evaluations, the data to be collected, and the statistical calculations for estimates of the data quality indicators at various levels of aggregation. The revised statistical calculations would be used to determine attainment of the DQOs. The proposed amendments also addressed required auditing programs to help determine and ensure data quality comparability across individual monitoring programs.

    The EPA received some comments expressing concerns about the funding of the quality system. Funding issues are addressed in section III.E of this preamble. Substantive and procedural issues are addressed here.

    1. General Quality Assurance Requirements

    The EPA proposed to revise or include a number of general QA provisions that would serve to consolidate information and to ensure conformance to the QA requirements specified in EPA Order 5360.1 A2.

    The EPA proposed to consolidate the QA requirements for SLAMS and PSD stations from two separate appendices, 40 CFR part 58, appendices A and B, into one single appendix A because both programs have similar QA requirements. See 71 FR 2725. The EPA received only endorsements on the proposed consolidation and therefore this final rule consolidates these appendices.

    The EPA proposed to revise the part 58 appendix A to conform to the current EPA Quality Assurance Policies in EPA Order 5360.1 A2 which requires agencies that accept Federal grant funding for their air monitoring programs to have a QA program with certain elements including quality management plans (QMPs), quality assurance project plans (QAPPs), and the identification of a QA management function. EPA received three sets of comments endorsing the revision and received one comment expressing concern about the identification of the QA manager function. See 71 FR 2725. The proposed regulation would not have required that monitoring organizations identify a QA manager but would have required that they provide for a QA management function, which provides for independent oversight of the ambient air monitoring quality system. The EPA feels that the proposed language captures the essence of the requirements in EPA Order 5360.1A2, while accommodating the diverse nature of the ambient air monitoring community which is made up of large and small (local and Tribal) organizations. Consistent with the majority of positive feedback, and the need for conformance to the EPA Order, this final rule matches the proposed rule on this point.

    The EPA proposed to revise the QA program by emphasizing the DQO process. See 71 FR 2725. A DQO is a qualitative and quantitative statement that defines the appropriate quality of data needed for a particular decision—for example, the data quality necessary for EPA or a monitoring organization to make data comparisons against the NAAQS. The DQOs help to establish the requirements for the data quality indicators of precision, bias, completeness, and detectability and the rationale for the acceptance criteria for these indicators. The EPA received a number of endorsements on this approach and did not receive negative comments. This final rule matches the proposed rule.

    2. Specific Requirements for PM10−2.5, PM2.5, PM10 and Total Suspended Particulates

    The EPA proposed to revise some of the PM2.5 and PM10 QA requirements in an attempt to provide consistency in implementation and assessment. Since PM10−2.5 monitoring was proposed to be required, EPA included similar QA requirements for this monitoring. These requirements included the implementation of flow rate audits conducted by the monitoring organization, collocated monitoring, and performance evaluations.

    The EPA proposed to make all the requirements for flow rate verifications and audits consistent among the PM10−2.5, PM2.5, and PM10 methods. See 71 FR 2728. This requirement would have increased the audit frequency for PM10 monitoring and decreased the audit frequency for PM2.5 monitoring. Most commenters endorsed the proposed approach but a few commenters voiced concerns regarding the increased frequency for high-volume samplers for PM10 and total suspended particulates (TSP) which operate somewhat differently and are not as easy to audit. The EPA reviewed the comments and revised the flowrate verification requirement from monthly to quarterly for the hi-volume manual instruments sampling for PM10 and TSP only.

    The EPA proposed to revise the sampling frequency for the implementation of the PM2.5 Performance Evaluation Program (PEP). See 71 FR 2726. This proposed approach, based on historical PM2.5 precision and bias data, identified the minimum number of performance evaluations required for all primary quality assurance organizations to provide an adequate assessment of bias, rather than the current requirement that a uniform 25 percent of monitors in a primary quality assurance organization be evaluated each year. The revision would establish a suitable sampling frequency of five valid audits a year for organizations with less than or equal to five monitoring sites and eight valid Start Printed Page 61255audits a year for those organizations with greater than five monitoring sites. The majority of commenters approved of the PEP reduction frequency. A few commenters suggested that some primary quality assurance organizations do not need to be audited and said PEP audits should only focus on those producing inferior results. The EPA disagrees with this comment and believes that because the PEP program needs to provide a periodic estimate of bias for each primary quality assurance organization, the program must be implemented at each primary quality assurance organization.

    There was also a comment suggesting further reductions to the auditing frequency or requiring the same number of audits over a longer period of time. The proposed audit cycle is based on 3 years since that is how many years of data are collected for comparison the PM2.5 NAAQS. Therefore, the audit cycle frequency was based on the number of audit values needed to provide EPA the confidence in our bias estimates at the primary quality assurance organization over a 3 year period. Therefore, this final rule matches the proposed rule.

    The EPA proposed to reduce the lower ends of concentration limits for which collocated data can be used to provide precision estimates. See 71 FR 2727. The lower ends of concentration limits would be reduced from 6 micrograms per cubic meter (μ/m3) to 3 μ/m3 for PM2.5 and PM10c (low-volume samplers) and from 20 μ/m3 to 15 μ/m3 for PM10 (high-volume samplers). Statistical evaluation of 3 years of PM2.5 and PM10 data revealed comparable estimates of precision using data from both of these reduced concentration ranges, and also revealed that the addition of the data at these lower ranges will increase the level of confidence in the precision estimates. The majority of commenters endorsed the approach but there were a few commenters who were concerned that the lower concentrations, based on the statistics used to estimate precision, might lead to greater imprecision estimates. The evaluation that EPA made with the data from these lower concentrations included did not show any major increase in imprecision compared to omitting those data.[13] Since EPA has proposed the use of target upper confidence limits for statistical assessments and an upper confidence limit is influenced by sample size, lowering the concentration values tends to tighten or lower the confidence limits because more data points are available in the sample and therefore offsets any greater variability that might be associated with lower concentrations. Therefore this final rule matches the proposed rule.

    Based upon the decision that there is no need to implement a PM10−2.5 monitoring program broad enough to systematically determine attainment/nonattainment with a PM10−2.5 NAAQS, EPA has modified the proposed PM10−2.5 collocation precision requirement and the Performance Evaluation Program (PEP) requirements in this final rule. See 71 FR 2726. The proposed quality system for PM10−2.5 was developed for NAAQS comparison purposes and would have provided reliable precision and bias estimates at the primary quality assurance organization level of aggregation. However, EPA is not adopting a NAAQS using a PM10−2.5 indicator at this time, so EPA is now requiring a network of PM10−2.5 monitors only at NCore stations. The goal of these monitors will be to improve our understanding of PM10−2.5, support health studies for future reviews of the NAAQS, and promote improvements in the monitoring technology. States may choose to operate additional PM10−2.5 monitors. With this in mind, the quality system need not be focused on the data quality assessments at the primary quality assurance organization level of aggregation but rather can and should be focused on understanding and controlling the data quality of each of the methods used to collect PM10−2.5. Also, since it is now anticipated that a primary quality assurance organization would have very few PM10−2.5 sites, the proposal, if adopted without change, would have required almost every NCore site to have a collocated second PM10−2.5 monitor, and the proposal would not provide for assessment of FEM precision even if FEMs are approved and deployed in place of some or most FRMs since as proposed the first collocation requirement of an FEM in a primary quality assurance organization would always be with a FRM. To avoid these undesirable outcomes, this final rule requires fewer collocated samplers than the proposal would have. Under this final rule, EPA will ensure that collocated sampling for estimating precision be implemented at 15 percent of FRMs (all FRMs aggregated) and 15 percent of the FEMs of each method designation. The number of collocated sites would thus be based on the size of the final PM10−2.5 network. In order to provide a distribution of collocation across the United States, EPA will require, at a minimum, one collocated site in each EPA Region. The Regional Administrator shall select the sites for collocation. The site selection process will also consider selecting States with more than one PM10−2.5 site to have one or two of the required collocations and will aim for an appropriate distribution among rural and urban sites.

    For the PEP, this final rule departs from the proposal by requiring only one PEP audit at one PM10−2.5 site in each primary quality assurance organization each year. The proposed rule would have required five or eight PEP audits for PM10−2.5 in each organization. See 71 FR 2787, 2788. Since the PEP is already being run, at present, for the PM2.5 network and it is expected that the PM10−2.5 FRMs will utilize the same FRMs as the PM2.5 samplers, the PEP audit for the PM10−2.5 site can count towards the required number of PEP audits for PM2.5 sites. It will be necessary to place a PM10c PEP sampler at the NCore site also but, this incremental requirement will not be a significant additional resource burden. When and if FEMs are implemented at some PM10−2.5 sites, the PEP audit will be an additional audit at those particular sites and will require additional resources for auditing.

    The incremental cost of placing and operating PM10−2.5 samplers for purposes of tracking precision will also be minor in most cases. Many of the primary quality assurance organizations that will implement the PM10−2.5 monitor at NCore sites are required to implement PM2.5 and PM10 networks. Some or most of the initial PM10−2.5 deployments will be with manual FRM instruments, similar to the instruments used in the PM2.5 networks and to some of the instruments used in the PM10 networks. The EPA will allow collocated PM10−2.5 monitors to be included in the primary quality assurance organization's count for required PM2.5 and PM10 collocation. In most cases, the primary quality assurance organization's collocation requirements for FRMs will not increase overall, since it is not anticipated that any one primary quality assurance organization will have many additional PM10−2.5 sites that are not already both PM2.5 and PM10 sites. The only restriction to this aggregated collocation count will be for monitoring organizations that are operating high-volume PM10 samplers. Since the PM10c monitor in a PM10−2.5 FRM will be a low-volume sampler, PM10 high-volume and PM10 low-volume samplers cannot be aggregated together in the collocation Start Printed Page 61256count and at least one collocated monitor must be identified for each type within primary quality assurance organization. Therefore, it is expected that the 15 percent collocation requirement for PM10−2.5 FRMs will not actually increase the overall collocation burden at the majority of the primary quality assurance organizations beyond what they would have been required to implement for their PM10 and PM2.5 networks.

    For any FEMs that might be used at PM10−2.5 sites, EPA will require 15 percent collocation of each method designation or at least two collocations within each method designation. The EPA will require two collocations in order to collocate one FEM instrument with the same method designation to provide estimates of within method precision and collocate a second with an FRM to provide for an estimate of bias. These collocations would not necessarily need to be at separate monitoring sites.

    3. Particulate Matter Performance Evaluation Program and National Performance Audit Programs

    The EPA proposed to revise the current regulatory requirements dealing with responsibilities for independent assessments of monitoring system performance. See 71 FR 2726. These evaluations are the subject of sections 2.4 and 3.5.3.1 of the existing (1997) appendix A to 40 CFR part 58. Section 2.4 of appendix A to 40 CFR part 58 applied to all NAAQS pollutants and section 3.5.3.1 applied only to PM2.5.

    The EPA proposed to revise the text of 40 CFR part 58, appendix A to cover PM10−2.5 and also to clarify that it is the responsibility of each monitoring organization to make arrangements for, and to provide any necessary funding for, the conduct of adequate independent performance evaluations of all its FRM or FEM criteria pollutant monitors. The proposed language also clearly indicates that it is the monitoring organization's choice whether to obtain its independent performance evaluations through EPA's National Performance Audit Program (NPAP) and PM2.5 PEP programs, or from some other independent organization. An independent organization could be another unit of the same agency that is sufficiently separated in terms of organizational reporting and which can provide for independent filter weighing and performance evaluation auditing. The proposed approach would ensure that adequate and independent audits are performed and would provide flexibility in the implementation approach.

    Monitoring organizations that choose to comply with the revised provisions of appendix A to 40 CFR part 58 regarding performance evaluations by relying on EPA audits, for PM2.5, PM10−2.5, and/or other NAAQS pollutants, would be required to agree that EPA hold back part of the grant funds they would otherwise receive directly. These funds would be used by EPA to hire contractors to perform the audits and to purchase expendable supplies. To ensure national consistency and effective audits, EPA included provisions to ensure certification of data comparability for audit services not provided by EPA and for traceability of gases and other audit standards to national standards maintained by the National Institute for Standards and Technology.

    The EPA received a broad range of comments on this proposed revision. The EPA received a few comments in support of these programs and one commenter felt that the PEP audits should be increased. In general, the comments expressing concern with the proposed language did not suggest that these programs were not necessary but were concerned about some technical aspects of the programs or with funding implications. Funding issues are addressed in section III.E of this preamble.

    The EPA received a number of comments expressing concerns that allowing the monitoring agencies to implement the audit programs themselves or through third parties would increase the variability in the performance evaluation data. Since one of the major goals in the historically centralized and federally implemented PEP and NPAP programs has been the evaluation of data comparability, EPA is also concerned about any additional variability and its effect on data comparability. It has been EPA's practice with regard to any State which already performs these audits to perform side-by-side comparisons of EPA's equipment and procedures and the State's procedures to ensure both are producing results of acceptable quality. The EPA has successfully performed these comparisons with the California Air Resources Board's audit system. These comparisons will be expanded to include any additional States which choose to perform audits themselves or through third parties, rather than ask EPA to do so. During the comment period, EPA asked the monitoring organizations whether or not, assuming finalization of the proposed rule changes, they would continue to use the federally implemented program or perform the audits itself. For 2007, only three monitoring organizations (besides the one already implementing NPAP) opted to implement the NPAP and three monitoring organizations (besides the two already implementing PEP) opted to implement the PEP. The EPA believes it has the capability to ensure these State will implement programs will produce data of a quality comparable to the Federally implemented program.

    The EPA also received comments stating concerns about the stringency of the definition of adequate and independent. Adequacy refers to the number of audits administered at any primary quality assurance organization and the technical procedures used in the audits. This final rule does not require any additional adequacy requirements above and beyond what EPA currently implements for the federally implemented program. The EPA evaluates data quality at the aggregation called “reporting organization” (which was changed to “primary quality assurance organization” in the proposal). The EPA feels that it needs to collect enough data to be able to judge data quality within each primary quality assurance organization over the same period that it uses the data for comparison to the NAAQS (3 years).

    In the case of the PEP for PM2.5, today's action requires five audits per year for organizations with five or fewer sites and eight audits for those organizations with greater than five sites, the same as proposed. The number of audits aggregated over three years provides a reasonable estimate of bias at a primary quality assurance organization within an acceptable level of confidence. For the NPAP program addressing NAAQS for CO, SO2, Pb, and NO2, the goal is to perform audits on about 20 percent of the sites each year, but since there may be a number of high priority sites within a primary quality assurance organization that should be audited more often, it is anticipated that NPAP might audit each site within a primary quality assurance organization over about 7 to 8 years. This 20 percent goal is the current EPA practice, but was not proposed to be required by rule and, therefore, does not appear in this final rule.

    There were a few comments suggesting that some primary quality assurance organizations do not need to be audited and that EPA mandatory audits for CO, SO2, Pb, and NO2 should only focus on those organizations producing inferior results. The EPA continues to believe that it is important to develop an estimate of bias for each primary quality assurance organization. To do this, the audit program must be Start Printed Page 61257implemented at each primary quality assurance organization. The NPAP audits using a through-the-probe approach, which is generally not how audits are performed by the primary quality assurance organizations themselves. By auditing some stations within a primary quality assurance organization each year using the through-the-probe approach, the NPAP can identify problems which the organization may not be aware of on its own. Also, EPA continues to believe that it is necessary to provide an adequate assessment of data comparability of all primary quality assurance organizations every year.

    There were also comments concerning the requirement to use independent filter weighing laboratories for the implementation of the PEP. When EPA first implemented the PEP program, EPA established two independent laboratories to weigh filters for the PEP audits. Due to program efficiencies, EPA is now using one filter weighing laboratory. If primary quality assurance organizations implement the PEP themselves, they should not be able to utilize the same laboratory in which they weigh their routine sampler filters since any bias or contamination that might occur at the routine lab will also be “passed on” to the PEP filter. Because the PEP provides an estimate of bias (systematic error), it is necessary to avoid having a systematic bias occurring in the routine filter weighing lab affect both the PEP filters and the routine filters. Primary quality assurance organizations interested in implementing the PEP themselves have the option to make arrangements with other State labs, contractor labs, or utilize the PEP national lab.

    The EPA believes that both the NPAP and PEP programs serve as an integral part of the overall ambient air monitoring program quality system and provide EPA and the public with independent and objective assessments of data quality and data comparability. Both programs provide the only quantitative independent assessments of data quality at a national level. Therefore, the proposed language was not changed and this final rule matches the proposed rule.

    4. Revisions to Precision and Bias Statistics

    The EPA proposed to change the statistics for assessment of precision and bias for criteria pollutants. See 71 FR 2727. Two important data quality indicators that are needed to assess the achievement of DQOs are bias and precision. Statistics in the current requirements of 40 CFR part 58, appendix A (with the exception of PM2.5) combine precision and bias together into a probability limit at the primary quality assurance organization level of aggregation. Since the standard EPA DQO process uses separate estimates of precision and bias, EPA examined separated assessment methods that were statistically reasonable and simple.

    For SO2, NO2, CO, and O3, EPA proposed to estimate precision and bias on confidence intervals at the site level of data aggregation rather than the primary quality assurance organization. Estimates at the site level can be accomplished with the automated methods for SO2, NO2, CO, and O3 because there is sufficient QC information collected at the site level to perform adequate assessments.

    The precision and bias statistics for PM measurements (PM10, PM10−2.5 and PM2.5) are generated at a primary quality assurance organization level because, unlike the gaseous pollutants, due to costs only a percentage of the sites have precision and bias checks performed in any year and only a few times per year. As with the gaseous pollutants, the statistics would use the confidence limit approach. Using a consistent set of statistics simplifies the procedures.

    The EPA also proposed to change the precision and bias statistics for Pb to provide a framework for developing and assessing a DQO. See 71 FR 2727. The QC checks for Pb come in three forms: Flow rate audits, Pb audit strips, and collocation. The EPA proposed to combine information from the flow rate audits and the Pb audit strips to provide an estimate of bias. Precision estimates would still be made using collocated sampling but the estimates would be based on the upper 95 percent confidence limit of the coefficient of variation, similar to the method described for the automated instruments for SO2, NO2, CO, and O3.

    The EPA received only positive comments on the proposed statistics and some typographical corrections. This final rule matches the proposed rule.

    5. Other Program Updates

    The EPA proposed several QA program changes to update the existing requirements in 40 CFR part 58 to reflect current program needs and terminology.

    The EPA proposed to remove SO2 and NO2 manual audit checks. A review of all SLAMS/NAMS/PAMS sites by monitor type revealed that no monitoring organizations are using manual SO2 or NO2 methods, nor are any monitoring organizations expected to use these older technologies. The EPA received only comments endorsing the removal of the manual audit checks. Therefore, this final rule matches the proposed rule.

    The EPA proposed to change the concentration ranges for QC checks and annual audit concentrations. The one-point QC check concentration ranges for the gaseous pollutants SO2, NO2, O3, and CO were expanded to include lower concentrations. Lower audit ranges were added to concentration ranges for the annual audits. Adding or expanding the required range to lower concentration ranges was appropriate due to the lower measured concentrations at many monitoring sites as well as the potential for NCore stations to monitor areas where concentrations are at trace ranges. In addition, EPA proposed that the selection of QC check gas concentration must reflect the routine concentrations normally measured at sites within the monitoring network in order to appropriately estimate the precision and bias at these routine concentration ranges. The majority of the comments EPA received on this proposal were positive but EPA received comments that asked for more guidance on how a monitoring organization would choose the appropriate audit ranges. The EPA would like to provide as much flexibility as possible for the monitoring organization to use their local knowledge of their monitoring sites to choose their audit concentration ranges. Accordingly, in this final rule, section 3.2.2.1 of appendix A to part 58 establishes a non-binding goal that the primary quality assurance organization select the three audit concentration ranges which bracket 80 percent of the routine monitoring concentrations at the site. So in general, with some minor modification to address comments, this final rule matches the proposed rule.

    The EPA proposed to revise the PM10 collocation requirement. See 71 FR 2726. Fifteen percent of all PM2.5 sites are required to maintain collocated samplers. For PM10, the collocated requirements in the existing (1997) regulation were three alternative values based on the number of routine monitors within a primary quality assurance organization. For consistency, the proposed amendments would have changed the PM10 collocation requirement to match the PM2.5 requirement. This proposed change would make the collocation requirement consistent for PM2.5 and PM10. The EPA did not receive any comments on this proposed change. Therefore, this final rule matches the proposed rule.

    The EPA proposed to revise the requirements for PM2.5 flow rate audits. Start Printed Page 61258See 71 FR 2728. Based on an evaluation of flow rate data and discussions within the QA Strategy Workgroup,[14] EPA proposed to reduce the frequency of flow rate audits from quarterly to semiannually and to remove the alternative method which allows for obtaining the precision check from the analyzers internal flow meter without the use of an external flow rate transfer standard. Most monitoring organizations participating in the QA Strategy Workgroup considered auditing with an external transfer standard to be the preferred method and believed that the quarterly audit data demonstrated the instruments were sufficiently stable to reduce the audit frequency. The EPA did not receive any comments on this proposal; therefore, this final rule matches the proposed rule.

    D. Appendix C—Ambient Air Quality Monitoring Methodology

    1. Applicability of Federal Reference Methods and Federal Equivalent Methods

    The EPA proposed that monitoring methods used in the multipollutant NCore, SLAMS, and PAMS networks were required to be FRMs, FEMs, ARMs, or where appropriate, other methods designed to meet the DQOs of the network being deployed. See 71 FR 2731. Specifics on the monitoring methods proposed for use at each type of site are described below.

    The EPA proposed that NCore multipollutant stations must use FRMs or FEMs for criteria pollutants when the expected concentration of the pollutants was at or near the level of the NAAQS. For criteria pollutant measurements of CO and SO2, where the level of the pollutant is well below the NAAQS, EPA observed that it may be more appropriate to operate higher sensitivity monitors than typical FRM or FEM instruments. See 71 FR 2728. In these cases, higher sensitivity methods were expected to support additional monitoring objectives that conventional FRMs or FEMs cannot. In some cases, higher-sensitivity gas monitors have also been approved as FEM and can serve both NAAQS and other monitoring objectives. Options for high-sensitivity measurements of CO, SO2, and total reactive nitrogen (NOy) are described in the report, “Technical Assistance Document for Precursor Gas Measurements in the NCore Multipollutant Monitoring Network.” Comments regarding monitoring methods used at NCore stations are addressed in section V.E.1 of this preamble.

    The EPA proposed that SLAMS use FRMs or FEMs for criteria pollutants. See 71 FR 2728. The EPA also proposed that these sites have the additional option of using ARMs for PM2.5. Approved regional methods are described in section V.D.2 of this preamble.

    Photochemical assessment monitoring stations (PAMS) were proposed to be required to use FRM or FEM monitors for O3, with most expected to use the O3 ultraviolet photometry FEM and the nitric oxide (NO) and NO2 chemiluminescence FRM for criteria pollutant measurements. See 71 FR 2728. Methods for volatile organic compounds (VOC) including carbonyls, additional measurements of gaseous nitrogen, such as NOy, and meteorological measurements are routinely operated at PAMS. Because these measurements are not of criteria pollutants, the methods were not subject to the requirements for reference or equivalent methods. However, these methods were described in detail in the report, “Technical Assistance Document (TAD) for Sampling and Analysis of Ozone Precursors.” [15]

    The EPA proposed that SPM sites have no restrictions on the type of method to be utilized. While FRM and FEM can be employed at SPM sites, other methods, not limited to continuous, high-sensitivity, and passive methods, may also be utilized. Because the SPM provision was designed to encourage monitoring, agencies could design SPM sites with methods to meet monitoring objectives that may not be achievable with FRMs or FEMs. Additional information on SPMs is included in section V.E.8 of this preamble.

    The EPA received several comments on its proposed approach for ambient air monitoring methodology. Some of these comments expressed concern that requiring only designated reference or equivalent methods takes away flexibility and the drive for improvements to air quality instrumentation. The EPA agrees that some flexibility is desirable for agencies to use innovative methods that can support other objectives beyond NAAQS decision making. However, CAA section 319 requires “* * * an air quality monitoring system throughout the U.S. which utilizes uniform air quality monitoring criteria and methodology * * *”. The EPA recognizes that there may be occasions when a unique method is better suited to meet a specific monitoring objective that is different from NAAQS decision making. In these cases, EPA will allow for these innovative methods, so long as the monitoring agency is not attempting to use them to meet minimum requirements for the number of monitors for a given criteria pollutant. For example, a low cost method might be applied as a SPM to provide short term data for validation of an air quality model.

    2. Approved Regional Methods for PM2.5

    The EPA proposed amendments that expanded the allowed use of alternative PM2.5 measurement methods through ARMs. See 71 FR 2729. The EPA also proposed to extend the existing provisions for approval of a nondesignated PM2.5 method as a substitute for a FRM or FEM at a specific individual site to a network of sites. This approval would be extended on a network basis to allow for flexibility in operating a hybrid network of PM2.5 FRM and continuous monitors. The size of the network, in which the ARM could be approved, would be based on the location of test sites operated during the testing of the candidate ARM. The proposed amendments would have required that test sites be located in urban and rural locations that characterize a wide range of aerosols expected across the network. A hybrid network of monitors was envisioned to address monitoring objectives beyond just determining compliance with NAAQS. The hybrid network was expected to lead to a reduced number of existing FRM samplers and an increase in continuous ARM samplers that would all be approved for direct comparison with the applicable forms of the PM2.5 NAAQS.

    Many comments were received on EPA's proposal regarding ARMs for PM2.5. Several commenters suggested requiring on-going collocation with an FRM. Commenters also raised concerns about ensuring data quality, especially in light of the lower level of the 24-hour PM2.5 NAAQS and therefore the perceived need to ensure that the statistical criteria are met in each season. One commenter was so concerned about the data quality issues that the commenter recommended dropping the ARM provision. Other commenters voiced strong support for the ARM provision, but also recommended that EPA allow for less collocation with FRMs than the 30 Start Printed Page 61259percent that was proposed. Several commenters recommended that EPA allow non-linear data adjustment factors as are used for AIRNow and mapping purposes.

    In reviewing comments on the provision for ARMs, EPA agrees that data quality issues need to be appropriately addressed. Since ARMs will be used for several monitoring objectives, including NAAQS attainment/nonattainment determinations, they must meet the Class III FEM performance criteria set out in part 53. However, as proposed, these performance criteria left open the possibility that in cleaner environments where concentration data approached background levels of PM2.5 that approved methods may have unacceptable levels of bias to meet other monitoring objectives. Therefore, the Class III equivalency criteria, which are the same criteria used for PM2.5 ARMs, has been strengthened to address concerns about additive bias in cleaner environments. The EPA performed an extensive investigation into developing equivalency criteria for PM2.5 continuous methods. One of the conclusions from that process was that continuous methods, by virtue of being able to provide a sample every day, generate data with more certainty in decision making than methods used with lower sample frequencies (i.e., a 1-in-3 day sample schedule), with all other factors being equal. Although biases can be seasonal, correlation combined with the other performance criteria will guard against high biases in one season cancelling out low biases in another. Together, the performance criteria and the daily sample schedule will ensure that data quality objectives are met when making NAAQS decisions with data from ARMs.

    With respect to requiring on-going collocation with FRMs at 30 percent of the sites with continuous PM2.5 monitors, EPA has considered how this would affect agencies with many continuous monitors and finds it unnecessary to require such a large absolute number of collocated sites, although the number of collocated FRM under a 30 percent collocation requirement makes sense for smaller networks. Therefore, this final rule states that monitoring agencies are only required to have 30 percent collocation of the ARMs they count towards the applicable minimum number of required FRM/FEM/ARM sites—rounded up, rather than 30 percent of their full networks of ARMs.

    For the issue of non-linear data transformations, this final rule specifically allows data transformations when using an ARM, including non-linear ones, so long as the transformations are described in both the ARM application and the monitoring agency's quality assurance project plan (or addendum to the QAPP), the transformations are prospective, and the ARM application provides for details on how often or under what circumstances they will be recalculated, based on what data, and which analytical method.

    Since participation in seeking approval of ARMs is voluntary and approval of an ARM applies only in the territory of the agency seeking approval, no monitoring agency having concerns will be required to utilize the ARM provisions. However, for many agencies this approach will offer an opportunity to improve their monitoring network's utility, by using methods that can serve multiple objectives, while having lower costs. Therefore, EPA is finalizing the ARM provisions as proposed, with the exceptions of the additive bias requirement being strengthened; changes to the required collocation requirement; and clarifying use of data transformations, including non-linear ones.

    Today's final action thus allows State, local, and Tribal monitoring agencies to independently, or in cooperation with instrument manufacturers, seek approval of ARMs where PM2.5 continuous monitor data quality is sufficiently comparable to FRMs for integration into the agency's PM2.5 network used in NAAQS attainment findings. The performance criteria for approval of candidate ARMs are the same criteria for precision, correlation, and additive and multiplicative bias that have been finalized for approval of continuous PM2.5 Class III equivalent methods, described in section IV.C of this preamble. These performance criteria are to be demonstrated by monitoring agencies independently or in cooperation with instrument manufacturers under actual operational conditions using one to two FRM and one to two candidate monitors each. This is a departure from the very tightly-controlled approach used for national equivalency demonstration in which three FRM and three candidate monitors are operated. The ARM will be validated periodically in recognition of changing aerosol composition and instrument performance. These validations will be performed on at least two levels: (1) Through yearly assessments of data quality provided for as part of the on-going quality assurance (QA) requirements in 40 CFR part 58, appendix A, and (2) through network assessments conducted at least every 5 years as described in section V.B.2 of this preamble.

    The testing criteria EPA will use for approval of PM2.5 continuous methods as ARMs are intended to be robust but not overly burdensome. The two main features of testing that are different than FEMs are the duration and locations of testing. The duration is expected to be 1 year to provide an understanding of the quality of the data on a seasonal basis. The locations for testing are expected to be a subset of sites in a network where the State desires the PM2.5 continuous monitor to be approved as an ARM. Testing will be carried out in multiple locations to include up to two Core-based Statistical Area/Combined Statistical Areas (CBSA/CSA) and one rural area or small city for a new method. For methods that have already been approved by EPA in other networks, one CBSA/CSA and one rural area or small city are required to be tested.

    To ensure that approvals of new methods are made consistently on a national basis, the procedures for approval of methods are similar to the requirements specified in 40 CFR part 53, i.e., the EPA Administrator (or delegated official) will approve the application. However, to optimize flexibility in the approval process, all other monitoring agencies seeking approval of an ARM that is already approved in another agency's monitoring network can seek approval through their EPA Regional Administrator. This approach will provide a streamlined approval process, as well as an incentive for consistency in selection and operation of PM2.5 continuous monitors across various monitoring agency networks.

    The QA requirements for approval of continuous PM2.5 ARM at a network of sites are the same as for FEM in 40 CFR part 58, appendix A, except that 30 percent—rounded up—of the required sites that utilize a PM2.5 ARM would be collocated with an FRM and required to operate at a sample frequency of at least a 1-in-6 day schedule. The higher collocation requirement would support the main goal of the particulate matter continuous monitoring implementation plan, which was to have an optimized FRM and PM2.5 continuous monitoring network that can serve several monitoring objectives. This collocation requirement is necessary to retain a minimum number of FRM for continued validation of the ARM, direct comparison to NAAQS, and for long-term trends that are consistent with the historical data set archived in the AQS. The collocated sites are to be located at the highest concentration sites, starting Start Printed Page 61260with one site in each of the largest population MSA in the network and working to the next highest-population MSA with the second site and so forth.

    Finally, EPA reiterates that ARMs may be used to measure compliance with the PM2.5 NAAQS. See section 50.13(b) and (c) (as published elsewhere in today's Federal Register) (annual and 24-hour primary and secondary standards are met when designated concentrations “as determined in accordance with Appendix N” are met), and Part 50 Appendix N section 1.a (for purposes of section 50.13, PM2.5 can be measured by FRM, FEM, “or by an Approved Regional Method (ARM) designated in accordance with part 58 of this chapter”).

    E. Appendix D—Network Design Criteria for Ambient Air Quality Monitoring

    1. Requirements for Operation of Multipollutant NCore Stations

    The EPA proposed requirements for NCore stations applicable to States individually that would, in the aggregate, result in the deployment of a new network of multipollutant monitoring stations in approximately 60 mostly urban areas. See 71 FR 2730. In the proposal, most States would have been required to operate one urban station; however, rural stations could be substituted in States that have limited dense urban exposures. Such substitution would not change the goal of having about 20 rural NCore sites. California, Florida, Illinois, Michigan, New York, North Carolina, Ohio, Pennsylvania, and Texas would be required to operate one to two additional NCore stations in order to account for their unique situations. These stations, combined with about 20 multipollutant rural stations, which were not proposed to be required of specific States, would form the new NCore multipollutant network. The rural NCore stations would be negotiated using grant authority as part of an overall design of the network that is expected to leverage existing rural networks such as IMPROVE, CASTNET and, in some cases, State-operated rural sites.[16]

    These NCore multipollutant stations are intended to track long-term trends for accountability of emissions control programs and health assessments that contribute to ongoing reviews of the NAAQS; support development of emissions control strategies through air quality model evaluation and other observational methods; support scientific studies ranging across technological, health, and atmospheric process disciplines; and support ecosystem assessments. Of course, these stations together with the more numerous PM2.5, PM10, O3, and other NAAQS pollutant sites would also provide data for use in attainment and nonattainment designations and for public reporting and forecasting of the AQI.

    The EPA proposed that these NCore multipollutant stations be required to measure O2; CO, SO2, and total reactive nitrogen (NOy) (using high-sensitivity methods, where appropriate); PM2.5 (with both a FRM and a continuous monitor); PM2.5 chemical speciation; PM10−2.5 (with a continuous FEM); and meteorological parameters including temperature, wind speed, wind direction, and relative humidity. See 71 FR 2730. High-sensitivity measurements are necessary for CO, SO2, and NOy to adequately measure these pollutants in most air sheds for data purposes beyond NAAQS attainment determinations. For the other criteria pollutants, EPA proposed use of conventional ambient air monitoring methods.

    At least one NCore station was proposed to be required in each State, unless a State determines through the network design process that a site which meets their obligation can be reasonably represented by a site in a second State, and the second State has committed to establishing and operating that site. Any State could propose modifications to these requirements for approval by the Administrator. While the proposed amendments did not specify the cities in which the States would have to place their NCore multipollutant monitoring stations, EPA anticipated that the overall result would be a network that has a diversity of locations to support the purposes listed earlier. For example, there would be sites with different levels and compositions of PM2.5 and PM10−2.5, allowing air quality models to be evaluated under a range of conditions.

    The EPA received several comments on the proposed requirements for operating the NCore multipollutant monitoring stations. Some commenters recommended requiring additional NCore monitoring stations for better spatial coverage and to capture gradients, including specifically requiring additional rural sites. Regarding methods, a few commenters recommended not requiring the total reactive NOy measurement, since this measurement in some but not all cases is little different from the existing NO2 measurement by chemiluminescence, which uses the same measurement principle as NOy.

    In reviewing the comments, EPA notes that more NCore sites can be deployed than required by regulation. For example, in our proposal EPA stated that it would develop a design of the network for rural sites—not specifically required of any individual State—that leveraged existing rural networks such as IMPROVE, CASTNET and, in some cases, State-operated rural sites. In some cases it may be appropriate to have enough NCore multipollutant sites to assess gradients; however, in other areas having enough sites to develop gradients with all the parameters required of an NCore station may not be needed and would therefore present an unnecessary burden to the States. Therefore, EPA is finalizing the NCore network design requirements as proposed.

    For required methods, EPA agrees that in areas where the existing NOX method provides comparable data to the NOy method, monitoring agencies should be allowed to operate NOX instead of the more challenging measurement of NOy. However, EPA notes much of the reason for NOy and NOX reading being so close may be a positive bias with current typical NOX (NO + NO2) instruments which may over report NO2. Since further development of the NOX method is underway, monitoring agencies which seek waivers for the NOy method are encouraged to utilize high sensitivity versions of the chemiluminescence method so that they are capable of switching from high sensitivity NOX to high sensitivity NOy in performing gaseous nitrogen measurements. The EPA is therefore finalizing the required measurements at NCore multipollutant sites as proposed; however, EPA will allow for waivers of the NOy method in areas where measured NOX is expected to provide virtually the same data as NOy. This is largely expected to be in urban environments until such time as the NO2 method (and hence the NOX) is sufficiently improved that having separate measurements of NOy and NOX provides more useful information than the existing technology. See also section V.E.7.

    The NCore stations are to be deployed at sites representing as large an area of relatively uniform land use and ambient air concentrations as possible (i.e., out Start Printed Page 61261of the area of influence of specific local sources, unless exposure to the local source(s) is typical of exposures across the urban area). Neighborhood-scale sites may be appropriate for NCore multipollutant monitoring stations in cases where the site is expected to be similar to many other neighborhood scale locations throughout the area. In some instances, State and local agencies may have a long-term record of several measurements at an existing location that deviates from this siting scheme. The State or local agency may propose utilizing these kinds of sites as the NCore multipollutant monitoring station to take advantage of that record. The EPA will approve these sites, considering both existing and expected new users of the data. The NCore multipollutant stations should be collocated, when appropriate, with other multipollutant air monitoring stations including PAMS, National Air Toxic Trends Station sites, and the PM2.5 chemical Speciation Trends Network sites. Collocation will allow use of the same monitoring platform and equipment to meet the objectives of multiple programs where possible and advantageous. Of the approximately 60 required NCore stations, up to 35 existing State-operated multi-monitor stations are already also operating or preparing to also operate the high-sensitivity monitors for CO, SO2, and NOy that are part of the NCore requirement.

    Although EPA is retaining the 24-hour PM10 NAAQS for requisite protection against short-term exposure to thoracic coarse particles and is not promulgating a PM10−2.5 NAAQS, the NCore stations are also being required to deploy a PM10−2.5 FRM or FEM to build a dataset for scientific research purposes, including supporting health studies and future reviews of the PM NAAQS. Separate PM10 monitoring will not be required at NCore stations. For many PM10−2.5 methods, including the FRM, PM10 data will be readily available as part of the calculated PM10−2.5 measurement. Even if a PM10−2.5 method that does not report PM10 is approved as an FEM and is deployed to one or more NCore sites, PM10 will still be available by virtue of the independent measurements of PM2.5 and PM10−2.5 (which could appropriately be summed). Therefore, EPA is not making measurements of PM10 a requirement of the NCore network. Also, since the NCore network of PM10−2.5 FRM/FEM is not being used for attainment/nonattainment determinations, agencies may operate filter methods on as infrequent a schedule as a 1-in-3 day sampling.

    This final rule contains a requirement for PM10−2.5 speciation to be conducted at NCore multipollutant monitoring stations. The EPA had proposed a requirement for PM10−2.5 speciation in 25 areas, with the areas required to have this monitoring selected based on having an MSA population over 500,000 and having an estimated design value of greater than 80 percent of the proposed PM10−2.5 NAAQS. This would have concentrated the PM10−2.5 speciation monitoring in areas that have high populations and high exposures to PM10−2.5. Since EPA is requiring PM10−2.5 monitoring at NCore primarily for scientific purposes, it is more appropriate to have monitoring in a variety of urban and rural locations so as to increase the diversity of areas that have available chemical species data to use in scientific studies. The EPA had already proposed to have chemical speciation for PM2.5 at NCore stations. The collocation of both PM10−2.5 and PM2.5 speciation monitoring at NCore stations is consistent with the multipollutant objectives of the NCore network and will support further research in understanding the chemical composition and sources of PM10 and PM10−2.5, and PM2.5 at a variety of urban and rural locations.

    Once these multipollutant NCore stations are established, it is EPA's intention that they operate for many years in their respective locations. Therefore, State and local agencies are encouraged to insure long-term accessibility to the sites proposed for NCore monitoring stations. Relocating these stations will require EPA approval, which will be based on the data needs of the host State and other clients of the information.

    The EPA may negotiate with some States, and possibly with some Tribes, for the establishment and operation of additional rural NCore multipollutant monitoring stations to complement the stations required by today's action.

    The EPA is in the process of upgrading the CASTNET monitoring capabilities to allow stations to provide even more useful data to multiple users. The EPA expects that about 20 CASTNET sites, operated at EPA expense, will have new capabilities equivalent to some of the capabilities envisioned for NCore multipollutant sites. After consultations with State air quality planners and other data users, EPA may adjust the goal of having 20 rural State-operated NCore stations, if some of these CASTNET stations can achieve the same data objectives. This would preserve State/local funding resources for other types of monitoring. Alternatively, the CASTNET stations will contribute to a more robust rural network with multipollutant capabilities.

    2. Requirements for Operation of PM10−2.5 Stations

    For PM10−2.5, EPA proposed a new minimum network requirement based on metropolitan statistical area (MSA) population and estimated PM10−2.5 design value. See 71 FR 2732-2736. Under that proposal, only those MSAs that contained an urbanized area of at least 100,000 persons were required to have one or more monitors. The minimum network design requirements would not have included separate requirements for multiple urbanized areas of 100,000 persons or more within a single MSA. Where more than one MSA was part of a CSA, each MSA was treated separately and was subject to individual requirements.

    The EPA proposed that the actual or estimated PM10−2.5 design value (3-year average of 98th percentile 24-hour concentrations) of a MSA, where one could be calculated, be used as a second factor to increase the minimum number of monitors in MSAs with higher estimated ambient coarse particle levels and to reduce requirements in MSAs with lower estimated concentrations. The EPA developed an initial database of estimated PM10−2.5 design values by analyzing concentrations from existing collocated or nearly collocated PM10 and PM2.5 monitors in each MSA and identifying which pairs met the proposed siting criteria which specified when a monitor was suitable for comparison to the proposed PM10−2.5 NAAQS. Monitoring agencies were given the option of proposing other procedures for calculating estimated PM10−2.5 design values as a substitute for EPA-calculated values.

    The EPA's proposal would have required as many as five PM10−2.5 monitors in MSAs with total population of more than 5 million with actual or estimated design values of greater than 80 percent of the proposed PM10−2.5 NAAQS, and no monitors in MSAs under 1 million people with actual or estimated design values less than 50 percent of that proposed NAAQS. The EPA estimated that the size of the minimum required PM10−2.5 network would be approximately 250 monitors based on these proposed requirements and the most recent estimates of PM10−2.5 design values available at the time of proposal. An additional review of urbanized area population counts and estimated design values completed after proposal subsequently reduced the Start Printed Page 61262estimated size of the required PM10−2.5 network to approximately 225 monitors (not counting PM10−2.5 monitors at NCore stations) through the elimination of some MSAs where the population of the urbanized area was found to be fewer than 100,000 persons, or where updated estimated design values decreased sufficiently for monitoring requirements to drop into an adjoining design value category with lower requirements.

    As noted earlier, in addition to the minimum monitoring requirements, EPA proposed a five-part test that would be used to determine whether potential PM10−2.5 monitoring sites were suitable for comparison to the proposed NAAQS. All five parts of the site-suitability test were required to be met for data from required monitors or non-required monitors to be compared to the proposed PM10−2.5 NAAQS.

    The EPA received extensive comments on all aspects of the PM10−2.5 network design proposal including the minimum monitoring requirements, five-part suitability test for PM10−2.5 NAAQS comparability, and monitor placement criteria. As summarized in section III.C.2 of the preamble for the NAAQS revisions published elsewhere in this Federal Register, EPA is not adopting a proposed PM10−2.5 NAAQS but instead will be retaining the current 24-hour PM10 standard. Therefore, the elements of the PM10−2.5 monitoring network design that were proposed to implement an ambient network for the primary purpose of determining NAAQS compliance are no longer required and are not included in this final rule.

    As described elsewhere in this notice, EPA is requiring PM10−2.5 mass concentration and speciation monitoring as part of the NCore network of multipollutant sites. These sites are intended to track long-term trends for accountability of emissions control programs and health assessments that contribute to ongoing reviews of the NAAQS; support development of emissions control strategies through air quality model evaluation and other observational methods; support scientific studies ranging across technological, health, and atmospheric process disciplines; and support ecosystem assessments.

    3. Requirements for Operation of PM2.5 Stations

    The PM2.5 network includes over 1,200 FRM samplers at approximately 900 sites that are operated to determine compliance with the NAAQS; track trends, development, and accountability of emission control programs; and provide data for health and ecosystem assessments that contribute to periodic reviews of the NAAQS. More than 500 continuous PM2.5 monitors are operated to support public reporting and forecasting of the AQI.

    The EPA proposed to modify the network minimum requirements for PM2.5 monitoring so that multiple urban monitors in the same MSA or CSA are not required if they are redundant or are measuring concentrations well below the NAAQS. See 71 FR 2741. EPA proposed to base minimum monitoring requirements on PM2.5 concentrations as represented by the design value of the area, and on the census population of the CSA, or in cases where there is no CSA, the MSA. Overall, this was expected to result in a lower number of required sites (to satisfy minimum network design requirements); however, EPA recommended that States continue to operate a high percentage of the existing sites now utilizing FRM, but with FEM and ARM continuous methods replacing the FRM monitors at many of the sites.[17] Id.

    The EPA proposed to require that all sites counted by a State towards meeting the minimum requirement for the number of PM2.5 sites have an FRM, FEM, or ARM monitor. The EPA also proposed that at least one-half of all the required PM2.5 sites be required to operate PM2.5 continuous monitors of some type even if not an FEM or ARM.

    As noted, EPA proposed to use design value and population as inputs in deciding the minimum required number of PM2.5 monitoring sites in each CSA/MSA. The EPA proposed these inputs so that monitoring resources would be prioritized based on the number of people who may be exposed to a problem and the level of exposure of that population. Metropolitan areas with smaller populations would not be required to perform as much monitoring as larger areas. If ambient air concentrations as indicated by historical monitoring are low enough, these smaller population areas would not have been required to continue to perform any PM2.5 monitoring.

    The proposed amendments also would have required fewer sites when design values are well above (rather than near) the level of the NAAQS to allow more flexibility in the use of monitoring resources in areas where States and EPA are already confident of the severity and extent of the PM2.5 problem and possibly in more need of other types of data to address it.

    We proposed to retain the current siting criteria for PM2.5, which have an emphasis on population-oriented sites at neighborhood scale and larger. See 71 FR 2741. In the proposal, EPA stated that these current design criteria appeared to remain appropriate for implementation of the proposed primary PM2.5 NAAQS. See 71 FR 2742. The proposal stated that the existing minimum requirements effectively ensure that monitors are placed in locations that appropriately reflect the community-oriented area-wide concentrations levels used in the epidemiological studies that support the proposed (and now final) lowering of the 24-hour NAAQS.

    The EPA further proposed that background and transport sites remain a required part of each State's network to support characterization of regional transport and regional scale episodes of PM2.5. To meet these requirements, IMPROVE samplers could be used even though they would not be eligible for comparison to the PM2.5 NAAQS; these samplers are currently used in visibility monitoring programs in Class I areas and national parks. Sites in other States which are located at places that make them appropriate as background and transport sites could also fulfill these minimum siting requirements.

    The preamble to the proposal also pointed out that in most MSAs, the PM2.5 monitor recording the maximum annual PM2.5 concentrations is the same as the monitor showing the maximum 24-hour PM2.5 concentrations, suggesting that generally it will be these common high-reading monitors that will determine attainment/nonattainment for both the annual and 24-hour PM2.5 NAAQS. 71 FR 2742. The preamble further noted that where this is the case, supplemental monitors, such as continuous PM2.5 monitors and PM2.5 speciation monitors, should already be well located to help in understanding the causes of the high PM2.5 concentrations. In a relatively small number of cases, certain microscale PM2.5 monitors that have not been eligible for comparison to the annual PM2.5 NAAQS and that have been complying with the pre-existing 24-hour PM2.5 NAAQS of 65 μg/m3, and therefore have no impact on attainment status, may become more influential to attainment status under the more stringent level of the then-proposed, now adopted 24-hour PM2.5 standard. In these cases, EPA noted that States may choose to move accompanying speciation and continuous monitors to Start Printed Page 61263the new site of particular interest to get a better characterization of PM2.5 at that location.

    The EPA received a number of comments regarding the PM2.5 network design. Several commenters expressed concern regarding the provision to allow fewer required sites when monitored PM2.5 concentrations are significantly above the PM2.5 NAAQS. Commenters stated that allowing fewer sites would be inadequate to demonstrate actual ambient air conditions. One commenter stated that the provision had merit for long-term NAAQS such as the annual average but not for short term standards. The commenter pointed out that long term standards, where concentrations are averaged out over a multiple year period, tend to provide relatively uniform results even over a large geographical area; however, daily observations are going to be more variable at a given site and from site to site. Other commenters expressed concern that while they appreciated the flexibility to redirect resources to speciation sampling in areas with significantly high NAAQS design values, there would still be a need for both speciation and FRM data. In these cases, while the flexibility may be available, in practice it would be difficult to shut down a monitor in an area that is significantly above the NAAQS.

    The EPA also received comments on using CSA as the definition for a metropolitan area in which to apply the minimally required PM2.5 monitoring network criteria. Commenters expressed concern that the CSA was too large an area to apply minimum monitoring requirements and that it may result in the loss of essential monitors necessary to characterize the extent of nonattainment areas. In addition, EPA received comments on the proposed requirement for the PM2.5 monitoring network to provide for one-half the required sites, rounded-up, to operate PM2.5 continuous monitors. Commenters expressed concern that requiring PM2.5 continuous monitors, none of which at present meet FEM and/or ARM performance criteria, may result in minimizing the impetus for equipment manufacturers to further develop versions of these technologies that would meet the FEM/ARM performance criteria. Some commenters expressed concern that although PM2.5 continuous monitors serve multiple monitoring objectives, which underscores the need for their operation, requiring collocation with FRMs should not be a requirement of all the sites since it places an unnecessary burden on the States.

    The EPA also received several comments regarding the location of required PM2.5 monitoring sites, questioning EPA's proposal to keep the siting requirements for PM2.5 monitors the same despite the revision of the 24-hour NAAQS to a level at which commenters asserted that violations of the 24-hour NAAQS may occur in many middle scale or microscale locations not presently experiencing violations of the current 24-hour NAAQS. The gist of the comments was that more monitors should be deployed in middle and/or microscale locations to find such violations. One commenter recommended that EPA specifically require a monitoring organization to have at least one microscale site in any area that is nonattainment or marginally nonattainment for the 24-hour NAAQS.

    In response to concerns about requiring fewer PM2.5 monitoring sites when monitored PM2.5 concentrations are significantly above the NAAQS, EPA is not adopting the provision and will instead provide two ranges of minimum monitoring requirements depending on design value. As proposed, agencies with areas that are significantly below the PM2.5 NAAQS (less than or equal to 85 percent of the annual and 24-hour PM2.5 NAAQS) will have a lower minimum monitoring requirement. Areas that are within 15 percent of the NAAQS or above it will be required to operate more PM2.5 monitoring sites (i.e., be required to deploy a greater minimum number of monitors), relative to those at less than 85 percent of the NAAQS.

    To address the comments concerning the most appropriate Census Bureau definition in which to apply the PM2.5 minimum monitoring requirements, EPA compared the current network to the number of monitors that would be required using either CSA or MSA as the unit for applying monitoring requirements. The results demonstrated that using MSA ensures a few more required sites in areas that have multiple MSAs making up a large CSA with high populations and large geographical areas, without requiring new sites of less obvious priority in MSAs that have smaller geographic coverage and population. Since the overall goal of reducing redundant required sites in large metropolitan areas can be met by using MSA as the unit for monitoring requirements, and using MSA as the unit will also result in multiple MSAs with high design values in the same CSA each having minimum monitoring requirements to address spatial gradients in large areas, EPA is adopting the MSA in as the geographic unit for applying the minimum PM2.5 monitoring requirements. In a CSA, each MSA must meet the MSA requirements separately.

    In considering the comments on requiring one-half the required PM2.5 sites to have continuous monitors, EPA notes that the existing network of monitors is providing invaluable data for reporting and forecasting of the AQI and in support of emergency situations such as wildfires and natural disasters (e.g., Hurricane Katrina). Ensuring a minimum network of these monitors is essential to informing the public and policy makers on the quality of the air during air pollution episodes. The technology utilized in the network continues to evolve as agencies adopt the most suitable methods for use in their own network. The EPA believes that as agencies continue to purchase the most optimal equipment for their networks and as instrument manufacturers now will have the opportunity to receive FEM or ARM approval for their method(s), manufacturers will continue to develop better continuous instruments. The EPA is therefore adopting the proposed requirement for one-half the required PM2.5 sites to have continuous monitors as proposed. However, to address the concern about whether required continuous monitors need to be collocated with a matching second continuous monitor, this final rule states that only one of all the required PM2.5 continuous monitors in each MSA needs to have such a collocated match. This will allow a minimal level of performance characterization of the continuous monitors in each area that they are operated. Additional PM2.5 continuous monitors, when required, can either be collocated with FRMs or set up at non-collocated sites to provide better spatial coverage of the MSA.

    With regard to concerns expressed in comments about monitor siting in light of the revised 24-hour PM2.5 NAAQS, EPA agrees that the proposed change in the level of the primary 24-hour PM2.5 NAAQS from 65 μg/m3 to 35 μg/m3 raised the issue of whether any commensurate changes would be needed in these requirements. The EPA has considered the original requirements for PM2.5 network design promulgated in 1997 and their rationale, how the PM2.5 network is currently configured, what if any changes need to be made to this network to make it consistent with the intended level of protection of the lower 24-hour PM2.5 NAAQS in combination with the annual PM2.5 NAAQS, and whether these or any changes should be required by a general rule or developed on a case-by-case basis. Start Printed Page 61264

    In specifying monitor siting criteria for the original PM2.5 monitoring network in 1997, EPA noted that the annual standard had been set based on epidemiology studies in which monitors generally were representative of community-average exposures. The EPA stated its expectations that the annual standard would generally be the controlling standard in designating nonattainment areas and that controlling emissions to reduce annual averages would lower both annual and 24-hour PM2.5 concentrations across each annual NAAQS nonattainment area. Accordingly, the PM2.5 network design provisions in that final rule (62 FR 38833, July 18, 1997) and EPA's subsequent negotiations with State/local monitoring agencies over monitoring plans were largely but not solely directed at obtaining air quality data reflecting community-wide exposures by placing monitors in neighborhood and larger scales of representation.

    Section 2.8 of appendix D of 40 CFR part 58 as promulgated in 1997 had only a few definite requirements regarding the siting of PM2.5 monitors. Section 2.8.1.3 specified how many “core” monitors representing community-wide air quality were required based on MSA population. For areas with populations of 500,000 or more, section 2.8.1.3.1(a) required that at least one core monitoring station must be placed in a “population-oriented” area of expected maximum concentration and (unless waived under section 2.8.1.3.4) at least one core station in an area of poor air quality. Areas with populations between 200,000 and 500,000 were required to operate at least one core monitor. Section 2.8.1.3.4 strongly encouraged any State with an MSA with only one required monitor (due to being fewer than 500,000 in population or due to a waiver) to site it so it represented community-oriented concentrations in areas of high average PM2.5 concentrations. Section 2.8.1.3.7 required core monitoring sites to represent neighborhood or larger spatial scales. States could at their initiative place additional monitors anywhere, but monitors in relatively unique microscale, localized hot spot, or unique middle-scale locations cannot be compared to the annual NAAQS, and any monitoring site must be population-oriented to be compared to either NAAQS. Part 58 App. D section 2.8.1.2.3.

    In practice, the majority of PM2.5 monitors are deployed at neighborhood scale and larger, meaning that they are located far enough from large emission sources that they represent the fairly uniform air quality across an area with dimensions of at least a few kilometers and thus can be considered community-oriented. The existing PM2.5 monitoring network continues to mostly be made up of these population-oriented, community-oriented, neighborhood scale monitoring sites. The EPA is presently aware of fewer than ten PM2.5 monitors that are sited in relatively unique population-oriented microscale areas, localized hot spots, or unique population-oriented middle-scale areas. Such sites may have higher concentrations than neighborhood scale sites on at least some days because they may be close to and downwind of large emission sources, but the number of people exposed to such concentrations is not large relative to the surrounding communities.

    The EPA believes the PM2.5 networks that were deployed were, and the networks that are now operating currently are, consistent with the intended level of protection of the annual PM2.5 NAAQS. Consistency or inconsistency with regard to the 24-hour PM2.5 NAAQS has not been of practical significance until now due to the near absence of violations of that standard. In the January 17, 2006, proposal notice, EPA said that it believed that the 1997 rule's design criteria remained appropriate for implementation of the proposed primary PM2.5 NAAQS, including the lower 24-hour NAAQS, because these requirements effectively ensured that monitors are placed in locations that appropriately reflect the community-oriented areawide concentration levels used in the epidemiological studies that support the proposed lowering of the 24-hour PM2.5 NAAQS. 71 FR 2742. The EPA continues to believe this, noting that the monitors used in the epidemiology studies underlying the 24-hour PM2.5 NAAQS were sited similar to the majority of monitors in the existing State/local networks.

    No comments directly contradicted this assessment. While an implication of the final monitoring rule provisions regarding siting of PM2.5 monitors is that States may choose not to monitor microenvironment or middle scale locations where some people are exposed to 24-hour concentrations above the level of the 24-hour NAAQS, such a result remains consistent with the community-oriented area-wide level of protection on which the 24-hour PM2.5 NAAQS is premised. Thus, EPA believes it is not appropriate to specifically require any number of monitors to be placed in microenvironment or hot spot locations as one commenter suggested.

    On the other hand, States and EPA may agree as part of the annual monitoring plan submission by the State and approval by the Regional Administrator that in specific cases placement of new or relocated monitors into microenvironment or middle scale locations is warranted and consistent with the intended level of protection of the 24-hour PM2.5 NAAQS. States may also propose, and EPA would be inclined to approve, the placement of PM2.5 monitors in populated areas too small to be subject to the requirements regarding minimum numbers of monitors, if there is reason to believe PM2.5 concentrations are of concern. Of particular interest may be smaller cities and towns which presently lack any PM2.5 monitor but which experience emission patterns such as use of wood stoves and/or weather conditions such as inversions which can create high short-term concentrations of PM2.5. States also remain free to place SPM at any location, without need for EPA review or approval.[18]

    The proposed rule text for 40 CFR 58, appendix D inadvertently failed to include rule text on PM2.5 monitoring network design criteria, found in existing appendix D section 2.8.1.2.3, setting forth the requirements that: (1) The required monitors are sited to represent community-wide air quality, (2) at least one monitoring site is placed in a “population-oriented” area of expected maximum concentration, and (3) at least one station is placed in an area of poor air quality. Therefore, this final rule restores these pre-existing requirements to appendix D. This final rule sets out these criteria (in substantively identical but slightly redrafted form) in appendix D section 4.7.1(b).

    Also, as noted in the proposal and again above, some monitors that have not measured high concentrations relative to the 1997 24-hour NAAQS may become more influential to attainment status under the just adopted, more stringent 24-hour NAAQS. In these cases, EPA encourages States to consider adding or moving speciation and continuous monitors to the newly influential site to get a better characterization of PM2.5 concentrations and their causes at that location.

    Finally, this final rule clarifies that IMPROVE monitors operated by an Start Printed Page 61265organization other than the State may be counted as satisfying the State's obligation to operate background and transport monitoring sites for PM2.5.

    4. Requirements for Operation of PM10 Stations

    PM10 monitors currently are deployed throughout the country at about 1,200 sites, with most metropolitan areas already operating more PM10 monitors than are required by current monitoring requirements.

    In the January 17, 2006, proposal notice, EPA proposed changes to the PM10 requirements in coordination with new minimum requirements for a PM10−2.5 monitoring network in support of the proposed 24-hour PM10−2.5 NAAQS which would have eventually replaced the PM10 NAAQS entirely. See 71 FR 2742. As already explained, EPA is not finalizing the proposed NAAQS for PM10−2.5 and instead is retaining the 24-hour PM10 NAAQS for all parts of the U.S. This change has necessitated a different approach for PM10 minimum monitoring requirements from the one proposed.

    Rather than revoking PM10 monitoring requirements, as proposed, EPA believes that a robust nationwide monitoring network is required to provide compliance data for the 24-hour PM10 NAAQS and to support other objectives including the assessment of long-term trends, evaluations of the effectiveness of State and local coarse particle control programs, and health effects research. The EPA has therefore considered whether the existing National Air Monitoring Station Criteria in Table 4 of appendix D of 40 CFR part 58, last revisited in 1997, are still appropriate for these purposes. Because these criteria have an urban focus by being based on MSAs, allow for local considerations to be a factor in determining the actual required number of stations, require more stations in larger MSAs and MSAs with more evidence of poor PM10 air quality while also requiring some stations even in clean MSAs of a certain size, and in the aggregate will result in a required number of PM10 monitors that is similar to the required numbers of ozone and PM2.5 monitors, EPA believes these criteria are appropriate. With regard to the comparison to the required numbers of ozone and PM2.5 monitors, EPA has considered two directionally opposite factors. PM10 is less spatially uniform than O3 or PM2.5, suggesting the need for relatively more intensive monitoring in areas with PM10 problems, but PM10 concentrations in most areas are below the PM10 NAAQS (unlike for O3 and PM2.5) suggesting that fewer monitors should be required overall for PM10. This final rule therefore retains the current PM10 minimum network requirements, except that these will no longer be called “NAMS” requirements.

    The current PM10 minimum monitoring requirements in section 3.7.7 of part 58 appendix D are based on MSA population and three different ranges of ambient PM10 concentrations as compared to the PM10 NAAQS. For MSAs in the lowest category of ambient PM10 concentrations, those for which ambient PM10 data show concentrations less than 80 percent of the NAAQS, at least one monitor is required if the population of the MSA is 500,000 or greater. For MSAs in the highest category of ambient PM10 concentrations, those for which ambient PM10 data show concentrations exceeding the NAAQS by 20 percent or more, at least one monitor is required if the population of the MSAs is 100,000 persons or greater. These requirements list ranges of required monitors, with the actual number of monitors to be determined by EPA and States.

    Based on PM10 ambient data for 2003-2005 and current census population statistics, a minimum of between 200 and 500 PM10 FRM/FEM monitors will be required across all affected MSAs. Over 800 PM10 monitors are in fact currently deployed in these MSAs. About 400 other PM10 monitors currently operate outside the boundary of any MSA. As stated in section III.B of this preamble, EPA believes a reduction in the size of the existing monitoring networks for certain pollutants, including PM10, for which the large majority of monitors record no NAAQS violations, is an appropriate way to free up resources for higher priority monitoring objectives. These higher priority objectives could include meeting both the new requirements in this final rule such as the NCore multipollutant measurements and objectives defined by the local air quality management program. The EPA notes that many PM10 monitors have been recording concentrations well below the 24-hour PM10 NAAQS and thus are candidates for discontinuation at a State's initiative. States may also choose to continue to operate monitors in excess of the minimum requirements. To the extent that States and Tribes are considering reducing the total number of PM10 monitors deployed, EPA believes, consistent with the basis for retaining the 24-hour PM10 standard, priority should be given to maintaining monitors sited in urban and industrial [19] areas. States may of course choose to retain PM10 monitors that are recording concentrations below the PM10 NAAQS level to support monitoring objectives other than attainment/nonattainment determinations, such as baseline monitoring for prevention of significant deterioration permitting or public information. The EPA expects to work with States to assess their PM10 networks and help determine which of these monitors are delivering valuable data and which monitors present disinvestment opportunities. As should be evident, however, States may not reduce their PM10 networks below the minimum requirements for monitoring within MSAs given in 40 CFR part 58 appendix D.

    In addition, if States and Tribes are considering deploying new PM10 monitors, EPA recommends, again consistent with the basis for retaining the 24-hour PM10 standard, that those monitors be placed in areas where there are urban and/or industrial sources of thoracic coarse particles. Furthermore, consistent with the monitors used in studies that informed our decision on the level of the standard (see section III.D of the final rule on the PM NAAQS published elsewhere in today's Federal Register), EPA recommends that any new PM10 monitors be placed in locations that are reflective of community exposures at middle and neighborhood scales of representation, and not in source-oriented hotspots that are not population oriented.

    The final rule omits two passages in section 4.6 (Particulate Matter (PM10) Design Criteria) of 40 CFR 58, appendix D that were included for providing context for the proposed rule. The omitted passages are 4.6(b)(4) (Urban scale) and 4.6(b)(5) (Regional scale). As explained below, these two passages are not consistent with EPA's intention to preserve the substance of the 1997 monitoring rule regarding scales of representativeness, while restructuring appendix D to eliminate SLAMS versus NAMS distinctions and to make clearer which requirements (and explanatory background and guidance) applied to each individual pollutant. In appendix D of the 1997 monitoring rule, section 2.8 (Particulate Matter Design Criteria for SLAMS) addressed both PM2.5 and PM10, in some sentences referring explicitly to PM2.5, PM10, or both, and in some sentences referring only in general to particulate matter. In this final rule, section 4.6 (Particulate Matter (PM10) Start Printed Page 61266Design Criteria) addresses this subject matter for PM10, while section 4.7 (Fine Particulate Matter (PM2.5) Design Criteria) does so for PM2.5. In the proposed rule, for the purpose of providing context, EPA included paragraphs on microscale, middle scale, neighborhood scale, urban scale, and regional monitoring scales in both section 4.6 and 4.7. However, EPA upon closer consideration has determined that omitting the paragraphs on urban scale and regional scale from section 4.6 is appropriate for PM10, in terms of clarifying and preserving the effective substance of the 1997 rule for PM10. The bases for reaching this conclusion include the following: (1) The paragraphs concerning these scales of representation in the 1997 appendix D (section 2.8.0.7 and 2.8.0.8) mention PM2.5 specifically but not PM10, (2) the paragraph which precedes the five paragraphs on the five scales (2.8.0.2) states that middle and neighborhood scales are the most important scales for PM10, (3) section 2.8 in the 1997 rule was titled as applying to SLAMS in particular but no SLAMS monitors were specifically required at any spatial scale or scales, (4) under section 3.7 (Particulate Matter Design Criteria for NAMS) specific numbers of PM10 monitors were required but without specification as to spatial scale, and (5) Table 6 of appendix D in the 1997 rule indicates that only the micro, middle, and neighborhood scales are “required for NAMS.” The EPA notes that in the final rule, the same numbers of PM10 monitors are required as in the 1997 rule, but they are not referred to as NAMS monitors. The EPA notes that urban scale and regional scale are of little, if any, relevance to PM10 monitoring, because of the short transport distances for PM10, especially when emitted near ground level. In contrast, because PM2.5 is a secondary pollutant, large spatial scales are relevant because monitors in such locations will reflect regional emissions trends and transport patterns.

    5. Requirements for Operation of Carbon Monoxide, Sulfur Dioxide, Nitrogen Dioxide, and Lead Monitoring Stations

    Criteria pollutant monitoring networks for the measurement of CO, SO2, NO2, and Pb are primarily operated to determine compliance with the NAAQS and to track trends and accountability of emission control programs as part of a SIP. Because these criteria pollutant concentrations are typically well below the NAAQS, there is limited use for public reporting to the AQI.

    The EPA proposed to revoke all minimum requirements for CO, SO2, and NO2 monitoring networks, and reduce the requirements for Pb. See 71 FR 27423. The proposal allowed for reductions in ambient air monitoring for CO, SO2, NO2, and Pb, particularly where measured levels are well below the applicable NAAQS and air quality problems are not expected, except in cases with ongoing regulatory requirements for monitoring such as SIP or permit provisions. The EPA stated it would work with States on a voluntary basis to make sure that at least some monitors for these pollutants remain in place in each EPA region. Measurement of CO, SO2, and NOy were also proposed as required measurements at NCore sites. There may be little regulatory purpose for keeping many other sites showing low concentrations, other than specific State, local, or Tribal commitments to do so. However, in limited cases, some of these monitors may be part of a long-term record utilized in a health effects study. Under 40 CFR 58.11 of this final rule, States must consider the effect of monitoring site closures on data users other than the State itself, such as health effects studies. The EPA expects State and local agencies to seek input on which monitors are being used for health effects studies so they can give this consideration. See also section IV.E.8 of this preamble.

    6. Requirements for Operation of Ozone Stations

    Ozone (O3) monitors currently are deployed throughout the country at about 1,200 sites, with most metropolitan areas already operating more O3 monitors than would be required by today's action. The EPA does not anticipate or recommend significant changes to the size of this network because O3 remains a pollutant with measured levels near or above the NAAQS in many areas throughout the country. However, this final rule should help to better prioritize monitoring resources depending on the population and levels of O3 in an area.

    For O3, EPA proposed changing the minimum network requirement from at least two sites in “any urbanized area having a population of more than 200,000” to an approach that considers the level of exposure to O3, as indicated by the design value, and the census population of a metropolitan area. See 71 FR 2742. The proposal stated that a CSA, or MSA if there is no CSA, with a population of 10 million or more and a design value near the O3 NAAQS would be required to operate at least four sites. Smaller CSAs and MSAs as low as 350,000 people in population would be required to operate as few as one site. An even smaller area would have no required monitor, provided its design values (for example, from a previously required monitor or a SPM) were sufficiently low. Taking the same approach used in the proposed minimum requirements for PM2.5 sites, EPA proposed that high-population areas with measured ambient concentrations significantly above the NAAQS be allowed to operate one less site than areas with measured ambient concentrations near the NAAQS to allow flexibility of monitoring resources in those areas.

    The EPA received a number of comments on the proposed minimum network requirements for O3. Similar to the comments received on PM2.5, many commenters had concerns with requiring only one site when an area is significantly above the NAAQS and with defining the minimum monitoring requirements by CSA instead of by a smaller level of a metropolitan area. For instance, several commenters noted that by applying the minimum monitoring requirements by CSA, agencies may not be required to deploy enough monitors to characterize the within-MSA gradient needed to adequately characterize O3 across a metropolitan area.

    In response to concerns about allowing one less O3 monitoring site when a high-population area is significantly above the NAAQS, EPA is not adopting this provision. This final rule instead provides two values for the minimum required number of monitors according to design value. Agencies with areas that are significantly below the O3 NAAQS (less than or equal to 85 percent of the O3 NAAQS) have the lower minimum monitoring requirement. Areas that are within 15 percent of the NAAQS or above it have will be required to operate more O3 monitoring sites.

    To address the comments concerning the most appropriate Census Bureau-defined area for which to apply the O3 minimum monitoring requirements, EPA investigated the current network compared with using either CSA or MSA as the basis for applying the minimum network requirements. The results demonstrate that using MSA ensures a few more sites in the small number of large CSAs that have high populations and large geographical areas without unnecessarily requiring new sites in the many areas that have smaller geographic coverage and population. Since using MSA does not impose a significant new burden on the States and makes it more likely that within-MSA gradient characterization of Start Printed Page 61267O3 will be characterized in high concentration areas, EPA is adopting MSA as the appropriate unit of a metropolitan area to apply the minimum O3 monitoring requirements. All other monitoring requirements for O3 are adopted as proposed.

    7. Requirements for Operation of Photochemical Assessment Monitoring Stations

    Section 182(c)(1) of the CAA required EPA to promulgate rules requiring enhanced monitoring of O3, NO, and VOC in ozone nonattainment areas classified as serious, severe, or extreme. On February 12, 1993, EPA promulgated requirements for State and local monitoring agencies to establish PAMS as part of their SIP monitoring networks in ozone nonattainment areas classified as serious, severe, or extreme. During 2001, EPA formed a workgroup consisting of EPA, State, and local monitoring experts to evaluate the existing PAMS network. The PAMS workgroup recommended that the existing PAMS requirements be streamlined to allow for more individualized PAMS networks to suit the specific data needs for a PAMS area.

    The EPA proposed changes to the minimum PAMS monitoring requirements in 40 CFR part 58 to implement the recommendations of the PAMS workgroup. See 71 FR 2743. Specifically, EPA proposed the following changes: The number of required PAMS sites would be reduced; only one Type 2 site would be required per area regardless of population and Type 4 sites would not be required; and only one Type 1 or one Type 3 site would be required per area. The requirements for speciated VOC measurements would be reduced. Speciated VOC measurements would only be required at Type 2 sites and one other site (either Type 1 or Type 3) per PAMS area. Carbonyl sampling would only be required in areas classified as serious or above for the 8-hour O3 standard. Conventional NO2/NOX monitors would only be required at Type 2 sites. High sensitivity NOy monitors would be required at one site per PAMS area (either Type 1 or Type 3). High sensitivity CO monitors would be required at Type 2 sites.

    The EPA received comments on the proposed amended PAMS requirements. Overall, the commenters supported the reduction in minimum PAMS requirements which will allow for more individualized PAMS networks and alternative enhanced O3 monitoring initiatives. However, some commenters were concerned with the proposed requirement for NOy monitoring at one Type 1 or one Type 3 site. Several commenters stated that the PAMS NOy requirement is not likely to be beneficial. They argued that NOy data in urban areas are likely to be indistinguishable from NOX data, the commercial NOy instrumentation is not yet fully developed, NOy monitors are difficult to site properly, and that few States have the modeling capability to employ NOy data.

    The EPA disagrees with the commenters' statements that PAMS NOy measurements will not be beneficial. As compared to NOX measurements, NOy measurements provide a more complete measurement of the available reactive nitrogen species involved in the photochemical reactions that lead to O3 formation. One of the primary uses of NOy data is for O3 modeling. However, O3 modeling is not the only use for NOy data. Long-term measurements of NOy provide the best indicator of the effectiveness of NOX controls at reducing the reactive nitrogen compounds involved in O3 formation. In addition, a relatively simple analysis of the O3-to-NOy ratio, or VOC-to-NOy ratio can be performed to identify if an area is “NOX limited” or “VOC limited” which would indicate if additional NOX controls would be more beneficial than additional VOC controls.

    Ideally, the NOX method should measure NO and NO2, whereas NOy measurements include NO, NO2, and other important reactive nitrogen species (referred to here as NOz) which includes nitrous acids [nitric acid (HNO3), and nitrous acid (HONO)], organic nitrates [peroxyl acetyl nitrate (PAN), methyl peroxyl acetyl nitrate (MPAN), and peroxyl propionyl nitrate, (PPN)], and particulate nitrates. However, recent studies have shown that existing NOX monitors also measure (and misreport as NO2) some NOz species. The NOy method was developed as an extension of the NOX method to accurately measure all reactive nitrogen compounds. Nonetheless, EPA will allow for waivers of the NOy method (via an alternative plan provided for under paragraph 5.3 of appendix D to part 53) in areas where measured NOX is expected to provide virtually the same data as NOy. This is largely expected to be in areas with fresh oxides of nitrogen emissions until such time as the NO2 method (and hence the NOX method) is sufficiently improved that having separate measurements of NOy and NOX provides more useful information than the existing technology. The EPA has evaluated a number of commercially available NOy monitors and has found them accurate and reliable. As with many methods, EPA continues to evaluate improvements to the method, but at this time EPA believes that the current method (and commercially available instrumentation) provides data of sufficient quality to meet the PAMS program objectives.

    While proper siting of an NOy monitor (installing a 10 meter tower and meeting proper fetch characteristics) may be difficult in some urban settings, EPA believes that NOy monitors can be adequately sited at most PAMS areas. Nonetheless, if siting a NOy monitor is not practicable in a given PAMS area, a State may request an alternative plan, as allowed for under paragraph 5.3 of appendix D to part 53, to allow monitoring of NOX instead of monitoring for NOy.

    After review and consideration of the comments received, EPA has decided to finalize the revisions to the PAMS requirements as proposed.

    F. Appendix E—Probe and Monitoring Path Siting Criteria for Ambient Air Monitoring

    The proposed revisions to this appendix consisted of minor organizational changes and two technical changes to the siting criteria affecting PM10−2.5 and O3 monitoring sites. See 71 FR 2748.

    1. Vertical Placement of PM10−2.5 Samplers

    Specific probe siting criteria were required to support the proposed PM10−2.5 network. The EPA proposed vertical probe placement requirements that limited microscale PM10−2.5 sites to an allowable height range of 2 to 7 meters and neighborhood and large scale PM10−2.5 sites to a range of 2 to 15 meters. These ranges were identical to the existing requirements for PM10. The range for middle-scale PM10−2.5 sites was limited to 2 to 7 meters which represented a change from PM10 where 2 to 15 meters was the allowed vertical placement range for middle-scale sites.

    Several commenters supported the proposed PM10−2.5 middle-scale vertical requirement as being consistent with the expectation that coarse particle concentrations nearest the breathing zone would be important to measure in the assessment of exposure risk, and that monitoring sites with more elevated inlets would be more likely to miss localized concentrations where the public is exposed. By contrast, other commenters raised concerns that the requirement would result in the measurement of localized (microscale) near-ground conditions not representative of a middle-scale sized area. Commenters also noted the Start Printed Page 61268importance of keeping identical inlet requirements for PM10−2.5 and PM2.5 to maximize the benefits of having collocated measurements at the same site.

    Based on review of the comments, EPA is retaining the 2 to 7 meter vertical requirement for middle-scale PM10−2.5 sites. This requirement is consistent with current requirements for microscale PM monitors but would require modifications for existing PM2.5 and PM10 monitors located between 8 and 15 meters above ground that were intended for middle-scale PM10−2.5 measurement. The EPA does not expect this requirement to have a major impact on monitoring networks since this final rule requires PM10−2.5 monitoring only at NCore sites, and these sites will typically represent neighborhood or larger scales. This final rule retains the existing rule language that has the option for the Regional Administrator to grant a waiver of siting criteria, providing flexibility for States to document situations where useful data could still be produced by monitors not meeting applicable requirements.

    2. Ozone Monitor Setback Requirement From Roads

    The EPA proposed an increase to the minimum permitted distance between roadways and the inlet probes of neighborhood and urban scale ozone and oxides of nitrogen sites to reduce the scavenging effects of motor vehicle-related nitric oxide emissions. See 71 FR 2748.

    Many commenters believed that the scavenging effects of oxides of nitrogen on O3 levels in urban, populated areas was more of an area-wide phenomena and would not be changed by moving a site a few meters farther from the nearest roadway. The relative value of the proposed change on the basis of the resource requirements necessary to relocate sites not meeting the increased road setback requirements was also questioned. Some support was noted for the application of the increased roadway setback requirement to new sites as long as existing ozone sites were “grandfathered.”

    The EPA acknowledges the logistical difficulty and expense of moving existing sites to meet the increased setback requirement. To achieve a balance between the goal of minimizing the interference of roadway emissions on O3 and oxides of nitrogen monitor data and to reduce the burden on affected monitoring organizations, EPA has modified the increased roadway setback requirement to apply only to newly established sites.

    G. Sample Retention Requirements

    During the regulatory development process, various governmental agencies and health scientists indicated that archiving particulate matter filters for FRM and FEM would be useful for later chemical speciation analyses, mass analyses, or other analyses.

    Current sample retention requirements apply specifically to PM2.5 filters and require a minimum storage requirement of 1 year. The EPA proposed that retention requirements be expanded to require archival of PM2.5, PM10−2.5, and PM10c (low volume) filters for a period of 1 year after collection. See 71 FR 2749.

    Commenters were supportive of the proposed requirement. Some commenters stated that the required filter retention period should be longer than 1 year, with a range in suggested storage periods of between 3 to 7 years. States provided examples of how filters archived for longer than 1 year were subsequently analyzed to provide data useful in the support of health studies, SIP work, or analysis of exceptional events. Several commenters, while supportive of the rationale for filter archival, preferred that the requirement not be included in the regulation and instead left for voluntary monitoring agency compliance. One commenter suggested that the requirement be clarified to explicitly include retention of blank filters in addition to exposed filters.

    The EPA notes the support for the proposed sample retention requirement and did not change that requirement in this final rule. As stated in this final rule, States have the discretion to retain their samples for longer than one year. The EPA supports such procedures, recognizing that States will have different logistical constraints that control the maximum length of time for which filters can be stored. The EPA has clarified that the requirement applies to all such filters referenced in 40 CFR 58.16(f), including exposed filters and blanks.

    The EPA acknowledges the concern among some commenters that States retain the right to determine the best use of archived filters. These commenters stated that national considerations for filter analysis should be considered a secondary priority to State needs. The EPA is respectful of this issue, and expects to negotiate with States on the scope of any request for archived filters intended for potentially destructive analyses so that the request if compatible with other State uses for the same type of filters.

    The EPA did not propose a specific effective date for this requirement in the monitoring rule and no commenters expressed implementation concerns. Accordingly, this final rule includes an effective date of January 1, 2007 for the sample retention requirement.

    In the proposal, rule requirements regarding sample retention were located in section 4.9 of appendix D, a section devoted to network design criteria. The EPA believes that sample retention requirements are more logically located in subpart B of part 58, which contains provisions on data submittal. Accordingly, the title of 40 CFR 58.16 (“Data submittal”) has been renamed “Data submittal and archiving requirements” and corresponding rule requirements on sample retention have been moved to 40 CFR 58.16(f) of this final rule.

    H. Deletion of Appendices B and F

    This final rule removes and reserves appendix B of 40 CFR 58, Quality Assurance Requirements for Prevention of Significant Deterioration (PSD) Air Monitoring, and appendix F of 40 CFR part 58, Annual SLAMS Air Quality Information, because both are obsolete.

    The preamble to the proposed rule explicitly proposed to remove appendix B because the quality assurance requirements for PSD monitoring were proposed to be moved to appendix A, which this final rule does. See 71 FR 2725. (The amendatory language at the end of the January 17, 2006 proposal notice inadvertently did not list this change.) No adverse comments were received on this change.

    The January 17, 2006 notice did not explicitly address the preservation or removal of appendix F, but its effective removal was inherent in the proposed rule because no section of the proposed part 58 would continue to refer to appendix F. Similarly, the final part 58 does not refer to appendix F. Appendix F previously was referenced by 40 CFR 58.26 in subpart C, Annual state air monitoring report, now deleted. Appendix F specified the required content, which was extensive, of the annual report of summarized monitoring data. An extensive annual report of summarized monitoring data is no longer required in this final rule. New section, 40 CFR 58.16, Data submittal, instead requires submission of individual data values. Summary information on monitoring data is still required by 40 CFR 58.15, Annual air monitoring data certification, for the sole purpose of making it clear what data is within the scope of the required certification letter. This final rule does not specify the exact content of the Start Printed Page 61269summary information required by 40 CFR 58.15 in order to provide more flexibility and to accommodate possible evolution of the standardized AQS reports which are the most convenient way for monitoring organizations to provide this information.

    VI. Statutory and Executive Order Reviews

    A. Executive Order 12866: Regulatory Planning and Review

    Under Executive Order 12866 (58 FR 51735, October 4, 1993), this action is a “significant regulatory action” because it may raise novel legal policy issues arising out of legal mandates, the President's priorities, or the principles set forth in the Executive Order. Accordingly, EPA submitted this action to the Office of Management and Budget (OMB) for review under Executive Order 12866 and any changes made in response to OMB recommendations have been documented in the docket for this action.

    B. Paperwork Reduction Act

    The information collection requirements in this rule have been submitted for approval to the Office of Management and Budget (OMB) under the Paperwork Reduction Act, 44 U.S.C. 3501 et seq., OMB control number 2060-0084. The information collection requirements are not enforceable until OMB approves them.

    The monitoring, recordkeeping, and reporting requirements in 40 CFR parts 53 and 58 are specifically authorized by sections 110, 301(a), and 319 of the Clean Air Act (CAA). All information submitted to EPA pursuant to the monitoring, recordkeeping, and reporting requirements for which a claim of confidentiality is made is safeguarded according to Agency policies in 40 CFR part 2, subpart B.

    The information collected under 40 CFR part 53 (e.g., test results, monitoring records, instruction manual, and other associated information) is needed to determine whether a candidate method intended for use in determining attainment of the National Ambient Air Quality Standards (NAAQS) in 40 CFR part 50 will meet the design, performance, and/or comparability requirements for designation as a Federal reference method (FRM) or Federal equivalent method (FEM). The final amendments add requirements for PM10−2.5 FEM and FRM determinations, Class II equivalent methods for PM10−2.5 and Class III equivalent methods for PM2.5 and PM10−2.5; reduce certain monitoring and data collection requirements; and streamline EPA administrative requirements.

    The incremental annual reporting and recordkeeping burden for this collection of information under 40 CFR part 53 (averaged over the first 3 years of this ICR) for one additional respondent per year is estimated to increase by a total of 2,774 labor hours per year with an increase in costs of $32,000/year. The capital/startup costs for test equipment and qualifying tests are estimated at $3,832 with operation and maintenance costs of $27,772.

    The information collected and reported under 40 CFR part 58 is needed to determine compliance with the NAAQS, to characterize air quality and associated health and ecosystems impacts, to develop emission control strategies, and to measure progress for the air pollution program. The amendments revise the technical requirements for certain types of sites, add provisions for monitoring of PM1010−2.5, and reduce certain monitoring requirements for criteria pollutants. Monitoring agencies are required to submit annual monitoring network plans, conduct network assessments every 5 years, perform quality assurance activities, and, in certain instances, establish NCore sites by January 1, 2011.

    The annual average reporting burden for the collection under 40 CFR part 58 (averaged over the first 3 years of this ICR) for 168 respondents is estimated to decrease by a total of 48,546 labor hours per year with a decrease in costs of $6,151,494. State, local, and Tribal entities are eligible for State assistance grants provided by the Federal government under the CAA which can be used for monitors and related activities.

    Burden means the total time, effort, or financial resources expended by persons to generate, maintain, retain, or disclose or provide information to or for a Federal agency. This includes the time needed to review instructions; develop, acquire, install, and utilize technology and systems for the purposes of collecting, validating, and verifying information, processing and maintaining information, and disclosing and providing information; adjust the existing ways to comply with any previously applicable instructions and requirements; train personnel to be able to respond to a collection of information; search data sources; complete and review the collection of information; and transmit or otherwise disclose the information.

    An agency may not conduct or sponsor, and a person is not required to respond to a collection of information unless it displays a currently valid OMB control number. The OMB control numbers for EPA's regulations in 40 CFR parts 53 and 58 are listed in 40 CFR part 9. When these ICR are approved by OMB, EPA will publish a technical amendment to 40 CFR part 9 in the Federal Register to display the OMB control number for the approved information collection requirements contained in this final rule.

    C. Regulatory Flexibility Act

    The EPA has determined that it is not necessary to prepare a regulatory flexibility analysis in connection with these final rule amendments.

    For the purposes of assessing the impacts of the final amendments on small entities, small entity is defined as: (1) A small business as defined by the Small Business Administration's regulations at 13 CFR 121.201; (2) a government jurisdiction that is a government of a city, county, town, school district or special district with a population of less than 50,000; and (3) a small organization that is any not-for-profit enterprise which is independently owned and operated and that is not dominant in its field.

    After considering the economic impacts of this final rule amendments on small entities, EPA has concluded that this action will not have a significant economic impact on a substantial number of small entities. The final requirements in 40 CFR part 53 for an FEM application are voluntary actions on the part of equipment manufacturers to seek EPA approval for their candidate sampling methods. The applications are evaluated according to the requirements in 40 CFR part 53 and test data submitted by the manufacturers to EPA to ensure that the candidate equivalent methods meet the same technical standards as the FRM. The final amendments to 40 CFR part 58 will reduce annual ambient air monitoring costs for State and local agencies by approximately $6.2 million and 48,546 labor hours from present levels. State and Tribal assistance grant funding provided by the Federal government can be used to defray the costs of new or upgraded monitors for the NCore networks.

    D. Unfunded Mandates Reform Act

    Title II of the Unfunded Mandates Reform Act of 1995 (UMRA), Public Law 104-4, establishes requirements for Federal agencies to assess the effects of their regulatory actions on State, local, and Tribal governments and the private sector. Under section 202 of the UMRA, EPA generally must prepare a written statement, including a cost-benefit Start Printed Page 61270analysis, for proposed and final rules with “Federal mandates” that may result in expenditures to State, local, and Tribal governments, in the aggregate, or to the private sector, of $100 million or more in any one year. Before promulgating an EPA rule for which a written statement is needed, section 205 of the UMRA generally requires EPA to identify and consider a reasonable number of regulatory alternatives and adopt the least costly, most cost-effective or least burdensome alternative that achieves the objectives of the rule. The provisions of section 205 do not apply when they are inconsistent with applicable law. Moreover, section 205 allows EPA to adopt an alternative other than the least costly, most cost-effective or least burdensome alternative if the Administrator publishes with this final rule an explanation why that alternative was not adopted. Before EPA establishes any regulatory requirements that may significantly or uniquely affect small governments, including Tribal governments, it must have developed under section 203 of the UMRA a small government agency plan. The plan must provide for notifying potentially affected small governments, enabling officials of affected small governments to have meaningful and timely input in the development of EPA regulatory proposals with significant Federal intergovernmental mandates, and informing, educating, and advising small governments on compliance with the regulatory requirements.

    The EPA has determined that this final rule does not contain a Federal mandate that may result in expenditures of $100 million or more for State, local, and Tribal governments, in the aggregate, or the private sector in any one year. The final amendments to 40 CFR part 58 will reduce annual ambient air monitoring costs for State and local agencies by approximately $6.2 million and 48,546 labor hours from present levels. Thus, these final amendments are not subject to the requirements of sections 202 and 205 of the UMRA.

    The EPA has determined that this final rule contains no regulatory requirements that might significantly or uniquely affect small governments. Small governments that may be affected by the final amendments are already meeting similar requirements under the existing rules, and the final amendments will substantially reduce the costs of the existing rules. Therefore, this final rule is not subject to the requirements of section 203 of the UMRA.

    E. Executive Order 13132: Federalism

    Executive Order 13132 (64 FR 43255, August 10, 1999), requires EPA to develop an accountable process to ensure “meaningful and timely input by State and local officials in the development of regulatory policies that have federalism implications.” “Policies that have federalism implications” is defined in the Executive Order to include regulations that have “substantial direct effects on the States, on the relationship between the national government and the States, or on the distribution of power and responsibilities among the various levels of government.”

    This final rule does not have federalism implications because it will not have substantial direct effects on the States, on the relationship between the national government and the States, or on the distribution of power and responsibilities among the various levels of government, as specified in Executive Order 13132. Thus, Executive Order 13132 does not apply to this final rule.

    Although section 6 of the Executive Order does not apply to this final rule, EPA did consult with representatives of State and local governments early in the process of developing this proposed rule. In 2001, EPA organized a National Monitoring Steering Committee (NMSC) to provide oversight and guidance in reviewing the existing air pollution monitoring program and in developing a comprehensive national ambient air monitoring strategy. The NMSC membership includes representatives from EPA, State and local agencies, State and Territorial Air Pollution Program Administrators/Association of Local Air Pollution Control Officials (STAPPA/ALAPCO), and Tribal governments to reflect the partnership between EPA and governmental agencies that collect and use ambient air data. The NMSC formed workgroups to address quality assurance, technology, and regulatory review of the draft ambient air monitoring strategy (NAAMS). These workgroups met several times by phone and at least once in a face-to-face workshop to develop specific recommendations for improving the ambient air monitoring program. A record of the Steering Committee members, workgroup members, and workshop are available on the Web at: http://www.epa.gov/​ttn/​amtic/​monitor.html. The EPA also met with State, local, and Tribal government representatives to discuss their comments on the proposed amendments and suggestions for resolving issues.

    F. Executive Order 13175: Consultation and Coordination With Indian Tribal Governments

    Executive Order 13175, entitled “Consultation and Coordination with Indian Tribal Governments” (65 FR 67249, November 9, 2000), requires EPA to develop an accountable process to ensure “meaningful and timely input by tribal officials in the development of regulatory policies that have tribal implications.” This final rule does not have tribal implications, as specified in Executive Order 13175. The final amendments will not directly apply to Tribal governments. However, a Tribal government may elect to conduct ambient air monitoring and report the data to AQS. Since it is possible that tribal governments may choose to establish and operate NCore sites as part of the national monitoring program, EPA consulted with Tribal officials early in the process of developing the proposed rule to permit them to have meaningful and timely input into its development and after proposal to discuss their comments and concerns. As discussed in section VI.E of this preamble, tribal agencies were represented on both the NMSSC and the workgroups that developed the NAAMS document and proposed monitoring requirements. Tribal monitoring programs were represented on both the Quality Assurance and Technology work groups. Participation was also open to tribal monitoring programs on the regulatory review workgroup.

    G. Executive Order 13045: Protection of Children From Environmental Health and Safety Risks

    Executive Order 13045 (62 FR 19885, April 23, 1997) applies to any rule that: (1) Is determined to be “economically significant” as defined under Executive Order 12866, and (2) concerns an environmental health or safety risk that EPA has reason to believe may have a disproportionate effect on children. If the regulatory action meets both criteria, EPA must evaluate the environmental health or safety effects of the planned rule on children, and explain why the planned regulation is preferable to other potentially effective and reasonably feasible alternatives considered by EPA.

    The EPA interprets Executive Order 13045 as applying only to those regulatory actions that are based on health or safety risks, such that the analysis required under section 5-501 of the Order has the potential to influence the regulation. This final rule is not subject to Executive Order 13045 because, while it is based on the need for monitoring data to characterize risk, Start Printed Page 61271this final monitoring rule itself does not establish an environmental standard intended to mitigate health or safety risks.

    H. Executive Order 12898: Federal Actions To Address Environmental Justice in Minority Populations and Low-Income Populations

    Executive Order 12898 (58 FR 7629, February 11, 1994) requires that each Federal agency make achieving environmental justice part of its mission by identifying and addressing, as appropriate, disproportionately high and adverse human health or environmental effects of its programs, policies, and activities on minorities and low-income populations. These requirements have been addressed to the extent practicable in the Regulatory Impact Analysis (RIA) for the final revisions to the NAAQS for particulate matter.

    I. Executive Order 13211: Actions That Significantly Affect Energy Supply, Distribution, or Use

    This final rule is not a “significant energy action” as defined in Executive Order 13211, “Actions Concerning Regulations That Significantly Affect Energy Supply, Distribution or Use” (66 FR 28355, May 22, 2001) because it is not likely to have a significant adverse effect on the supply, distribution, or use of energy. No significant change in the use of energy is expected because the total number of monitors for ambient air quality measurements will not increase above present levels. Further, EPA has concluded that this final rule is not likely to have any adverse energy effects.

    J. National Technology Transfer Advancement Act

    Section 12(d) of the National Technology Transfer Advancement Act of 1995 (NTTAA), Public Law 104-113, section 12(d) (15 U.S.C. 272 note) directs EPA to use voluntary consensus standards in its regulatory activities unless to do so would be inconsistent with applicable law or otherwise impractical. Voluntary consensus standards are technical standards (e.g., materials specifications, test methods, sampling procedures, and business practices) that are developed or adopted by voluntary consensus standards bodies. The NTTAA directs EPA to provide Congress, through OMB, explanations when EPA decides not to use available and applicable voluntary consensus standards.

    The final amendments involve environmental monitoring and measurement. Ambient air concentrations of PM2.5 are currently measured by the Federal reference method in 40 CFR part 50, appendix L (Reference Method for the Determination of Fine Particulate as PM2.5 in the Atmosphere) or by FRM or FEM that meet the requirements in 40 CFR part 53. Ambient air concentrations of PM10−2.5 will be measured by the final FRM in 40 CFR part 50, appendix O (Reference Method for the Determination of Coarse Particulate Matter as PM10−2.5 in the Atmosphere) published elsewhere in this Federal Register or by an FRM or FEM that meets the requirements in 40 CFR part 53. As discussed in section IV.B of this preamble, the final FRM for PM10−2.5 is similar to the existing methods for PM2.5 and PM10.

    Procedures are included in this final rule that allow for approval of an FEM for PM10−2.5 that is similar to the final FRM. Any method that meets the performance criteria for a candidate equivalent method may be approved for use as an FRM or FEM.

    This approach is consistent with EPA's Performance-Based Measurement System (PBMS). The PBMS approach is intended to be more flexible and cost effective for the regulated community; it is also intended to encourage innovation in analytical technology and improved data quality. The EPA is not precluding the use of any method, whether it constitutes a voluntary consensus standard or not, as long as it meets the specified performance criteria.

    K. Congressional Review Act

    The Congressional Review Act, 5 U.S.C. 801, et seq., as added by the Small Business Regulatory Enforcement Fairness Act of 1996, generally provides that before a rule may take effect, the agency promulgating the rule must submit a rule report, which includes a copy of the rule, to each House of Congress and to the Comptroller General of the United States. The EPA will submit a report containing the final amendments and other required information to the U.S. Senate, the U.S. House of Representatives, and the Comptroller General of the United States prior to publication of the final amendments in the Federal Register. A major rule cannot take effect until 60 days after it is published in the Federal Register. This action is not a “major rule” as defined by 5 U.S.C. 804(2). This final rule will not have an annual effect on the economy of $100 million or more, will not result in a major increase in costs or prices for State or local agencies, and will not affect competition with foreign-based enterprises in domestic and export markets. The final amendments will be effective on December 18, 2006. The final amendments will be effective 60 days after publication in the Federal Register to be consistent with the effective date of the revised NAAQS for PM published elsewhere in this Federal Register. Revisions to Ambient Air Monitoring Regulations.

    Start List of Subjects

    List of Subjects in 40 CFR Parts 53 and 58

    • Environmental protection
    • Administrative practice and procedure
    • Air pollution control
    • Intergovernmental relations
    • Reporting and recordkeeping requirements
    End List of Subjects Start Signature

    Dated: September 27, 2006.

    Stephen L. Johnson,

    Administrator.

    End Signature Start Amendment Part

    For the reasons set out in the preamble, title 40, chapter I, parts 53 and 58 of the Code of Federal Regulations are amended as follows:

    End Amendment Part Start Part

    PART 53—[AMENDED]

    End Part Start Amendment Part

    1. The authority citation for part 53 continues to read as follows:

    End Amendment Part Start Authority

    Authority: Section 301(a) of the Clean Air Act (42 U.S.C. sec. 1857g(a)), as amended by sec. 15(c)(2) of Pub. L. 91-604, 84 Stat. 1713, unless otherwise noted.

    End Authority

    Subpart A—[Amended]

    Start Amendment Part

    2. Sections 53.1 through 53.5 are revised to read as follows:

    End Amendment Part
    Definitions.

    Terms used but not defined in this part shall have the meaning given them by the Act.

    Act means the Clean Air Act (42 U.S.C. 1857-1857l), as amended.

    Additive and multiplicative bias means the linear regression intercept and slope of a linear plot fitted to corresponding candidate and reference method mean measurement data pairs.

    Administrator means the Administrator of the Environmental Protection Agency (EPA) or his or her authorized representative.

    Agency means the Environmental Protection Agency.

    Applicant means a person or entity who submits an application for a Federal reference method or Federal equivalent method determination under § 53.4, or a person or entity who assumes the rights and obligations of an applicant under § 53.7. Applicant may include a manufacturer, distributor, supplier, or vendor.

    Automated method or analyzer means a method for measuring concentrations of an ambient air pollutant in which sample collection (if necessary), Start Printed Page 61272analysis, and measurement are performed automatically by an instrument.

    Candidate method means a method for measuring the concentration of an air pollutant in the ambient air for which an application for a Federal reference method determination or a Federal equivalent method determination is submitted in accordance with § 53.4, or a method tested at the initiative of the Administrator in accordance with § 53.7.

    Class I equivalent method means an equivalent method for PM2.5 or PM10−2.5 which is based on a sampler that is very similar to the sampler specified for reference methods in appendix L or appendix O (as applicable) of part 50 of this chapter, with only minor deviations or modifications, as determined by EPA.

    Class II equivalent method means an equivalent method for PM2.5 or PM10−2.5 that utilizes a PM2.5 sampler or PM10−2.5 sampler in which integrated PM2.5 samples or PM10−2.5 samples are obtained from the atmosphere by filtration and subjected to a subsequent filter conditioning process followed by a gravimetric mass determination, but which is not a Class I equivalent method because of substantial deviations from the design specifications of the sampler specified for reference methods in appendix L or appendix O (as applicable) of part 50 of this chapter, as determined by EPA.

    Class III equivalent method means an equivalent method for PM2.5 or PM10−2.5 that is an analyzer capable of providing PM2.5 or PM10−2.5 ambient air measurements representative of one-hour or less integrated PM2.5 or PM10−2.5 concentrations as well as 24-hour measurements determined as, or equivalent to, the mean of 24 one-hour consecutive measurements.

    CO means carbon monoxide.

    Collocated means two or more air samplers, analyzers, or other instruments that are operated simultaneously while located side by side, separated by a distance that is large enough to preclude the air sampled by any of the devices from being affected by any of the other devices, but small enough so that all devices obtain identical or uniform ambient air samples that are equally representative of the general area in which the group of devices is located.

    Federal equivalent method (FEM) means a method for measuring the concentration of an air pollutant in the ambient air that has been designated as an equivalent method in accordance with this part; it does not include a method for which an equivalent method designation has been canceled in accordance with § 53.11 or § 53.16.

    Federal reference method (FRM) means a method of sampling and analyzing the ambient air for an air pollutant that is specified as a reference method in an appendix to part 50 of this chapter, or a method that has been designated as a reference method in accordance with this part; it does not include a method for which a reference method designation has been canceled in accordance with § 53.11 or § 53.16.

    ISO 9001-registered facility means a manufacturing facility that is either:

    (1) An International Organization for Standardization (ISO) 9001-registered manufacturing facility, registered to the ISO 9001 standard (by the Registrar Accreditation Board (RAB) of the American Society for Quality Control (ASQC) in the United States), with registration maintained continuously; or

    (2) A facility that can be demonstrated, on the basis of information submitted to the EPA, to be operated according to an EPA-approved and periodically audited quality system which meets, to the extent appropriate, the same general requirements as an ISO 9001-registered facility for the design and manufacture of designated Federal reference method and Federal equivalent method samplers and monitors.

    ISO-certified auditor means an auditor who is either certified by the Registrar Accreditation Board (in the United States) as being qualified to audit quality systems using the requirements of recognized standards such as ISO 9001, or who, based on information submitted to the EPA, meets the same general requirements as provided for ISO-certified auditors.

    Manual method means a method for measuring concentrations of an ambient air pollutant in which sample collection, analysis, or measurement, or some combination thereof, is performed manually. A method for PM10 or PM2.5 which utilizes a sampler that requires manual preparation, loading, and weighing of filter samples is considered a manual method even though the sampler may be capable of automatically collecting a series of sequential samples.

    NO means nitrogen oxide.

    NO2 means nitrogen dioxide.

    NOX means oxides of nitrogen and is defined as the sum of the concentrations of NO2 and NO.

    O3 means ozone.

    Operated simultaneously means that two or more collocated samplers or analyzers are operated concurrently with no significant difference in the start time, stop time, and duration of the sampling or measurement period.

    Pb means lead.

    PM means PM10, PM10C, PM2.5, PM10−2.5, or particulate matter of unspecified size range.

    PM2.5 means particulate matter with an aerodynamic diameter less than or equal to a nominal 2.5 micrometers as measured by a reference method based on appendix L of part 50 of this chapter and designated in accordance with part 53 of this chapter, by an equivalent method designated in accordance with part 53 of this chapter, or by an approved regional method designated in accordance with appendix C to this part.

    PM10 means particulate matter with an aerodynamic diameter less than or equal to a nominal 10 micrometers as measured by a reference method based on appendix J of part 50 of this chapter and designated in accordance with this part or by an equivalent method designated in accordance with this part.

    PM10C means particulate matter with an aerodynamic diameter less than or equal to a nominal 10 micrometers as measured by a reference method based on appendix O of part 50 of this chapter and designated in accordance with this part or by an equivalent method designated in accordance with this part.

    PM10−2.5 means particulate matter with an aerodynamic diameter less than or equal to a nominal 10 micrometers and greater than a nominal 2.5 micrometers as measured by a reference method based on appendix O to part 50 of this chapter and designated in accordance with this part or by an equivalent method designated in accordance with this part.

    PM2.5sampler means a device, associated with a manual method for measuring PM2.5, designed to collect PM2.5 from an ambient air sample, but lacking the ability to automatically analyze or measure the collected sample to determine the mass concentrations of PM2.5 in the sampled air.

    PM10sampler means a device, associated with a manual method for measuring PM10, designed to collect PM10 from an ambient air sample, but lacking the ability to automatically analyze or measure the collected sample to determine the mass concentrations of PM10 in the sampled air.

    PM10Csampler means a PM10 sampler that meets the special requirements for a PM10C sampler that is part of a PM10−2.5 reference method sampler, as specified in appendix O to part 50 of this chapter, or a PM10 sampler that is part of a PM10−2.5 sampler that has been designated as an equivalent method for PM10−2.5. Start Printed Page 61273

    PM10−2.5sampler means a sampler, or a collocated pair of samplers, associated with a manual method for measuring PM10−2.5 and designed to collect either PM10−2.5 directly or PM10C and PM2.5 separately and simultaneously from concurrent ambient air samples, but lacking the ability to automatically analyze or measure the collected sample(s) to determine the mass concentrations of PM10−2.5 in the sampled air.

    Sequential samples for PM samplers means two or more PM samples for sequential (but not necessarily contiguous) time periods that are collected automatically by the same sampler without the need for intervening operator service.

    SO2 means sulfur dioxide.

    Test analyzer means an analyzer subjected to testing as part of a candidate method in accordance with subparts B, C, D, E, or F of this part, as applicable.

    Test sampler means a PM10 sampler, PM2.5 sampler, or PM10−2.5 sampler subjected to testing as part of a candidate method in accordance with subparts C, D, E, or F of this part.

    Ultimate purchaser means the first person or entity who purchases a Federal reference method or a Federal equivalent method for purposes other than resale.

    General requirements for a reference method determination.

    The following general requirements for a Federal reference method (FRM) determination are summarized in table A-1 of this subpart.

    (a) Manual methods—(1) Sulfur dioxide (SO2) and lead. For measuring SO2 and lead, appendices A and G of part 50 of this chapter specify unique manual FRM for measuring these pollutants. Except as provided in § 53.16, other manual methods for SO2 and lead will not be considered for FRM determinations under this part.

    (2) PM10. A FRM for measuring PM10 must be a manual method that meets all requirements specified in appendix J of part 50 of this chapter and must include a PM10 sampler that has been shown in accordance with this part to meet all requirements specified in this subpart A and subpart D of this part.

    (3) PM2.5. A FRM for measuring PM2.5 must be a manual method that meets all requirements specified in appendix L of part 50 of this chapter and must include a PM2.5 sampler that has been shown in accordance with this part to meet the applicable requirements specified in this subpart A and subpart E of this part. Further, FRM samplers must be manufactured in an ISO 9001-registered facility, as defined in § 53.1 and as set forth in § 53.51.

    (4) PM10−2.5. A FRM for measuring PM10−2.5 must be a manual method that meets all requirements specified in appendix O of part 50 of this chapter and must include PM10C and PM2.5 samplers that have been shown in accordance with this part to meet the applicable requirements specified in this subpart A and subpart E of this part. Further, PM10−2.5 FRM samplers must be manufactured in an ISO 9001-registered facility, as defined in § 53.1 and as set forth in § 53.51.

    (b) Automated methods. An automated FRM for measuring CO, O3, or NO2 must utilize the measurement principle and calibration procedure specified in the appropriate appendix to part 50 of this chapter and must have been shown in accordance with this part to meet the requirements specified in this subpart A and subpart B of this part.

    General requirements for an equivalent method determination.

    (a) Manual methods. A manual Federal equivalent method (FEM) must have been shown in accordance with this part to satisfy the applicable requirements specified in this subpart A and subpart C of this part. In addition, a PM sampler associated with a manual method for PM10, PM2.5, or PM10−2.5 must have been shown in accordance with this part to satisfy the following additional requirements, as applicable:

    (1) PM10. A PM10 sampler associated with a manual method for PM10 must satisfy the requirements of subpart D of this part.

    (2) PM2.5Class I. A PM2.5 Class I FEM sampler must also satisfy all requirements of subpart E of this part, which shall include appropriate demonstration that each and every deviation or modification from the FRM sampler specifications does not significantly alter the performance of the sampler.

    (3) PM2.5Class II. (i) A PM2.5 Class II FEM sampler must also satisfy the applicable requirements of subparts E and F of this part or the alternative requirements in paragraph (a)(3)(ii) of this section.

    (ii) In lieu of the applicable requirements specified for Class II PM2.5 methods in subparts C and F of this part, a Class II PM2.5 FEM sampler may alternatively meet the applicable requirements in paragraphs (b)(3)(i) through (iii) of this section and the testing, performance, and comparability requirements specified for Class III equivalent methods for PM2.5 in subpart C of this part.

    (4) PM10−2.5Class I. A PM10−2.5 Class I FEM sampler must also satisfy the applicable requirements of subpart E of this part (there are no additional requirements specifically for Class I PM10−2.5 methods in subpart C of this part).

    (5) PM10−2.5Class II. (i) A PM10−2.5 Class II FEM sampler must also satisfy the applicable requirements of subpart C of this part and also the applicable requirements and provisions of paragraphs (b)(3)(i) through (iii) of this section, or the alternative requirements in paragraph (a)(5)(ii) of this section.

    (ii) In lieu of the applicable requirements specified for Class II PM10−2.5 methods in subpart C of this part and in paragraph (b)(3)(iii) of this section, a Class II PM10−2.5 FEM sampler may alternatively meet the applicable requirements in paragraphs (b)(3)(i) and (ii) of this section and the testing, performance, and comparability requirements specified for Class III FEMs for PM10−2.5 in subpart C of this part.

    (6) ISO 9001. All designated FEMs for PM2.5 or PM10−2.5 must be manufactured in an ISO 9001-registered facility, as defined in § 53.1 and as set forth in § 53.51.

    (b) Automated methods. All types of automated FEMs must have been shown in accordance with this part to satisfy the applicable requirements specified in this subpart A and subpart C of this part. In addition, an automated FEM must have been shown in accordance with this part to satisfy the following additional requirements, as applicable:

    (1) An automated FEM for pollutants other than PM must be shown in accordance with this part to satisfy the applicable requirements specified in subpart B of this part.

    (2) An automated FEM for PM10 must be shown in accordance with this part to satisfy the applicable requirements of subpart D of this part.

    (3) A Class III automated FEM for PM2.5 or PM10−2.5 must be shown in accordance with this part to satisfy the requirements in paragraphs (b)(3)(i) through (iii) of this section, as applicable.

    (i) All pertinent requirements of 40 CFR part 50, appendix L, including sampling height, range of operational conditions, ambient temperature and pressure sensors, outdoor enclosure, electrical power supply, control devices and operator interfaces, data output port, operation/instruction manual, data output and reporting requirements, and any other requirements that would be reasonably applicable to the method, unless adequate (as determined by the Administrator) rationale can be Start Printed Page 61274provided to support the contention that a particular requirement does not or should not be applicable to the particular candidate method.

    (ii) All pertinent tests and requirements of subpart E of this part, such as instrument manufacturing quality control; final assembly and inspection; manufacturer's audit checklists; leak checks; flow rate accuracy, measurement accuracy, and flow rate cut-off; operation following power interruptions; effect of variations in power line voltage, ambient temperature and ambient pressure; and aerosol transport; unless adequate (as determined by the Administrator) rationale can be provided to support the contention that a particular test or requirement does not or should not be applicable to the particular candidate method.

    (iii) Candidate methods shall be tested for and meet any performance requirements, such as inlet aspiration, particle size separation or selection characteristics, change in particle separation or selection characteristics due to loading or other operational conditions, or effects of surface exposure and particle volatility, determined by the Administrator to be necessary based on the nature, design, and specifics of the candidate method and the extent to which it deviates from the design and performance characteristics of the reference method. These performance requirements and the specific test(s) for them will be determined by Administrator for each specific candidate method or type of candidate method and may be similar to or based on corresponding tests and requirements set forth in subpart F of this part or may be special requirements and tests tailored by the Administrator to the specific nature, design, and operational characteristics of the candidate method. For example, a candidate method with an inlet design deviating substantially from the design of the reference method inlet would likely be subject to an inlet aspiration test similar to that set forth in § 53.63. Similarly, a candidate method having an inertial fractionation system substantially different from that of the reference method would likely be subject to a static fractionation test and a loading test similar to those set forth in §§ 53.64 and 53.65, respectively. A candidate method with more extensive or profound deviations from the design and function of the reference method may be subject to other tests, full wind-tunnel tests similar to those described in § 53.62, or to special tests adapted or developed individually to accommodate the specific type of measurement or operation of the candidate method.

    (4) All designated FEM for PM2.5 or PM10−2.5 must be manufactured in an ISO 9001-registered facility, as defined in § 53.1 and as set forth in § 53.51.

    Applications for reference or equivalent method determinations.

    (a) Applications for FRM or FEM determinations shall be submitted in duplicate to: Director, National Exposure Research Laboratory, Reference and Equivalent Method Program (MD-D205-03), U.S. Environmental Protection Agency, Research Triangle Park, North Carolina 27711 (Commercial delivery address: 4930 Old Page Road, Durham, North Carolina 27703).

    (b) Each application shall be signed by an authorized representative of the applicant, shall be marked in accordance with § 53.15 (if applicable), and shall contain the following:

    (1) A clear identification of the candidate method, which will distinguish it from all other methods such that the method may be referred to unambiguously. This identification must consist of a unique series of descriptors such as title, identification number, analyte, measurement principle, manufacturer, brand, model, etc., as necessary to distinguish the method from all other methods or method variations, both within and outside the applicant's organization.

    (2) A detailed description of the candidate method, including but not limited to the following: The measurement principle, manufacturer, name, model number and other forms of identification, a list of the significant components, schematic diagrams, design drawings, and a detailed description of the apparatus and measurement procedures. Drawings and descriptions pertaining to candidate methods or samplers for PM2.5 or PM10−2.5 must meet all applicable requirements in reference 1 of appendix A of this subpart, using appropriate graphical, nomenclature, and mathematical conventions such as those specified in references 3 and 4 of appendix A of this subpart.

    (3) A copy of a comprehensive operation or instruction manual providing a complete and detailed description of the operational, maintenance, and calibration procedures prescribed for field use of the candidate method and all instruments utilized as part of that method (under § 53.9(a)).

    (i) As a minimum this manual shall include:

    (A) Description of the method and associated instruments.

    (B) Explanation of all indicators, information displays, and controls.

    (C) Complete setup and installation instructions, including any additional materials or supplies required.

    (D) Details of all initial or startup checks or acceptance tests and any auxiliary equipment required.

    (E) Complete operational instructions.

    (F) Calibration procedures and descriptions of required calibration equipment and standards.

    (G) Instructions for verification of correct or proper operation.

    (H) Trouble-shooting guidance and suggested corrective actions for abnormal operation.

    (I) Required or recommended routine, periodic, and preventative maintenance and maintenance schedules.

    (J) Any calculations required to derive final concentration measurements.

    (K) Appropriate references to any applicable appendix of part 50 of this chapter; reference 6 of appendix A of this subpart; and any other pertinent guidelines.

    (ii) The manual shall also include adequate warning of potential safety hazards that may result from normal use and/or malfunction of the method and a description of necessary safety precautions. (See § 53.9(b).) However, the previous requirement shall not be interpreted to constitute or imply any warranty of safety of the method by EPA. For samplers and automated methods, the manual shall include a clear description of all procedures pertaining to installation, operation, preventive maintenance, and troubleshooting and shall also include parts identification diagrams. The manual may be used to satisfy the requirements of paragraphs (b)(1) and (2) of this section to the extent that it includes information necessary to meet those requirements.

    (4) A statement that the candidate method has been tested in accordance with the procedures described in subparts B, C, D, E, and/or F of this part, as applicable.

    (5) Descriptions of test facilities and test configurations, test data, records, calculations, and test results as specified in subparts B, C, D, E, and/or F of this part, as applicable. Data must be sufficiently detailed to meet appropriate principles described in part B, sections 3.3.1 (paragraph 1) and 3.5.1 and part C, section 4.6 of reference 2 of appendix A of this subpart; and in paragraphs 1 through 3 of section 4.8 (Records) of reference 5 of appendix A of this subpart. Salient requirements Start Printed Page 61275from these references include the following:

    (i) The applicant shall maintain and include records of all relevant measuring equipment, including the make, type, and serial number or other identification, and most recent calibration with identification of the measurement standard or standards used and their National Institute of Standards and Technology (NIST) traceability. These records shall demonstrate the measurement capability of each item of measuring equipment used for the application and include a description and justification (if needed) of the measurement setup or configuration in which it was used for the tests. The calibration results shall be recorded and identified in sufficient detail so that the traceability of all measurements can be determined and any measurement could be reproduced under conditions close to the original conditions, if necessary, to resolve any anomalies.

    (ii) Test data shall be collected according to the standards of good practice and by qualified personnel. Test anomalies or irregularities shall be documented and explained or justified. The impact and significance of the deviation on test results and conclusions shall be determined. Data collected shall correspond directly to the specified test requirement and be labeled and identified clearly so that results can be verified and evaluated against the test requirement. Calculations or data manipulations must be explained in detail so that they can be verified.

    (6) A statement that the method, analyzer, or sampler tested in accordance with this part is representative of the candidate method described in the application.

    (c) For candidate automated methods and candidate manual methods for PM10, PM2.5, and PM10−2.5 the application shall also contain the following:

    (1) A detailed description of the quality system that will be utilized, if the candidate method is designated as a reference or equivalent method, to ensure that all analyzers or samplers offered for sale under that designation will have essentially the same performance characteristics as the analyzer(s) or samplers tested in accordance with this part. In addition, the quality system requirements for candidate methods for PM2.5 and PM10−2.5 must be described in sufficient detail, based on the elements described in section 4 of reference 1 (Quality System Requirements) of appendix A of this subpart. Further clarification is provided in the following sections of reference 2 of appendix A of this subpart: part A (Management Systems), sections 2.2 (Quality System and Description), 2.3 (Personnel Qualification and Training), 2.4 (Procurement of Items and Services), 2.5 (Documents and Records), and 2.7 (Planning); part B (Collection and Evaluation of Environmental Data), sections 3.1 (Planning and Scoping), 3.2 (Design of Data Collection Operations), and 3.5 (Assessment and Verification of Data Usability); and part C (Operation of Environmental Technology), sections 4.1 (Planning), 4.2 (Design of Systems), and 4.4 (Operation of Systems).

    (2) A description of the durability characteristics of such analyzers or samplers (see § 53.9(c)). For methods for PM2.5 and PM10−2.5 the warranty program must ensure that the required specifications (see Table A-1 to this subpart) will be met throughout the warranty period and that the applicant accepts responsibility and liability for ensuring this conformance or for resolving any nonconformities, including all necessary components of the system, regardless of the original manufacturer. The warranty program must be described in sufficient detail to meet appropriate provisions of the ANSI/ASQC and ISO 9001 standards (references 1 and 2 in appendix A of this subpart) for controlling conformance and resolving nonconformance, particularly sections 4.12, 4.13, and 4.14 of reference 1 in appendix A of this subpart.

    (i) Section 4.12 in reference 1 of appendix A of this subpart requires the manufacturer to establish and maintain a system of procedures for identifying and maintaining the identification of inspection and test status throughout all phases of manufacturing to ensure that only instruments that have passed the required inspections and tests are released for sale.

    (ii) Section 4.13 in reference 1 of appendix A of this subpart requires documented procedures for control of nonconforming product, including review and acceptable alternatives for disposition; section 4.14 in reference 1 of appendix A of this subpart requires documented procedures for implementing corrective (4.14.2) and preventive (4.14.3) action to eliminate the causes of actual or potential nonconformities. In particular, section 4.14.3 requires that potential causes of nonconformities be eliminated by using information such as service reports and customer complaints to eliminate potential causes of nonconformities.

    (d) For candidate reference or equivalent methods for PM2.5 and Class II or Class III equivalent methods for PM10−2.5, the applicant, if requested by EPA, shall provide to EPA for test purposes one sampler or analyzer that is representative of the sampler or analyzer associated with the candidate method. The sampler or analyzer shall be shipped FOB destination to Director, National Exposure Research Laboratory, Reference and Equivalent Method Program (MD-D205-03), U.S. Environmental Protection Agency, 4930 Old Page Road, Durham, North Carolina 27703, scheduled to arrive concurrent with or within 30 days of the arrival of the other application materials. This analyzer or sampler may be subjected to various tests that EPA determines to be necessary or appropriate under § 53.5(f), and such tests may include special tests not described in this part. If the instrument submitted under this paragraph malfunctions, becomes inoperative, or fails to perform as represented in the application before the necessary EPA testing is completed, the applicant shall be afforded an opportunity to repair or replace the device at no cost to EPA. Upon completion of EPA testing, the analyzer or sampler submitted under this paragraph shall be repacked by EPA for return shipment to the applicant, using the same packing materials used for shipping the instrument to EPA unless alternative packing is provided by the applicant. Arrangements for, and the cost of, return shipment shall be the responsibility of the applicant. The EPA does not warrant or assume any liability for the condition of the analyzer or sampler upon return to the applicant.

    Processing of applications.

    After receiving an application for a FRM or FEM determination, the Administrator will, within 120 calendar days after receipt of the application, take one or more of the following actions:

    (a) Send notice to the applicant, in accordance with § 53.8, that the candidate method has been determined to be a reference or equivalent method.

    (b) Send notice to the applicant that the application has been rejected, including a statement of reasons for rejection.

    (c) Send notice to the applicant that additional information must be submitted before a determination can be made and specify the additional information that is needed (in such cases, the 120-day period shall commence upon receipt of the additional information).

    (d) Send notice to the applicant that additional test data must be submitted and specify what tests are necessary and Start Printed Page 61276how the tests shall be interpreted (in such cases, the 120-day period shall commence upon receipt of the additional test data).

    (e) Send notice to the applicant that the application has been found to be substantially deficient or incomplete and cannot be processed until additional information is submitted to complete the application and specify the general areas of substantial deficiency.

    (f) Send notice to the applicant that additional tests will be conducted by the Administrator, specifying the nature of and reasons for the additional tests and the estimated time required (in such cases, the 120-day period shall commence 1 calendar day after the additional tests have been completed).

    3. Sections 53.8 and 53.9 are revised to read as follows:

    Designation of reference and equivalent methods.

    (a) A candidate method determined by the Administrator to satisfy the applicable requirements of this part shall be designated as a FRM or FEM (as applicable) by and upon publication of a notice of the designation in the Federal Register.

    (b) Upon designation, a notice indicating that the method has been designated as a FRM or FEM shall be sent to the applicant.

    (c) The Administrator will maintain a current list of methods designated as FRM or FEM in accordance with this part and will send a copy of the list to any person or group upon request. A copy of the list will be available for inspection or copying at EPA Regional Offices and may be available via the Internet or other sources.

    Conditions of designation.

    Designation of a candidate method as a FRM or FEM shall be conditioned to the applicant's compliance with the following requirements. Failure to comply with any of the requirements shall constitute a ground for cancellation of the designation in accordance with § 53.11.

    (a) Any method offered for sale as a FRM or FEM shall be accompanied by a copy of the manual referred to in § 53.4(b)(3) when delivered to any ultimate purchaser, and an electronic copy of the manual suitable for incorporating into user-specific standard operating procedure documents shall be readily available to any users.

    (b) Any method offered for sale as a FRM or FEM shall generate no unreasonable hazard to operators or to the environment during normal use or when malfunctioning.

    (c) Any analyzer, PM10 sampler, PM2.5 sampler, or PM10−2.5 sampler offered for sale as part of a FRM or FEM shall function within the limits of the performance specifications referred to in § 53.20(a), § 53.30(a), § 53.50, or § 53.60, as applicable, for at least 1 year after delivery and acceptance when maintained and operated in accordance with the manual referred to in § 53.4(b)(3).

    (d) Any analyzer, PM10 sampler, PM2.5 sampler, or PM10−2.5 sampler offered for sale as a FRM or FEM shall bear a prominent, permanently affixed label or sticker indicating that the analyzer or sampler has been designated by EPA as a FRM or FEM (as applicable) in accordance with this part and displaying any designated method identification number that may be assigned by EPA.

    (e) If an analyzer is offered for sale as a FRM or FEM and has one or more selectable ranges, the label or sticker required by paragraph (d) of this section shall be placed in close proximity to the range selector and shall indicate clearly which range or ranges have been designated as parts of the FRM or FEM.

    (f) An applicant who offers analyzers, PM10 samplers, PM2.5 samplers, or PM10−2.5 samplers for sale as FRM or FEMs shall maintain an accurate and current list of the names and mailing addresses of all ultimate purchasers of such analyzers or samplers. For a period of 7 years after publication of the FRM or FEM designation applicable to such an analyzer or sampler, the applicant shall notify all ultimate purchasers of the analyzer or sampler within 30 days if the designation has been canceled in accordance with § 53.11 or § 53.16 or if adjustment of the analyzer or sampler is necessary under § 53.11(b).

    (g) If an applicant modifies an analyzer, PM10 sampler, PM2.5 sampler, or PM10−2.5 sampler that has been designated as a FRM or FEM, the applicant shall not sell the modified analyzer or sampler as a reference or equivalent method nor attach a label or sticker to the modified analyzer or sampler under paragraph (d) or (e) of this section until the applicant has received notice under § 53.14(c) that the existing designation or a new designation will apply to the modified analyzer or sampler or has applied for and received notice under § 53.8(b) of a new FRM or FEM determination for the modified analyzer or sampler.

    (h) An applicant who has offered PM2.5 or PM10−2.5 samplers or analyzers for sale as part of a FRM or FEM may continue to do so only so long as the facility in which the samplers or analyzers are manufactured continues to be an ISO 9001-registered facility, as set forth in subpart E of this part. In the event that the ISO 9001 registration for the facility is withdrawn, suspended, or otherwise becomes inapplicable, either permanently or for some specified time interval, such that the facility is no longer an ISO 9001-registered facility, the applicant shall notify EPA within 30 days of the date the facility becomes other than an ISO 9001-registered facility, and upon such notification, EPA shall issue a preliminary finding and notification of possible cancellation of the FRM or FEM designation under § 53.11.

    (i) An applicant who has offered PM2.5 or PM10−2.5 samplers or analyzers for sale as part of a FRM or FEM may continue to do so only so long as updates of the Product Manufacturing Checklist set forth in subpart E of this part are submitted annually. In the event that an annual Checklist update is not received by EPA within 12 months of the date of the last such submitted Checklist or Checklist update, EPA shall notify the applicant within 30 days that the Checklist update has not been received and shall, within 30 days from the issuance of such notification, issue a preliminary finding and notification of possible cancellation of the reference or equivalent method designation under § 53.11.

    4. Table A-1 to subpart A of part 53 is revised to read as follows:

    Table A-1 to Subpart A of Part 53.—Summary of Applicable Requirements for Reference and Equivalent Methods for Air Monitoring of Criteria Pollutants.

    PollutantRef. or equivalentManual or automatedApplicable part 50 appendixApplicable subparts of part 53
    ABCDEF
    SO2ReferenceManualA
    EquivalentManual
    Start Printed Page 61277
    Automated
    COReferenceAutomatedC
    EquivalentManual
    Automated
    O3ReferenceAutomatedD
    EquivalentManual
    Automated
    NO2ReferenceAutomatedF
    EquivalentManual
    Automated
    PbReferenceManualG
    EquivalentManual
    PM10ReferenceManualJ
    EquivalentManual
    Automated
    PM2.5ReferenceManualL
    Equivalent Class IManualL
    Equivalent Class IIManualL121,2
    Equivalent Class IIIAutomatedL111
    PM10-2.5ReferenceManualO2
    Equivalent Class IManualO2
    Equivalent Class IIManualO2211, 2
    Equivalent Class IIIAutomatedL1,O1, 211
    1 Some requirements may apply, based on the nature of each particular candidate method, as determined by the Administrator.
    2 Alternative Class III requirements may be substituted.
    Start Printed Page 61278 Start Amendment Part

    5. Paragraphs (1), (2), and (6) of appendix A to subpart A of part 53 are revised to read as follows:

    End Amendment Part

    Appendix A to Subpart A of Part 53—References

    (1) American National Standard Quality Systems—Model for Quality Assurance in Design, Development, Production, Installation, and Servicing, ANSI/ISO/ASQC Q9001-1994. Available from American Society for Quality, P.O. Box 3005, Milwaukee, WI 53202 (http://qualitypress.asq.org).

    (2) American National Standard Quality Systems for Environmental Data and Technology Programs—Requirements with guidance for use, ANSI/ASQC E4-2004. Available from American Society for Quality P.O. Box 3005, Milwaukee, WI 53202 (http://qualitypress.asq.org).

    * * * * *

    (6) Quality Assurance Guidance Document 2.12. Monitoring PM2.5 in Ambient Air Using Designated Reference or Class I Equivalent Methods. U.S. EPA, National Exposure Research Laboratory, Research Triangle Park, NC, November 1998 or later edition. Currently available at http://www.epa.gov/​ttn/​amtic/​pmqainf.html.

    Start Amendment Part

    6. Subpart C is revised to read as follows:

    End Amendment Part
    Subpart C—Procedures for Determining Comparability Between Candidate Methods and Reference Methods
    53.30
    General provisions.
    53.31
    [Reserved]
    53.32
    Test procedures for methods for SO2, CO, O3, and NO2.
    53.33
    Test procedure for methods for Pb.
    53.34
    Test procedures for methods for PM10 and Class I methods for PM2.5.
    53.35
    Test procedures for Class II and Class III methods for PM2.5 and PM10−2.5.

    Tables to Subpart C of Part 53

    Table C-1 to Subpart C of Part 53—Test Concentration Ranges, Number of Measurements Required, and Maximum Discrepancy Specification

    Table C-2 to Subpart C of Part 53—Sequence of Test Measurements

    Table C-3 to Subpart C of Part 53—Test Specifications for Pb Methods

    Table C-4 to Subpart C of Part 53—Test Specifications for PM10, PM2.5, and PM10−2.5 Candidate Equivalent Methods

    Table C-5 to Subpart C of Part 53—Summary of Comparability Field Testing Campaign Site and Seasonal Requirements for Class II and III FEMs for PM10−2.5 and PM2.5

    Figures to Subpart C of Part 53

    Figure C-1 to Subpart C of Part 53—Suggested Format for Reporting Test Results for Methods for SO2, CO, O3, NO2

    Figure C-2 to Subpart C of Part 53—Illustration of the Slope and Intercept Limits for Class II and Class III PM2.5 Candidate Equivalent Methods

    Figure C-3 to Subpart C of Part 53—Illustration of the Slope and Intercept Limits for Class II and Class III PM10−2.5 Candidate Equivalent Methods

    Figure C-4 to Subpart C of Part 53—Illustration of the Minimum Limits for Correlation Coefficient for PM2.5 and PM10−2.5 Class II and III Methods

    Appendix to Subpart C of Part 53

    Appendix A to Subpart C of Part 53—References

    Subpart C—Procedures for Determining Comparability Between Candidate Methods and Reference Methods

    General provisions.

    (a) Determination of comparability. The test procedures prescribed in this subpart shall be used to determine if a candidate method is comparable to a reference method when both methods measure pollutant concentrations in ambient air. Minor deviations in testing requirements and acceptance requirements set forth in this subpart, in connection with any documented extenuating circumstances, may be determined by the Administrator to be acceptable, at the discretion of the Administrator.

    (b) Selection of test sites. (1) Each test site shall be in an area which can be shown to have at least moderate concentrations of various pollutants. Each site shall be clearly identified and shall be justified as an appropriate test site with suitable supporting evidence such as a description of the surrounding area, characterization of the sources and pollutants typical in the area, maps, population density data, vehicular traffic data, emission inventories, pollutant measurements from previous years, concurrent pollutant measurements, meteorological data, and other information useful in supporting the suitability of the site for the comparison test or tests.

    (2) If approval of one or more proposed test sites is desired prior to conducting the tests, a written request for approval of the test site or sites must be submitted to the address given in § 53.4. The request should include information identifying the type of candidate method and one or more specific proposed test sites along with a justification for each proposed specific site as described in paragraph (b)(1) of this section. The EPA will evaluate each proposed site and approve the site, disapprove the site, or request more information about the site. Any such pre-test approval of a test site by the EPA shall indicate only that the site meets the applicable test site requirements for the candidate method type; it shall not indicate, suggest, or imply that test data obtained at the site will necessarily meet any of the applicable data acceptance requirements. The Administrator may exercise discretion in selecting a different site (or sites) for any additional tests the Administrator decides to conduct.

    (c) Test atmosphere. Ambient air sampled at an appropriate test site or sites shall be used for these tests. Simultaneous concentration measurements shall be made in each of the concentration ranges specified in tables C-1, C-3, or C-4 of this subpart, as appropriate.

    (d) Sampling or sample collection. All test concentration measurements or samples shall be taken in such a way that both the candidate method and the reference method obtain air samples that are alike or as nearly identical as practical.

    (e) Operation. Set-up and start-up of the test analyzer(s), test sampler(s), and reference method analyzers or samplers shall be in strict accordance with the applicable operation manual(s).

    (f) Calibration. The reference method shall be calibrated according to the appropriate appendix to part 50 of this chapter (if it is a manual method) or according to the applicable operation manual(s) (if it is an automated method). A candidate method (or portion thereof) shall be calibrated according to the applicable operation Start Printed Page 61279manual(s), if such calibration is a part of the method.

    (g) Submission of test data and other information. All recorder charts, calibration data, records, test results, procedural descriptions and details, and other documentation obtained from (or pertinent to) these tests shall be identified, dated, signed by the analyst performing the test, and submitted. For candidate methods for PM2.5 and PM10−2.5, all submitted information must meet the requirements of the ANSI/ASQC E4 Standard, sections 6 (reference 1 of appendix A of this subpart).

    [Reserved]
    Test procedures for methods for SO2, CO, O3, and NO2.

    (a) Comparability. Comparability is shown for SO2, CO, O3, and NO2 methods when the differences between:

    (1) Measurements made by a candidate manual method or by a test analyzer representative of a candidate automated method, and;

    (2) Measurements made simultaneously by a reference method are less than or equal to the values for maximum discrepancy specified in table C-1 of this subpart.

    (b) Test measurements. All test measurements are to be made at the same test site. If necessary, the concentration of pollutant in the sampled ambient air may be augmented with artificially generated pollutant to facilitate measurements in the specified ranges, as described under paragraph (f)(4) of this section.

    (c) Requirements for measurements or samples. All test measurements made or test samples collected by means of a sample manifold as specified in paragraph (f)(4) of this section shall be at a room temperature between 20° and 30° C, and at a line voltage between 105 and 125 volts. All methods shall be calibrated as specified in § 53.30(f) prior to initiation of the tests.

    (d) Set-up and start-up. (1) Set-up and start-up of the test analyzer, test sampler(s), and reference method shall be in strict accordance with the applicable operation manual(s). If the test analyzer does not have an integral strip chart or digital data recorder, connect the analyzer output to a suitable strip chart or digital data recorder. This recorder shall have a chart width of at least 25 centimeters, a response time of 1 second or less, a deadband of not more than 0.25 percent of full scale, and capability of either reading measurements at least 5 percent below zero or offsetting the zero by at least 5 percent. Digital data shall be recorded at appropriate time intervals such that trend plots similar to a strip chart recording may be constructed with a similar or suitable level of detail.

    (2) Other data acquisition components may be used along with the chart recorder during the conduct of these tests. Use of the chart recorder is intended only to facilitate visual evaluation of data submitted.

    (3) Allow adequate warmup or stabilization time as indicated in the applicable operation manual(s) before beginning the tests.

    (e) Range. (1) Except as provided in paragraph (e)(2) of this section, each method shall be operated in the range specified for the reference method in the appropriate appendix to part 50 of this chapter (for manual reference methods), or specified in table B-1 of subpart B of this part (for automated reference methods).

    (2) For a candidate method having more than one selectable range, one range must be that specified in table B-1 of subpart B of this part, and a test analyzer representative of the method must pass the tests required by this subpart while operated on that range. The tests may be repeated for a broader range (i.e., one extending to higher concentrations) than the one specified in table B-1 of subpart B of this part, provided that the range does not extend to concentrations more than two times the upper range limit specified in table B-1 of subpart B of this part and that the test analyzer has passed the tests required by subpart B of this part (if applicable) for the broader range. If the tests required by this subpart are conducted or passed only for the range specified in table B-1 of subpart B of this part, any equivalent method determination with respect to the method will be limited to that range. If the tests are passed for both the specified range and a broader range (or ranges), any such determination will include the broader range(s) as well as the specified range. Appropriate test data shall be submitted for each range sought to be included in such a determination.

    (f) Operation of automated methods. (1) Once the test analyzer has been set up and calibrated and tests started, manual adjustment or normal periodic maintenance, as specified in the manual referred to in § 53.4(b)(3), is permitted only every 3 days. Automatic adjustments which the test analyzer performs by itself are permitted at any time. The submitted records shall show clearly when manual adjustments were made and describe the operations performed.

    (2) All test measurements shall be made with the same test analyzer; use of multiple test analyzers is not permitted. The test analyzer shall be operated continuously during the entire series of test measurements.

    (3) If a test analyzer should malfunction during any of these tests, the entire set of measurements shall be repeated, and a detailed explanation of the malfunction, remedial action taken, and whether recalibration was necessary (along with all pertinent records and charts) shall be submitted.

    (4) Ambient air shall be sampled from a common intake and distribution manifold designed to deliver homogenous air samples to both methods. Precautions shall be taken in the design and construction of this manifold to minimize the removal of particulate matter and trace gases, and to insure that identical samples reach the two methods. If necessary, the concentration of pollutant in the sampled ambient air may be augmented with artificially generated pollutant. However, at all times the air sample measured by the candidate and reference methods under test shall consist of not less than 80 percent ambient air by volume. Schematic drawings, physical illustrations, descriptions, and complete details of the manifold system and the augmentation system (if used) shall be submitted.

    (g) Tests. (1) Conduct the first set of simultaneous measurements with the candidate and reference methods:

    (i) Table C-1 of this subpart specifies the type (1-or 24-hour) and number of measurements to be made in each of the three test concentration ranges.

    (ii) The pollutant concentration must fall within the specified range as measured by the reference method.

    (iii) The measurements shall be made in the sequence specified in table C-2 of this subpart, except for the 1-hour SO2 measurements, which are all in the high range.

    (2) For each pair of measurements, determine the difference (discrepancy) between the candidate method measurement and reference method measurement. A discrepancy which exceeds the discrepancy specified in table C-1 of this subpart constitutes a failure. Figure C-1 of this subpart contains a suggested format for reporting the test results.

    (3) The results of the first set of measurements shall be interpreted as follows:

    (i) Zero failures: The candidate method passes the test for comparability. Start Printed Page 61280

    (ii) Three or more failures: The candidate method fails the test for comparability.

    (iii) One or two failures: Conduct a second set of simultaneous measurements as specified in table C-1 of this subpart. The results of the combined total of first-set and second-set measurements shall be interpreted as follows:

    (A) One or two failures: The candidate method passes the test for comparability.

    (B) Three or more failures: The candidate method fails the test for comparability.

    (iv) For SO2, the 1-hour and 24-hour measurements shall be interpreted separately, and the candidate method must pass the tests for both 1- and 24-hour measurements to pass the test for comparability.

    (4) A 1-hour measurement consists of the integral of the instantaneous concentration over a 60-minute continuous period divided by the time period. Integration of the instantaneous concentration may be performed by any appropriate means such as chemical, electronic, mechanical, visual judgment, or by calculating the mean of not less than 12 equally-spaced instantaneous readings. Appropriate allowances or corrections shall be made in cases where significant errors could occur due to characteristic lag time or rise/fall time differences between the candidate and reference methods. Details of the means of integration and any corrections shall be submitted.

    (5) A 24-hour measurement consists of the integral of the instantaneous concentration over a 24-hour continuous period divided by the time period. This integration may be performed by any appropriate means such as chemical, electronic, mechanical, or by calculating the mean of twenty-four (24) sequential 1-hour measurements.

    (6) For O3 and CO, no more than six 1-hour measurements shall be made per day. For SO2, no more than four 1-hour measurements or one 24-hour measurement shall be made per day. One-hour measurements may be made concurrently with 24-hour measurements if appropriate.

    (7) For applicable methods, control or calibration checks may be performed once per day without adjusting the test analyzer or method. These checks may be used as a basis for a linear interpolation-type correction to be applied to the measurements to correct for drift. If such a correction is used, it shall be applied to all measurements made with the method, and the correction procedure shall become a part of the method.

    Test procedure for methods for Pb.

    (a) Comparability. Comparability is shown for Pb methods when the differences between:

    (1) Measurements made by a candidate method, and

    (2) Measurements made by the reference method on simultaneously collected Pb samples (or the same sample, if applicable), are less than or equal to the value specified in table C-3 of this subpart.

    (b) Test measurements. Test measurements may be made at any number of test sites. Augmentation of pollutant concentrations is not permitted, hence an appropriate test site or sites must be selected to provide Pb concentrations in the specified range.

    (c) Collocated samplers. The ambient air intake points of all the candidate and reference method collocated samplers shall be positioned at the same height above the ground level, and between 2 meters (1 meter for samplers with flow rates less than 200 liters per minute (L/min)) and 4 meters apart. The samplers shall be oriented in a manner that will minimize spatial and wind directional effects on sample collection.

    (d) Sample collection. Collect simultaneous 24-hour samples (filters) of Pb at the test site or sites with both the reference and candidate methods until at least 10 filter pairs have been obtained. A candidate method which employs a sampler and sample collection procedure that are identical to the sampler and sample collection procedure specified in the reference method, but uses a different analytical procedure, may be tested by analyzing common samples. The common samples shall be collected according to the sample collection procedure specified by the reference method and each shall be divided for respective analysis in accordance with the analytical procedures of the candidate method and the reference method.

    (e) Audit samples. Three audit samples must be obtained from the address given in §  53.4(a). The audit samples are 3/4 × 8-inch glass fiber strips containing known amounts of Pb at the following nominal levels: 100 micrograms per strip (μg/strip); 300 μg/strip; 750 μg/strip. The true amount of Pb, in total μg/strip, will be provided with each audit sample.

    (f) Filter analysis. (1) For both the reference method samples and the audit samples, analyze each filter extract three times in accordance with the reference method analytical procedure. The analysis of replicates should not be performed sequentially, i.e., a single sample should not be analyzed three times in sequence. Calculate the indicated Pb concentrations for the reference method samples in micrograms per cubic meter (μg/m3) for each analysis of each filter. Calculate the indicated total Pb amount for the audit samples in μg/strip for each analysis of each strip. Label these test results as R1A, R1B, R1C, R2A, R2B, * * *, Q1A, Q1B, Q1C, * * *, where R denotes results from the reference method samples; Q denotes results from the audit samples; 1, 2, 3 indicate the filter number, and A, B, C indicate the first, second, and third analysis of each filter, respectively.

    (2) For the candidate method samples, analyze each sample filter or filter extract three times and calculate, in accordance with the candidate method, the indicated Pb concentration in μg/m3 for each analysis of each filter. Label these test results as C1A, C1B, C2C, * * *, where C denotes results from the candidate method. For candidate methods which provide a direct measurement of Pb concentrations without a separable procedure, C1A=C1B=C1C, C2A=C2B=C2C, etc.

    (g) Average Pb concentration. For the reference method, calculate the average Pb concentration for each filter by averaging the concentrations calculated from the three analyses using equation 1 of this section:

    Where, i is the filter number.

    (h) Accuracy. (1)(i) For the audit samples, calculate the average Pb concentration for each strip by averaging the concentrations calculated from the three analyses using equation 2 of this section:

    Where, i is audit sample number.

    (ii) Calculate the percent difference (Dq) between the indicated Pb concentration for each audit sample and the true Pb concentration (Tq) using equation 3 of this section:

    (2) If any difference value (Dqi) exceeds ±5 percent, the accuracy of the Start Printed Page 61281reference method analytical procedure is out-of-control. Corrective action must be taken to determine the source of the error(s) (e.g., calibration standard discrepancies, extraction problems, etc.) and the reference method and audit sample determinations must be repeated according to paragraph (f) of this section, or the entire test procedure (starting with paragraph (d) of this section) must be repeated.

    (i) Acceptable filter pairs. Disregard all filter pairs for which the Pb concentration, as determined in paragraph (g) of this section by the average of the three reference method determinations, falls outside the range of 0.5 to 4.0 μg/m3. All remaining filter pairs must be subjected to the tests for precision and comparability in paragraphs (j) and (k) of this section. At least five filter pairs must be within the 0.5 to 4.0 μg/m3 range for the tests to be valid.

    (j) Test for precision. (1) Calculate the precision (P) of the analysis (in percent) for each filter and for each method, as the maximum minus the minimum divided by the average of the three concentration values, using equation 4 or equation 5 of this section:

    or

    where, i indicates the filter number.

    (2) If any reference method precision value (PRi) exceeds 15 percent, the precision of the reference method analytical procedure is out-of-control. Corrective action must be taken to determine the source(s) of imprecision, and the reference method determinations must be repeated according to paragraph (f) of this section, or the entire test procedure (starting with paragraph (d) of this section) must be repeated.

    (3) If any candidate method precision value (PCi) exceeds 15 percent, the candidate method fails the precision test.

    (4) The candidate method passes this test if all precision values (i.e., all PRi's and all PCi's) are less than 15 percent.

    (k) Test for comparability. (1) For each filter or analytical sample pair, calculate all nine possible percent differences (D) between the reference and candidate methods, using all nine possible combinations of the three determinations (A, B, and C) for each method using equation 6 of this section:

    where, i is the filter number, and n numbers from 1 to 9 for the nine possible difference combinations for the three determinations for each method (j = A, B, C, candidate; k = A, B, C, reference).

    (2) If none of the percent differences (D) exceeds ±20 percent, the candidate method passes the test for comparability.

    (3) If one or more of the percent differences (D) exceed ±20 percent, the candidate method fails the test for comparability.

    (4) The candidate method must pass both the precision test (paragraph (j) of this section) and the comparability test (paragraph (k) of this section) to qualify for designation as an equivalent method.

    Test procedure for methods for PM10 and Class I methods for PM2.5.

    (a) Comparability. Comparability is shown for PM10 methods and for Class I methods for PM2.5 when the relationship between:

    (1) Measurements made by a candidate method, and

    (2) Measurements made by a corresponding reference method on simultaneously collected samples (or the same sample, if applicable) at each of one or more test sites (as required) is such that the linear regression parameters (slope, intercept, and correlation coefficient) describing the relationship meet the requirements specified in table C-4 of this subpart.

    (b) Methods for PM10. Test measurements must be made, or derived from particulate samples collected, at not less than two test sites, each of which must be located in a geographical area characterized by ambient particulate matter that is significantly different in nature and composition from that at the other test site(s). Augmentation of pollutant concentrations is not permitted, hence appropriate test sites must be selected to provide the minimum number of test PM10 concentrations in the ranges specified in table C-4 of this subpart. The tests at the two sites may be conducted in different calendar seasons, if appropriate, to provide PM10 concentrations in the specified ranges.

    (c) PM10methods employing the same sampling procedure as the reference method but a different analytical method. Candidate methods for PM10 which employ a sampler and sample collection procedure that are identical to the sampler and sample collection procedure specified in the reference method, but use a different analytical procedure, may be tested by analyzing common samples. The common samples shall be collected according to the sample collection procedure specified by the reference method and shall be analyzed in accordance with the analytical procedures of both the candidate method and the reference method.

    (d) Methods for PM2.5. Augmentation of pollutant concentrations is not permitted, hence appropriate test sites must be selected to provide the minimum number of test measurement sets to meet the requirements for PM2.5 concentrations in the ranges specified in table C-4 of this subpart. Only one test site is required, and the site need only meet the PM2.5 ambient concentration levels required by table C-4 of this subpart and the requirements of § 53.30(b) of this subpart. A total of 10 valid measurement sets is required.

    (e) Collocated measurements. (1) Set up three reference method samplers collocated with three candidate method samplers or analyzers at each of the number of test sites specified in table C-4 of this subpart.

    (2) The ambient air intake points of all the candidate and reference method collocated samplers or analyzers shall be positioned at the same height above the ground level, and between 2 meters (1 meter for samplers or analyzers with flow rates less than 200 L/min) and 4 meters apart. The samplers shall be oriented in a manner that will minimize spatial and wind directional effects on sample collection.

    (3) At each site, obtain as many sets of simultaneous PM10 or PM2.5 measurements as necessary (see table C-4 of this subpart), each set consisting of three reference method and three candidate method measurements, all obtained simultaneously.

    (4) Candidate PM10 method measurements shall be nominal 24-hour (±1 hour) integrated measurements or shall be averaged to obtain the mean concentration for a nominal 24-hour period. PM2.5 measurements may be either nominal 24-or 48-hour integrated measurements. All collocated measurements in a measurement set must cover the same nominal 24-or 48-hour time period.

    (5) For samplers, retrieve the samples promptly after sample collection and analyze each sample according to the reference method or candidate method, as appropriate, and determine the PM10 or PM2.5 concentration in μg/m3. If the conditions of paragraph (c) of this section apply, collect sample sets only Start Printed Page 61282with the three reference method samplers. Guidance for quality assurance procedures for PM2.5 methods is found in “Quality Assurance Document 2.12” (reference (2) in appendix A to this subpart).

    (f) Sequential samplers. For sequential samplers, the sampler shall be configured for the maximum number of sequential samples and shall be set for automatic collection of all samples sequentially such that the test samples are collected equally, to the extent possible, among all available sequential channels or utilizing the full available sequential capability.

    (g) Calculation of reference method averages and precisions. (1) For each of the measurement sets, calculate the average PM10 or PM2.5 concentration obtained with the reference method samplers, using equation 7 of this section:

    Where:

    R = The concentration measurements from the reference methods;

    i = The sampler number; and

    j = The measurement set number.

    (2) For each of the measurement sets, calculate the precision of the reference method PM10 or PM2.5 measurements as the standard deviation, PRj, using equation 8 of this section:

    (3) For each measurement set, also calculate the precision of the reference method PM10 or PM2.5 measurements as the relative standard deviation, RPRj, in percent, using equation 9 of this section:

    (h) Acceptability of measurement sets. Each measurement set is acceptable and valid only if the three reference method measurements and the three candidate method measurements are obtained and are valid, R̄j falls within the acceptable concentration range specified in table C-4 of this subpart, and either PRj or RPRj is within the corresponding limit for reference method precision specified in table C-4 of this subpart. For each site, table C-4 of this subpart specifies the minimum number of measurement sets required having R̄j above and below specified concentrations for 24- or 48-hour samples. Additional measurement sets shall be obtained, as necessary, to provide the minimum number of acceptable measurement sets for each category and the minimum total number of acceptable measurement sets for each test site. If more than the minimum number of measurement sets are collected that meet the acceptability criteria, all such measurement sets shall be used to demonstrate comparability.

    (i) Candidate method average concentration measurement. For each of the acceptable measurement sets, calculate the average PM10 or PM2.5 concentration measurements obtained with the candidate method samplers, using equation 10 of this section:

    Where:

    C = The concentration measurements from the candidate methods;

    i = The measurement number in the set; and

    j = The measurement set number.

    (j) Test for comparability. (1) For each site, plot all of the average PM10 or PM2.5 measurements obtained with the candidate method (C̄j) against the corresponding average PM10 or PM2.5 measurements obtained with the reference method (R̄j. For each site, calculate and record the linear regression slope and intercept, and the correlation coefficient.

    (2) To pass the test for comparability, the slope, intercept, and correlation coefficient calculated under paragraph (j)(1) of this section must be within the limits specified in table C-4 of this subpart for all test sites.

    Test procedure for Class II and Class III methods for PM2.5 and PM10−2.5.

    (a) Overview. Class II and Class III candidate equivalent methods shall be tested for comparability of PM2.5 or PM10−2.5 measurements to corresponding collocated PM2.5 or PM10−2.5 reference method measurements at each of multiple field sites, as required. Comparability is shown for the candidate method when simultaneous collocated measurements made by candidate and reference methods meet the comparability requirements specified in this section § 53.35 and in table C-4 of this subpart at each of the required test sites.

    (b) Test sites and seasons. A summary of the test site and seasonal testing requirements is presented in table C-5 of this subpart.

    (1) Test sites. Comparability testing is required at each of the applicable U.S. test sites required by this paragraph (b). Each test site must also meet the general test site requirements specified in § 53.30(b).

    (i) PM2.5Class II and Class III candidate methods. Test sites should be chosen to provide representative chemical and meteorological characteristics with respect to nitrates, sulfates, organic compounds, and various levels of temperature, humidity, wind, and elevation. For Class III methods, one test site shall be selected in each of the following four general locations (A, B, C, and D). For Class II methods, two test sites, one western site (A or B) and one midwestern or eastern site (C or D), shall be selected from these locations.

    (A) Test site A shall be in the Los Angeles basin or California Central Valley area in a location that is characterized by relatively high PM2.5, nitrates, and semi-volatile organic pollutants.

    (B) Test site B shall be in a western city such as Denver, Salt Lake City, or Albuquerque in an area characterized by cold weather, higher elevation, winds, and dust.

    (C) Test site C shall be in a midwestern city characterized by substantial temperature variation, high nitrates, and wintertime conditions.

    (D) Test site D shall be in a northeastern or mid-Atlantic city that is seasonally characterized by high sulfate concentrations and high relative humidity.

    (ii) PM10−2.5Class II and Class III candidate methods. Test sites shall be chosen to provide modest to high levels of PM10−2.5 representative of locations in proximity to urban sources of PM10−2.5 such as high-density traffic on paved roads, industrial sources, and construction activities. For Class III methods, one test site shall be selected in each of the four following general locations (A, B, C, and D), and at least one of the test sites shall have characteristic wintertime temperatures of 0° C or lower. For Class II methods, two test sites, one western site (A or B) and one midwestern or eastern site (C or D), shall be selected from these locations.

    (A) Test site A shall be in the Los Angeles basin or the California Central Valley area in a location that is characterized by relatively high PM2.5, nitrates, and semi-volatile organic pollutants. Start Printed Page 61283

    (B) Test site B shall be in a western city characterized by a high ratio of PM10−2.5 to PM2.5, with exposure to windblown dust, such as Las Vegas or Phoenix.

    (C) Test site C shall be in a midwestern city characterized by substantial temperature variation, high nitrates, and wintertime conditions.

    (D) Test site D shall be in a large city east of the Mississippi River, having characteristically high sulfate concentrations and high humidity levels.

    (2) Test seasons. (i) For PM2.5 and PM10−2.5 Class III candidate methods, test campaigns are required in both summer and winter seasons at test site A, in the winter season only at test sites B and C, and in the summer season only at test site D. (A total of five test campaigns is required.) The summer season shall be defined as the typically warmest three or four months of the year at the site; the winter season shall be defined as the typically coolest three or four months of the year at the site.

    (ii) For Class II PM2.5 and PM10−2.5 candidate methods, one test campaign is required at test site A or B and a second test campaign at test site C or D (total of two test campaigns).

    (3) Test concentrations. The test sites should be selected to provide ambient concentrations within the concentration limits specified in table C-4 of this subpart, and also to provide a wide range of test concentrations. A narrow range of test concentrations may result in a low concentration coefficient of variation statistic for the test measurements, making the test for correlation coefficient more difficult to pass (see paragraph (h) of this section, test for comparison correlation).

    (4) Pre-approval of test sites. The EPA recommends that the applicant seek EPA approval of each proposed test site prior to conducting test measurements at the site. To do so, the applicant should submit a request for approval as described in § 53.30(b)(2).

    (c) Collocated measurements. (1) For each test campaign, three reference method samplers and three candidate method samplers or analyzers shall be installed and operated concurrently at each test site within each required season (if applicable), as specified in paragraph (b) of this section. All reference method samplers shall be of single-filter design (not multi-filter, sequential sample design). Each candidate method shall be setup and operated in accordance with its associated manual referred to in § 53.4(b)(3) and in accordance with applicable guidance in “Quality Assurance Document 2.12” (reference (2) in appendix A to this subpart). All samplers or analyzers shall be placed so that they sample or measure air representative of the surrounding area (within one kilometer) and are not unduly affected by adjacent buildings, air handling equipment, industrial operations, traffic, or other local influences. The ambient air inlet points of all samplers and analyzers shall be positioned at the same height above the ground level and between 2 meters (1 meter for instruments having sample inlet flow rates less than 200 L/min) and 4 meters apart.

    (2) A minimum of 23 valid and acceptable measurement sets of PM2.5 or PM10−2.5 24-hour (nominal) concurrent concentration measurements shall be obtained during each test campaign at each test site. To be considered acceptable for the test, each measurement set shall consist of at least two valid reference method measurements and at least two valid candidate method measurements, and the PM2.5 or PM10−2.5 measured concentration, as determined by the average of the reference method measurements, must fall within the acceptable concentration range specified in table C-4 of this subpart. Each measurement set shall include all valid measurements obtained. For each measurement set containing fewer than three reference method measurements or fewer than three candidate method measurements, an explanation and appropriate justification shall be provided to account for the missing measurement or measurements.

    (3) More than 23 valid measurement sets may be obtained during a particular test campaign to provide a more advantageous range of concentrations, more representative conditions, additional higher or lower measurements, or to otherwise improve the comparison of the methods. All valid data sets obtained during each test campaign shall be submitted and shall be included in the analysis of the data.

    (4) The integrated-sample reference method measurements shall be of at least 22 hours and not more than 25 hours duration. Each reference method sample shall be retrieved promptly after sample collection and analyzed according to the reference method to determine the PM2.5 or PM10−2.5 measured concentration in μg/m3. Guidance and quality assurance procedures applicable to PM2.5 or PM10−2.5 reference methods are found in “Quality Assurance Document 2.12” (reference (2) in appendix A to this subpart).

    (5) Candidate method measurements shall be timed or processed and averaged as appropriate to determine an equivalent mean concentration representative of the same time period as that of the concurrent integrated-sample reference method measurements, such that all measurements in a measurement set shall be representative of the same time period. In addition, hourly average concentration measurements shall be obtained from each of the Class III candidate method analyzers for each valid measurement set and submitted as part of the application records.

    (6) In the following tests, all measurement sets obtained at a particular test site, from both seasonal campaigns if applicable, shall be combined and included in the test data analysis for the site. Data obtained at different test sites shall be analyzed separately. All measurements should be reported as normally obtained, and no measurement values should be rounded or truncated prior to data analysis. In particular, no negative measurement value, if otherwise apparently valid, should be modified, adjusted, replaced, or eliminated merely because its value is negative. Calculated mean concentrations or calculated intermediate quantities should retain at least one order-of-magnitude greater resolution than the input values. All measurement data and calculations shall be recorded and submitted in accordance with § 53.30(g), including hourly test measurements obtained from Class III candidate methods.

    (d) Calculation of mean concentrations—(1) Reference method outlier test. For each of the measurement sets for each test site, check each reference method measurement to see if it might be an anomalous value (outlier) as follows, where Ri,j is the measurement of reference method sampler i on test day j. In the event that one of the reference method measurements is missing or invalid due to a specific, positively-identified physical cause (e.g., sampler malfunction, operator error, accidental damage to the filter, etc.; see paragraph (c)(2) of this section), then substitute zero for the missing measurement, for the purposes of this outlier test only.

    (i) Calculate the quantities 2 × R1,j/(R1,j + R2,j) and 2 × R1,j/(R1,j + R3,j). If both quantities fall outside of the interval, (0.93, 1.07), then R1,j is an outlier.

    (ii) Calculate the quantities 2 × R2,j/(R2,j + R1,j) and 2 × R2,j/(R2,j + R3,j). If both quantities fall outside of the interval, (0.93, 1.07), then R2,j is an outlier.

    (iii) Calculate the quantities 2 × R3,j/(R3,j + R1,j) and 2 × R3,j/(R3,j + R2,j). If both quantities fall outside of the Start Printed Page 61284interval, (0.93, 1.07), then R3,j is an outlier.

    (iv) If this test indicates that one of the reference method measurements in the measurement set is an outlier, the outlier measurement shall be eliminated from the measurement set, and the other two measurements considered valid. If the test indicates that more than one reference method measurement in the measurement set is an outlier, the entire measurement set (both reference and candidate method measurements) shall be excluded from further data analysis for the tests of this section.

    (2) For each of the measurement sets for each test site, calculate the mean concentration for the reference method measurements, using equation 11 of this section:

    Where:

    j = The mean concentration measured by the reference method for the measurement set;

    Ri,j = The measurement of reference method sampler i on test day j; and

    n = The number of valid reference method measurements in the measurement set (normally 3).

    (3) Any measurement set for which R̄j does not fall in the acceptable concentration range specified in table C-4 of this subpart is not valid, and the entire measurement set (both reference and candidate method measurements) must be eliminated from further data analysis.

    (4) For each of the valid measurement sets at each test site, calculate the mean concentration for the candidate method measurements, using equation 12 of this section. (The outlier test in paragraph (d)(1) of this section shall not be applied to the candidate method measurements.)

    Where:

    j = The mean concentration measured by the candidate method for the measurement set;

    Ci,j = The measurement of the candidate method sampler or analyzer i on test day j; and

    m = The number of valid candidate method measurements in the measurement set (normally 3).

    (e) Test for reference method precision. (1) For each of the measurement sets for each site, calculate an estimate for the relative precision of the reference method measurements, RPj, using equation 13 of this section:

    (2) For each site, calculate an estimate of reference method relative precision for the site, RP, using the root mean square calculation of equation 14 of this section:

    Where, J is the total number of valid measurement sets for the site.

    (3) Verify that the estimate for reference method relative precision for the site, RP, is not greater than the value specified for reference method precision in table C-4 of this subpart. A reference method relative precision greater than the value specified in table C-4 of this subpart indicates that quality control for the reference method is inadequate, and corrective measures must be implemented before proceeding with the test.

    (f) Test for candidate method precision. (1) For each of the measurement sets, for each site, calculate an estimate for the relative precision of the candidate method measurements, CPj, using equation 15 of this section:

    (2) For each site, calculate an estimate of candidate method relative precision for the site, CP, using the root mean square calculation of equation 16 of this section:

    Where, J is the total number of valid measurement sets for the site.

    (3) To pass the test for precision, the mean candidate method relative precision at each site must not be greater than the value for candidate method precision specified in table C-4 of this subpart.

    (g) Test for additive and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean concentration measured by the reference method, R̄, using equation 17 of this section:

    (2) For each test site, calculate the mean concentration measured by the candidate method, C̄, using equation 18 of this section:

    (3) For each test site, calculate the linear regression slope and intercept of the mean candidate method measurements (C̄j) against the mean reference method measurements (R̄j), using equations 19 and 20 of this section, respectively:

    (4) To pass this test, at each test site:

    (i) The slope (calculated to at least 2 decimal places) must be in the interval specified for regression slope in table C-4 of this subpart; and

    (ii) The intercept (calculated to at least 2 decimal places) must be in the interval specified for regression intercept in table C-4 of this subpart.

    (iii) The slope and intercept limits are illustrated in figures C-2 and C-3 of this subpart.

    (h) Tests for comparison correlation. (1) For each test site, calculate the (Pearson) correlation coefficient, r (not the coefficient of determination, r2), using equation 21 of this section:

    Start Printed Page 61285

    (2) For each test site, calculate the concentration coefficient of variation, CCV, using equation 22 of this section:

    (3) To pass the test, the correlation coefficient, r, for each test site must not be less than the values, for various values of CCV, specified for correlation in table C-4 of this subpart. These limits are illustrated in figure C-4 of this subpart.

    Tables to Subpart C of Part 53

    Table C-1 to Subpart C of Part 53.—Test Concentration Ranges, Number of Measurements Required, and Maximum Discrepancy Specification

    PollutantConcentration range, parts per millionSimultaneous measurements requiredMaximum discrepancy specification, parts per million
    1-hr24-hr
    First setSecond setFirst setSecond set
    OzoneLow 0.06 to 0.10560.02
    Med 0.15 to 0.2556.03
    High 0.35 to 0.4546.04
    Total1418
    Carbon monoxideLow 7 to 11561.5
    Med 20 to 30562.0
    Total1418
    Sulfur dioxideLow 0.02 to 0.05330.02
    Med 0.10 to 0.1523.03
    Total7878
    Nitrogen dioxideLow 0.02 to 0.08330.02
    Med 0.10 to 0.2023.03
    Total78

    Table C-2 to Subpart C of Part 53.—Sequence of Test Measurements

    MeasurementConcentration range
    First setSecond set
    1LowMedium.
    2HighHigh.
    3MediumLow.
    4HighHigh.
    5LowMedium.
    6MediumLow.
    7LowMedium.
    8MediumLow.
    9HighHigh.
    10MediumLow.
    11HighMedium.
    12LowHigh.
    13MediumMedium.
    14LowHigh.
    15Low.
    16Medium.
    17Low.
    18High.

    Table C-3 to Subpart C of Part 53.—Test Specifications for Pb Methods

    Concentration range, μg/m30.5-4.0
    Minimum number of 24-hr measurements5
    Maximum analytical precision, percent15
    Maximum analytical accuracy, percent± 5
    Maximum difference, percent of reference method± 20

    Table C-4 to Subpart C of Part 53.—Test Specifications for PM10, PM2.5 and PMR102.5 Candidate Equivalent Methods

    SpecificationPM10PM2.5PM102.5
    Class IClass IIClass IIIClass IIClass III
    Acceptable concentration range (Rj), μg/m315-3003-2003-2003-2003-2003-200
    Minimum number of test sites212424
    Minimum number of candidate method samplers or analyzers per site3331313131
    Start Printed Page 61286
    Number of reference method samplers per site3331313131
    Minimum number of acceptable sample sets per site for PM10 methods:
    Rj < 60 μg/m33
    Rj > 60 μg/m33
    Total10
    Minimum number of acceptable sample sets per site for PM2.5 and PM10-2.5 candidate equivalent methods:
    Rj < 30 μg/m3 for 24-hr or Rj < 20 μg/m3 for 48-hr samples3
    Rj > 30 μg/m3 for 24-hr or Rj > 20 μg/m3 for 48-hr samples3
    Each season1023232323
    Total, each site102323 (46 for two-season sites)2323 (46 for two-season sites)
    Precision of replicate reference method measurements, PRj or RPRj′, respectively; RP for Class II or III PM2.5 or PM10-2.5′, maximum5 μg/m3 or 7%2 μg/m3 or 5%10%210%210%210%2
    Precision of PM2.5 or PM10-2.5 candidate method, CP, each site10%215%215%215%2
    Slope of regression relationship1±0.101±0.051±0.101±0.101±0.101±0.12
    Intercept of regression relationship, μg/m30±50±1Between: 13.55-(15.05 × slope), but not less than −1.5; and 16.56-(15.05 × slope), but not more than + 1.5Between: 15.05-(17.32 × slope), but not less than −2.0; and 15.05-(13.20 × slope), but not more than + 2.0Between: 62.05-(70.5 × slope), but not less than −3.5; and 78.95-(70.5 × slope), but not more than + 3.5Between: 70.50-(82.93 × slope), but not less than −7.0; and 70.50-(61.16 × slope), but not more than + 7.0
    Correlation of reference method and candidate method measurements≥0.97≥0.97
    ≥0.93 . . . . . . . CCV≤0.4; ≥0.85 + 0.2 × CCV . . for 0.4≤CCV≤0.5; ≥0.95 . . . . . . . for CCV≥0.5
    1 Some missing daily measurement values may be permitted; see test procedure.
    2 Calculated as the root mean square over all measurement sets

    Table C-5 to Subpart C of Part 53—Summary of Comparability Field Testing Campaign Site and Seasonal Requirements for Class II and III FEMs for PM10−2.5 and PM2.5

    Candidate methodTest siteABCD
    PM2.5Test site location areaLos Angeles basin or California Central ValleyWestern city such as Denver, Salt Lake City, or AlbuquerqueMidwestern cityNortheastern or mid-Atlantic city.
    Test site characteristicsRelatively high PM2.5, nitrates, and semi-volatile organic pollutantsCold weather, higher elevation, winds, and dustSubstantial temperature variation, high nitrates, wintertime conditionsHigh sulfate and high relative humidity.
    Class III Field test campaigns (Total: 5)Winter and summerWinter onlyWinter onlySummer only.
    Class II Field test campaigns (Total: 2)Site A or B, any seasonSite C or D, any season.
    Start Printed Page 61287
    PM10−2.5Test site location areaLos Angeles basin or California Central ValleyWestern city such as Las Vegas or PhoenixMidwestern cityLarge city east of the Mississippi River.
    Test site characteristicsRelatively high PM2.5, nitrates, and semi-volatile organic pollutantsHigh PM10−2.5 to PM2.5 ratio, windblown dustSubstantial temperature variation, high nitrates, wintertime conditionsHigh sulfate and high relative humidity.
    Class III Field test campaigns (Total: 5)Winter and summerWinter onlyWinter onlySummer only.
    Class II Field test campaigns (Total: 2)Site A or B, any seasonSite C or D, any season.

    Figures to Subpart C of Part 53

    Figure C-1 to Subpart C of Part 53—Suggested Format for Reporting Test Results for Methods for SO2, CO, O3, NO2

    Candidate Method

    Reference Method

    Applicant

    ☐ First Set ☐ Second Set ☐ Type

    ☐ 1 Hour ☐ 24 Hour

    Concentration rangeDateTimeConcentration, ppmDifferenceTable C-1 spec.Pass or fail
    CandidateReference
    Low1
    ____ ppm2
    to ____ ppm3
    4
    5
    6
    Medium1
    ____ ppm2
    to ____ ppm3
    4
    5
    6
    High1
    ____ ppm2
    to ____ ppm3
    4
    5
    6
    7
    8
    Total Failures:
    Start Printed Page 61288

    Start Printed Page 61289

    Appendix to Subpart C of Part 53

    Appendix A to Subpart C of Part 53—References

    (1) American National Standard Quality Systems for Environmental Data and Technology Programs—Requirements with guidance for use, ANSI/ASQC E4-2004. Available from American Society for Quality, P.O. Box 3005, Milwaukee, WI 53202 (http://qualitypress.asq.org).

    (2) Quality Assurance Guidance Document 2.12. Monitoring PM2.5 in Ambient Air Using Designated Reference or Class I Equivalent Methods. U.S. EPA, National Exposure Research Laboratory, Research Triangle Park, NC, November 1998 or later edition. Currently available at http://www.epa.gov/​ttn/​amtic/​pmqainf.html.

    Subpart E—Procedures for Testing Physical (Design) and Performance Characteristics of Reference Methods and Class I and Class II Equivalent Methods for PM2.5 or PM10−2.5

    Start Amendment Part

    7. The heading for subpart E is revised as set out above.

    End Amendment Part Start Amendment Part

    8. Section 53.50 is revised to read as follows:

    End Amendment Part
    General provisions.

    (a) A candidate method for PM2.5 or PM10−2.5 described in an application for a FRM or FEM determination submitted under § 53.4 shall be determined by the EPA to be a FRM or a Class I, II, or III FEM on the basis of the definitions for such methods given in § 53.1. This subpart sets forth the specific tests that must be carried out and the test results, evidence, documentation, and other materials that must be provided to EPA to demonstrate that a PM2.5 or PM10−2.5 sampler associated with a candidate reference method or Class I or Class II equivalent method meets all design and performance specifications set forth in appendix L or O, respectively, of part 50 of this chapter as well as additional requirements specified in this subpart E. Some or all of these tests may also be applicable to a candidate Class III equivalent method or analyzer, as may be determined under § 53.3(b)(3).

    (b) PM2.5methods—(1) Reference method. A sampler associated with a candidate reference method for PM2.5 shall be subject to the provisions, specifications, and test procedures prescribed in §§ 53.51 through 53.58.

    (2) Class I method. A sampler associated with a candidate Class I equivalent method for PM2.5 shall be subject to the provisions, specifications, and test procedures prescribed in all sections of this subpart.

    (3) Class II method. A sampler associated with a candidate Class II equivalent method for PM2.5 shall be subject to the provisions, specifications, and test procedures prescribed in all applicable sections of this subpart, as specified in subpart F of this part or as specified in § 53.3(a)(3).

    (c) PM10−2.5methods—(1) Reference method. A sampler associated with a reference method for PM10−2.5, as specified in appendix O to part 50 of this chapter, shall be subject to the requirements in this paragraph (c)(1).

    (i) The PM2.5 sampler of the PM10−2.5 sampler pair shall be verified to be either currently designated under this part 53 as a FRM for PM2.5, or shown to meet all requirements for designation as a FRM for PM2.5, in accordance with this part 53.

    (ii) The PM10C sampler of the PM10−2.5 sampler pair shall be verified to be of like manufacturer, design, configuration, and fabrication to the PM2.5 sampler of the PM10−2.5 sampler pair, except for replacement of the particle size separator specified in section 7.3.4 of appendix L to part 50 of this chapter with the downtube extension as specified in Figure O-1 of appendix O to part 50 of this chapter.

    (iii) For samplers that meet the provisions of paragraphs (c)(1)(i) and (ii) of this section, the candidate PM10−2.5 reference method may be determined to be a FRM without further testing.

    (2) Class I method. A sampler associated with a Class I candidate equivalent method for PM10−2.5 shall meet the requirements in this paragraph (c)(2).

    (i) The PM2.5 sampler of the PM10−2.5 sampler pair shall be verified to be either currently designated under this part 53 as a FRM or Class I FEM for PM2.5, or shown to meet all requirements for designation as a FRM or Class I FEM for PM2.5, in accordance with this part 53.

    (ii) The PM10c sampler of the PM10−2.5 sampler pair shall be verified to be of similar design to the PM10−2.5 sampler and to meet all requirements for designation as a FRM or Class I FRM for PM2.5, in accordance with this part 53, except for replacement of the particle size separator specified in section 7.3.4 Start Printed Page 61290of appendix L to part 50 of this chapter with the downtube extension as specified in Figure O-1 of appendix O to part 50 of this chapter.

    (iii) For samplers that meet the provisions of paragraphs (c)(2)(i) and (ii) of this section, the candidate PM10−2.5 method may be determined to be a Class I FEM without further testing.

    (3) Class II method. A sampler associated with a Class II candidate equivalent method for PM10−2.5 shall be subject to the applicable requirements of this subpart E, as described in § 53.3(a)(5).

    (d) The provisions of § 53.51 pertain to test results and documentation required to demonstrate compliance of a candidate method sampler with the design specifications set forth in 40 CFR part 50, appendix L or O, as applicable. The test procedures prescribed in §§ 53.52 through 53.59 pertain to performance tests required to demonstrate compliance of a candidate method sampler with the performance specifications set forth in 40 CFR part 50, appendix L or O, as applicable, as well as additional requirements specified in this subpart E. These latter test procedures shall be used to test the performance of candidate samplers against the performance specifications and requirements specified in each procedure and summarized in table E-1 of this subpart.

    (e) Test procedures prescribed in § 53.59 do not apply to candidate reference method samplers. These procedures apply primarily to candidate Class I or Class II equivalent method samplers for PM2.5 or PM10−2.5 that have a sample air flow path configuration upstream of the sample filter that is modified from that specified for the FRM sampler, as set forth in 40 CFR part 50, appendix L, Figures L-1 to L-29 or 40 CFR part 50 appendix O, Figure O-1, if applicable, such as might be necessary to provide for sequential sample capability. The additional tests determine the adequacy of aerosol transport through any altered components or supplemental devices that are used in a candidate sampler upstream of the filter. In addition to the other test procedures in this subpart, these test procedures shall be used to further test the performance of such an equivalent method sampler against the performance specifications given in the procedure and summarized in table E-1 of this subpart.

    (f) A 10-day operational field test of measurement precision is required under § 53.58 for both FRM and Class I FEM samplers for PM2.5. This test requires collocated operation of three candidate method samplers at a field test site. For candidate FEM samplers, this test may be combined and carried out concurrently with the test for comparability to the FRM specified under § 53.34, which requires collocated operation of three FRM samplers and three candidate FEM samplers.

    (g) All tests and collection of test data shall be performed in accordance with the requirements of reference 1, section 4.10.5 (ISO 9001) and reference 2, part B, (section 6) and Part C, (section 7) in appendix A of this subpart. All test data and other documentation obtained specifically from or pertinent to these tests shall be identified, dated, signed by the analyst performing the test, and submitted to EPA in accordance with subpart A of this part.

    Start Amendment Part

    9. Section 53.51 is revised to read as follows:

    End Amendment Part
    Demonstration of compliance with design specifications and manufacturing and test requirements.

    (a) Overview. (1) Paragraphs (a) through (f) of this section specify certain documentation that must be submitted and tests that are required to demonstrate that samplers associated with a designated FRM or FEM for PM2.5 or PM10−2.5 are properly manufactured to meet all applicable design and performance specifications and have been properly tested according to all applicable test requirements for such designation. Documentation is required to show that instruments and components of a PM2.5 or PM10−2.5 sampler are manufactured in an ISO 9001-registered facility under a quality system that meets ISO-9001 requirements for manufacturing quality control and testing.

    (2) In addition, specific tests are required by paragraph (d) of this section to verify that critical features of FRM samplers—the particle size separator and the surface finish of surfaces specified to be anodized—meet the specifications of 40 CFR part 50, appendix L or appendix O, as applicable. A checklist is required to provide certification by an ISO-certified auditor that all performance and other required tests have been properly and appropriately conducted, based on a reasonable and appropriate sample of the actual operations or their documented records. Following designation of the method, another checklist is required initially to provide an ISO-certified auditor's certification that the sampler manufacturing process is being implemented under an adequate and appropriate quality system.

    (3) For the purposes of this section, the definitions of ISO 9001-registered facility and ISO-certified auditor are found in § 53.1. An exception to the reliance by EPA on ISO-certified auditors is the requirement for the submission of the operation or instruction manual associated with the candidate method to EPA as part of the application. This manual is required under § 53.4(b)(3). The EPA has determined that acceptable technical judgment for review of this manual may not be assured by ISO-certified auditors, and approval of this manual will therefore be performed by EPA.

    (b) ISO registration of manufacturing facility. The applicant must submit documentation verifying that the samplers identified and sold as part of a designated PM2.5 or PM10−2.5 FRM or FEM will be manufactured in an ISO 9001-registered facility and that the manufacturing facility is maintained in compliance with all applicable ISO 9001 requirements (reference 1 in appendix A of this subpart). The documentation shall indicate the date of the original ISO 9001 registration for the facility and shall include a copy of the most recent certification of continued ISO 9001 facility registration. If the manufacturer does not wish to initiate or complete ISO 9001 registration for the manufacturing facility, documentation must be included in the application to EPA describing an alternative method to demonstrate that the facility meets the same general requirements as required for registration to ISO-9001. In this case, the applicant must provide documentation in the application to demonstrate, by required ISO-certified auditor's inspections, that a quality system is in place which is adequate to document and monitor that the sampler system components and final assembled samplers all conform to the design, performance and other requirements specified in this part and in 40 CFR part 50, appendix L.

    (c) Sampler manufacturing quality control. The manufacturer must ensure that all components used in the manufacture of PM2.5 or PM10−2.5 samplers to be sold as part of a FRM or FEM and that are specified by design in 40 CFR part 50, appendix L or O (as applicable), are fabricated or manufactured exactly as specified. If the manufacturer's quality records show that its quality control (QC) and quality assurance (QA) system of standard process control inspections (of a set number and frequency of testing that is less than 100 percent) complies with the applicable QA provisions of section 4 of reference 4 in appendix A of this subpart and prevents nonconformances, 100 percent testing shall not be required until that conclusion is disproved by Start Printed Page 61291customer return or other independent manufacturer or customer test records. If problems are uncovered, inspection to verify conformance to the drawings, specifications, and tolerances shall be performed. Refer also to paragraph (e) of this section—final assembly and inspection requirements.

    (d) Specific tests and supporting documentation required to verify conformance to critical component specifications— (1) Verification of PM2.5(WINS) impactor jet diameter. For samplers utilizing the WINS impactor particle size separator specified in paragraphs 7.3.4.1, 7.3.4.2, and 7.3.4.3 of appendix L to part 50 of this chapter, the diameter of the jet of each impactor manufactured for a PM2.5 or PM10−2.5 sampler under the impactor design specifications set forth in 40 CFR part 50, appendix L, shall be verified against the tolerance specified on the drawing, using standard, NIST-traceable ZZ go/no go plug gages. This test shall be a final check of the jet diameter following all fabrication operations, and a record shall be kept of this final check. The manufacturer shall submit evidence that this procedure is incorporated into the manufacturing procedure, that the test is or will be routinely implemented, and that an appropriate procedure is in place for the disposition of units that fail this tolerance test.

    (2) VSCC separator. For samplers utilizing the BGI VSCCTM Very Sharp Cut Cyclone particle size separator specified in paragraph 7.3.4.4 of appendix L to part 50 of this chapter, the VSCC manufacturer shall identify the critical dimensions and manufacturing tolerances for the device, develop appropriate test procedures to verify that the critical dimensions and tolerances are maintained during the manufacturing process, and carry out those procedures on each VSCC manufactured to verify conformance of the manufactured products. The manufacturer shall also maintain records of these tests and their results and submit evidence that this procedure is incorporated into the manufacturing procedure, that the test is or will be routinely implemented, and that an appropriate procedure is in place for the disposition of units that fail this tolerance test.

    (3) Verification of surface finish. The anodization process used to treat surfaces specified to be anodized shall be verified by testing treated specimen surfaces for weight and corrosion resistance to ensure that the coating obtained conforms to the coating specification. The specimen surfaces shall be finished in accordance with military standard specification 8625F, Type II, Class I (reference 4 in appendix A of this subpart) in the same way the sampler surfaces are finished, and tested, prior to sealing, as specified in section 4.5.2 of reference 4 in appendix A of this subpart.

    (e) Final assembly and inspection requirements. Each sampler shall be tested after manufacture and before delivery to the final user. Each manufacturer shall document its post-manufacturing test procedures. As a minimum, each test shall consist of the following: Tests of the overall integrity of the sampler, including leak tests; calibration or verification of the calibration of the flow measurement device, barometric pressure sensor, and temperature sensors; and operation of the sampler with a filter in place over a period of at least 48 hours. The results of each test shall be suitably documented and shall be subject to review by an ISO-certified auditor.

    (f) Manufacturer's audit checklists. Manufacturers shall require an ISO-certified auditor to sign and date a statement indicating that the auditor is aware of the appropriate manufacturing specifications contained in 40 CFR part 50, appendix L or O (as applicable), and the test or verification requirements in this subpart. Manufacturers shall also require an ISO-certified auditor to complete the checklists, shown in figures E-1 and E-2 of this subpart, which describe the manufacturer's ability to meet the requirements of the standard for both designation testing and product manufacture.

    (1) Designation testing checklist. The completed statement and checklist as shown in figure E-1 of this subpart shall be submitted with the application for FRM or FEM determination.

    (2) Product manufacturing checklist. Manufacturers shall require an ISO-certified auditor to complete a Product Manufacturing Checklist (figure E-2 of this subpart), which evaluates the manufacturer on its ability to meet the requirements of the standard in maintaining quality control in the production of FRM or FEM devices. The completed checklist shall be submitted with the application for FRM or FEM determination.

    Start Amendment Part

    10. Section 53.52 is amended by revising paragraph (e)(1) to read as follows:

    End Amendment Part
    Leak check test.
    * * * * *

    (e) Test setup. (1) The test sampler shall be set up for testing as described in the sampler's operation or instruction manual referred to in § 53.4(b)(3). The sampler shall be installed upright and set up in its normal configuration for collecting PM samples, except that the sample air inlet shall be removed and the flow rate measurement adaptor shall be installed on the sampler's downtube.

    * * * * *
    Start Amendment Part

    11. Section 53.53 is amended by revising paragraph (e)(1) to read as follows:

    End Amendment Part
    Test for flow rate accuracy, regulation, measurement accuracy, and cut-off.
    * * * * *

    (e) Test setup. (1) Setup of the sampler shall be as required in this paragraph (e) and otherwise as described in the sampler's operation or instruction manual referred to in § 53.4(b)(3). The sampler shall be installed upright and set up in its normal configuration for collecting PM samples. A sample filter and (or) the device for creating an additional 55 mm Hg pressure drop shall be installed for the duration of these tests. The sampler's ambient temperature, ambient pressure, and flow rate measurement systems shall all be calibrated per the sampler's operation or instruction manual within 7 days prior to this test.

    * * * * *
    Start Amendment Part

    12. Section 53.54 is amended by revising paragraph (d)(1) to read as follows:

    End Amendment Part
    Test for proper sampler operation following power interruptions.
    * * * * *

    (d) Test setup. (1) Setup of the sampler shall be performed as required in this paragraph (d) and otherwise as described in the sampler's operation or instruction manual referred to in § 53.4(b)(3). The sampler shall be installed upright and set up in its normal configuration for collecting PM samples. A sample filter and (or) the device for creating an additional 55 mm Hg pressure drop shall be installed for the duration of these tests. The sampler's ambient temperature, ambient pressure, and flow measurement systems shall all be calibrated per the sampler's operating manual within 7 days prior to this test.

    * * * * *
    Start Amendment Part

    13. Section 53.33 is amended by:

    End Amendment Part Start Amendment Part

    a. Revising paragraphs (a)(1) introductory text and (a)(2).

    End Amendment Part Start Amendment Part

    b. Revising paragraph (e)(1).

    End Amendment Part Start Amendment Part

    c. Revising paragraph (g)(5)(i) to read as follows.

    End Amendment Part
    Test for effect of variations in power line voltage and ambient temperature.

    (a) Overview. (1) This test procedure is a combined procedure to test various performance parameters under Start Printed Page 61292variations in power line voltage and ambient temperature. Tests shall be conducted in a temperature-controlled environment over four 6-hour time periods during which reference temperature and flow rate measurements shall be made at intervals not to exceed 5 minutes. Specific parameters to be evaluated at line voltages of 105 and 125 volts and temperatures of −20 °C and +40 °C are as follows:

    * * * * *

    (2) The performance parameters tested under this procedure, the corresponding minimum performance specifications, and the applicable test conditions are summarized in table E-1 of this subpart. Each performance parameter tested, as described or determined in the test procedure, must meet or exceed the associated performance specification given. The candidate sampler must meet all specifications for the associated PM2.5 or PM10-2.5 method (as applicable) to pass this test procedure.

    * * * * *

    (e) * * * (1) Setup of the sampler shall be performed as required in this paragraph (e) and otherwise as described in the sampler's operation or instruction manual referred to in § 53.4(b)(3). The sampler shall be installed upright and set up in the temperature-controlled chamber in its normal configuration for collecting PM samples. A sample filter and (or) the device for creating an additional 55 mm Hg pressure drop shall be installed for the duration of these tests. The sampler's ambient temperature, ambient pressure, and flow measurement systems shall all be calibrated per the sampler's operating manual within 7 days prior to this test.

    * * * * *

    (g) * * *

    (5) * * * (i) Calculate the absolute value of the difference between the mean ambient air temperature indicated by the test sampler and the mean ambient (chamber) air temperature measured with the ambient air temperature recorder as:

    Where:

    Tind,ave = The mean ambient air temperature indicated by the test sampler, °C; and

    Tref,ave = The mean ambient air temperature measured by the reference temperature instrument, °C.

    * * * * *
    Start Amendment Part

    14. Section 53.56 is amended by revising paragraphs (a)(2) and (e)(1) to read as follows:

    End Amendment Part
    Test for effect of variations in ambient pressure.

    (a) * * *

    (2) The performance parameters tested under this procedure, the corresponding minimum performance specifications, and the applicable test conditions are summarized in table E-1 of this subpart. Each performance parameter tested, as described or determined in the test procedure, must meet or exceed the associated performance specification given. The candidate sampler must meet all specifications for the associated PM2.5 or PM10−2.5 method (as applicable) to pass this test procedure.

    * * * * *

    (e) * * * (1) Setup of the sampler shall be performed as required in this paragraph (e) and otherwise as described in the sampler's operation or instruction manual referred to in § 53.4(b)(3). The sampler shall be installed upright and set up in the pressure-controlled chamber in its normal configuration for collecting PM samples. A sample filter and (or) the device for creating an additional 55 mm Hg pressure drop shall be installed for the duration of these tests. The sampler's ambient temperature, ambient pressure, and flow measurement systems shall all be calibrated per the sampler's operating manual within 7 days prior to this test.

    * * * * *
    Start Amendment Part

    15. Section 53.57 is amended by revising paragraphs (a), (b), and (e)(1) to read as follows:

    End Amendment Part
    Test for filter temperature control during sampling and post-sampling periods.

    (a) Overview. This test is intended to measure the candidate sampler's ability to prevent excessive overheating of the PM sample collection filter (or filters) under conditions of elevated solar insolation. The test evaluates radiative effects on filter temperature during a 4-hour period of active sampling as well as during a subsequent 4-hour non-sampling time period prior to filter retrieval. Tests shall be conducted in an environmental chamber which provides the proper radiant wavelengths and energies to adequately simulate the sun's radiant effects under clear conditions at sea level. For additional guidance on conducting solar radiative tests under controlled conditions, consult military standard specification 810-E (reference 6 in appendix A of this subpart). The performance parameters tested under this procedure, the corresponding minimum performance specifications, and the applicable test conditions are summarized in table E-1 of this subpart. Each performance parameter tested, as described or determined in the test procedure, must meet or exceed the associated performance specification to successfully pass this test.

    (b) Technical definition. Filter temperature control during sampling is the ability of a sampler to maintain the temperature of the particulate matter sample filter within the specified deviation (5 °C) from ambient temperature during any active sampling period. Post-sampling temperature control is the ability of a sampler to maintain the temperature of the particulate matter sample filter within the specified deviation from ambient temperature during the period from the end of active sample collection by the sampler until the filter is retrieved from the sampler for laboratory analysis.

    * * * * *

    (e) * * * (1) Setup of the sampler shall be performed as required in this paragraph (e) and otherwise as described in the sampler's operation or instruction manual referred to in § 53.4(b)(3). The sampler shall be installed upright and set up in the solar radiation environmental chamber in its normal configuration for collecting PM samples (with the inlet installed). The sampler's ambient and filter temperature measurement systems shall be calibrated per the sampler's operating manual within 7 days prior to this test. A sample filter shall be installed for the duration of this test. For sequential samplers, a sample filter shall also be installed in each available sequential channel or station intended for collection of a sequential sample (or at least five additional filters for magazine-type sequential samplers) as directed by the sampler's operation or instruction manual.

    * * * * *
    Start Amendment Part

    16. Section 53.58 is revised to read as follows:

    End Amendment Part
    Operational field precision and blank test.

    (a) Overview. This test is intended to determine the operational precision of the candidate sampler during a minimum of 10 days of field operation, using three collocated test samplers. Measurements of PM are made at a test site with all of the samplers and then compared to determine replicate precision. Candidate sequential samplers are also subject to a test for possible deposition of particulate matter on inactive filters during a period of storage in the sampler. This procedure is applicable to both reference and equivalent methods. In the case of Start Printed Page 61293equivalent methods, this test may be combined and conducted concurrently with the comparability test for equivalent methods (described in subpart C of this part), using three reference method samplers collocated with three candidate equivalent method samplers and meeting the applicable site and other requirements of subpart C of this part.

    (b) Technical definition. (1) Field precision is defined as the standard deviation or relative standard deviation of a set of PM measurements obtained concurrently with three or more collocated samplers in actual ambient air field operation.

    (2) Storage deposition is defined as the mass of material inadvertently deposited on a sample filter that is stored in a sequential sampler either prior to or subsequent to the active sample collection period.

    (c) Test site. Any outdoor test site having PM2.5 (or PM10−2.5, as applicable) concentrations that are reasonably uniform over the test area and that meet the minimum level requirement of paragraph (g)(2) of this section is acceptable for this test.

    (d) Required facilities and equipment. (1) An appropriate test site and suitable electrical power to accommodate three test samplers are required.

    (2) Teflon sample filters, as specified in section 6 of 40 CFR part 50, appendix L, conditioned and preweighed as required by section 8 of 40 CFR part 50, appendix L, as needed for the test samples.

    (e) Test setup. (1) Three identical test samplers shall be installed at the test site in their normal configuration for collecting PM samples in accordance with the instructions in the associated manual referred to in § 53.4(b)(3) and also in accordance with applicable supplemental guidance provided in reference 3 in appendix A of this subpart. The test samplers' inlet openings shall be located at the same height above ground and between 2 (1 for samplers with flow rates less than 200 L/min.) and 4 meters apart horizontally. The samplers shall be arranged or oriented in a manner that will minimize the spatial and wind directional effects on sample collection of one sampler on any other sampler.

    (2) Each test sampler shall be successfully leak checked, calibrated, and set up for normal operation in accordance with the instruction manual and with any applicable supplemental guidance provided in reference 3 in appendix A of this subpart.

    (f) Test procedure. (1) Install a conditioned, preweighed filter in each test sampler and otherwise prepare each sampler for normal sample collection. Set identical sample collection start and stop times for each sampler. For sequential samplers, install a conditioned, preweighed specified filter in each available channel or station intended for automatic sequential sample filter collection (or at least five additional filters for magazine-type sequential samplers), as directed by the sampler's operation or instruction manual. Since the inactive sequential channels are used for the storage deposition part of the test, they may not be used to collect the active PM test samples.

    (2) Collect either a nominal 24-hour or 48-hour atmospheric PM sample simultaneously with each of the three test samplers.

    (3) Following sample collection, retrieve the collected sample from each sampler. For sequential samplers, retrieve the additional stored (blank, unsampled) filters after at least 5 days (120 hours) storage in the sampler if the active samples are 24-hour samples, or after at least 10 days (240 hours) if the active samples are 48-hour samples.

    (4) Determine the measured PM mass concentration for each sample in accordance with the applicable procedures prescribed for the candidate method in appendix L or appendix O, as applicable, of part 50 of this chapter, and in accordance with the associated manual referred to in § 53.4(b)(3) and supplemental guidance in reference 2 in appendix A of this subpart. For sequential samplers, also similarly determine the storage deposition as the net weight gain of each blank, unsampled filter after the 5-day (or 10-day) period of storage in the sampler.

    (5) Repeat this procedure to obtain a total of 10 sets of any combination of (nominal) 24-hour or 48-hour PM measurements over 10 test periods. For sequential samplers, repeat the 5-day (or 10-day) storage test of additional blank filters once for a total of two sets of blank filters.

    (g) Calculations. (1) Record the PM concentration for each test sampler for each test period as Ci,j, where i is the sampler number (i = 1,2,3) and j is the test period (j = 1,2, * * * 10).

    (2)(i) For each test period, calculate and record the average of the three measured PM concentrations as Cave,j where j is the test period using equation 26 of this section:

    (ii) If Cave,j < 3 μg/m3 for any test period, data from that test period are unacceptable, and an additional sample collection set must be obtained to replace the unacceptable data.

    (3)(i) Calculate and record the precision for each of the 10 test periods, as the standard deviation, using equation 27 of this section:

    (ii) For each of the 10 test periods, also calculate and record the precision as the relative standard deviation, in percent, using equation 28 of this section:

    (h) Test results. (1) The candidate method passes the precision test if either Pj or RPj is less than or equal to the corresponding specification in table E-1 of this subpart for all 10 test periods.

    (2) The candidate sequential sampler passes the blank filter storage deposition test if the average net storage deposition weight gain of each set of blank filters (total of the net weight gain of each blank filter divided by the number of filters in the set) from each test sampler (six sets in all) is less than 50 μg.

    Start Amendment Part

    17. Section 53.59 is amended by revising paragraphs (a) and (b)(5) to read as follows:

    End Amendment Part
    Aerosol transport test for Class I equivalent method samplers.

    (a) Overview. This test is intended to verify adequate aerosol transport through any modified or air flow splitting components that may be used in a Class I candidate equivalent method sampler such as may be necessary to achieve sequential sampling capability. This test is applicable to all Class I candidate samplers in which the aerosol flow path (the flow path through which sample air passes upstream of sample collection filter) differs significantly from that specified for reference method samplers as specified in 40 CFR part 50, appendix L or appendix O, as applicable. The test requirements and performance specifications for this test are summarized in table E-1 of this subpart.

    (b) * * *

    (5) An added component is any physical part of the sampler which is different in some way from that specified for a reference method Start Printed Page 61294sampler in 40 CFR part 50, appendix L or appendix O, as applicable, such as a device or means to allow or cause the aerosol to be routed to one of several channels.

    * * * * *
    Start Amendment Part

    18. Table E-1 to subpart E is revised to read as follows:

    End Amendment Part

    Table E-1 to Subpart E of Part 53.—Summary of Test Requirements for Reference and Class I Equivalent Methods for PM2.5 and PM10-2.5

    Subpart E procedurePerformance testPerformance specificationTest conditionsPart 50, appendix L reference
    § 53.52 Sample leak check testSampler leak check facilityExternal leakage: 80 mL/min, max Internal leakage: 80 mL/min, maxControlled leak flow rate of 80 mL/minSec. 7.4.6.
    § 53.53 Base flow rate testSample flow rate 1. Mean 2. Regulation 3. Meas accuracy 4. CV accuracy 5. Cut-off1. 16.67 ? 5% L/min 2. 2%, max 3. 2%, max 4. 0.3%, max 5. Flow rate cut-off if flow rate deviates more than 10% from design flow rate for >60 ± ?30 seconds(a) 6-hour normal operational test plus flow rate cut-off test (b) Normal conditions (c) Additional 55 mm Hg pressure drop to simulate loaded filter (d) Variable flow restriction used for cut-off testSec. 7.4.1, Sec. 7.4.2 Sec. 7.4.3 Sec. 7.4.4 Sec. 7.4.5.
    § 53.54 Power interruption testSample flow rate: 1. Mean 2. Regulation 3. Meas. accuracy 4. CV accuracy 5. Occurrence time of power interruptions 6. Elapsed sample time 7. Sample volume1. 16.67 ?± 5% L/Min 2. 2%, max 3. 2%, max 4. 0.3% max 5. ? ± 2 min if >60 seconds 6. ? ± 20 seconds 7. ± ?2%, max(a) 6-hour normal operational test (b) Nominal conditions (c) Additional 55 mm Hg pressure drop to simulate loaded filter (d) 6 power interruptions of various durationsSec. 7.4.1, Sec. 7.4.2 Sec. 7.4.3 Sec. 7.4.5 Sec. 7.4.12 Sec. 7.4.13 Sec. 7.4.15.4 Sec. 7.4.15.5.
    § 53.55 Temperature and line voltage testSample flow rate 1. Mean 2. Regulation 3. Meas. accuracy 4. CV accuracy 5. Temperature meas. accuracy 6. Proper operation1. 16.6 ±? 5% L/min 2. 2%, max 3. 2%, max 4. 0.3% max 5. 2 °C(a) 6-hour normal operational test (b) Normal conditions (c) Additional 55 mm Hg pressure drop to simulate loaded filter (d) Ambient temperature at −20 and +40 °C (e) Line voltage: 105 Vac to 125 VacSec. 7.4.1, Sec. 7.4.2 Sec. 7.4.3 Sec. 7.4.5 Sec. 7.4.8 Sec. 7.4.15.1.
    § 53.56 Barometric pressure effect testSample flow rate 1. Mean 2. Regulation 3. Meas. accuracy 4. CV accuracy 5. Pressure meas. accuracy 6. Proper operation1. 16.67 ?± ? 5% L/min 2. 2%, max 3. 2%, max 3. 2%, max 4. 0.3%, max 5. 10 mm Hg(a) 6-hour normal operational test (b) Normal conditions (c) Additional 55 mm Hg pressure drop to simulate loaded filter (d) Barometric pressure at 600 and 800 mm HgSec. 7.4.1, Sec. 7.4.2 Sec. 7.4.3 Sec. 7.4.5 Sec. 7.4.9.
    § 53.57 Filter temperature control test1. Filter temp meas. accuracy 2. Ambient temp. meas. accuracy 3. Filter temp. control accuracy, sampling and non-sampling1. 2 °C 2. 2 °C 3. Not more than 5 °C above ambient temp. for more than 30 min.(a) 4-hour simulated solar radiation, sampling (b) 4-hour simulated solar radiation, non-sampling (c) Solar flux of 1000 ?50 W/m2Sec. 7.4.8 Sec. 7.4.10 Sec. 7.4.11.
    § 53.58 Field precision test1. Measurement precision 2. Storage deposition test for sequential samplers1. Pj < 2 μg/m3 or RPj < 5% 2. 50 μg max. average weight gain/blank filter(a) 3 collocated samplers at 1 site for at least 10 days; (b) PM2.5 conc. > 3 μg/m3 (c) 24- or 48-hour samples (d) 5- or 10-day storage period for inactive stored filtersSec. 5.1 Sec. 7.3.5 Sec. 8 Sec. 9 Sec. 10.
    Start Printed Page 61295
    The Following Requirement Is Applicable to Class I Candidate Equivalent Methods Only
    § 53.59 Aerosol transport testAerosol transport97%, min. for all channelsDetermine aerosol transport through any new or modified components with respect to the reference method sampler before the filter for each channel
    Start Amendment Part

    19. References (1), (2), (3), and (5) in appendix A to subpart E of part 53 are revised to read as follows:

    End Amendment Part

    Appendix A to Subpart E of Part 53—References

    (1) American National Standard Quality Systems—Model for Quality Assurance in Design, Development, Production, Installation, and Servicing, ANSI/ISO/ASQC Q9001-1994. Available from American Society for Quality, P.O. Box 3005, Milwaukee, WI 53202 (http://qualitypress.asq.org).

    (2) American National Standard Quality Systems for Environmental Data and Technology Programs—Requirements with guidance for use, ANSI/ASQC E4-2004. Available from American Society for Quality, P.O. Box 3005, Milwaukee, WI 53202 (http://qualitypress.asq.org).

    (3) Quality Assurance Guidance Document 2.12. Monitoring PM2.5 in Ambient Air Using Designated Reference or Class I Equivalent Methods. U.S. EPA, National Exposure Research Laboratory, Research Triangle Park, NC, November 1998 or later edition. Currently available at http://www.epa.gov/​ttn/​amtic/​pmqainf.html.

    * * * * *

    (5) Quality Assurance Handbook for Air Pollution Measurement Systems, Volume IV: Meteorological Measurements. Revised March, 1995. EPA-600/R-94-038d. Available from National Technical Information Service, Springfield, VA 22161, (800-553-6847, http://www.ntis.gov). NTIS number PB95-199782INZ.

    * * * * *

    Subpart F—[Amended]

    Start Amendment Part

    20. Section 53.60 is amended by:

    End Amendment Part Start Amendment Part

    a. Revising paragraph (b);

    End Amendment Part Start Amendment Part

    b. Revising paragraph (c);

    End Amendment Part Start Amendment Part

    c. Revising paragraph (d) introductory text; and

    End Amendment Part Start Amendment Part

    d. Revising paragraph (f)(4) to read as follows:

    End Amendment Part
    General provisions.
    * * * * *

    (b) A candidate method described in an application for a FRM or FEM determination submitted under § 53.4 shall be determined by the EPA to be a Class II candidate equivalent method on the basis of the definition of a Class II FEM in § 53.1.

    (c) Any sampler associated with a Class II candidate equivalent method (Class II sampler) must meet all applicable requirements for FRM samplers or Class I FEM samplers specified in subpart E of this part, as appropriate. Except as provided in § 53.3(a)(3), a Class II PM2.5 sampler must meet the additional requirements as specified in paragraph (d) of this section.

    (d) Except as provided in paragraphs (d)(1), (2), and (3) of this section, all Class II samplers are subject to the additional tests and performance requirements specified in § 53.62 (full wind tunnel test), § 53.65 (loading test), and § 53.66 (volatility test). Alternative tests and performance requirements, as described in paragraphs (d)(1), (2), and (3) of this section, are optionally available for certain Class II samplers which meet the requirements for reference method or Class I equivalent method samplers given in 40 CFR part 50, appendix L, and in subpart E of this part, except for specific deviations of the inlet, fractionator, or filter.

    * * * * *

    (f) * * *

    (4) Loading test. The loading test is conducted to ensure that the performance of a candidate sampler is not significantly affected by the amount of particulate deposited on its interior surfaces between periodic cleanings. The candidate sampler is artificially loaded by sampling a test environment containing aerosolized, standard test dust. The duration of the loading phase is dependent on both the time between cleaning as specified by the candidate method and the aerosol mass concentration in the test environment. After loading, the candidate's performance must then be evaluated by § 53.62 (full wind tunnel evaluation), § 53.63 (wind tunnel inlet aspiration test), or § 53.64 (static fractionator test). If the results of the appropriate test meet the criteria presented in table F-1 of this subpart, then the candidate sampler passes the loading test under the condition that it be cleaned at least as often as the cleaning frequency proposed by the candidate method and that has been demonstrated to be acceptable by this test.

    * * * * *
    Start Amendment Part

    21. The section heading of § 53.61 is revised to read as follows:

    End Amendment Part
    Test conditions.
    * * * * *
    Start Amendment Part

    22. Section 53.66 is amended by revising paragraph (e)(2)(iii) to read as follows:

    End Amendment Part
    Test procedure: Volatility test.
    * * * * *

    (e) * * *

    (2) * * *

    (iii) Operate the candidate and the reference samplers such that they simultaneously sample the test aerosol for 2 hours for a candidate sampler operating at 16.7 L/min or higher, or proportionately longer for a candidate sampler operating at a lower flow rate.

    * * * * *
    Start Amendment Part

    23. Table F-1 to subpart F is revised to read as follows: Start Printed Page 61296

    End Amendment Part

    Table F-1 to Subpart F of Part 53.—Performance Specifications for PM2.5 Class II Equivalent Samplers

    Performance testSpecificationsAcceptance criteria
    § 53.62 Full Tunnel EvaluationSolid VOAG produced aerosol at 2 km/hr and 24 km/hrDp50 2.5 μm ± 0.2 μm Numerical Analysis Results: 95% ≤ ? Rc ≤ ? 105%
    § 53.63 Wind Tunnel Inlet Aspriation TestLiquid VOAG produced aerosol at 2 km/hr and 24 km/hrRelative Aspiration: 95% ≤ ? A ≤ ? 105%
    § 53.64 Static Fractionator TestEvaluation of the fractionator under static conditionsDp50 = 2.5 μm ? 0.2 μm Numerical Analysis Results: 95% ? ≤ Rc ? ≤ 105%
    § 53.65 Loading TestLoading of the clean candidate under laboratory conditionsAcceptance criteria as specified in the post-loading evaluation test (§ 53.62, § 53.63, or § 53.64)
    § 53.66 Volatility TestPolydisperse liquid aerosol produced by air nebulization of A.C.S. reagent grade glycerol, 99.5% minimum purityRegression Parameters Slope = 1 ± 0.1, Intercept = 0 ± ? 0.15mg r ≥ 0.97.
    Start Amendment Part

    24. In Figure E-1 to subpart F, the figure number “E-1” is revised to read “F-1.”

    End Amendment Part Start Part

    PART 58—[AMENDED]

    End Part Start Amendment Part

    25. The authority citation for part 58 is revised to read as follows:

    End Amendment Part Start Authority

    Authority: 42 U.S.C. 7403, 7410, 7601(a), 7611, and 7619.

    End Authority Start Amendment Part

    26. Subpart A is revised to read as follows:

    End Amendment Part
    Subpart A—General Provisions
    58.1
    Definitions.
    58.2
    Purpose.
    58.3
    Applicability.

    Subpart A—General Provisions

    Definitions.

    As used in this part, all terms not defined herein have the meaning given them in the Act.

    Act means the Clean Air Act as amended (42 U.S.C. 7401, et seq.)

    Additive and multiplicative bias means the linear regression intercept and slope of a linear plot fitted to corresponding candidate and reference method mean measurement data pairs.

    Administrator means the Administrator of the Environmental Protection Agency (EPA) or his or her authorized representative.

    Air Quality System (AQS) means EPA's computerized system for storing and reporting of information relating to ambient air quality data.

    Approved regional method (ARM) means a continuous PM2.5 method that has been approved specifically within a State or local air monitoring network for purposes of comparison to the NAAQS and to meet other monitoring objectives.

    AQCR means air quality control region.

    CO means carbon monoxide.

    Combined statistical area (CSA) is defined by the U.S. Office of Management and Budget as a geographical area consisting of two or more adjacent Core Based Statistical Areas (CBSA) with employment interchange of at least 15 percent. Combination is automatic if the employment interchange is 25 percent and determined by local opinion if more than 15 but less than 25 percent (http://www.census.gov/​population/​estimates/​metro-city/​List6.txt).

    Community monitoring zone (CMZ) means an optional averaging area with established, well defined boundaries, such as county or census block, within an MPA that has relatively uniform concentrations of annual PM2.5 as defined by appendix N of part 50 of this chapter. Two or more community-oriented SLAMS monitors within a CMZ that meet certain requirements as set forth in appendix N of part 50 of this chapter may be averaged for making comparisons to the annual PM2.5 NAAQS.

    Core-based statistical area (CBSA) is defined by the U.S. Office of Management and Budget, as a statistical geographic entity consisting of the county or counties associated with at least one urbanized area/urban cluster of at least 10,000 population, plus adjacent counties having a high degree of social and economic integration. Metropolitan Statistical Areas (MSAs) and micropolitan statistical areas are the two categories of CBSA (metropolitan areas have populations greater than 50,000; and micropolitan areas have populations between 10,000 and 50,000). In the case of very large cities where two or more CBSAs are combined, these larger areas are referred to as combined statistical areas (CSAs) (http://www.census.gov/​population/​estimates/​metro-city/​List1.txt).

    Corrected concentration pertains to the result of an accuracy or precision assessment test of an open path analyzer in which a high-concentration test or audit standard gas contained in a short test cell is inserted into the optical measurement beam of the instrument. When the pollutant concentration measured by the analyzer in such a test includes both the pollutant concentration in the test cell and the concentration in the atmosphere, the atmospheric pollutant concentration must be subtracted from the test measurement to obtain the corrected concentration test result. The corrected concentration is equal to the measured concentration minus the average of the atmospheric pollutant concentrations measured (without the test cell) immediately before and immediately after the test.

    Design value means the calculated concentration according to the applicable appendix of part 50 of this chapter for the highest site in an attainment or nonattainment area.

    EDO means environmental data operations.

    Effective concentration pertains to testing an open path analyzer with a high-concentration calibration or audit standard gas contained in a short test cell inserted into the optical measurement beam of the instrument. Effective concentration is the equivalent ambient-level concentration that would produce the same spectral absorbance over the actual atmospheric monitoring path length as produced by the high-concentration gas in the short test cell. Quantitatively, effective concentration is equal to the actual concentration of the gas standard in the test cell multiplied by the ratio of the path length of the test cell to the actual atmospheric monitoring path length.

    Federal equivalent method (FEM) means a method for measuring the concentration of an air pollutant in the ambient air that has been designated as an equivalent method in accordance with part 53 of this chapter; it does not include a method for which an equivalent method designation has been canceled in accordance with § 53.11 or § 53.16 of this chapter.

    Federal reference method (FRM) means a method of sampling and Start Printed Page 61297analyzing the ambient air for an air pollutant that is specified as a reference method in an appendix to part 50 of this chapter, or a method that has been designated as a reference method in accordance with this part; it does not include a method for which a reference method designation has been canceled in accordance with § 53.11 or § 53.16 of this chapter.

    HNO3 means nitric acid.

    Local agency means any local government agency, other than the State agency, which is charged by a State with the responsibility for carrying out a portion of the plan.

    Meteorological measurements means measurements of wind speed, wind direction, barometric pressure, temperature, relative humidity, solar radiation, ultraviolet radiation, and/or precipitation.

    Metropolitan Statistical Area (MSA) means a CBSA associated with at least one urbanized area of 50,000 population or greater. The central county plus adjacent counties with a high degree of integration comprise the area.

    Monitor means an instrument, sampler, analyzer, or other device that measures or assists in the measurement of atmospheric air pollutants and which is acceptable for use in ambient air surveillance under the applicable provisions of appendix C to this part.

    Monitoring agency means a State or local agency responsible for meeting the requirements of this part.

    Monitoring organization means a State, local, or other monitoring organization responsible for operating a monitoring site for which the quality assurance regulations apply.

    Monitoring path for an open path analyzer means the actual path in space between two geographical locations over which the pollutant concentration is measured and averaged.

    Monitoring path length of an open path analyzer means the length of the monitoring path in the atmosphere over which the average pollutant concentration measurement (path-averaged concentration) is determined. See also, optical measurement path length.

    Monitoring planning area (MPA) means a contiguous geographic area with established, well defined boundaries, such as a CBSA, county or State, having a common area that is used for planning monitoring locations for PM2.5. An MPA may cross State boundaries, such as the Philadelphia PA-NJ MSA, and be further subdivided into community monitoring zones. MPAs are generally oriented toward CBSAs or CSAs with populations greater than 200,000, but for convenience, those portions of a State that are not associated with CBSAs can be considered as a single MPA.

    NATTS means the national air toxics trends stations. This network provides hazardous air pollution ambient data.

    NCore means the National Core multipollutant monitoring stations. Monitors at these sites are required to measure particles (PM2.5, speciated PM2.5, PM10-2.5), O3, SO2, CO, nitrogen oxides (NO/NO2/NOy), Pb, and basic meteorology.

    Network means all stations of a given type or types.

    NH3 means ammonia.

    NO2 means nitrogen dioxide. NO means nitrogen oxide. NOX means oxides of nitrogen and is defined as the sum of the concentrations of NO2 and NO.

    NOy means the sum of all total reactive nitrogen oxides, including NO, NO2, and other nitrogen oxides referred to as NOZ.

    O3 means ozone.

    Open path analyzer means an automated analytical method that measures the average atmospheric pollutant concentration in situ along one or more monitoring paths having a monitoring path length of 5 meters or more and that has been designated as a reference or equivalent method under the provisions of part 53 of this chapter.

    Optical measurement path length means the actual length of the optical beam over which measurement of the pollutant is determined. The path-integrated pollutant concentration measured by the analyzer is divided by the optical measurement path length to determine the path-averaged concentration. Generally, the optical measurement path length is:

    (1) Equal to the monitoring path length for a (bistatic) system having a transmitter and a receiver at opposite ends of the monitoring path;

    (2) Equal to twice the monitoring path length for a (monostatic) system having a transmitter and receiver at one end of the monitoring path and a mirror or retroreflector at the other end; or

    (3) Equal to some multiple of the monitoring path length for more complex systems having multiple passes of the measurement beam through the monitoring path.

    PAMS means photochemical assessment monitoring stations.

    Pb means lead.

    Plan means an implementation plan approved or promulgated pursuant to section 110 of the Act.

    PM means PM10, PM110C, PM2.5, PM10−2.5, or particulate matter of unspecified size range.

    PM2.5 means particulate matter with an aerodynamic diameter less than or equal to a nominal 2.5 micrometers as measured by a reference method based on appendix L of part 50 of this chapter and designated in accordance with part 53 of this chapter, by an equivalent method designated in accordance with part 53 of this chapter, or by an approved regional method designated in accordance with appendix C to this part.

    PM10 means particulate matter with an aerodynamic diameter less than or equal to a nominal 10 micrometers as measured by a reference method based on appendix J of part 50 of this chapter and designated in accordance with part 53 of this chapter or by an equivalent method designated in accordance with part 53 of this chapter.

    PM10C means particulate matter with an aerodynamic diameter less than or equal to a nominal 10 micrometers as measured by a reference method based on appendix O of part 50 of this chapter and designated in accordance with part 53 of this chapter or by an equivalent method designated in accordance with part 53 of this chapter.

    PM10−2.5 means particulate matter with an aerodynamic diameter less than or equal to a nominal 10 micrometers and greater than a nominal 2.5 micrometers as measured by a reference method based on appendix O to part 50 of this chapter and designated in accordance with part 53 of this chapter or by an equivalent method designated in accordance with part 53 of this chapter.

    Point analyzer means an automated analytical method that measures pollutant concentration in an ambient air sample extracted from the atmosphere at a specific inlet probe point and that has been designated as a reference or equivalent method in accordance with part 53 of this chapter.

    Population-oriented monitoring (or sites) means residential areas, commercial areas, recreational areas, industrial areas where workers from more than one company are located, and other areas where a substantial number of people may spend a significant fraction of their day.

    Primary quality assurance organization means a monitoring organization or other organization that is responsible for a set of stations that monitor the same pollutant and for which data quality assessments can be pooled. Each criteria pollutant sampler/monitor at a monitoring station in the SLAMS and SPM networks must be associated with one, and only one, primary quality assurance organization.

    Probe means the actual inlet where an air sample is extracted from the atmosphere for delivery to a sampler or point analyzer for pollutant analysis. Start Printed Page 61298

    PSD station means any station operated for the purpose of establishing the effect on air quality of the emissions from a proposed source for purposes of prevention of significant deterioration as required by § 51.24(n) of this chapter.

    Regional Administrator means the Administrator of one of the ten EPA Regional Offices or his or her authorized representative.

    Reporting organization means an entity, such as a State, local, or Tribal monitoring agency, that collects and reports air quality data to EPA.

    Site means a geographic location. One or more stations may be at the same site.

    SLAMS means State or local air monitoring stations. The SLAMS make up the ambient air quality monitoring sites that are primarily needed for NAAQS comparisons, but may serve other data purposes. SLAMS exclude special purpose monitor (SPM) stations and include NCore, PAMS, and all other State or locally operated stations that have not been designated as SPM stations.

    SO2 means sulfur dioxide.

    Special purpose monitor (SPM) station means a monitor included in an agency's monitoring network that the agency has designated as a special purpose monitor station in its monitoring network plan and in the Air Quality System, and which the agency does not count when showing compliance with the minimum requirements of this subpart for the number and siting of monitors of various types.

    State agency means the air pollution control agency primarily responsible for development and implementation of a plan under the Act.

    State speciation site means a supplemental PM2.5 speciation station that is not part of the speciation trends network.

    Station means a single monitor, or a group of monitors with a shared objective, located at a particular site.

    STN station means a PM2.5 speciation station designated to be part of the speciation trends network. This network provides chemical species data of fine particulate.

    Traceable means that a local standard has been compared and certified, either directly or via not more than one intermediate standard, to a National Institute of Standards and Technology (NIST)-certified primary standard such as a NIST-traceable Reference Material (NTRM) or a NIST-certified Gas Manufacturer's Internal Standard (GMIS).

    TSP (total suspended particulates) means particulate matter as measured by the method described in appendix B of part 50 of this chapter.

    Urbanized area means an area with a minimum residential population of at least 50,000 people and which generally includes core census block groups or blocks that have a population density of at least 1,000 people per square mile and surrounding census blocks that have an overall density of at least 500 people per square mile. The Census Bureau notes that under certain conditions, less densely settled territory may be part of each Urbanized Area.

    VOC means volatile organic compounds.

    Purpose.

    (a) This part contains requirements for measuring ambient air quality and for reporting ambient air quality data and related information. The monitoring criteria pertain to the following areas:

    (1) Quality assurance procedures for monitor operation and data handling.

    (2) Methodology used in monitoring stations.

    (3) Operating schedule.

    (4) Siting parameters for instruments or instrument probes.

    (5) Minimum ambient air quality monitoring network requirements used to provide support to the State implementation plans (SIP), national air quality assessments, and policy decisions. These minimums are described as part of the network design requirements, including minimum numbers and placement of monitors of each type.

    (6) Air quality data reporting, and requirements for the daily reporting of an index of ambient air quality.

    (b) The requirements pertaining to provisions for an air quality surveillance system in the SIP are contained in this part.

    (c) This part also acts to establish a national ambient air quality monitoring network for the purpose of providing timely air quality data upon which to base national assessments and policy decisions.

    Applicability.

    This part applies to:

    (a) State air pollution control agencies.

    (b) Any local air pollution control agency to which the State has delegated authority to operate a portion of the State's SLAMS network.

    (c) Owners or operators of proposed sources.

    Start Amendment Part

    27. Subpart B is revised to read as follows:

    End Amendment Part
    Subpart B—Monitoring Network
    58.10
    Annual monitoring network plan and periodic network assessment.
    58.11
    Network technical requirements.
    58.12
    Operating schedules.
    58.13
    Monitoring network completion.
    58.14
    System modification.
    58.15
    Annual air monitoring data certification.
    58.16
    Data submittal and archiving requirements.

    Subpart B—Monitoring Network

    Annual monitoring network plan and periodic network assessment.

    (a)(1) Beginning July 1, 2007, the State, or where applicable local, agency shall adopt and submit to the Regional Administrator an annual monitoring network plan which shall provide for the establishment and maintenance of an air quality surveillance system that consists of a network of SLAMS monitoring stations including FRM, FEM, and ARM monitors that are part of SLAMS, NCore stations, STN stations, State speciation stations, SPM stations, and/or, in serious, severe and extreme ozone nonattainment areas, PAMS stations, and SPM monitoring stations. The plan shall include a statement of purposes for each monitor and evidence that siting and operation of each monitor meets the requirements of appendices A, C, D, and E of this part, where applicable. The annual monitoring network plan must be made available for public inspection for at least 30 days prior to submission to EPA.

    (2) Any annual monitoring network plan that proposes SLAMS network modifications including new monitoring sites is subject to the approval of the EPA Regional Administrator, who shall provide opportunity for public comment and shall approve or disapprove the plan and schedule within 120 days. If the State or local agency has already provided a public comment opportunity on its plan and has made no changes subsequent to that comment opportunity, the Regional Administrator is not required to provide a separate opportunity for comment.

    (3) The plan for establishing required NCore multipollutant stations shall be submitted to the Administrator not later than July 1, 2009. The plan shall provide for all required stations to be operational by January 1, 2011.

    (b) The annual monitoring network plan must contain the following information for each existing and proposed site:

    (1) The AQS site identification number.

    (2) The location, including street address and geographical coordinates. Start Printed Page 61299

    (3) The sampling and analysis method(s) for each measured parameter.

    (4) The operating schedules for each monitor.

    (5) Any proposals to remove or move a monitoring station within a period of 18 months following plan submittal.

    (6) The monitoring objective and spatial scale of representativeness for each monitor as defined in appendix D to this part.

    (7) The identification of any sites that are suitable and sites that are not suitable for comparison against the annual PM2.5 NAAQS as described in § 58.30.

    (8) The MSA, CBSA, CSA or other area represented by the monitor.

    (c) The annual monitoring network plan must document how States and local agencies provide for the review of changes to a PM2.5 monitoring network that impact the location of a violating PM2.5 monitor or the creation/change to a community monitoring zone, including a description of the proposed use of spatial averaging for purposes of making comparisons to the annual PM2.5 NAAQS as set forth in appendix N to part 50 of this chapter. The affected State or local agency must document the process for obtaining public comment and include any comments received through the public notification process within their submitted plan.

    (d) The State, or where applicable local, agency shall perform and submit to the EPA Regional Administrator an assessment of the air quality surveillance system every 5 years to determine, at a minimum, if the network meets the monitoring objectives defined in appendix D to this part, whether new sites are needed, whether existing sites are no longer needed and can be terminated, and whether new technologies are appropriate for incorporation into the ambient air monitoring network. The network assessment must consider the ability of existing and proposed sites to support air quality characterization for areas with relatively high populations of susceptible individuals (e.g., children with asthma), and, for any sites that are being proposed for discontinuance, the effect on data users other than the agency itself, such as nearby States and Tribes or health effects studies. For PM2.5, the assessment also must identify needed changes to population-oriented sites. The State, or where applicable local, agency must submit a copy of this 5-year assessment, along with a revised annual network plan, to the Regional Administrator. The first assessment is due July 1, 2010.

    (e) All proposed additions and discontinuations of SLAMS monitors in annual monitoring network plans and periodic network assessments are subject to approval according to § 58.14.

    Network technical requirements.

    (a)(1) State and local governments shall follow the applicable quality assurance criteria contained in appendix A to this part when operating the SLAMS networks.

    (2) Beginning January 1, 2009, State and local governments shall follow the quality assurance criteria contained in appendix A to this part that apply to SPM sites when operating any SPM site which uses a FRM, FEM, or ARM and meets the requirements of appendix E to this part, unless the Regional Administrator approves an alternative to the requirements of appendix A with respect to such SPM sites because meeting those requirements would be physically and/or financially impractical due to physical conditions at the monitoring site and the requirements are not essential to achieving the intended data objectives of the SPM site. Alternatives to the requirements of appendix A may be approved for an SPM site as part of the approval of the annual monitoring plan, or separately.

    (3) The owner or operator of an existing or a proposed source shall follow the quality assurance criteria in appendix A to this part that apply to PSD monitoring when operating a PSD site.

    (b) State and local governments must follow the criteria in appendix C to this part to determine acceptable monitoring methods or instruments for use in SLAMS networks. Appendix C criteria are optional at SPM stations.

    (c) State and local governments must follow the network design criteria contained in appendix D to this part in designing and maintaining the SLAMS stations. The final network design and all changes in design are subject to approval of the Regional Administrator. NCore, STN, and PAMS network design and changes are also subject to approval of the Administrator. Changes in SPM stations do not require approvals, but a change in the designation of a monitoring site from SLAMS to SPM requires approval of the Regional Administrator.

    (d) State and local governments must follow the criteria contained in appendix E to this part for siting monitor inlets, paths or probes at SLAMS stations. Appendix E adherence is optional for SPM stations.

    Operating schedules.

    State and local governments shall collect ambient air quality data at any SLAMS station on the following operational schedules:

    (a) For continuous analyzers, consecutive hourly averages must be collected except during:

    (1) Periods of routine maintenance,

    (2) Periods of instrument calibration, or

    (3) Periods or monitoring seasons exempted by the Regional Administrator.

    (b) For Pb manual methods, at least one 24-hour sample must be collected every 6 days except during periods or seasons exempted by the Regional Administrator.

    (c) For PAMS VOC samplers, samples must be collected as specified in section 5 of appendix D to this part. Area-specific PAMS operating schedules must be included as part of the PAMS network description and must be approved by the Regional Administrator.

    (d) For manual PM2.5 samplers:

    (1) Manual PM2.5 samplers at SLAMS stations other than NCore stations must operate on at least a 1-in-3 day schedule at sites without a collocated continuously operating PM2.5 monitor. For SLAMS PM2.5 sites with both manual and continuous PM2.5 monitors operating, the monitoring agency may request approval for a reduction to 1-in-6 day PM2.5 sampling at SLAMS stations or for seasonal sampling from the EPA Regional Administrator. The EPA Regional Administrator may grant sampling frequency reductions after consideration of factors, including but not limited to the historical PM2.5 data quality assessments, the location of current PM2.5 design value sites, and their regulatory data needs. Sites that have design values that are within plus or minus 10 percent of the NAAQS; and sites where the 24-hour values exceed the NAAQS for a period of 3 years are required to maintain at least a 1-in-3 day sampling frequency. Sites that have a design value within plus or minus 5 percent of the daily PM2.5 NAAQS must have an FRM or FEM operate on a daily schedule.

    (2) Manual PM2.5 samplers at NCore stations and required regional background and regional transport sites must operate on at least a 1-in-3 day sampling frequency.

    (3) Manual PM2.5 speciation samplers at STN stations must operate on a 1-in-3 day sampling frequency.

    (e) For PM10 samplers'a 24-hour sample must be taken from midnight to midnight (local time) to ensure national consistency. The minimum monitoring schedule for the site in the area of expected maximum concentration shall be based on the relative level of that Start Printed Page 61300monitoring site concentration with respect to the 24-hour standard as illustrated in Figure 1. If the operating agency demonstrates by monitoring data that during certain periods of the year conditions preclude violation of the PM10 24-hour standard, the increased sampling frequency for those periods or seasons may be exempted by the Regional Administrator and permitted to revert back to once in six days. The minimum sampling schedule for all other sites in the area remains once every six days. No less frequently than as part of each 5-year network assessment, the most recent year of data must be considered to estimate the air quality status at the site near the area of maximum concentration. Statistical models such as analysis of concentration frequency distributions as described in “Guideline for the Interpretation of Ozone Air Quality Standards,” EPA-450/479-003, U.S. Environmental Protection Agency, Research Triangle Park, NC, January 1979, should be used. Adjustments to the monitoring schedule must be made on the basis of the 5-year network assessment. The site having the highest concentration in the most current year must be given first consideration when selecting the site for the more frequent sampling schedule. Other factors such as major change in sources of PM10 emissions or in sampling site characteristics could influence the location of the expected maximum concentration site. Also, the use of the most recent 3 years of data might, in some cases, be justified in order to provide a more representative database from which to estimate current air quality status and to provide stability to the network. This multiyear consideration reduces the possibility of an anomalous year biasing a site selected for accelerated sampling. If the maximum concentration site based on the most current year is not selected for the more frequent operating schedule, documentation of the justification for selection of an alternative site must be submitted to the Regional Office for approval during the 5-year network assessment process. Minimum data completeness criteria, number of years of data and sampling frequency for judging attainment of the NAAQS are discussed in appendix K of part 50 of this chapter.

    (f) For manual PM10-2.5 samplers:

    (1) Manual PM10-2.5 samplers at NCore stations must operate on at least a 1-in-3 day schedule at sites without a collocated continuously operating federal equivalent PM10-2.5 method that has been designated in accordance with part 53 of this chapter.

    (2) Manual PM10-2.5 speciation samplers at NCore stations must operate on at least a 1-in-3 day sampling frequency.

    Monitoring network completion.

    (a) The network of NCore multipollutant sites must be physically established no later than January 1, 2011, and at that time, operating under all of the requirements of this part, including the requirements of appendices A, C, D, E, and G to this part.

    (b) Where existing networks are not in conformance with required numbers of monitors specified in this part, additional required monitors must be operated by January 1, 2008.

    System modification.

    (a) The State, or where appropriate local, agency shall develop and implement a plan and schedule to modify the ambient air quality monitoring network that complies with the findings of the network assessments required every 5 years by § 58.10(e). The State or local agency shall consult with the EPA Regional Administrator during the development of the schedule to modify the monitoring program, and shall make the plan and schedule available to the public for 30 days prior to submission to the EPA Regional Administrator. The final plan and schedule with respect to the SLAMS network are subject to the approval of the EPA Regional Administrator. Plans containing modifications to NCore Stations or PAMS Stations shall be submitted to the Administrator. The Regional Administrator shall provide opportunity for public comment and shall approve or disapprove submitted plans and schedules within 120 days.

    (b) Nothing in this section shall preclude the State, or where appropriate local, agency from making modifications to the SLAMS network for reasons other than those resulting from the periodic network assessments. These modifications must be reviewed and Start Printed Page 61301approved by the Regional Administrator. Each monitoring network may make or be required to make changes between the 5-year assessment periods, including for example, site relocations or the addition of PAMS networks in bumped-up ozone nonattainment areas. These modifications must address changes invoked by a new census and changes due to changing air quality levels. The State, or where appropriate local, agency shall provide written communication describing the network changes to the Regional Administrator for review and approval as these changes are identified.

    (c) State, or where appropriate, local agency requests for SLAMS monitor station discontinuation, subject to the review of the Regional Administrator, will be approved if any of the following criteria are met and if the requirements of appendix D to this part, if any, continue to be met. Other requests for discontinuation may also be approved on a case-by-case basis if discontinuance does not compromise data collection needed for implementation of a NAAQS and if the requirements of appendix D to this part, if any, continue to be met.

    (1) Any PM2.5, O3, CO, PM10, SO2, Pb, or NO2 SLAMS monitor which has shown attainment during the previous five years, that has a probability of less than 10 percent of exceeding 80 percent of the applicable NAAQS during the next three years based on the levels, trends, and variability observed in the past, and which is not specifically required by an attainment plan or maintenance plan. In a nonattainment or maintenance area, if the most recent attainment or maintenance plan adopted by the State and approved by EPA contains a contingency measure to be triggered by an air quality concentration and the monitor to be discontinued is the only SLAMS monitor operating in the nonattainment or maintenance area, the monitor may not be discontinued.

    (2) Any SLAMS monitor for CO, PM10, SO2, or NO2 which has consistently measured lower concentrations than another monitor for the same pollutant in the same county (or portion of a county within a distinct attainment area, nonattainment area, or maintenance area, as applicable) during the previous five years, and which is not specifically required by an attainment plan or maintenance plan, if control measures scheduled to be implemented or discontinued during the next five years would apply to the areas around both monitors and have similar effects on measured concentrations, such that the retained monitor would remain the higher reading of the two monitors being compared.

    (3) For any pollutant, any SLAMS monitor in a county (or portion of a county within a distinct attainment, nonattainment, or maintenance area, as applicable) provided the monitor has not measured violations of the applicable NAAQS in the previous five years, and the approved SIP provides for a specific, reproducible approach to representing the air quality of the affected county in the absence of actual monitoring data.

    (4) A PM2.5 SLAMS monitor which EPA has determined cannot be compared to the relevant NAAQS because of the siting of the monitor, in accordance with § 58.30.

    (5) A SLAMS monitor that is designed to measure concentrations upwind of an urban area for purposes of characterizing transport into the area and that has not recorded violations of the relevant NAAQS in the previous five years, if discontinuation of the monitor is tied to start-up of another station also characterizing transport.

    (6) A SLAMS monitor not eligible for removal under any of the criteria in paragraphs (c)(1) through (c)(5) of this section may be moved to a nearby location with the same scale of representation if logistical problems beyond the State's control make it impossible to continue operation at its current site.

    Annual air monitoring data certification.

    (a) The State, or where appropriate local, agency shall submit to the EPA Regional Administrator an annual air monitoring data certification letter to certify data collected at all SLAMS and at all FRM, FEM, and ARM SPM stations that meet criteria in appendix A to this part from January 1 to December 31 of the previous year. The senior air pollution control officer in each agency, or his or her designee, shall certify that the previous year of ambient concentration and quality assurance data are completely submitted to AQS and that the ambient concentration data are accurate to the best of her or his knowledge, taking into consideration the quality assurance findings.

    (1) Through 2009, the annual data certification letter is due by July 1 of each year.

    (2) Beginning in 2010, the annual data certification letter is due by May 1 of each year.

    (b) Along with each certification letter, the State shall submit to the Administrator (through the appropriate Regional Office) an annual summary report of all the ambient air quality data collected at all SLAMS and at SPM stations using FRM, FEM, or ARMs. The annual report(s) shall be submitted for data collected from January 1 to December 31 of the previous year. The annual summary report(s) must contain all information and data required by the State's approved plan and must be submitted on the same schedule as the certification letter, unless an approved alternative date is included in the plan. The annual summary serves as the record of the specific data that is the object of the certification letter.

    (c) Along with each certification letter, the State shall submit to the Administrator (through the appropriate Regional Office) a summary of the precision and accuracy data for all ambient air quality data collected at all SLAMS and at SPM stations using FRM, FEM, or ARMs. The summary of precision and accuracy shall be submitted for data collected from January 1 to December 31 of the previous year. The summary of precision and accuracy must be submitted on the same schedule as the certification letter, unless an approved alternative date is included in the plan.

    Data submittal and archiving requirements.

    (a) The State, or where appropriate, local agency, shall report to the Administrator, via AQS all ambient air quality data and associated quality assurance data for SO2; CO; O3; NO2; NO; NOY; NOX; Pb; PM10 mass concentration; PM2.5 mass concentration; for filter-based PM2.5 FRM/FEM the field blank mass, sampler-generated average daily temperature, and sampler-generated average daily pressure; chemically speciated PM2.5 mass concentration data; PM10-2.5 mass concentration; chemically speciated PM10-2.5 mass concentration data; meteorological data from NCore and PAMS sites; and metadata records and information specified by the AQS Data Coding Manual (http://www.epa.gov/​ttn/​airs/​airsaqs/​manuals/​manuals.htm). Such air quality data and information must be submitted directly to the AQS via electronic transmission on the specified quarterly schedule described in paragraph (b) of this section.

    (b) The specific quarterly reporting periods are January 1-March 31, April 1-June 30, July 1-September 30, and October 1-December 31. The data and information reported for each reporting period must contain all data and information gathered during the reporting period, and be received in the AQS within 90 days after the end of the quarterly reporting period. For example, Start Printed Page 61302the data for the reporting period January 1-March 31 are due on or before June 30 of that year.

    (c) Air quality data submitted for each reporting period must be edited, validated, and entered into the AQS (within the time limits specified in paragraph (b) of this section) pursuant to appropriate AQS procedures. The procedures for editing and validating data are described in the AQS Data Coding Manual and in each monitoring agency's quality assurance project plan.

    (d) The State shall report VOC and if collected, carbonyl, NH3, and HNO3 data, from PAMS sites to AQS within 6 months following the end of each quarterly reporting period listed in paragraph (b) of this section.

    (e) The State shall also submit any portion or all of the SLAMS and SPM data to the appropriate Regional Administrator upon request.

    (f) The State, or where applicable, local agency shall archive all PM2.5, PM10, and PM10−2.5 filters from manual low-volume samplers (samplers having flow rates less than 200 liters/minute) from all SLAMS sites for a minimum period of 1 year after collection. These filters shall be made available during the course of that year for supplemental analyses at the request of EPA or to provide information to State and local agencies on particulate matter composition. Other Federal agencies may request access to filters for purposes of supporting air quality management or community health—such as biological assay—through the applicable EPA Regional Administrator. The filters shall be archived according to procedures approved by the Administrator. The EPA recommends that particulate matter filters be archived for longer periods, especially for key sites in making NAAQS related decisions or for supporting health-related air pollution studies.

    Start Amendment Part

    28. Subpart C is revised to read as follows:

    End Amendment Part

    Subpart C—Special Purpose Monitors

    Special purpose monitors (SPM).

    (a) An SPM is defined as any monitor included in an agency's monitoring network that the agency has designated as a special purpose monitor in its annual monitoring network plan and in AQS, and which the agency does not count when showing compliance with the minimum requirements of this subpart for the number and siting of monitors of various types. Any SPM operated by an air monitoring agency must be included in the periodic assessments and annual monitoring network plan required by § 58.10. The plan shall include a statement of purposes for each SPM monitor and evidence that operation of each monitor meets the requirements of appendix A or an approved alternative as provided by § 58.11(a)(2) where applicable. The monitoring agency may designate a monitor as an SPM after January 1, 2007 only if it is a new monitor, i.e., a SLAMS monitor that is not included in the currently applicable monitoring plan or, for a monitor included in the monitoring plan prior to January 1, 2007, if the Regional Administrator has approved the discontinuation of the monitor as a SLAMS site.

    (b) Any SPM data collected by an air monitoring agency using a Federal reference method (FRM), Federal equivalent method (FEM), or approved regional method (ARM) must meet the requirements of § 58.11, § 58.12, and appendix A to this part or an approved alternative to appendix A to this part. Compliance with appendix E to this part is optional but encouraged except when the monitoring agency's data objectives are inconsistent with those requirements. Data collected at an SPM using a FRM, FEM, or ARM meeting the requirements of appendix A must be submitted to AQS according to the requirements of § 58.16. Data collected by other SPMs may be submitted. The monitoring agency must also submit to AQS an indication of whether each SPM reporting data to AQS monitor meets the requirements of appendices A and E to this part.

    (c) All data from an SPM using an FRM, FEM, or ARM which has operated for more than 24 months is eligible for comparison to the relevant NAAQS, subject to the conditions of § 58.30, unless the air monitoring agency demonstrates that the data came from a particular period during which the requirements of appendix A or an approved alternative, appendix C, or appendix E were not met in practice.

    (d) If an SPM using an FRM, FEM, or ARM is discontinued within 24 months of start-up, the Administrator will not base a NAAQS violation determination for the PM2.5 or ozone NAAQS solely on data from the SPM.

    (e) If an SPM using an FRM, FEM, or ARM is discontinued within 24 months of start-up, the Administrator will not designate an area as nonattainment for the CO, SO2, NO2, Pb, or 24-hour PM10 NAAQS solely on the basis of data from the SPM. Such data are eligible for use in determinations of whether a nonattainment area has attained one of these NAAQS.

    (f) Prior approval from EPA is not required for discontinuance of an SPM.

    Start Amendment Part

    29. Subpart D is revised to read as follows:

    End Amendment Part

    Subpart D—Comparability of Ambient Data to NAAQS

    Special considerations for data comparisons to the NAAQS.

    (a) Comparability of PM2.5data. (1) There are two forms of the PM2.5 NAAQS described in part 50 of this chapter. The PM2.5 monitoring site characteristics (see appendix D to this part, section 4.7.1) impact how the resulting PM2.5 data can be compared to the annual PM2.5 NAAQS form. PM2.5 data that are representative, not of areawide but rather, of relatively unique population-oriented microscale, or localized hot spot, or unique population-oriented middle-scale impact sites are only eligible for comparison to the 24-hour PM2.5 NAAQS. For example, if the PM2.5 monitoring site is adjacent to a unique dominating local PM2.5 source or can be shown to have average 24-hour concentrations representative of a smaller than neighborhood spatial scale, then data from a monitor at the site would only be eligible for comparison to the 24-hour PM2.5 NAAQS.

    (2) There are cases where certain population-oriented microscale or middle scale PM2.5 monitoring sites are determined by the Regional Administrator to collectively identify a larger region of localized high ambient PM2.5 concentrations. In those cases, data from these population-oriented sites would be eligible for comparison to the annual PM2.5 NAAQS.

    (b) [Reserved]

    Subpart E—[Removed and Reserved]

    Start Amendment Part

    30. Subpart E of part 58 is removed and reserved.

    End Amendment Part

    Subpart F—[Amended]

    Start Amendment Part

    31. Section 58.50 is revised to read as follows:

    End Amendment Part
    Index reporting.

    (a) The State or where applicable, local agency shall report to the general public on a daily basis through prominent notice an air quality index that complies with the requirements of appendix G to this part.

    (b) Reporting is required for all individual MSA with a population exceeding 350,000.

    (c) The population of a MSA for purposes of index reporting is the most recent decennial U.S. census population.

    Start Printed Page 61303

    Subpart G—[Amended]

    Start Amendment Part

    32. Sections 58.60 and 58.61 are revised to read as follows:

    End Amendment Part
    Federal monitoring.

    The Administrator may locate and operate an ambient air monitoring site if the State or local agency fails to locate, or schedule to be located, during the initial network design process, or as a result of the 5-year network assessments required in § 58.10, a SLAMS station at a site which is necessary in the judgment of the Regional Administrator to meet the objectives defined in appendix D to this part.

    Monitoring other pollutants.

    The Administrator may promulgate criteria similar to that referenced in subpart B of this part for monitoring a pollutant for which an NAAQS does not exist. Such an action would be taken whenever the Administrator determines that a nationwide monitoring program is necessary to monitor such a pollutant.

    Start Amendment Part

    33. Appendix A to part 58 is revised to read as follows:

    End Amendment Part

    Appendix A to Part 58—Quality Assurance Requirements for SLAMS, SPMs and PSD Air Monitoring

    1. General Information

    2. Quality System Requirements

    3. Measurement Quality Check Requirements

    4. Calculations for Data Quality Assessments

    5. Reporting Requirements

    6. References

    1. General Information

    This appendix specifies the minimum quality system requirements applicable to SLAMS air monitoring data and PSD data for the pollutants SO2, NO2, O3, CO, PM2.5, PM10 and PM10−2.5 submitted to EPA. This appendix also applies to all SPM stations using FRM, FEM, or ARM methods which also meet the requirements of Appendix E of this part. Monitoring organizations are encouraged to develop and maintain quality systems more extensive than the required minimums. The permit-granting authority for PSD may require more frequent or more stringent requirements. Monitoring organizations may, based on their quality objectives, develop and maintain quality systems beyond the required minimum. Additional guidance for the requirements reflected in this appendix can be found in the “Quality Assurance Handbook for Air Pollution Measurement Systems”, volume II, part 1 (see reference 10 of this appendix) and at a national level in references 1, 2, and 3 of this appendix.

    1.1 Similarities and Differences Between SLAMS and PSD Monitoring. In most cases, the quality assurance requirements for SLAMS, SPMs if applicable, and PSD are the same. Affected SPMs are subject to all the SLAMS requirements, even where not specifically stated in each section. Table A-1 of this appendix summarizes the major similarities and differences of the requirements for SLAMS and PSD. Both programs require:

    (a) The development, documentation, and implementation of an approved quality system;

    (b) The assessment of data quality;

    (c) The use of reference, equivalent, or approved methods. The requirements of this appendix do not apply to a SPM that does not use a FRM, FEM, or ARM;

    (d) The use of calibration standards traceable to NIST or other primary standard;

    (e) Performance evaluations and systems.

    1.1.1 The monitoring and quality assurance responsibilities for SLAMS are with the State or local agency, hereafter called the monitoring organization, whereas for PSD they are with the owner/operator seeking the permit. The monitoring duration for SLAMS is indefinite, whereas for PSD the duration is usually 12 months. Whereas the reporting period for precision and accuracy data is on an annual or calendar quarter basis for SLAMS, it is on a continuing sampler quarter basis for PSD, since the monitoring may not commence at the beginning of a calendar quarter.

    1.1.2 The annual performance evaluations (described in section 3.2.2 of this appendix) for PSD must be conducted by personnel different from those who perform routine span checks and calibrations, whereas for SLAMS, it is the preferred but not the required condition. For PSD, the evaluation rate is 100 percent of the sites per reporting quarter whereas for SLAMS it is 25 percent of the sites or instruments quarterly. Monitoring for sulfur dioxide (SO2) and nitrogen dioxide (NO2) for PSD must be done with automated analyzers—the manual bubbler methods are not permitted.

    1.1.3 The requirements for precision assessment for the automated methods are the same for both SLAMS and PSD. However, for manual methods, only one collocated site is required for PSD.

    1.1.4 The precision, accuracy and bias data for PSD are reported separately for each sampler (site), whereas for SLAMS, the report may be by sampler (site), by primary quality assurance organization, or nationally, depending on the pollutant. SLAMS data are required to be reported to the AQS, PSD data are required to be reported to the permit-granting authority. Requirements in this appendix, with the exception of the differences discussed in this section, and in Table A-1 of this appendix will be expected to be followed by both SLAMS and PSD networks unless directly specified in a particular section.

    1.2 Measurement Uncertainty. Measurement uncertainty is a term used to describe deviations from a true concentration or estimate that are related to the measurement process and not to spatial or temporal population attributes of the air being measured. Monitoring organizations must develop quality assurance project plans (QAPP) which describe how the organization intends to control measurement uncertainty to an appropriate level in order to achieve the objectives for which the data are collected. The process by which one determines the quality of data needed to meet the monitoring objective is sometimes referred to the Data Quality Objectives Process. Data quality indicators associated with measurement uncertainty include:

    (a) Precision. A measurement of mutual agreement among individual measurements of the same property usually under prescribed similar conditions, expressed generally in terms of the standard deviation.

    (b) Bias. The systematic or persistent distortion of a measurement process which causes errors in one direction.

    (c) Accuracy. The degree of agreement between an observed value and an accepted reference value. Accuracy includes a combination of random error (imprecision) and systematic error (bias) components which are due to sampling and analytical operations.

    (d) Completeness. A measure of the amount of valid data obtained from a measurement system compared to the amount that was expected to be obtained under correct, normal conditions.

    (e) Detectability. The low critical range value of a characteristic that a method specific procedure can reliably discern.

    1.3 Measurement Quality Checks. The SLAMS measurement quality checks described in sections 3.2 and 3.3 of this appendix shall be reported to AQS and are included in the data required for certification. The PSD network is required to implement the measurement quality checks and submit this information quarterly along with assessment information to the permit-granting authority.

    1.4 Assessments and Reports. Periodic assessments and documentation of data quality are required to be reported to EPA or to the permit granting authority (PSD). To provide national uniformity in this assessment and reporting of data quality for all networks, specific assessment and reporting procedures are prescribed in detail in sections 3, 4, and 5 of this appendix. On the other hand, the selection and extent of the quality assurance and quality control activities used by a monitoring organization depend on a number of local factors such as field and laboratory conditions, the objectives for monitoring, the level of data quality needed, the expertise of assigned personnel, the cost of control procedures, pollutant concentration levels, etc. Therefore, quality system requirements in section 2 of this appendix are specified in general terms to allow each monitoring organization to develop a quality system that is most efficient and effective for its own circumstances while achieving the data quality objectives required for the SLAMS sites.

    2. Quality System Requirements

    A quality system is the means by which an organization manages the quality of the monitoring information it produces in a systematic, organized manner. It provides a framework for planning, implementing, assessing and reporting work performed by an organization and for carrying out required quality assurance and quality control activities.

    2.1 Quality Management Plans and Quality Assurance Project Plans. All Start Printed Page 61304monitoring organizations must develop a quality system that is described and approved in quality management plans (QMP) and quality assurance project plans (QAPP) to ensure that the monitoring results:

    (a) Meet a well-defined need, use, or purpose;

    (b) Provide data of adequate quality for the intended monitoring objectives;

    (c) Satisfy stakeholder expectations;

    (d) Comply with applicable standards specifications;

    (e) Comply with statutory (and other) requirements of society; and

    (f) Reflect consideration of cost and economics.

    2.1.1 The QMP describes the quality system in terms of the organizational structure, functional responsibilities of management and staff, lines of authority, and required interfaces for those planning, implementing, assessing and reporting activities involving environmental data operations (EDO). The QMP must be suitably documented in accordance with EPA requirements (reference 2 of this appendix), and approved by the appropriate Regional Administrator, or his or her representative. The quality system will be reviewed during the systems audits described in section 2.5 of this appendix. Organizations that implement long-term monitoring programs with EPA funds should have a separate QMP document. Smaller organizations or organizations that do infrequent work with EPA funds may combine the QMP with the QAPP based on negotiations with the funding agency. Additional guidance on this process can be found in reference 10 of this appendix. Approval of the recipient's QMP by the appropriate Regional Administrator or his or her representative, may allow delegation of the authority to review and approve the QAPP to the recipient, based on adequacy of quality assurance procedures described and documented in the QMP. The QAPP will be reviewed by EPA during systems audits or circumstances related to data quality.

    2.1.2 The QAPP is a formal document describing, in sufficient detail, the quality system that must be implemented to ensure that the results of work performed will satisfy the stated objectives. The quality assurance policy of the EPA requires every environmental data operation (EDO) to have a written and approved QAPP prior to the start of the EDO. It is the responsibility of the monitoring organization to adhere to this policy. The QAPP must be suitably documented in accordance with EPA requirements (reference 3 of this appendix).

    2.1.3 The monitoring organization's quality system must have adequate resources both in personnel and funding to plan, implement, assess and report on the achievement of the requirements of this appendix and its approved QAPP.

    2.2 Independence of Quality Assurance. The monitoring organization must provide for a quality assurance management function- that aspect of the overall management system of the organization that determines and implements the quality policy defined in a monitoring organization's QMP. Quality management includes strategic planning, allocation of resources and other systematic planning activities (e.g., planning, implementation, assessing and reporting) pertaining to the quality system. The quality assurance management function must have sufficient technical expertise and management authority to conduct independent oversight and assure the implementation of the organization's quality system relative to the ambient air quality monitoring program and should be organizationally independent of environmental data generation activities.

    2.3. Data Quality Performance Requirements.

    2.3.1 Data Quality Objectives. Data quality objectives (DQO) or the results of other systematic planning processes are statements that define the appropriate type of data to collect and specify the tolerable levels of potential decision errors that will be used as a basis for establishing the quality and quantity of data needed to support the objectives of the SLAMS stations. DQO will be developed by EPA to support the primary SLAMS objectives for each criteria pollutant. As they are developed they will be added to the regulation. DQO or the results of other systematic planning processes for PSD or other monitoring will be the responsibility of the monitoring organizations. The quality of the conclusions made from data interpretation can be affected by population uncertainty (spatial or temporal uncertainty) and measurement uncertainty (uncertainty associated with collecting, analyzing, reducing and reporting concentration data). This appendix focuses on assessing and controlling measurement uncertainty.

    2.3.1.1 Measurement Uncertainty for Automated and Manual PM2.5 Methods. The goal for acceptable measurement uncertainty is defined as 10 percent coefficient of variation (CV) for total precision and plus or minus 10 percent for total bias.

    2.3.1.2 Measurement Uncertainty for Automated Ozone Methods. The goal for acceptable measurement uncertainty is defined for precision as an upper 90 percent confidence limit for the coefficient variation (CV) of 7 percent and for bias as an upper 95 percent confidence limit for the absolute bias of 7 percent.

    2.3.1.3 Measurement Uncertainty for PM10-2.5 Methods. The goal for acceptable measurement uncertainty is defined for precision as an upper 90 percent confidence limit for the coefficient variation (CV) of 15 percent and for bias as an upper 95 percent confidence limit for the absolute bias of 15 percent.

    2.4 National Performance Evaluation Programs. Monitoring plans or the QAPP shall provide for the implementation of a program of independent and adequate audits of all monitors providing data for SLAMS and PSD including the provision of adequate resources for such audit programs. A monitoring plan (or QAPP) which provides for monitoring organization participation in EPA's National Performance Audit Program (NPAP) and the PM Performance Evaluation Program (PEP) program and which indicates the consent of the monitoring organization for EPA to apply an appropriate portion of the grant funds, which EPA would otherwise award to the monitoring organization for monitoring activities, will be deemed by EPA to meet this requirement. For clarification and to participate, monitoring organizations should contact either the appropriate EPA Regional Quality Assurance (QA) Coordinator at the appropriate EPA Regional Office location, or the NPAP Coordinator, Emissions Monitoring and Analysis Division (D205-02), U.S. Environmental Protection Agency, Research Triangle Park, NC 27711.

    2.5 Technical Systems Audit Program. Technical systems audits of each ambient air monitoring organization shall be conducted at least every 3 years by the appropriate EPA Regional Office and reported to the AQS. Systems audit programs are described in reference 10 of this appendix. For further instructions, monitoring organizations should contact the appropriate EPA Regional QA Coordinator.

    2.6 Gaseous and Flow Rate Audit Standards.

    2.6.1 Gaseous pollutant concentration standards (permeation devices or cylinders of compressed gas) used to obtain test concentrations for carbon monoxide (CO), sulfur dioxide (SO2), nitrogen oxide (NO), and nitrogen dioxide (NO2) must be traceable to either a National Institute of Standards and Technology (NIST) Traceable Reference Material (NTRM) or a NIST-certified Gas Manufacturer's Internal Standard (GMIS), certified in accordance with one of the procedures given in reference 4 of this appendix. Vendors advertising certification with the procedures provided in reference 4 of this appendix and distributing gasses as “EPA Protocol Gas” must participate in the EPA Protocol Gas Verification Program or not use “EPA” in any form of advertising.

    2.6.2 Test concentrations for ozone (O3) must be obtained in accordance with the ultra violet photometric calibration procedure specified in appendix D to part 50 of this chapter, or by means of a certified O3 transfer standard. Consult references 7 and 8 of this appendix for guidance on primary and transfer standards for O3.

    2.6.3 Flow rate measurements must be made by a flow measuring instrument that is traceable to an authoritative volume or other applicable standard. Guidance for certifying some types of flowmeters is provided in reference 10 of this appendix.

    2.7 Primary Requirements and Guidance. Requirements and guidance documents for developing the quality system are contained in references 1 through 10 of this appendix, which also contain many suggested procedures, checks, and control specifications. Reference 10 of this appendix describes specific guidance for the development of a quality system for SLAMS. Many specific quality control checks and specifications for methods are included in the respective reference methods described in part 50 of this chapter or in the respective equivalent method descriptions available from EPA (reference 6 of this appendix). Similarly, quality control procedures related to specifically designated reference and equivalent method analyzers are contained in the respective operation or instruction manuals associated with those analyzers. Start Printed Page 61305

    3. Measurement Quality Check Requirements

    This section provides the requirements for primary quality assurance organizations (PQAOs) to perform the measurement quality checks that can be used to assess data quality. With the exception of the flow rate verifications (sections 3.2.3 and 3.3.2 of this appendix), data from these checks are required to be submitted to the AQS within the same time frame as routine ambient concentration data. Section 3.2 of this appendix describes checks of automated or continuous instruments while section 3.3 describe checks associated with manual sampling instruments. Other quality control samples are identified in the various references described earlier and can be used to control certain aspects of the measurement system.

    3.1 Primary Quality Assurance Organization. A primary quality assurance organization is defined as a monitoring organization or a coordinated aggregation of such organizations that is responsible for a set of stations that monitors the same pollutant and for which data quality assessments can logically be pooled. Each criteria pollutant sampler/monitor at a monitoring station in the SLAMS network must be associated with one, and only one, primary quality assurance organization.

    3.1.1 Each primary quality assurance organization shall be defined such that measurement uncertainty among all stations in the organization can be expected to be reasonably homogeneous, as a result of common factors. Common factors that should be considered by monitoring organizations in defining primary quality assurance organizations include:

    (a) Operation by a common team of field operators according to a common set of procedures;

    (b) Use of a common QAPP or standard operating procedures;

    (c) Common calibration facilities and standards;

    (d) Oversight by a common quality assurance organization; and

    (e) Support by a common management, laboratory or headquarters.

    3.1.2 Primary quality assurance organizations are not necessarily related to the organization reporting data to the AQS. Monitoring organizations having difficulty in defining the primary quality assurance organizations or in assigning specific sites to primary quality assurance organizations should consult with the appropriate EPA Regional Office. All definitions of primary quality assurance organizations shall be subject to final approval by the appropriate EPA Regional Office during scheduled network reviews or systems audits.

    3.1.3 Data quality assessment results shall be reported as specified in section 5 of this appendix.

    3.2 Measurement Quality Checks of Automated Methods. Table A-2 of this appendix provides a summary of the types and frequency of the measurement quality checks that will be described in this section.

    3.2.1 One-Point Quality Control Check for SO2, NO2, O3, and CO. A one-point quality control (QC) check must be performed at least once every 2 weeks on each automated analyzer used to measure SO2, NO2, O3 and CO. The frequency of QC checks may be reduced based upon review, assessment and approval of the EPA Regional Administrator. However, with the advent of automated calibration systems more frequent checking is encouraged. See Reference 10 of this appendix for guidance on the review procedure. The QC check is made by challenging the analyzer with a QC check gas of known concentration (effective concentration for open path analyzers) between 0.01 and 0.10 parts per million (ppm) for SO2, NO2, and O3, and between 1 and 10 ppm for CO analyzers. The ranges allow for appropriate check gas selection for SLAMS sites that may be sampling for different objectives, i.e., trace gas monitoring vs. comparison to National Ambient Air Quality Standards (NAAQS). The QC check gas concentration selected should be related to the routine concentrations normally measured at sites within the monitoring network in order to appropriately reflect the precision and bias at these routine concentration ranges. To check the precision and bias of SLAMS analyzers operating at ranges either above or below the levels identified, use check gases of appropriate concentrations as approved by the appropriate EPA Regional Administrator or their designee. The standards from which check concentrations are obtained must meet the specifications of section 2.6 of this appendix.

    3.2.1.1 Except for certain CO analyzers described below, point analyzers must operate in their normal sampling mode during the QC check, and the test atmosphere must pass through all filters, scrubbers, conditioners and other components used during normal ambient sampling and as much of the ambient air inlet system as is practicable. If permitted by the associated operation or instruction manual, a CO point analyzer may be temporarily modified during the QC check to reduce vent or purge flows, or the test atmosphere may enter the analyzer at a point other than the normal sample inlet, provided that the analyzer's response is not likely to be altered by these deviations from the normal operational mode. If a QC check is made in conjunction with a zero or span adjustment, it must be made prior to such zero or span adjustments.

    3.2.1.2 Open path analyzers are tested by inserting a test cell containing a QC check gas concentration into the optical measurement beam of the instrument. If possible, the normally used transmitter, receiver, and as appropriate, reflecting devices should be used during the test and the normal monitoring configuration of the instrument should be altered as little as possible to accommodate the test cell for the test. However, if permitted by the associated operation or instruction manual, an alternate local light source or an alternate optical path that does not include the normal atmospheric monitoring path may be used. The actual concentration of the QC check gas in the test cell must be selected to produce an effective concentration in the range specified earlier in this section. Generally, the QC test concentration measurement will be the sum of the atmospheric pollutant concentration and the QC test concentration. If so, the result must be corrected to remove the atmospheric concentration contribution. The corrected concentration is obtained by subtracting the average of the atmospheric concentrations measured by the open path instrument under test immediately before and immediately after the QC test from the QC check gas concentration measurement. If the difference between these before and after measurements is greater than 20 percent of the effective concentration of the test gas, discard the test result and repeat the test. If possible, open path analyzers should be tested during periods when the atmospheric pollutant concentrations are relatively low and steady.

    3.2.1.3 Report the audit concentration (effective concentration for open path analyzers) of the QC gas and the corresponding measured concentration (corrected concentration, if applicable, for open path analyzers) indicated by the analyzer. The percent differences between these concentrations are used to assess the precision and bias of the monitoring data as described in sections 4.1.2 (precision) and 4.1.3 (bias) of this appendix.

    3.2.2 Annual performance evaluation for SO2, NO2, O3, or CO. Each calendar quarter (during which analyzers are operated), evaluate at least 25 percent of the SLAMS analyzers that monitor for SO2, NO2, O3, or CO such that each analyzer is evaluated at least once per year. If there are fewer than four analyzers for a pollutant within a primary quality assurance organization, it is suggested to randomly evaluate one or more analyzers so that at least one analyzer for that pollutant is evaluated each calendar quarter. The evaluation should be conducted by a trained experienced technician other than the routine site operator.

    3.2.2.1 (a) The evaluation is made by challenging the analyzer with audit gas standard of known concentration (effective concentration for open path analyzers) from at least three consecutive audit levels. The audit levels selected should represent or bracket 80 percent of ambient concentrations measured by the analyzer being evaluated:

    Audit levelConcentration range, ppm
    O3SO2NO2CO
    10.02-0.050.0003-0.0050.0002-0.0020.08-0.10
    20.06-0.100.006-0.010.003-0.0050.50-1.00
    Start Printed Page 61306
    30.11-0.200.02-0.100.006-0.101.50-4.00
    40.21-0.300.11-0.400.11-0.305-15
    50.31-0.900.41-0.900.31-0.6020-50

    (b) An additional 4th level is encouraged for those monitors that have the potential for exceeding the concentration ranges described by the initial three selected.

    3.2.2.2 (a) NO2 audit gas for chemiluminescence-type NO2 analyzers must also contain at least 0.08 ppm NO. NO concentrations substantially higher than 0.08 ppm, as may occur when using some gas phase titration (GPT) techniques, may lead to evaluation errors in chemiluminescence analyzers due to inevitable minor NO-NOX channel imbalance. Such errors may be atypical of routine monitoring errors to the extent that such NO concentrations exceed typical ambient NO concentrations at the site. These errors may be minimized by modifying the GPT technique to lower the NO concentrations remaining in the NO2 audit gas to levels closer to typical ambient NO concentrations at the site.

    (b) To evaluate SLAMS analyzers operating on ranges higher than 0 to 1.0 ppm for SO2, NO2, and O3 or 0 to 50 ppm for CO, use audit gases of appropriately higher concentration as approved by the appropriate EPA Regional Administrator or the Administrator's designee.

    3.2.2.3 The standards from which audit gas test concentrations are obtained must meet the specifications of section 2.6 of this appendix. The gas standards and equipment used for evaluations must not be the same as the standards and equipment used for calibration or calibration span adjustments. For SLAMS sites, the auditor should not be the operator or analyst who conducts the routine monitoring, calibration, and analysis. For PSD sites the auditor must not be the operator or analyst who conducts the routine monitoring, calibration, and analysis.

    3.2.2.4 For point analyzers, the evaluation shall be carried out by allowing the analyzer to analyze the audit gas test atmosphere in its normal sampling mode such that the test atmosphere passes through all filters, scrubbers, conditioners, and other sample inlet components used during normal ambient sampling and as much of the ambient air inlet system as is practicable. The exception provided in section 3.2.1 of this appendix for certain CO analyzers does not apply for evaluations.

    3.2.2.5 Open path analyzers are evaluated by inserting a test cell containing the various audit gas concentrations into the optical measurement beam of the instrument. If possible, the normally used transmitter, receiver, and, as appropriate, reflecting devices should be used during the evaluation, and the normal monitoring configuration of the instrument should be modified as little as possible to accommodate the test cell for the evaluation. However, if permitted by the associated operation or instruction manual, an alternate local light source or an alternate optical path that does not include the normal atmospheric monitoring path may be used. The actual concentrations of the audit gas in the test cell must be selected to produce effective concentrations in the evaluation level ranges specified in this section of this appendix. Generally, each evaluation concentration measurement result will be the sum of the atmospheric pollutant concentration and the evaluation test concentration. If so, the result must be corrected to remove the atmospheric concentration contribution. The corrected concentration is obtained by subtracting the average of the atmospheric concentrations measured by the open path instrument under test immediately before and immediately after the evaluation test (or preferably before and after each evaluation concentration level) from the evaluation concentration measurement. If the difference between the before and after measurements is greater than 20 percent of the effective concentration of the test gas standard, discard the test result for that concentration level and repeat the test for that level. If possible, open path analyzers should be evaluated during periods when the atmospheric pollutant concentrations are relatively low and steady. Also, if the open path instrument is not installed in a permanent manner, the monitoring path length must be reverified to within plus or minus 3 percent to validate the evaluation, since the monitoring path length is critical to the determination of the effective concentration.

    3.2.2.6  Report both the evaluation concentrations (effective concentrations for open path analyzers) of the audit gases and the corresponding measured concentration (corrected concentrations, if applicable, for open path analyzers) indicated or produced by the analyzer being tested. The percent differences between these concentrations are used to assess the quality of the monitoring data as described in section 4.1.4 of this appendix.

    3.2.3 Flow Rate Verification for Particulate Matter. A one-point flow rate verification check must be performed at least once every month on each automated analyzer used to measure PM10, PM10−2.5 and PM2.5. The verification is made by checking the operational flow rate of the analyzer. If the verification is made in conjunction with a flow rate adjustment, it must be made prior to such flow rate adjustment. Randomization of the flow rate verification with respect to time of day, day of week, and routine service and adjustments is encouraged where possible. For the standard procedure, use a flow rate transfer standard certified in accordance with section 2.6 of this appendix to check the analyzer's normal flow rate. Care should be used in selecting and using the flow rate measurement device such that it does not alter the normal operating flow rate of the analyzer. Report the flow rate of the transfer standard and the corresponding flow rate measured (indicated) by the analyzer. The percent differences between the audit and measured flow rates are used to assess the bias of the monitoring data as described in section 4.2.2 of this appendix (using flow rates in lieu of concentrations).

    3.2.4  Semi-Annual Flow Rate Audit for Particulate Matter. Every 6 months, audit the flow rate of the PM10, PM10−2.5 and PM2.5 particulate analyzers. Where possible, EPA strongly encourages more frequent auditing. The audit should (preferably) be conducted by a trained experienced technician other than the routine site operator. The audit is made by measuring the analyzer's normal operating flow rate using a flow rate transfer standard certified in accordance with section 2.6 of this appendix. The flow rate standard used for auditing must not be the same flow rate standard used to calibrate the analyzer. However, both the calibration standard and the audit standard may be referenced to the same primary flow rate or volume standard. Great care must be used in auditing the flow rate to be certain that the flow measurement device does not alter the normal operating flow rate of the analyzer. Report the audit flow rate of the transfer standard and the corresponding flow rate measured (indicated) by the analyzer. The percent differences between these flow rates are used to validate the one-point flow rate verification checks used to estimate bias as described in section 4.2.3 of this appendix.

    3.2.5 Collocated Sampling Procedures for PM2.5. For each pair of collocated monitors, designate one sampler as the primary monitor whose concentrations will be used to report air quality for the site, and designate the other as the audit monitor.

    3.2.5.1 Each EPA designated Federal reference method (FRM) or Federal equivalent method (FEM) within a primary quality assurance organization must:

    (a) Have 15 percent of the monitors collocated (values of 0.5 and greater round up); and

    (b) Have at least 1 collocated monitor (if the total number of monitors is less than 3). The first collocated monitor must be a designated FRM monitor.

    3.2.5.2 In addition, monitors selected for collocation must also meet the following requirements:

    (a) A primary monitor designated as an EPA FRM shall be collocated with an audit monitor having the same EPA FRM method designation.

    (b) For each primary monitor model designated as an EPA FEM used by the PQAO, 50 percent of the monitors designated for collocation shall be collocated with an audit monitor having the same method designation and 50 percent of the monitors shall be collocated with an FRM audit monitor. If the primary quality assurance Start Printed Page 61307organization only has one FEM monitor it shall be collocated with an FRM audit monitor. If there are an odd number of collocated monitors required, the additional monitor shall be an FRM audit monitor. An example of this procedure is found in Table A-3 of this appendix.

    3.2.5.3 The collocated monitors should be deployed according to the following protocol:

    (a) 80 percent of the collocated audit monitors should be deployed at sites with annual average or daily concentrations estimated to be within ±20 percent of the applicable NAAQS and the remainder at what the monitoring organizations designate as high value sites;

    (b) If an organization has no sites with annual average or daily concentrations within ± 20 percent of the annual NAAQS (or 24-hour NAAQS if that is affecting the area), 60 percent of the collocated audit monitors should be deployed at those sites with the annual mean concentrations (or 24-hour NAAQS if that is affecting the area) among the highest 25 percent for all sites in the network.

    3.2.5.4 In determining the number of collocated sites required for PM2.5, monitoring networks for visibility assessments should not be treated independently from networks for particulate matter, as the separate networks may share one or more common samplers. However, for Class I visibility areas, EPA will accept visibility aerosol mass measurement instead of a PM2.5 measurement if the latter measurement is unavailable. Any PM2.5 monitoring site which does not have a monitor which is an EPA FRM, FEM or ARM is not required to be included in the number of sites which are used to determine the number of collocated monitors.

    3.2.5.5  For each PSD monitoring network, one site must be collocated. A site with the predicted highest 24-hour pollutant concentration must be selected.

    3.2.5.6  The two collocated monitors must be within 4 meters of each other and at least 2 meters apart for flow rates greater than 200 liters/min or at least 1 meter apart for samplers having flow rates less than 200 liters/min to preclude airflow interference. Calibration, sampling, and analysis must be the same for both collocated samplers and the same as for all other samplers in the network.

    3.2.5.7  Sample the collocated audit monitor for SLAMS sites on a 12-day schedule; sample PSD sites on a 6-day schedule or every third day for PSD daily monitors. If a primary quality assurance organization has only one collocated monitor, higher sampling frequencies than the 12-day schedule may be needed in order to produce about 25 valid sample pairs a year. Report the measurements from both primary and collocated audit monitors at each collocated sampling site. The calculations for evaluating precision between the two collocated monitors are described in section 4.3.1 of this appendix.

    3.2.6 Collocated Sampling Procedures for PM10−2.5. For the PM10−2.5 network, all automated methods must be designated as Federal equivalent methods (FEMs). For each pair of collocated monitors, designate one sampler as the primary monitor whose concentrations will be used to report air quality for the site, and designate the other as the audit monitor.

    3.2.6.1 The EPA shall ensure that each EPA designated FEM within the national PM10−2.5 monitoring network must:

    (a) Have 15 percent of the monitors collocated (values of 0.5 and greater round up); and

    (b) Have at least 2 collocated monitors (if the total number of monitors is less than 10). The first collocated monitor must be a designated FRM monitor and the second must be a monitor of the same method designation. Both collocated FRM and FEM monitors can be located at the same site.

    3.2.6.2 The Regional Administrator for the EPA Regions where the FEMs are implemented will select the sites for collocated monitoring. The site selection process shall consider giving priority to sites at primary quality assurance organizations or States with more than one PM10−2.5 site, sites considered important from a regional perspective, and sites needed for an appropriate distribution among rural and urban NCore sites. Depending on the speed at which the PM10−2.5 network is deployed, the first sites implementing FEMs shall be required to perform collocation until there is a larger distribution of FEM monitors implemented in the network.

    3.2.6.3 The two collocated monitors must be within 4 meters of each other and at least 2 meters apart for flow rates greater than 200 liters/min or at least 1 meter apart for samplers having flow rates less than 200 liters/min to preclude airflow interference. Calibration, sampling, and analysis must be the same for both collocated samplers and the same as for all other samplers in the network.

    3.2.6.4 Sample the collocated audit monitor for SLAMS sites on a 12-day schedule. Report the measurements from both primary and collocated audit monitors at each collocated sampling site. The calculations for evaluating precision between the two collocated monitors are described in section 4.3.1 of this appendix.

    3.2.7 PM2.5 Performance Evaluation Program (PEP) Procedures. The PEP is an independent assessment used to estimate total measurement system bias. These evaluations will be performed under the PM Performance Evaluation Program (PEP) (section 2.4 of this appendix) or a comparable program. Performance evaluations will be performed on the SLAMS monitors annually within each primary quality assurance organization. For primary quality assurance organizations with less than or equal to five monitoring sites, five valid performance evaluation audits must be collected and reported each year. For primary quality assurance organizations with greater than five monitoring sites, eight valid performance evaluation audits must be collected and reported each year. A valid performance evaluation audit means that both the primary monitor and PEP audit concentrations are valid and above 3 μg/m3. Additionally, each year, every designated FRM or FEM within a primary quality assurance organization must:

    (1) Have each method designation evaluated each year; and,

    (2) Have all FRM or FEM samplers subject to a PEP audit at least once every six years; which equates to approximately 15 percent of the monitoring sites audited each year.

    (b) Additional information concerning the Performance Evaluation Program is contained in reference 10 of this appendix. The calculations for evaluating bias between the primary monitor and the performance evaluation monitor for PM2.5 are described in section 4.3.2 of this appendix.

    3.2.8 PM10−2.5 Performance Evaluation Program. For the PM10−2.5 network, all automated methods will be designated as federal equivalent methods (FEMs). One performance evaluation audit, as described in section 3.2.7 must be performed at one PM10−2.5 site in each primary quality assurance organization each year. The calculations for evaluating bias between the primary monitor(s) and the performance evaluation monitors for PM10−2.5 are described in section 4.1.3 of this appendix.

    3.3 Measurement Quality Checks of Manual Methods. Table A-2 of this appendix provides a summary of the types and frequency of the measurement quality checks that will be described in this section.

    3.3.1 Collocated Sampling Procedures for PM10. For each network of manual PM10 methods, select 15 percent (or at least one) of the monitoring sites within the primary quality assurance organization for collocated sampling. For purposes of precision assessment, networks for measuring total suspended particulate (TSP) and PM10 shall be considered separately from one another. However, PM10 samplers used in the PM10-2.5 network, may be counted along with the PM10 samplers in the PM10 network as long as the PM10 samplers in both networks are the same method designation. PM10 and TSP sites having annual mean particulate matter concentrations among the highest 25 percent of the annual mean concentrations for all the sites in the network must be selected or, if such sites are impractical, alternative sites approved by the EPA Regional Administrator may be selected.

    3.3.1.1 In determining the number of collocated sites required for PM10, monitoring networks for lead (Pb) should be treated independently from networks for particulate matter (PM), even though the separate networks may share one or more common samplers. However, a single pair of samplers collocated at a common-sampler monitoring site that meets the requirements for both a collocated Pb site and a collocated PM site may serve as a collocated site for both networks.

    3.3.1.2 The two collocated monitors must be within 4 meters of each other and at least 2 meters apart for flow rates greater than 200 liters/min or at least 1 meter apart for samplers having flow rates less than 200 liters/min to preclude airflow interference. Calibration, sampling, analysis and verification/validation procedures must be the same for both collocated samplers and the same as for all other samplers in the network.

    3.3.1.3 For each pair of collocated samplers, designate one sampler as the Start Printed Page 61308primary sampler whose samples will be used to report air quality for the site, and designate the other as the audit sampler. Sample SLAMS sites on a 12-day schedule; sample PSD sites on a 6-day schedule or every third day for PSD daily samplers. If a primary quality assurance organization has only one collocated monitor, higher sampling frequencies than the 12-day schedule may be needed in order to produce approximately 25 valid sample pairs a year. Report the measurements from both samplers at each collocated sampling site. The calculations for evaluating precision between the two collocated samplers are described in section 4.2.1 of this appendix.

    3.3.2 Flow Rate Verification for Particulate Matter. Follow the same procedure as described in section 3.2.3 of this appendix for PM2.5, PM10 (low-volume instruments), and PM10−2.5. High-volume PM10 and TSP instruments can also follow the procedure in section 3.2.3 but the audits are required to be conducted quarterly. The percent differences between the audit and measured flow rates are used to assess the bias of the monitoring data as described in section 4.2.2 of this appendix.

    3.3.3 Semi-Annual Flow Rate Audit for Particulate Matter. Follow the same procedure as described in section 3.2.4 of this appendix for PM2.5, PM10, PM10−2.5 and TSP instruments. The percent differences between these flow rates are used to validate the one-point flow rate verification checks used to estimate bias as described in section 4.2.3 of this appendix. Great care must be used in auditing high-volume particulate matter samplers having flow regulators because the introduction of resistance plates in the audit flow standard device can cause abnormal flow patterns at the point of flow sensing. For this reason, the flow audit standard should be used with a normal filter in place and without resistance plates in auditing flow-regulated high-volume samplers, or other steps should be taken to assure that flow patterns are not perturbed at the point of flow sensing.

    3.3.4 Pb Methods.

    3.3.4.1 Annual Flow Rate. For the Pb Reference Method (40 CFR part 50, appendix G), the flow rates of the high-volume Pb samplers shall be verified and audited using the same procedures described in sections 3.3.2 and 3.3.3 of this appendix.

    3.3.4.2 Pb Strips. Each calendar quarter or sampling quarter (PSD), audit the Pb Reference Method analytical procedure using glass fiber filter strips containing a known quantity of Pb. These audit sample strips are prepared by depositing a Pb solution on unexposed glass fiber filter strips of dimensions 1.9 centimeters (cm) by 20.3 cm (3/4 inch by 8 inch) and allowing them to dry thoroughly. The audit samples must be prepared using batches of reagents different from those used to calibrate the Pb analytical equipment being audited. Prepare audit samples in the following concentration ranges:

    RangePb concentration, μg/stripEquivalent ambient Pb concentration, μg/m3 1
    1100-3000.5-1.5
    2400-1,0003.0-5.0
    1 Equivalent ambient Pb concentration in μ/m3 is based on sampling at 1.7 m3/min for 24 hours on a 20.3 cm × 25.4 cm (8 inch × 10 inch) glass fiber filter.

    (a) Audit samples must be extracted using the same extraction procedure used for exposed filters.

    (b) Analyze three audit samples in each of the two ranges each quarter samples are analyzed. The audit sample analyses shall be distributed as much as possible over the entire calendar quarter.

    (c) Report the audit concentrations (in μg Pb/strip) and the corresponding measured concentrations (in μg Pb/strip) using AQS unit code 077. The relative percent differences between the concentrations are used to calculate analytical accuracy as described in section 4.4.2 of this appendix.

    (d) The audits of an equivalent Pb method are conducted and assessed in the same manner as for the reference method. The flow auditing device and Pb analysis audit samples must be compatible with the specific requirements of the equivalent method.

    3.3.5 Collocated Sampling Procedures for PM2.5. Follow the same procedure as described in section 3.2.5 of this appendix. PM2.5 samplers used in the PM10-2.5 network, may be counted along with the PM2.5 samplers in the PM2.5 network as long as the PM2.5 samplers in both networks are the same method designation.

    3.3.6 Collocated Sampling Procedures for PM10-2.5. All designated FRMs within the PM10-2.5 monitoring network must have 15 percent of the monitors collocated (values of 0.5 and greater round up) at the PM10-2.5 sites. All FRM method designations can be aggregated.

    3.3.6.1 The EPA shall ensure that each designated FEM within the PM10-2.5 monitoring network must:

    (a) Have 15 percent of the monitors collocated (values of 0.5 and greater round up); and

    (b) Have at least 2 collocated monitors (if the total number of monitors is less than 10). The first collocated monitor must be a designated FRM monitor and the second must be a monitor of the same method designation. Both collocated FRM and FEM monitors can be located at the same site.

    3.3.6.2 The Regional Administrator for the EPA Region where the FRM or FEMs are implemented will select the sites for collocated monitoring. The collocation site selection process shall consider sites at primary quality assurance organizations or States with more than one PM10-2.5 site; primary quality assurance organizations already monitoring for PM10 and PM2.5 using FRMs or FEMs; and an appropriate distribution among rural and urban NCore sites. Monitoring organizations implementing PM10 samplers and PM2.5 FRM samplers of the same method designation as the PM10-2.5 FRM can include the PM10-2.5 monitors in their respective PM10 and PM2.5 count. Follow the same procedures as described in sections 3.2.6.2 and 3.2.6.3 of this appendix.

    3.3.7 PM2.5 Performance Evaluation Program (PEP) Procedures. Follow the same procedure as described in section 3.2.7 of this appendix.

    3.3.8 PM10-2.5 Performance Evaluation Program (PEP) Procedures. One performance evaluation audit, as described in section 3.2.7 of this appendix must be performed at one PM10-2.5 site in each primary quality assurance organization each year. Monitoring organizations implementing PM2.5 FRM samplers of the same method designation in both the PM2.5 and the PM10-2.5 networks can include the PM10-2.5 performance evaluation audit in their respective PM2.5 performance evaluation count as long as the performance evaluation is conducted at the PM10-2.5 site. The calculations for evaluating bias between the primary monitor(s) and the performance evaluation monitors for PM10-2.5 are described in section 4.1.3 of this appendix.

    4. Calculations for Data Quality Assessment

    (a) Calculations of measurement uncertainty are carried out by EPA according to the following procedures. Primary quality assurance organizations should report the data for all appropriate measurement quality checks as specified in this appendix even though they may elect to perform some or all of the calculations in this section on their own.

    (b) The EPA will provide annual assessments of data quality aggregated by site and primary quality assurance organization for SO2, NO2, O3 and CO and by primary quality assurance organization for PM10, PM2.5, PM10-2.5 and Pb.

    (c) At low concentrations, agreement between the measurements of collocated samplers, expressed as relative percent difference or percent difference, may be relatively poor. For this reason, collocated measurement pairs are selected for use in the precision and bias calculations only when both measurements are equal to or above the following limits:

    (1) TSP: 20 μg/m3.

    (2) Pb: 0.15 μg/m3.

    (3) PM10 (Hi-Vol): 15 μg/m3.

    (4) PM10 (Lo-Vol): 3 μg/m3.

    (5) PM10-2.5 and PM2.5: 3 μg/m3.

    4.1 Statistics for the Assessment of QC Checks for SO2, NO2, O3 and CO.

    4.1.1 Percent Difference. All measurement quality checks start with a comparison of an audit concentration or value (flowrate) to the concentration/value measured by the analyzer and use percent difference as the comparison statistic as described in equation 1 of this section. For Start Printed Page 61309each single point check, calculate the percent difference, di, as follows:

    where, meas is the concentration indicated by the monitoring organization's instrument and audit is the audit concentration of the standard used in the QC check being measured.

    4.1.2 Precision Estimate. The precision estimate is used to assess the one-point QC checks for SO2, NO2, O3, or CO described in section 3.2.1 of this appendix. The precision estimator is the coefficient of variation upper bound and is calculated using equation 2 of this section:

    where, X20.1,n-1 is the 10th percentile of a chi-squared distribution with n-1 degrees of freedom.

    4.1.3 Bias Estimate. The bias estimate is calculated using the one-point QC checks for SO2, NO2, O3, or CO described in section 3.2.1 of this appendix and the performance evaluation program for PM10-2.5 described in sections 3.2.8 and 3.3.8 of this appendix. The bias estimator is an upper bound on the mean absolute value of the percent differences as described in equation 3 of this section:

    where, n is the number of single point checks being aggregated; t0.95,n-1 is the 95th quantile of a t-distribution with n-1 degrees of freedom; the quantity AB is the mean of the absolute values of the di's and is calculated using equation 4 of this section:

    and the quantity AS is the standard deviation of the absolute value of the di's and is calculated using equation 5 of this section:

    4.1.3.1 Assigning a sign (positive/negative) to the bias estimate. Since the bias statistic as calculated in equation 3 of this appendix uses absolute values, it does not have a tendency (negative or positive bias) associated with it. A sign will be designated by rank ordering the percent differences of the QC check samples from a given site for a particular assessment interval.

    4.1.3.2 Calculate the 25th and 75th percentiles of the percent differences for each site. The absolute bias upper bound should be flagged as positive if both percentiles are positive and negative if both percentiles are negative. The absolute bias upper bound would not be flagged if the 25th and 75th percentiles are of different signs.

    4.1.4 Validation of Bias Using the one-point QC Checks. The annual performance evaluations for SO2, NO2, O3, or CO described in section 3.2.2 of this appendix are used to verify the results obtained from the one-point QC checks and to validate those results across a range of concentration levels. To quantify this annually at the site level and at the 3-year primary quality assurance organization level, probability limits will be calculated from the one-point QC checks using equations 6 and 7 of this appendix:

    where, m is the mean (equation 8 of this appendix):

    where, k is the total number of one point QC checks for the interval being evaluated and S is the standard deviation of the percent differences (equation 9 of this appendix) as follows:

    4.1.5 Percent Difference. Percent differences for the performance evaluations, calculated using equation 1 of this appendix can be compared to the probability intervals for the respective site or at the primary quality assurance organization level. Ninety-five percent of the individual percent differences (all audit concentration levels) for the performance evaluations should be captured within the probability intervals for the primary quality assurance organization.

    4.2 Statistics for the Assessment of PM10.

    4.2.1 Precision Estimate from Collocated Samplers. Precision is estimated via duplicate measurements from collocated samplers of the same type. It is recommended that the precision be aggregated at the primary quality assurance organization level quarterly, annually, and at the 3-year level. The data pair would only be considered valid if both concentrations are greater than the minimum values specified in section 4(c) of this appendix. For each collocated data pair, calculate the relative percent difference, di, using equation 10 of this appendix:

    where, Xi is the concentration from the primary sampler and Yi is the concentration value from the audit sampler. The coefficient of variation upper bound is calculated using the equation 11 of this appendix:

    where, n is the number of valid data pairs being aggregated, and X20.1.n-1 is the 10th percentile of a chi-squared distribution with n1 degrees of freedom. The factor of 2 in the denominator adjusts for the fact that each di is calculated from two values with error.

    4.2.2 Bias Estimate Using One-Point Flow Rate Verifications. For each one-point flow rate verification described in sections 3.2.3 and 3.3.2 of this appendix, calculate the percent difference in volume using equation 1 of this appendix where meas is the value indicated by the sampler's volume measurement and audit is the actual volume indicated by the auditing flow meter. The absolute volume bias upper bound is then calculated using equation 3, where n is the number of flow rate audits being aggregated; t0.95,n-1 is the 95th quantile of a t-distribution with n-1 degrees of freedom, the quantity AB is the mean of the absolute values of the di's and is calculated using equation 4 of this appendix , and the quantity AS in equation 3 of this appendix is the standard deviation of the absolute values if the di's and is calculated using equation 5 of this

    4.2.3 Assessment Semi-Annual Flow Rate Audits. The flow rate audits described in sections 3.2.4 and 3.3.3 of this appendix are used to assess the results obtained from the one-point flow rate verifications and to provide an estimate of flow rate acceptability. For each flow rate audit, calculate the percent difference in volume using equation 1 of this appendix where meas is the value indicated by the sampler's volume measurement and audit is the actual volume indicated by the auditing flow meter. To quantify this annually and at the 3-year primary quality assurance organization level, probability limits are calculated from the percent differences using equations 6 and 7 of this appendix where m is the mean described in equation 8 of this appendix and k is the total number of one-point flow rate verifications for the year and S is the standard deviation of the percent differences as described in equation 9 of this appendix. Start Printed Page 61310

    4.2.4  Percent Difference. Percent differences for the annual flow rate audit concentration, calculated using equation 1 of this appendix, can be compared to the probability intervals for the one-point flow rate verifications for the respective primary quality assurance organization. Ninety-five percent of the individual percent differences (all audit concentration levels) for the performance evaluations should be captured within the probability intervals for primary quality assurance organization.

    4.3  Statistics for the Assessment of PM2.5 and PM10-2.5.

    4.3.1  Precision Estimate. Precision for collocated instruments for PM2.5 and PM10-2.5 may be estimated where both the primary and collocated instruments are the same method designation and when the method designations are not similar. Follow the procedure described in section 4.2.1 of this appendix. In addition, one may want to perform an estimate of bias when the primary monitor is an FEM and the collocated monitor is an FRM. Follow the procedure described in section 4.1.3 of this appendix in order to provide an estimate of bias using the collocated data.

    4.3.2  Bias Estimate. Follow the procedure described in section 4.1.3 of this appendix for the bias estimate of PM10-2.5. The PM2.5 bias estimate is calculated using the paired routine and the PEP monitor data described in section 3.2.6 of this appendix. Calculate the percent difference, di, using equation 1 of this appendix, where meas is the measured concentration from agency's primary monitor and audit is the concentration from the PEP monitor. The data pair would only be considered valid if both concentrations are greater than the minimum values specified in section 4(c) of this appendix. Estimates of bias are presented for various levels of aggregation, sometimes aggregating over time, sometimes aggregating over samplers, and sometimes aggregating over both time and samplers. These various levels of aggregation are achieved using the same basic statistic.

    4.3.2.1  This statistic averages the individual biases described in equation 1 of this appendix to the desired level of aggregation using equation 12 of this appendix:

    where, nj is the number of pairs and d1, d2, dnj are the biases for each of the pairs to be averaged.

    4.3.2.2  Confidence intervals can be constructed for these average bias estimates in equation 12 of this appendix using equations 13 and 14 of this appendix:

    Where, t0.95,df is the 95th quantile of a t-distribution with degrees of freedom df = nj − 1 and s is an estimate of the variability of the average bias calculated using equation 15 of this appendix:

    4.4 Statistics for the Assessment of Pb.

    4.4.1 Precision Estimate. Follow the same procedures as described for PM10 in section 4.2.1 of this appendix using the data from the collocated instruments. The data pair would only be considered valid if both concentrations are greater than the minimum values specified in section 4(c) of this appendix.

    4.4.2 Bias Estimate. In order to estimate bias, the information from the flow rate audits and the Pb strip audits needs to be combined as described below. To be consistent with the formulas for the gases, the recommended procedures are to work with relative errors of the lead measurements. The relative error in the concentration is related to the relative error in the volume and the relative error in the mass measurements using equation 16 of this appendix:

    As with the gases, an upper bound for the absolute bias is desired. Using equation 16 above, the absolute value of the relative (concentration) error is bounded by equation 17 of this appendix:

    Start Printed Page 61311

    The quality indicator data collected are then used to bound each part of equation 17 separately.

    4.4.2.1 Flow rate calculations. For each flow rate audit, calculate the percent difference in volume by equation 1 of this appendix where meas is the value indicated by the sampler's volume measurement and audit is the actual volume indicated by the auditing flow meter. The absolute volume bias upper bound is then calculated using equation 3 of this appendix where n is the number of flow rate audits being aggregated; t0.95,n-1 is the 95th quantile of a t-distribution with n-1 degrees of freedom; the quantity AB is the mean of the absolute values of the di's and is calculated using equation 4, and the quantity AS in equation 3 of this appendix is the standard deviation of the absolute values of the di's and is calculated using equation 5 of this appendix.

    4.4.2.2 Lead strip calculations. Similarly for each lead strip audit, calculate the percent difference in mass by equation 1 where meas is the value indicated by the mass measurement and audit is the actual lead mass on the audit strip. The absolute mass bias upper bound is then calculated using equation 3 of this appendix where n is the number of lead strip audits being aggregated; t0.95,n-1 is the 95th quantile of a t-distribution with n-1 degrees of freedom; the quantity AB is the mean of the absolute values of the di's and is calculated using equation 4 of this appendix and the quantity AS in equation 3 of this appendix is the standard deviation of the absolute values of the di's and is calculated using equation 5 of this appendix.

    4.4.2.3 Final bias calculation. Finally, the absolute bias upper bound is given by combining the absolute bias estimates of the flow rate and Pb strips using equation 18 of this appendix:

    where, the numerator and denominator have been multiplied by 100 since everything is expressed as a percentage.

    4.5 Time Period for Audits. The statistics in this section assume that the mass and flow rate audits represent the same time period. Since the two types of audits are not performed at the same time, the audits need to be grouped by common time periods. Consequently, the absolute bias estimates should be done on annual and 3-year levels. The flow rate audits are site-specific, so the absolute bias upper bound estimate can be done and treated as a site-level statistic.

    5. Reporting Requirements

    5.1 SLAMS Reporting Requirements. For each pollutant, prepare a list of all monitoring sites and their AQS site identification codes in each primary quality assurance organization and submit the list to the appropriate EPA Regional Office, with a copy to AQS. Whenever there is a change in this list of monitoring sites in a primary quality assurance organization, report this change to the EPA Regional Office and to AQS.

    5.1.1 Quarterly Reports. For each quarter, each primary quality assurance organization shall report to AQS directly (or via the appropriate EPA Regional Office for organizations not direct users of AQS) the results of all valid measurement quality checks it has carried out during the quarter. The quarterly reports must be submitted consistent with the data reporting requirements specified for air quality data as set forth in § 58.16. The EPA strongly encourages early submission of the quality assurance data in order to assist the monitoring organizations control and evaluate the quality of the ambient air data.

    5.1.2 Annual Reports.

    5.1.2.1 When the monitoring organization has certified relevant data for the calendar year, EPA will calculate and report the measurement uncertainty for the entire calendar year.

    5.2 PSD Reporting Requirements. At the end of each sampling quarter, the organization must report the appropriate statistical assessments in section 4 of this appendix for the pollutants measured. All data used to calculate reported estimates of precision and bias including span checks, collocated sampler and audit results must be made available to the permit granting authority upon request.

    6.0 References

    (1) American National Standard—Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs. ANSI/ASQC E4-2004. February 2004. Available from American Society for Quality Control, 611 East Wisconsin Avenue, Milwaukee, WI 53202.

    (2) EPA Requirements for Quality Management Plans. EPA QA/R-2. EPA/240/B-01/002. March 2001. Office of Environmental Information, Washington DC 20460. http://www.epa.gov/​quality/​qs-docs/​r2-final.pdf.

    (3) EPA Requirements for Quality Assurance Project Plans for Environmental Data Operations. EPA QA/R-5. EPA/240/B-01/003. March 2001. Office of Environmental Information, Washington DC 20460. http://www.epa.gov/​quality/​qs-docs/​r5-final.pdf.

    (4) EPA Traceability Protocol for Assay and Certification of Gaseous Calibration Standards. EPA-600/R-97/121. September 1997. Available from U.S. Environmental Protection Agency, ORD Publications Office, Center for Environmental Research Information (CERI), 26 W. Martin Luther King Drive, Cincinnati, OH 45268.

    (5) Guidance for the Data Quality Objectives Process. EPA QA/G-4. EPA/240/B-06/001. February, 2006. Office of Environmental Information, Washington DC 20460. http://www.epa.gov/​quality/​qs-docs/​g4-final.pdf.

    (6) List of Designated Reference and Equivalent Methods. Available from U.S. Environmental Protection Agency, National Exposure Research Laboratory, Human Exposure and Atmospheric Sciences Division, MD-D205-03, Research Triangle Park, NC 27711. http://www.epa.gov/​ttn/​amtic/​criteria.html.

    (7) McElroy, F.F. Transfer Standards for the Calibration of Ambient Air Monitoring Analyzers for Ozone. EPA-600/4-79-056. U.S. Environmental Protection Agency, Research Triangle Park, NC 27711, September, 1979. http://www.epa.gov/​ttn/​amtic/​cpreldoc.html.

    (8) Paur, R.J. and F.F. McElroy. Technical Assistance Document for the Calibration of Ambient Ozone Monitors. EPA-600/4-79-057. U.S. Environmental Protection Agency, Research Triangle Park, NC 27711, September, 1979. http://www.epa.gov/​ttn/​amtic/​cpreldoc.html.

    (9) Quality Assurance Handbook for Air Pollution Measurement Systems, Volume 1—A Field Guide to Environmental Quality Assurance. EPA-600/R-94/038a. April 1994. Available from U.S. Environmental Protection Agency, ORD Publications Office, Center for Environmental Research Information (CERI), 26 W. Martin Luther King Drive, Cincinnati, OH 45268. http://www.epa.gov/​ ttn/amtic/qabook.html.

    (10) Quality Assurance Handbook for Air Pollution Measurement Systems, Volume II: Part 1—Ambient Air Quality Monitoring Program Quality System Development. EPA-454/R-98-004. http://www.epa.gov/​ttn/​amtic/​qabook.html. Start Printed Page 61312

    Table A-1 of Appendix A to Part 58. Difference and Similarities Between SLAMS and PSD Requirements

    TopicSLAMSPSD
    Requirements1. The development, documentation, and implementation of an approved quality system 2. The assessment of data quality 3. The use of reference, equivalent, or approved methods 4. The use of calibration standards traceable to NIST or other primary standard 5. The participation in EPA performance evaluations and the permission for EPA to conduct system audits
    Monitoring and QA ResponsibilityState/local agency via the “primary quality assurance organization”Source owner/operator.
    Monitoring DurationIndefinitelyUsually up to 12 months.
    Annual Performance Evaluation (PE)Standards and equipment different from those used for spanning, calibration, and verifications. Prefer different personnelPersonnel, standards and equipment different from those used for spanning, calibration, and verifications.
    PE audit rate:
    —Automated100% per year100% per quarter.
    —ManualVaries depending on pollutant. See Table A-2 of this appendix100% per quarter.
    Precision Assessment:
    —AutomatedOne-point QC check biweekly but data quality dependentOne point QC check biweekly.
    —ManualVaries depending on pollutant. See Table A-2 of this appendixOne site: 1 every 6 days or every third day for daily monitoring (TSP and Pb).
    Reporting
    —AutomatedBy site—EPA performs calculations annuallyBy site—source owner/operator performs calculations each sampling quarter.
    —ManualBy reporting organization—EPA performs calculations annuallyBy site—source owner/operator performs calculations each sampling quarter.

    Table A-2 of Appendix A to Part 58. Minimum Data Assessment Requirements for SLAMS Sites

    MethodAssessment methodCoverageMinimum frequencyParameters reported
    Automated Methods
    1-Point QC for SO2, NO2, O3, COResponse check at concentration 0.01-0.1 ppm SO2, NO2, O3, and 1-10 ppm COEach analyzerOnce per 2 weeksAudit concentration 1 and measured concentration 2.
    Annual performance evaluation for SO2, NO2, O3, COSee section 3.2.2 of this appendixEach analyzerOnce per yearAudit concentration 1 and measured concentration 2 for each level.
    Flow rate verification PM10, PM2.5, PM10-2.5Check of sampler flow rateEach samplerOnce every monthAudit flow rate and measured flow rate indicated by the sampler.
    Semi-annual flow rate audit PM10, PM2.5, PM10-2.5Check of sampler flow rate using independent standardEach samplerOnce every 6Audit flow rate and measured flow rate indicated by the sampler.
    Collocated sampling PM2.5, PM10-2.5Collocated samplers15%Every 12 daysPrimary sampler concentration and duplicate sampler concentration.
    Performance evaluation program PM2.5, PM10-2.5Collocated samplers1. 5 valid audits for primary QA orgs, with ≤ 5 sites 2. 8 valid audits for primary QA orgs, with > 5 sites 3. All samplers in 6 yearsOver all 4 quartersPrimary sampler concentration and performance evaluation sampler concentration.
    Manual Methods
    Collocated sampling PM10, TSP, PM10-2.5, PM2.5Collocated samplers15%Every 12 days PSD—every 6 daysPrimary sampler concentration and duplicate sampler concentration.
    Flow rate verification PM10 (low Vol), PM10-2.5, PM2.5Check of sampler flow rateEach samplerOnce every monthAudit flow rate and measured flow rate indicated by the sampler.
    Start Printed Page 61313
    Flow rate verification PM10 (High-Vol), TSPCheck of sampler flow rateEach samplerOnce every quarterAudit flow rate and measured flow rate indicated by the sampler.
    Semi-annual flow rate audit PM10, TSP, PM10-2.5, PM2.5Check of sampler flow rate using independent standardEach sampler, all locationsOnce every 6 monthsAudit flow rate and measured flow rate indicated by the sampler.
    Manual Methods Lead1. Check of sample flow rate as for TSP 2. Check of analytical system with Pb audit strips1. Each sampler 2. Analytical1. Include with TSP 2. Each quarter1. Same as for TSP. 2. Actual concentration.
    Performance evaluation program PM2.5, PM10-2.5Collocated samplers1. 5 valid audits for primary QA orgs, with ≤ 5 sites 2. 8 valid audits for primary QA orgs, with ≥ 5 sites 3. All samplers in 6 yearsOver all 4 quartersPrimary sampler concentration and performance evaluation sampler concentration.
    1 Effective concentration for open path analyzers.
    2 Corrected concentration, if applicable, for open path analyzers.

    Table A-3 of Appendix A to Part 58.—Summary of PM2.5 Number and Type of Collocation (15% Collocation Requirement) Needed as an Example of a Primary Quality Assurance Organization That Has 54 Monitors and Procured FRMs and Three Other Equivalent Method Types

    Primary sampler method designationTotal no. of monitorsTotal no. collocatedNo. of collocated FRMNo. of collocated monitors of same method designation as primary
    FRM2033n/a
    FEM (A)20321
    FEM (C)2110
    FEM (D)12211

    Appendix B—[Removed and Reserved]

    34. Appendix B to part 58 is removed and reserved

    35. Appendix C to part 58 is revised to read as follows:

    Appendix C to Part 58—Ambient Air Quality Monitoring Methodology

    1.0 Purpose

    2.0 SLAMS Ambient Air Monitoring Stations

    3.0 NCore Ambient Air Monitoring Stations

    4.0 Photochemical Assessment Monitoring Stations (PAMS)

    5.0 Particulate Matter Episode Monitoring

    6.0 References

    1.0 Purpose

    This appendix specifies the criteria pollutant monitoring methods (manual methods or automated analyzers) which must be used in SLAMS and NCore stations that are a subset of SLAMS.

    2.0 SLAMS Ambient Air Monitoring Network

    2.1 Except as otherwise provided in this appendix, a criteria pollutant monitoring method used for making NAAQS decisions at a SLAMS site must be a reference or equivalent method as defined in § 50.1 of this chapter.

    2.2 Reserved

    2.3 Any manual method or analyzer purchased prior to cancellation of its reference or equivalent method designation under § 53.11 or § 53.16 of this chapter may be used at a SLAMS site following cancellation for a reasonable period of time to be determined by the Administrator.

    2.4 Approval of Non-designated Continuous PM2.5 Methods as Approved Regional Methods (ARMs) Operated Within a Network of Sites. A method for PM2.5 that has not been designated as an FRM or FEM as defined in § 50.1 of this chapter may be approved as an ARM for purposes of section 2.1 of this appendix at a particular site or network of sites under the following stipulations.

    2.4.1 The candidate ARM must be demonstrated to meet the requirements for PM2.5 Class III equivalent methods as defined in subpart C of part 53 of this chapter. Specifically the requirements for precision, correlation, and additive and multiplicative bias apply. For purposes of this section 2.4, the following requirements shall apply:

    2.4.1.1 The candidate ARM shall be tested at the site(s) in which it is intended to be used. For a network of sites operated by one reporting agency or primary quality assurance organization, the testing shall occur at a subset of sites to include one site in each MSA/CSA, up to the first 2 highest population MSA/CSA and at least one rural area or Micropolitan Statistical Area site. If the candidate ARM for a network is already approved for purposes of this section in another agency's network, subsequent testing shall minimally occur at one site in a MSA/CSA and one rural area or Micropolitan Statistical Area. There shall be no requirement for tests at any other sites.

    2.4.1.2 For purposes of this section, a full year of testing may begin and end in any season, so long as all seasons are covered.

    2.4.1.3 No PM10 samplers shall be required for the test, as determination of the PM2.5/PM10 ratio at the test site shall not be required.

    2.4.1.4 The test specification for PM2.5 Class III equivalent method precision defined in subpart C of part 53 of this chapter applies; however, there is no specific requirement that collocated continuous monitors be operated for purposes of generating a statistic for coefficient of variation (CV). To provide an estimate of precision that meets the requirement identified in subpart C of part 53 of this chapter, agencies may cite peer-reviewed published data or data in AQS that can be presented demonstrating the candidate ARM operated will produce data that meets the specification for precision of Class III PM2.5 methods.

    2.4.1.5 A minimum of 90 valid sample pairs per site for the year with no less than 20 valid sample pairs per season must be generated for use in demonstrating that additive bias, multiplicative bias and correlation meet the comparability requirements specified in subpart C of part 53 of this chapter. A valid sample pair may be generated with as little as one valid FRM and one valid candidate ARM measurement per day. Start Printed Page 61314

    2.4.1.6 For purposes of determining bias, FRM data with concentrations less than 3 micrograms per cubic meter (μg/m3) may be excluded. Exclusion of data does not result in failure of sample completeness specified in this section.

    2.4.1.7 Data transformations are allowed to be used to demonstrate meeting the comparability requirements specified in subpart C of part 53 of this chapter. Data transformation may be linear or non-linear, but must be applied in the same way to all sites used in the testing.

    2.4.2 The monitoring agency wishing to use an ARM must develop and implement appropriate quality assurance procedures for the method. Additionally, the following procedures are required for the method:

    2.4.2.1 The ARM must be consistently operated throughout the network. Exceptions to a consistent operation must be approved according to section 2.8 of this appendix;

    2.4.2.2 The ARM must be operated on an hourly sampling frequency capable of providing data suitable for aggregation into daily 24-hour average measurements;

    2.4.2.3 The ARM must use an inlet and separation device, as needed, that are already approved in either the reference method identified in appendix L to part 50 of this chapter or under part 53 of this chapter as approved for use on a PM2.5 reference or equivalent method. The only exceptions to this requirement are those methods that by their inherent measurement principle may not need an inlet or separation device that segregates the aerosol; and

    2.4.2.4 The ARM must be capable of providing for flow audits, unless by its inherent measurement principle, measured flow is not required. These flow audits are to be performed on the frequency identified in appendix A to this part.

    2.4.2.5 If data transformations are used, they must be described in the monitoring agencies Quality Assurance Project plan (or addendum to QAPP). The QAPP shall describe how often (e.g., quarterly, yearly) and under what provisions the data transformation will be updated. For example, not meeting the data quality objectives for a site over a season or year may be cause for recalculating a data transformation, but by itself would not be cause for invalidating the data. Data transformations must be applied prospectively, i.e., in real-time or near real-time, to the data output from the PM2.5 continuous method. See reference 7 of this appendix.

    2.4.3 The monitoring agency wishing to use the method must develop and implement appropriate procedures for assessing and reporting the precision and accuracy of the method comparable to the procedures set forth in appendix A of this part for designated reference and equivalent methods.

    2.4.4 Assessments of data quality shall follow the same frequencies and calculations as required under section 3 of appendix A to this part with the following exceptions:

    2.4.4.1 Collocation of ARM with FRM/FEM samplers must be maintained at a minimum of 30 percent of the required SLAMS sites with a minimum of 1 per network;

    2.4.4.2 All collocated FRM/FEM samplers must maintain a sample frequency of at least 1 in 6 sample days;

    2.4.4.3 Collocated FRM/FEM samplers shall be located at the design value site, with the required FRM/FEM samplers deployed among the largest MSA/CSA in the network, until all required FRM/FEM are deployed; and

    2.4.4.4 Data from collocated FRM/FEM are to be substituted for any calendar quarter that an ARM method has incomplete data.

    2.4.4.5 Collocation with an ARM under this part for purposes of determining the coefficient of variation of the method shall be conducted at a minimum of 7.5 percent of the sites with a minimum of 1 per network. This is consistent with the requirements in appendix A to this part for one-half of the required collocation of FRM/FEM (15 percent) to be collocated with the same method.

    2.4.4.6 Assessments of bias with an independent audit of the total measurement system shall be conducted with the same frequency as an FEM as identified in appendix A to this part.

    2.4.5 Request for approval of a candidate ARM, that is not already approved in another agency's network under this section, must meet the general submittal requirements of section 2.7 of this appendix. Requests for approval under this section when an ARM is already approved in another agency's network are to be submitted to the EPA Regional Administrator. Requests for approval under section 2.4 of this appendix must include the following requirements:

    2.4.5.1 A clear and unique description of the site(s) at which the candidate ARM will be used and tested, and a description of the nature or character of the site and the particulate matter that is expected to occur there.

    2.4.5.2 A detailed description of the method and the nature of the sampler or analyzer upon which it is based.

    2.4.5.3 A brief statement of the reason or rationale for requesting the approval.

    2.4.5.4 A detailed description of the quality assurance procedures that have been developed and that will be implemented for the method.

    2.4.5.5 A detailed description of the procedures for assessing the precision and accuracy of the method that will be implemented for reporting to AQS.

    2.4.5.6 Test results from the comparability tests as required in section 2.4.1 through 2.4.1.4 of this appendix.

    2.4.5.7 Such further supplemental information as may be necessary or helpful to support the required statements and test results.

    2.4.6 Within 120 days after receiving a request for approval of the use of an ARM at a particular site or network of sites under section 2.4 of this appendix, the Administrator will approve or disapprove the method by letter to the person or agency requesting such approval. When appropriate for methods that are already approved in another SLAMS network, the EPA Regional Administrator has approval/disapproval authority. In either instance, additional information may be requested to assist with the decision.

    2.5 [Reserved]

    2.6 Use of Methods With Higher, Nonconforming Ranges in Certain Geographical Areas.

    2.6.1 [Reserved]

    2.6.2 An analyzer may be used (indefinitely) on a range which extends to concentrations higher than two times the upper limit specified in table B-1 of part 53 of this chapter if:

    2.6.2.1 The analyzer has more than one selectable range and has been designated as a reference or equivalent method on at least one of its ranges, or has been approved for use under section 2.5 (which applies to analyzers purchased before February 18, 1975);

    2.6.2.2 The pollutant intended to be measured with the analyzer is likely to occur in concentrations more than two times the upper range limit specified in table B-1 of part 53 of this chapter in the geographical area in which use of the analyzer is proposed; and

    2.6.2.3 The Administrator determines that the resolution of the range or ranges for which approval is sought is adequate for its intended use. For purposes of this section (2.6), “resolution” means the ability of the analyzer to detect small changes in concentration.

    2.6.3 Requests for approval under section 2.6.2 of this appendix must meet the submittal requirements of section 2.7. Except as provided in section 2.7.3 of this appendix, each request must contain the information specified in section 2.7.2 in addition to the following:

    2.6.3.1 The range or ranges proposed to be used;

    2.6.3.2 Test data, records, calculations, and test results as specified in section 2.7.2.2 of this appendix for each range proposed to be used;

    2.6.3.3 An identification and description of the geographical area in which use of the analyzer is proposed;

    2.6.3.4 Data or other information demonstrating that the pollutant intended to be measured with the analyzer is likely to occur in concentrations more than two times the upper range limit specified in table B-1 of part 53 of this chapter in the geographical area in which use of the analyzer is proposed; and

    2.6.3.5 Test data or other information demonstrating the resolution of each proposed range that is broader than that permitted by section 2.5 of this appendix.

    2.6.4 Any person who has obtained approval of a request under this section (2.6.2) shall assure that the analyzer for which approval was obtained is used only in the geographical area identified in the request and only while operated in the range or ranges specified in the request.

    2.7 Requests for Approval; Withdrawal of Approval.

    2.7.1 Requests for approval under sections 2.4, 2.6.2, or 2.8 of this appendix must be submitted to: Director, National Exposure Research Laboratory (MD-D205-03), U.S. Environmental Protection Agency, Research Triangle Park, North Carolina 27711. For ARM that are already approved in another agency's network, subsequent Start Printed Page 61315requests for approval under section 2.4 are to be submitted to the applicable EPA Regional Administrator.

    2.7.2 Except as provided in section 2.7.3 of this appendix, each request must contain:

    2.7.2.1 A statement identifying the analyzer (e.g., by serial number) and the method of which the analyzer is representative (e.g., by manufacturer and model number); and

    2.7.2.2 Test data, records, calculations, and test results for the analyzer (or the method of which the analyzer is representative) as specified in subpart B, subpart C, or both (as applicable) of part 53 of this chapter.

    2.7.3 A request may concern more than one analyzer or geographical area and may incorporate by reference any data or other information known to EPA from one or more of the following:

    2.7.3.1 An application for a reference or equivalent method determination submitted to EPA for the method of which the analyzer is representative, or testing conducted by the applicant or by EPA in connection with such an application;

    2.7.3.2 Testing of the method of which the analyzer is representative at the initiative of the Administrator under § 53.7 of this chapter; or

    2.7.3.3 A previous or concurrent request for approval submitted to EPA under this section (2.7).

    2.7.4 To the extent that such incorporation by reference provides data or information required by this section (2.7) or by sections 2.4, 2.5, or 2.6 of this appendix, independent data or duplicative information need not be submitted.

    2.7.5 After receiving a request under this section (2.7), the Administrator may request such additional testing or information or conduct such tests as may be necessary in his judgment for a decision on the request.

    2.7.6 If the Administrator determines, on the basis of any available information, that any of the determinations or statements on which approval of a request under this section was based are invalid or no longer valid, or that the requirements of section 2.4, 2.5, or 2.6, as applicable, have not been met, he/she may withdraw the approval after affording the person who obtained the approval an opportunity to submit information and arguments opposing such action.

    2.8 Modifications of Methods by Users.

    2.8.1 Except as otherwise provided in this section, no reference method, equivalent method, or ARM may be used in a SLAMS network if it has been modified in a manner that could significantly alter the performance characteristics of the method without prior approval by the Administrator. For purposes of this section, “alternative method” means an analyzer, the use of which has been approved under section 2.4, 2.5, or 2.6 of this appendix or some combination thereof.

    2.8.2 Requests for approval under this section (2.8) must meet the submittal requirements of sections 2.7.1 and 2.7.2.1 of this appendix.

    2.8.3 Each request submitted under this section (2.8) must include:

    2.8.3.1 A description, in such detail as may be appropriate, of the desired modification;

    2.8.3.2 A brief statement of the purpose(s) of the modification, including any reasons for considering it necessary or advantageous;

    2.8.3.3 A brief statement of belief concerning the extent to which the modification will or may affect the performance characteristics of the method; and

    2.8.3.4 Such further information as may be necessary to explain and support the statements required by sections 2.8.3.2 and 2.8.3.3.

    2.8.4 The Administrator will approve or disapprove the modification by letter to the person or agency requesting such approval within 75 days after receiving a request for approval under this section and any further information that the applicant may be asked to provide.

    2.8.5 A temporary modification that could alter the performance characteristics of a reference, equivalent, or ARM may be made without prior approval under this section if the method is not functioning or is malfunctioning, provided that parts necessary for repair in accordance with the applicable operation manual cannot be obtained within 45 days. Unless such temporary modification is later approved under section 2.8.4 of this appendix, the temporarily modified method shall be repaired in accordance with the applicable operation manual as quickly as practicable but in no event later than 4 months after the temporary modification was made, unless an extension of time is granted by the Administrator. Unless and until the temporary modification is approved, air quality data obtained with the method as temporarily modified must be clearly identified as such when submitted in accordance with § 58.16 and must be accompanied by a report containing the information specified in section 2.8.3 of this appendix. A request that the Administrator approve a temporary modification may be submitted in accordance with sections 2.8.1 through 2.8.4 of this appendix. In such cases the request will be considered as if a request for prior approval had been made.

    2.9 Use of IMPROVE Samplers at a SLAMS Site. “IMPROVE” samplers may be used in SLAMS for monitoring of regional background and regional transport concentrations of fine particulate matter. The IMPROVE samplers were developed for use in the Interagency Monitoring of Protected Visual Environments (IMPROVE) network to characterize all of the major components and many trace constituents of the particulate matter that impair visibility in Federal Class I Areas. Descriptions of the IMPROVE samplers and the data they collect are available in references 4, 5, and 6 of this appendix.

    3.0 NCore Ambient Air Monitoring Stations

    3.1 Methods employed in NCore multipollutant sites used to measure SO2, CO, NO2, O3, PM2.5, or PM10−2.5 must be reference or equivalent methods as defined in § 50.1 of this chapter, or an ARM as defined in section 2.4 of this appendix, for any monitors intended for comparison with applicable NAAQS.

    3.2 If alternative SO2, CO, NO2, O3, PM2.5, or PM10−2.5 monitoring methodologies are proposed for monitors not intended for NAAQS comparison, such techniques must be detailed in the network description required by § 58.10 and subsequently approved by the Administrator. Examples of locations that are not intended to be compared to the NAAQS may be rural background and transport sites or areas where the concentration of the pollutant is so low that it would be more useful to operate a higher sensitivity method that is not an FRM or FEM.

    4.0 Photochemical Assessment Monitoring Stations (PAMS)

    4.1 Methods used for O3 monitoring at PAMS must be automated reference or equivalent methods as defined in § 50.1 of this chapter.

    4.2 Methods used for NO, NO2 and NOX monitoring at PAMS should be automated reference or equivalent methods as defined for NO2 in § 50.1 of this chapter. If alternative NO, NO2 or NOX monitoring methodologies are proposed, such techniques must be detailed in the network description required by § 58.10 and subsequently approved by the Administrator.

    4.3 Methods for meteorological measurements and speciated VOC monitoring are included in the guidance provided in references 2 and 3 of this appendix. If alternative VOC monitoring methodology (including the use of new or innovative technologies), which is not included in the guidance, is proposed, it must be detailed in the network description required by § 58.10 and subsequently approved by the Administrator.

    5.0 Particulate Matter Episode Monitoring

    5.1 For short-term measurements of PM10 during air pollution episodes (see § 51.152 of this chapter) the measurement method must be:

    5.1.1 Either the “Staggered PM10” method or the “PM10 Sampling Over Short Sampling Times” method, both of which are based on the reference method for PM10 and are described in reference 1: or

    5.1.2 Any other method for measuring PM10:

    5.1.2.1 Which has a measurement range or ranges appropriate to accurately measure air pollution episode concentration of PM10,

    5.1.2.2 Which has a sample period appropriate for short-term PM10 measurements, and

    5.1.2.3 For which a quantitative relationship to a reference or equivalent method for PM10 has been established at the use site. Procedures for establishing a quantitative site-specific relationship are contained in reference 1.

    5.2 PM10 methods other than the reference method are not covered under the quality assessment requirements of appendix to this part. Therefore, States must develop and implement their own quality assessment procedures for those methods allowed under this section 4. These quality assessment procedures should be similar or analogous to Start Printed Page 61316those described in section 3 of appendix A to this part for the PM10 reference method.

    6.0 References

    1. Pelton, D. J. Guideline for Particulate Episode Monitoring Methods, GEOMET Technologies, Inc., Rockville, MD. Prepared for U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA Contract No. 68-02-3584. EPA 450/4-83-005. February 1983.

    2. Technical Assistance Document For Sampling and Analysis of Ozone Precursors. Atmospheric Research and Exposure Assessment Laboratory, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711. EPA 600/8-91-215. October 1991.

    3. Quality Assurance Handbook for Air Pollution Measurement Systems: Volume IV. Meteorological Measurements. Atmospheric Research and Exposure Assessment Laboratory, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711. EPA 600/4-90-0003. August 1989.

    4. Eldred, R.A., Cahill, T.A., Wilkenson, L.K., et al., Measurements of fine particles and their chemical components in the IMPROVE/NPS networks, in Transactions of the International Specialty Conference on Visibility and Fine Particles, Air and Waste Management Association: Pittsburgh, PA, 1990; pp. 187-196.

    5. Sisler, J.F., Huffman, D., and Latimer, D.A.; Spatial and temporal patterns and the chemical composition of the haze in the United States: An analysis of data from the IMPROVE network, 1988-1991, ISSN No. 0737-5253-26, National Park Service, Ft. Collins, CO, 1993.

    6. Eldred, R.A., Cahill, T.A., Pitchford, M., and Malm, W.C.; IMPROVE—a new remote area particulate monitoring system for visibility studies, Proceedings of the 81st Annual Meeting of the Air Pollution Control Association, Dallas, Paper 88-54.3, 1988.

    7. Data Quality Objectives (DQOs) for Relating Federal Reference Method (FRM) and Continuous PM2.5 Measurements to Report an Air Quality Index (AQI). Office of Air Quality Planning and Standards, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711. EPA 454/B-02-2002. November 2002.

    36. Appendix D to part 58 is revised to read as follows:

    Appendix D to Part 58—Network Design Criteria for Ambient Air Quality Monitoring

    1. Monitoring Objectives and Spatial Scales

    2. General Monitoring Requirements

    3. Design Criteria for NCore Sites

    4. Pollutant-Specific Design Criteria for SLAMS Sites

    5. Design Criteria for Photochemical Assessment Monitoring Stations (PAMS)

    6. References

    1. Monitoring Objectives and Spatial Scales

    The purpose of this appendix is to describe monitoring objectives and general criteria to be applied in establishing the required SLAMS ambient air quality monitoring stations and for choosing general locations for additional monitoring sites. This appendix also describes specific requirements for the number and location of FRM, FEM, and ARM sites for specific pollutants, NCore multipollutant sites, PM10 mass sites, PM2.5 mass sites, chemically-speciated PM2.5 sites, and O3 precursor measurements sites (PAMS). These criteria will be used by EPA in evaluating the adequacy of the air pollutant monitoring networks.

    1.1 Monitoring Objectives. The ambient air monitoring networks must be designed to meet three basic monitoring objectives. These basic objectives are listed below. The appearance of any one objective in the order of this list is not based upon a prioritized scheme. Each objective is important and must be considered individually.

    (a) Provide air pollution data to the general public in a timely manner. Data can be presented to the public in a number of attractive ways including through air quality maps, newspapers, Internet sites, and as part of weather forecasts and public advisories.

    (b) Support compliance with ambient air quality standards and emissions strategy development. Data from FRM, FEM, and ARM monitors for NAAQS pollutants will be used for comparing an area's air pollution levels against the NAAQS. Data from monitors of various types can be used in the development of attainment and maintenance plans. SLAMS, and especially NCore station data, will be used to evaluate the regional air quality models used in developing emission strategies, and to track trends in air pollution abatement control measures' impact on improving air quality. In monitoring locations near major air pollution sources, source-oriented monitoring data can provide insight into how well industrial sources are controlling their pollutant emissions.

    (c) Support for air pollution research studies. Air pollution data from the NCore network can be used to supplement data collected by researchers working on health effects assessments and atmospheric processes, or for monitoring methods development work.

    1.1.1 In order to support the air quality management work indicated in the three basic air monitoring objectives, a network must be designed with a variety of types of monitoring sites. Monitoring sites must be capable of informing managers about many things including the peak air pollution levels, typical levels in populated areas, air pollution transported into and outside of a city or region, and air pollution levels near specific sources. To summarize some of these sites, here is a listing of six general site types:

    (a) Sites located to determine the highest concentrations expected to occur in the area covered by the network.

    (b) Sites located to measure typical concentrations in areas of high population density.

    (c) Sites located to determine the impact of significant sources or source categories on air quality.

    (d) Sites located to determine general background concentration levels.

    (e) Sites located to determine the extent of regional pollutant transport among populated areas; and in support of secondary standards.

    (f) Sites located to measure air pollution impacts on visibility, vegetation damage, or other welfare-based impacts.

    1.1.2 This appendix contains criteria for the basic air monitoring requirements. The total number of monitoring sites that will serve the variety of data needs will be substantially higher than these minimum requirements provide. The optimum size of a particular network involves trade-offs among data needs and available resources. This regulation intends to provide for national air monitoring needs, and to lend support for the flexibility necessary to meet data collection needs of area air quality managers. The EPA, State, and local agencies will periodically collaborate on network design issues through the network assessment process outlined in § 58.10.

    1.1.3 This appendix focuses on the relationship between monitoring objectives, site types, and the geographic location of monitoring sites. Included are a rationale and set of general criteria for identifying candidate site locations in terms of physical characteristics which most closely match a specific monitoring objective. The criteria for more specifically locating the monitoring site, including spacing from roadways and vertical and horizontal probe and path placement, are described in appendix E to this part.

    1.2 Spatial Scales. (a) To clarify the nature of the link between general monitoring objectives, site types, and the physical location of a particular monitor, the concept of spatial scale of representativeness is defined. The goal in locating monitors is to correctly match the spatial scale represented by the sample of monitored air with the spatial scale most appropriate for the monitoring site type, air pollutant to be measured, and the monitoring objective.

    (b) Thus, spatial scale of representativeness is described in terms of the physical dimensions of the air parcel nearest to a monitoring site throughout which actual pollutant concentrations are reasonably similar. The scales of representativeness of most interest for the monitoring site types described above are as follows:

    (1) Microscale—Defines the concentrations in air volumes associated with area dimensions ranging from several meters up to about 100 meters.

    (2) Middle scale—Defines the concentration typical of areas up to several city blocks in size with dimensions ranging from about 100 meters to 0.5 kilometer.

    (3) Neighborhood scale—Defines concentrations within some extended area of the city that has relatively uniform land use with dimensions in the 0.5 to 4.0 kilometers range. The neighborhood and urban scales listed below have the potential to overlap in applications that concern secondarily formed or homogeneously distributed air pollutants.

    (4) Urban scale—Defines concentrations within an area of city-like dimensions, on the order of 4 to 50 kilometers. Within a city, the geographic placement of sources may result in there being no single site that can be said to represent air quality on an urban scale.

    (5) Regional scale—Defines usually a rural area of reasonably homogeneous geography without large sources, and extends from tens to hundreds of kilometers. Start Printed Page 61317

    (6) National and global scales—These measurement scales represent concentrations characterizing the nation and the globe as a whole.

    (c) Proper siting of a monitor requires specification of the monitoring objective, the types of sites necessary to meet the objective, and then the desired spatial scale of representativeness. For example, consider the case where the objective is to determine NAAQS compliance by understanding the maximum ozone concentrations for an area. Such areas would most likely be located downwind of a metropolitan area, quite likely in a suburban residential area where children and other susceptible individuals are likely to be outdoors. Sites located in these areas are most likely to represent an urban scale of measurement. In this example, physical location was determined by considering ozone precursor emission patterns, public activity, and meteorological characteristics affecting ozone formation and dispersion. Thus, spatial scale of representativeness was not used in the selection process but was a result of site location.

    (d) In some cases, the physical location of a site is determined from joint consideration of both the basic monitoring objective and the type of monitoring site desired, or required by this appendix. For example, to determine PM2.5 concentrations which are typical over a geographic area having relatively high PM2.5 concentrations, a neighborhood scale site is more appropriate. Such a site would likely be located in a residential or commercial area having a high overall PM2.5 emission density but not in the immediate vicinity of any single dominant source. Note that in this example, the desired scale of representativeness was an important factor in determining the physical location of the monitoring site.

    (e) In either case, classification of the monitor by its type and spatial scale of representativeness is necessary and will aid in interpretation of the monitoring data for a particular monitoring objective (e.g., public reporting, NAAQS compliance, or research support).

    (f) Table D-1 of this appendix illustrates the relationship between the various site types that can be used to support the three basic monitoring objectives, and the scales of representativeness that are generally most appropriate for that type of site.

    Table D-1 of Appendix D to Part 58. Relationship Between Site Types and Scales of Representativeness

    Site typeAppropriate siting scales
    1. Highest concentrationMicro, middle, neighborhood (sometimes urban or regional for secondarily formed pollutants).
    2. Population orientedNeighborhood, urban.
    3. Source impactMicro, middle, neighborhood.
    4. General/background & regional transportUrban, regional.
    5. Welfare-related impactsUrban, regional.

    2. General Monitoring Requirements

    (a) The National ambient air monitoring system includes several types of monitoring stations, each targeting a key data collection need and each varying in technical sophistication.

    (b) Research grade sites are platforms for scientific studies, either involved with health or welfare impacts, measurement methods development, or other atmospheric studies. These sites may be collaborative efforts between regulatory agencies and researchers with specific scientific objectives for each. Data from these sites might be collected with both traditional and experimental techniques, and data collection might involve specific laboratory analyses not common in routine measurement programs. The research grade sites are not required by regulation; however, they are included here due to their important role in supporting the air quality management program.

    (c) The NCore multipollutant sites are sites that measure multiple pollutants in order to provide support to integrated air quality management data needs. NCore sites include both neighborhood and urban scale measurements in general, in a selection of metropolitan areas and a limited number of more rural locations. Continuous monitoring methods are to be used at the NCore sites when available for a pollutant to be measured, as it is important to have data collected over common time periods for integrated analyses. NCore multipollutant sites are intended to be long-term sites useful for a variety of applications including air quality trends analyses, model evaluation, and tracking metropolitan area statistics. As such, the NCore sites should be placed away from direct emission sources that could substantially impact the ability to detect area-wide concentrations. The Administrator must approve the NCore sites.

    (d) Monitoring sites designated as SLAMS sites, but not as NCore sites, are intended to address specific air quality management interests, and as such, are frequently single-pollutant measurement sites. The EPA Regional Administrator must approve the SLAMS sites.

    (e) This appendix uses the statistical-based definitions for metropolitan areas provided by the Office of Management and Budget and the Census Bureau. These areas are referred to as metropolitan statistical areas (MSA), micropolitan statistical areas, core-based statistical areas (CBSA), and combined statistical areas (CSA). A CBSA associated with at least one urbanized area of 50,000 population or greater is termed a Metropolitan Statistical Area (MSA). A CBSA associated with at least one urbanized cluster of at least 10,000 population or greater is termed a Micropolitan Statistical Area. CSA consist of two or more adjacent CBSA. In this appendix, the term MSA is used to refer to a Metropolitan Statistical Area. By definition, both MSA and CSA have a high degree of integration; however, many such areas cross State or other political boundaries. MSA and CSA may also cross more than one air shed. The EPA recognizes that State or local agencies must consider MSA/CSA boundaries and their own political boundaries and geographical characteristics in designing their air monitoring networks. The EPA recognizes that there may be situations where the EPA Regional Administrator and the affected State or local agencies may need to augment or to divide the overall MSA/CSA monitoring responsibilities and requirements among these various agencies to achieve an effective network design. Full monitoring requirements apply separately to each affected State or local agency in the absence of an agreement between the affected agencies and the EPA Regional Administrator.

    3. Design Criteria for NCore Sites

    (a) Each State (i.e. the fifty States, District of Columbia, Puerto Rico, and the Virgin Islands) is required to operate at least one NCore site. States may delegate this requirement to a local agency. States with many MSAs often also have multiple air sheds with unique characteristics and, often, elevated air pollution. These States include, at a minimum, California, Florida, Illinois, Michigan, New York, North Carolina, Ohio, Pennsylvania, and Texas. These States are required to identify one to two additional NCore sites in order to account for their unique situations. These additional sites shall be located to avoid proximity to large emission sources. Any State or local agency can propose additional candidate NCore sites or modifications to these requirements for approval by the Administrator. The NCore locations should be leveraged with other multipollutant air monitoring sites including PAMS sites, National Air Toxics Trends Stations (NATTS) sites, CASTNET sites, and STN sites. Site leveraging includes using the same monitoring platform and equipment to meet the objectives of the variety of programs where possible and advantageous.

    (b) The NCore sites must measure, at a minimum, PM2.5 particle mass using continuous and integrated/filter-based samplers, speciated PM2.5, PM10-2.5 particle mass, speciated PM10-2.5, O3, SO2, CO, NO/NOy, wind speed, wind direction, relative humidity, and ambient temperature.

    (1) Although the measurement of NOy is required in support of a number of monitoring objectives, available commercial instruments may indicate little difference in their measurement of NOy compared to the conventional measurement of NOX, particularly in areas with relatively fresh sources of nitrogen emissions. Therefore, in areas with negligible expected difference between NOy and NOX measured concentrations, the Administrator may allow Start Printed Page 61318for waivers that permit NOX monitoring to be substituted for the required NOy monitoring at applicable NCore sites.

    (2) EPA recognizes that, in some cases, the physical location of the NCore site may not be suitable for representative meteorological measurements due to the site's physical surroundings. It is also possible that nearby meteorological measurements may be able to fulfill this data need. In these cases, the requirement for meteorological monitoring can be waived by the Administrator.

    (c) In addition to the continuous measurements listed above, 10 of the NCore locations must also measure lead (Pb) either at the same sites or elsewhere within the MSA/CSA boundary. These ten Pb sites are included within the NCore networks because they are intended to be long-term in operation, and not impacted directly from a single Pb source. These locations for Pb monitoring must be located in the most populated MSA/CSA in each of the 10 EPA Regions. Alternatively, it is also acceptable to use the Pb concentration data provided at urban air toxics sites. In approving any substitutions, the Administrator must consider whether these alternative sites are suitable for collecting long-term lead trends data for the broader area.

    (d) Siting criteria are provided for urban and rural locations. Sites with significant historical records that do not meet siting criteria may be approved as NCore by the Administrator. Sites with the suite of NCore measurements that are explicitly designed for other monitoring objectives are exempt from these siting criteria (e.g., a near-roadway site).

    (1) Urban NCore stations are to be generally located at urban or neighborhood scale to provide representative concentrations of exposure expected throughout the metropolitan area; however, a middle-scale site may be acceptable in cases where the site can represent many such locations throughout a metropolitan area.

    (2) Rural NCore stations are to be located to the maximum extent practicable at a regional or larger scale away from any large local emission source, so that they represent ambient concentrations over an extensive area.

    4. Pollutant-Specific Design Criteria for SLAMS Sites

    4.1 Ozone (O3) Design Criteria. (a) State, and where appropriate, local agencies must operate O3 sites for various locations depending upon area size (in terms of population and geographic characteristics) and typical peak concentrations (expressed in percentages below, or near the O3 NAAQS). Specific SLAMS O3 site minimum requirements are included in Table D-2 of this appendix. The NCore sites are expected to complement the O3 data collection that takes place at single-pollutant SLAMS sites, and both types of sites can be used to meet the network minimum requirements. The total number of O3 sites needed to support the basic monitoring objectives of public data reporting, air quality mapping, compliance, and understanding O3-related atmospheric processes will include more sites than these minimum numbers required in Table D-2 of this appendix. The EPA Regional Administrator and the responsible State or local air monitoring agency must work together to design and/or maintain the most appropriate O3 network to service the variety of data needs in an area.

    Table D-2 of Appendix D to Part 58.— SLAMS Minimum O3 Monitoring Requirements

    MSA population1, 2Most recent 3-year design value concentrations ≥85% of any O3 NAAQS 3Most recent 3-year design value concentrations <85% of any O3 NAAQS3, 4
    >10 million42
    4-10 million31
    350,000-<4 million21
    50,000-<350,000 510
    1 Minimum monitoring requirements apply to the Metropolitan statistical area (MSA).
    2 Population based on latest available census figures.
    3 The ozone (O3) National Ambient Air Quality Standards (NAAQS) levels and forms are defined in 40 CFR part 50.
    4 These minimum monitoring requirements apply in the absence of a design value.
    5 Metropolitan statistical areas (MSA) must contain an urbanized area of 50,000 or more population.

    (b) Within an O3 network, at least one O3 site for each MSA, or CSA if multiple MSAs are involved, must be designed to record the maximum concentration for that particular metropolitan area. More than one maximum concentration site may be necessary in some areas. Table D-2 of this appendix does not account for the full breadth of additional factors that would be considered in designing a complete O3 monitoring program for an area. Some of these additional factors include geographic size, population density, complexity of terrain and meteorology, adjacent O3 monitoring programs, air pollution transport from neighboring areas, and measured air quality in comparison to all forms of the O3 NAAQS (i.e., 8-hour and 1-hour forms). Networks must be designed to account for all of these area characteristics. Network designs must be re-examined in periodic network assessments. Deviations from the above O3 requirements are allowed if approved by the EPA Regional Administrator.

    (c) The appropriate spatial scales for O3 sites are neighborhood, urban, and regional. Since O3 requires appreciable formation time, the mixing of reactants and products occurs over large volumes of air, and this reduces the importance of monitoring small scale spatial variability.

    (1) Neighborhood scale—Measurements in this category represent conditions throughout some reasonably homogeneous urban sub-region, with dimensions of a few kilometers. Homogeneity refers to pollutant concentrations. Neighborhood scale data will provide valuable information for developing, testing, and revising concepts and models that describe urban/regional concentration patterns. These data will be useful to the understanding and definition of processes that take periods of hours to occur and hence involve considerable mixing and transport. Under stagnation conditions, a site located in the neighborhood scale may also experience peak concentration levels within a metropolitan area.

    (2) Urban scale—Measurement in this scale will be used to estimate concentrations over large portions of an urban area with dimensions of several kilometers to 50 or more kilometers. Such measurements will be used for determining trends, and designing area-wide control strategies. The urban scale sites would also be used to measure high concentrations downwind of the area having the highest precursor emissions.

    (3) Regional scale—This scale of measurement will be used to typify concentrations over large portions of a metropolitan area and even larger areas with dimensions of as much as hundreds of kilometers. Such measurements will be useful for assessing the O3 that is transported to and from a metropolitan area, as well as background concentrations. In some situations, particularly when considering very large metropolitan areas with complex source mixtures, regional scale sites can be the maximum concentration location.

    (d) EPA's technical guidance documents on O3 monitoring network design should be used to evaluate the adequacy of each existing O3 monitor, to relocate an existing site, or to locate any new O3 sites.

    (e) For locating a neighborhood scale site to measure typical city concentrations, a reasonably homogeneous geographical area near the center of the region should be selected which is also removed from the influence of major NOX sources. For an urban scale site to measure the high concentration areas, the emission inventories should be Start Printed Page 61319used to define the extent of the area of important nonmethane hydrocarbons and NOX emissions. The meteorological conditions that occur during periods of maximum photochemical activity should be determined. These periods can be identified by examining the meteorological conditions that occur on the highest O3 air quality days. Trajectory analyses, an evaluation of wind and emission patterns on high O3 days, can also be useful in evaluating an O3 monitoring network. In areas without any previous O3 air quality measurements, meteorological and O3 precursor emissions information would be useful.

    (f) Once the meteorological and air quality data are reviewed, the prospective maximum concentration monitor site should be selected in a direction from the city that is most likely to observe the highest O3 concentrations, more specifically, downwind during periods of photochemical activity. In many cases, these maximum concentration O3 sites will be located 10 to 30 miles or more downwind from the urban area where maximum O3 precursor emissions originate. The downwind direction and appropriate distance should be determined from historical meteorological data collected on days which show the potential for producing high O3 levels. Monitoring agencies are to consult with their EPA Regional Office when considering siting a maximum O3 concentration site.

    (g) In locating a neighborhood scale site which is to measure high concentrations, the same procedures used for the urban scale are followed except that the site should be located closer to the areas bordering on the center city or slightly further downwind in an area of high density population.

    (h) For regional scale background monitoring sites, similar meteorological analysis as for the maximum concentration sites may also inform the decisions for locating regional scale sites. Regional scale sites may be located to provide data on O3 transport between cities, as background sites, or for other data collection purposes. Consideration of both area characteristics, such as meteorology, and the data collection objectives, such as transport, must be jointly considered for a regional scale site to be useful.

    (i) Since O3 levels decrease significantly in the colder parts of the year in many areas, O3 is required to be monitored at SLAMS monitoring sites only during the “ozone season” as designated in the AQS files on a State-by-State basis and described below in Table D-3 of this appendix. Deviations from the O3 monitoring season must be approved by the EPA Regional Administrator, documented within the annual monitoring network plan, and updated in AQS. Information on how to analyze O3 data to support a change to the O3 season in support of the 8-hour standard for a specific State can be found in reference 8 to this appendix.

    Table D-3 to Appendix D of Part 58. Ozone Monitoring Season by State

    StateBegin monthEnd month
    AlabamaMarchOctober
    AlaskaAprilOctober
    ArizonaJanuaryDecember
    ArkansasMarchNovember
    CaliforniaJanuaryDecember
    ColoradoMarchSeptember
    ConnecticutAprilSeptember
    DelawareAprilOctober
    District of ColumbiaAprilOctober
    FloridaMarchOctober
    GeorgiaMarchOctober
    HawaiiJanuaryDecember
    IdahoMaySeptember
    IllinoisAprilOctober
    IndianaAprilSeptember
    IowaAprilOctober
    KansasAprilOctober
    KentuckyMarchOctober
    Louisiana AQCR 019,022MarchOctober
    Louisiana AQCR 106JanuaryDecember
    MaineAprilSeptember
    MarylandAprilOctober
    MassachusettsAprilSeptember
    MichiganAprilSeptember
    MinnesotaAprilOctober
    MississippiMarchOctober
    MissouriAprilOctober
    MontanaJuneSeptember
    NebraskaAprilOctober
    NevadaJanuaryDecember
    New HampshireAprilSeptember
    New JerseyAprilOctober
    New MexicoJanuaryDecember
    New YorkAprilOctober
    North CarolinaAprilOctober
    North DakotaMaySeptember
    OhioAprilOctober
    OklahomaMarchNovember
    OregonMaySeptember
    PennsylvaniaAprilOctober
    Puerto RicoJanuaryDecember
    Rhode IslandAprilSeptember
    South CarolinaAprilOctober
    South DakotaJuneSeptember
    TennesseeMarchOctober
    Texas AQCR 106,153, 213, 214, 216JanuaryDecember
    Texas AQCR 022, 210, 211, 212, 215, 217, 218MarchOctober
    UtahMaySeptember
    VermontAprilSeptember
    VirginiaAprilOctober
    WashingtonMaySeptember
    West VirginiaAprilOctober
    WisconsinApril 15October 15
    WyomingAprilOctober
    American SamoaJanuaryDecember
    GuamJanuaryDecember
    Virgin IslandsJanuaryDecember

    4.2 Carbon Monoxide (CO) Design Criteria. (a) There are no minimum requirements for the number of CO monitoring sites. Continued operation of existing SLAMS CO sites using FRM or FEM is required until discontinuation is approved by the EPA Regional Administrator. Where SLAMS CO monitoring is ongoing, at least one site must be a maximum concentration site for that area under investigation.

    (b) Microscale and middle scale measurements are useful site classifications for SLAMS sites since most people have the potential for exposure on these scales. Carbon monoxide maxima occur primarily in areas near major roadways and intersections with high traffic density and often poor atmospheric ventilation.

    (1) Microscale—This scale applies when air quality measurements are to be used to represent distributions within street canyons, over sidewalks, and near major roadways. In the case with carbon monoxide, microscale measurements in one location can often be considered as representative of other similar locations in a city.

    (2) Middle scale—Middle scale measurements are intended to represent areas with dimensions from 100 meters to 0.5 kilometer. In certain cases, middle scale measurements may apply to areas that have a total length of several kilometers, such as “line” emission source areas. This type of emission sources areas would include air quality along a commercially developed street or shopping plaza, freeway corridors, parking lots and feeder streets.

    (c) After the spatial scale and type of site has been determined to meet the monitoring objective for each location, the technical guidance in reference 2 of this appendix should be used to evaluate the adequacy of each existing CO site and must be used to relocate an existing site or to locate any new sites.

    4.3 Nitrogen Dioxide (NO2) Design Criteria. (a) There are no minimum requirements for the number of NO2 monitoring sites. Continued operation of existing SLAMS NO2 sites using FRM or FEM is required until discontinuation is approved by the EPA Regional Administrator. Where SLAMS NO2 monitoring is ongoing, at least one NO2 site in the area must be located to measure the maximum concentration of NO2.

    (b) NO/NOy measurements are included within the NCore multipollutant site requirements and the PAMS program. These NO/NOy measurements will produce conservative estimates for NO2 that can be used to ensure tracking continued compliance with the NO2 NAAQS. NO/NOy monitors are used at these sites because it is important to collect data on total reactive nitrogen species for understanding O3 photochemistry.

    4.4 Sulfur Dioxide (SO2) Design Criteria. (a) There are no minimum requirements for the number of SO2 monitoring sites. Continued operation of existing SLAMS SO2 sites using FRM or FEM is required until discontinuation is approved by the EPA Regional Administrator. Where SLAMS SO2 monitoring is ongoing, at least one of the SLAMS SO2 sites must be a maximum concentration site for that specific area.

    (b) The appropriate spatial scales for SO2 SLAMS monitoring are the microscale, middle, and possibly neighborhood scales. The multi-pollutant NCore sites can provide Start Printed Page 61320for metropolitan area trends analyses and general control strategy progress tracking. Other SLAMS sites are expected to provide data that are useful in specific compliance actions, for maintenance plan agreements, or for measuring near specific stationary sources of SO2.

    (1) Micro and middle scale—Some data uses associated with microscale and middle scale measurements for SO2 include assessing the effects of control strategies to reduce concentrations (especially for the 3-hour and 24-hour averaging times) and monitoring air pollution episodes.

    (2) Neighborhood scale—This scale applies where there is a need to collect air quality data as part of an ongoing SO2 stationary source impact investigation. Typical locations might include suburban areas adjacent to SO2 stationary sources for example, or for determining background concentrations as part of these studies of population responses to exposure to SO2.

    (c) Technical guidance in reference 1 of this appendix should be used to evaluate the adequacy of each existing SO2 site, to relocate an existing site, or to locate new sites.

    4.5 Lead (Pb) Design Criteria. (a) State, and where appropriate, local agencies are required to conduct Pb monitoring for all areas where Pb levels have been shown or are expected to be of concern over the most recent 2 years. As a minimum, there must be two SLAMS sites in any area where Pb concentrations currently exceed or have exceeded the Pb NAAQS in the most recent 2 years, and at least one of these two required sites must be a maximum concentration site. Where the Pb air quality violations are widespread or the emissions density, topography, or population locations are complex and varied, the EPA Regional Administrator may require more than two Pb ambient air monitoring sites.

    (b) The most important spatial scales to effectively characterize the emissions from point sources are the micro, middle, and neighborhood scales.

    (1) Microscale—This scale would typify areas in close proximity to lead point sources. Emissions from point sources such as primary and secondary lead smelters, and primary copper smelters may under fumigation conditions likewise result in high ground level concentrations at the microscale. In the latter case, the microscale would represent an area impacted by the plume with dimensions extending up to approximately 100 meters. Data collected at microscale sites provide information for evaluating and developing “hot-spot” control measures.

    (2) Middle scale—This scale generally represents Pb air quality levels in areas up to several city blocks in size with dimensions on the order of approximately 100 meters to 500 meters. The middle scale may for example, include schools and playgrounds in center city areas which are close to major Pb point sources. Pb monitors in such areas are desirable because of the higher sensitivity of children to exposures of elevated Pb concentrations (reference 3 of this appendix). Emissions from point sources frequently impact on areas at which single sites may be located to measure concentrations representing middle spatial scales.

    (3) Neighborhood scale—The neighborhood scale would characterize air quality conditions throughout some relatively uniform land use areas with dimensions in the 0.5 to 4.0 kilometer range. Sites of this scale would provide monitoring data in areas representing conditions where children live and play. Monitoring in such areas is important since this segment of the population is more susceptible to the effects of Pb. Where a neighborhood site is located away from immediate Pb sources, the site may be very useful in representing typical air quality values for a larger residential area, and therefore suitable for population exposure and trends analyses.

    (c) Technical guidance is found in references 4 and 5 of this appendix. These documents provide additional guidance on locating sites to meet specific urban area monitoring objectives and should be used in locating new sites or evaluating the adequacy of existing sites.

    4.6 Particulate Matter (PM10) Design Criteria. (a) State, and where applicable local, agencies must operate the minimum number of required PM10 SLAMS sites listed in Table D-4 of this appendix.

    Table D-4 of Appendix D to Part 58. PM10 Minimum Monitoring Requirements (Number of Stations per MSA) 1

    Population categoryHigh concentration 2Medium concentration 3Low concentration 4,5
    >1,000,0006-104-82-4
    500,000-1,000,0004-82-41-2
    250,000-500,0003-41-20-1
    100,000-250,0001-20-10
    1 Selection of urban areas and actual numbers of stations per area within the ranges shown in this table will be jointly determined by EPA and the State Agency.
    2 High concentration areas are those for which ambient PM10 data show ambient concentrations exceeding the PM10 NAAQS by 20 percent or more.
    3 Medium concentration areas are those for which ambient PM10 data show ambient concentrations exceeding 80 percent of the PM10 NAAQS.
    4 Low concentration areas are those for which ambient PM10 data show ambient concentrations less than 80 percent of the PM10 NAAQS.
    5 These minimum monitoring requirements apply in the absence of a design value.

    (b) Although microscale monitoring may be appropriate in some circumstances, the most important spatial scales to effectively characterize the emissions of PM10 from both mobile and stationary sources are the middle scales and neighborhood scales.

    (1) Microscale—This scale would typify areas such as downtown street canyons, traffic corridors, and fence line stationary source monitoring locations where the general public could be exposed to maximum PM10 concentrations. Microscale particulate matter sites should be located near inhabited buildings or locations where the general public can be expected to be exposed to the concentration measured. Emissions from stationary sources such as primary and secondary smelters, power plants, and other large industrial processes may, under certain plume conditions, likewise result in high ground level concentrations at the microscale. In the latter case, the microscale would represent an area impacted by the plume with dimensions extending up to approximately 100 meters. Data collected at microscale sites provide information for evaluating and developing hot spot control measures.

    (2) Middle scale—Much of the short-term public exposure to coarse fraction particles (PM10) is on this scale and on the neighborhood scale. People moving through downtown areas or living near major roadways or stationary sources, may encounter particulate pollution that would be adequately characterized by measurements of this spatial scale. Middle scale PM10 measurements can be appropriate for the evaluation of possible short-term exposure public health effects. In many situations, monitoring sites that are representative of micro-scale or middle-scale impacts are not unique and are representative of many similar situations. This can occur along traffic corridors or other locations in a residential district. In this case, one location is representative of a neighborhood of small scale sites and is appropriate for evaluation of long-term or chronic effects. This scale also includes the characteristic concentrations for other areas with dimensions of a few hundred meters such as the parking lot and feeder streets associated with shopping centers, stadia, and office buildings. In the case of PM10, unpaved or seldomly swept parking lots associated with these sources could be an important source Start Printed Page 61321in addition to the vehicular emissions themselves.

    (3) Neighborhood scale—Measurements in this category represent conditions throughout some reasonably homogeneous urban sub-region with dimensions of a few kilometers and of generally more regular shape than the middle scale. Homogeneity refers to the particulate matter concentrations, as well as the land use and land surface characteristics. In some cases, a location carefully chosen to provide neighborhood scale data would represent not only the immediate neighborhood but also neighborhoods of the same type in other parts of the city. Neighborhood scale PM10 sites provide information about trends and compliance with standards because they often represent conditions in areas where people commonly live and work for extended periods. Neighborhood scale data could provide valuable information for developing, testing, and revising models that describe the larger-scale concentration patterns, especially those models relying on spatially smoothed emission fields for inputs. The neighborhood scale measurements could also be used for neighborhood comparisons within or between cities.

    4.7 Fine Particulate Matter (PM2.5) Design Criteria.

    4.7.1 General Requirements. (a) State, and where applicable local, agencies must operate the minimum number of required PM2.5 SLAMS sites listed in Table D-5 of this appendix. The NCore sites are expected to complement the PM2.5 data collection that takes place at non-NCore SLAMS sites, and both types of sites can be used to meet the minimum PM2.5 network requirements. Deviations from these PM2.5 monitoring requirements must be approved by the EPA Regional Administrator.

    Table D-5 of Appendix D to Part 58. PM2.5 Minimum Monitoring Requirements

    MSA population 1,2Most recent 3-year design value ≥85% of any PM2.5 NAAQS 3Most recent 3-year design value <85% of any PM2.5 NAAQS 3, 4
    >1,000,00032
    500,000-1,000,00021
    50,000-<500,000 510
    1 Minimum monitoring requirements apply to the Metropolitan statistical area (MSA).
    2 Population based on latest available census figures.
    3 The PM2.5 National Ambient Air Quality Standards (NAAQS) levels and forms are defined in 40 CFR part 50.
    4 These minimum monitoring requirements apply in the absence of a design value.
    5 Metropolitan statistical areas (MSA) must contain an urbanized area of 50,000 or more population.

    (b) Specific Design Criteria for PM2.5. The required monitoring stations or sites must be sited to represent community-wide air quality. These sites can include sites collocated at PAMS. These monitoring stations will typically be at neighborhood or urban-scale; however, in certain instances where population-oriented micro-or middle-scale PM2.5 monitoring are determined by the Regional Administrator to represent many such locations throughout a metropolitan area, these smaller scales can be considered to represent community-wide air quality.

    (1) At least one monitoring station is to be sited in a population-oriented area of expected maximum concentration.

    (2) For areas with more than one required SLAMS, a monitoring station is to be sited in an area of poor air quality.

    (3) Additional technical guidance for siting PM2.5 monitors is provided in references 6 and 7 of this appendix.

    (c) The most important spatial scale to effectively characterize the emissions of particulate matter from both mobile and stationary sources is the neighborhood scale for PM2.5. For purposes of establishing monitoring sites to represent large homogenous areas other than the above scales of representativeness and to characterize regional transport, urban or regional scale sites would also be needed. Most PM2.5 monitoring in urban areas should be representative of a neighborhood scale.

    (1) Microscale—This scale would typify areas such as downtown street canyons and traffic corridors where the general public would be exposed to maximum concentrations from mobile sources. In some circumstances, the microscale is appropriate for particulate sites; community-oriented SLAMS sites measured at the microscale level should, however, be limited to urban sites that are representative of long-term human exposure and of many such microenvironments in the area. In general, microscale particulate matter sites should be located near inhabited buildings or locations where the general public can be expected to be exposed to the concentration measured. Emissions from stationary sources such as primary and secondary smelters, power plants, and other large industrial processes may, under certain plume conditions, likewise result in high ground level concentrations at the microscale. In the latter case, the microscale would represent an area impacted by the plume with dimensions extending up to approximately 100 meters. Data collected at microscale sites provide information for evaluating and developing hot spot control measures. Unless these sites are indicative of population-oriented monitoring, they may be more appropriately classified as SPM.

    (2) Middle scale—People moving through downtown areas, or living near major roadways, encounter particle concentrations that would be adequately characterized by this spatial scale. Thus, measurements of this type would be appropriate for the evaluation of possible short-term exposure public health effects of particulate matter pollution. In many situations, monitoring sites that are representative of microscale or middle-scale impacts are not unique and are representative of many similar situations. This can occur along traffic corridors or other locations in a residential district. In this case, one location is representative of a number of small scale sites and is appropriate for evaluation of long-term or chronic effects. This scale also includes the characteristic concentrations for other areas with dimensions of a few hundred meters such as the parking lot and feeder streets associated with shopping centers, stadia, and office buildings.

    (3) Neighborhood scale—Measurements in this category would represent conditions throughout some reasonably homogeneous urban sub-region with dimensions of a few kilometers and of generally more regular shape than the middle scale. Homogeneity refers to the particulate matter concentrations, as well as the land use and land surface characteristics. Much of the PM2.5 exposures are expected to be associated with this scale of measurement. In some cases, a location carefully chosen to provide neighborhood scale data would represent the immediate neighborhood as well as neighborhoods of the same type in other parts of the city. PM2.5 sites of this kind provide good information about trends and compliance with standards because they often represent conditions in areas where people commonly live and work for periods comparable to those specified in the NAAQS. In general, most PM2.5 monitoring in urban areas should have this scale.

    (4) Urban scale—This class of measurement would be used to characterize the particulate matter concentration over an entire metropolitan or rural area ranging in size from 4 to 50 kilometers. Such measurements would be useful for assessing trends in area-wide air quality, and hence, the effectiveness of large scale air pollution control strategies. Community-oriented PM2.5 sites may have this scale.

    (5) Regional scale—These measurements would characterize conditions over areas with dimensions of as much as hundreds of kilometers. As noted earlier, using representative conditions for an area implies some degree of homogeneity in that area. For this reason, regional scale measurements would be most applicable to sparsely populated areas. Data characteristics of this scale would provide information about larger scale processes of particulate matter Start Printed Page 61322emissions, losses and transport. PM2.5 transport contributes to elevated particulate concentrations and may affect multiple urban and State entities with large populations such as in the eastern United States. Development of effective pollution control strategies requires an understanding at regional geographical scales of the emission sources and atmospheric processes that are responsible for elevated PM2.5 levels and may also be associated with elevated O3 and regional haze.

    4.7.2 Requirement for Continuous PM2.5 Monitoring. State, or where appropriate, local agencies must operate continuous fine particulate analyzers equal to at least one-half (round up) the minimum required sites listed in Table D-5 of this appendix. At least one required FRM/FEM monitor in each MSA must be collocated. State and local air monitoring agencies must use methodologies and quality assurance/quality control(QA/QC) procedures approved by the EPA Regional Administrator for these sites.

    4.7.3 Requirement for PM2.5 Background and Transport Sites. Each State shall install and operate at least one PM2.5 site to monitor for regional background and at least one PM2.5 site to monitor regional transport. These monitoring sites may be at community-oriented sites and this requirement may be satisfied by a corresponding monitor in an area having similar air quality in another State. State and local air monitoring agencies must use methodologies and QA/QC procedures approved by the EPA Regional Administrator for these sites. Methods used at these sites may include non-federal reference method samplers such as IMPROVE or continuous PM2.5 monitors.

    4.7.4 PM2.5 Chemical Speciation Site Requirements. Each State shall continue to conduct chemical speciation monitoring and analyses at sites designated to be part of the PM2.5 Speciation Trends Network (STN). The selection and modification of these STN sites must be approved by the Administrator. The PM2.5 chemical speciation urban trends sites shall include analysis for elements, selected anions and cations, and carbon. Samples must be collected using the monitoring methods and the sampling schedules approved by the Administrator. Chemical speciation is encouraged at additional sites where the chemically resolved data would be useful in developing State implementation plans and supporting atmospheric or health effects related studies.

    4.7.5 Special Network Considerations Required When Using PM2.5 Spatial Averaging Approaches. (a) The PM2.5 NAAQS, specified in 40 CFR part 50, provides State and local air monitoring agencies with an option for spatially averaging PM2.5 air quality data. More specifically, two or more community-oriented (i.e., sites in populated areas) PM2.5 monitors may be averaged for comparison with the annual PM2.5 NAAQS. This averaging approach is directly related to epidemiological studies used as the basis for the PM2.5 annual NAAQS. Spatial averaging does not apply to comparisons with the daily PM2.5 NAAQS.

    (b) State and local agencies must carefully consider their approach for PM2.5 network design when they intend to spatially average the data for compliance purposes. These State and local air monitoring agencies must define the area over which they intend to average PM2.5 air quality concentrations. This area is defined as a Community Monitoring Zone (CMZ), which characterizes an area of relatively similar annual average air quality. State and local agencies can define a CMZ in a number of ways, including as part or all of a metropolitan area. These CMZ must be defined within a State or local agencies network description, as required in § 58.10 of this part and approved by the EPA Regional Administrator. When more than one CMZ is described within an agency's network design plan, CMZs must not overlap in their geographical coverage. The criteria that must be used for evaluating the acceptability of spatial averaging are defined in appendix N to 40 CFR part 50.

    4.8 Coarse Particulate Matter (PM10−2.5) Design Criteria.

    4.8.1 General Monitoring Requirements. (a) The only required monitors for PM10−2.5 are those required at NCore Stations.

    (b) Although microscale monitoring may be appropriate in some circumstances, middle and neighborhood scale measurements are the most important station classifications for PM10−2.5 to assess the variation in coarse particle concentrations that would be expected across populated areas that are in proximity to large emissions sources.

    (1) Microscale—This scale would typify relatively small areas immediately adjacent to: Industrial sources; locations experiencing ongoing construction, redevelopment, and soil disturbance; and heavily traveled roadways. Data collected at microscale stations would characterize exposure over areas of limited spatial extent and population exposure, and may provide information useful for evaluating and developing source-oriented control measures.

    (2) Middle scale—People living or working near major roadways or industrial districts encounter particle concentrations that would be adequately characterized by this spatial scale. Thus, measurements of this type would be appropriate for the evaluation of public health effects of coarse particle exposure. Monitors located in populated areas that are nearly adjacent to large industrial point sources of coarse particles provide suitable locations for assessing maximum population exposure levels and identifying areas of potentially poor air quality. Similarly, monitors located in populated areas that border dense networks of heavily-traveled traffic are appropriate for assessing the impacts of resuspended road dust. This scale also includes the characteristic concentrations for other areas with dimensions of a few hundred meters such as school grounds and parks that are nearly adjacent to major roadways and industrial point sources, locations exhibiting mixed residential and commercial development, and downtown areas featuring office buildings, shopping centers, and stadiums.

    (3) Neighborhood scale—Measurements in this category would represent conditions throughout some reasonably homogeneous urban sub-region with dimensions of a few kilometers and of generally more regular shape than the middle scale. Homogeneity refers to the particulate matter concentrations, as well as the land use and land surface characteristics. This category includes suburban neighborhoods dominated by residences that are somewhat distant from major roadways and industrial districts but still impacted by urban sources, and areas of diverse land use where residences are interspersed with commercial and industrial neighborhoods. In some cases, a location carefully chosen to provide neighborhood scale data would represent the immediate neighborhood as well as neighborhoods of the same type in other parts of the city. The comparison of data from middle scale and neighborhood scale sites would provide valuable information for determining the variation of PM10-2.5 levels across urban areas and assessing the spatial extent of elevated concentrations caused by major industrial point sources and heavily traveled roadways. Neighborhood scale sites would provide concentration data that are relevant to informing a large segment of the population of their exposure levels on a given day.

    4.8.2 PM10-2.5 Chemical Speciation Site Requirements. PM10-2.5 chemical speciation monitoring and analyses is required at NCore sites. The selection and modification of these sites must be approved by the Administrator. Samples must be collected using the monitoring methods and the sampling schedules approved by the Administrator.

    5. Network Design for Photochemical Assessment Monitoring Stations (PAMS)

    The PAMS program provides more comprehensive data on O3 air pollution in areas classified as serious, severe, or extreme nonattainment for O3 than would otherwise be achieved through the NCore and SLAMS sites. More specifically, the PAMS program includes measurements for O3, oxides of nitrogen, VOC, and meteorology.

    5.1 PAMS Monitoring Objectives. PAMS design criteria are site specific. Concurrent measurements of O3, oxides of nitrogen, speciated VOC, CO, and meteorology are obtained at PAMS sites. Design criteria for the PAMS network are based on locations relative to O3 precursor source areas and predominant wind directions associated with high O3 events. Specific monitoring objectives are associated with each location. The overall design should enable characterization of precursor emission sources within the area, transport of O3 and its precursors, and the photochemical processes related to O3 nonattainment. Specific objectives that must be addressed include assessing ambient trends in O3, oxides of nitrogen, VOC species, and determining spatial and diurnal variability of O3, oxides of nitrogen, and VOC species. Specific monitoring objectives associated with each of these sites may result in four distinct site types. Detailed guidance for the locating of these sites may be found in reference 9 of this appendix.

    (a) Type 1 sites are established to characterize upwind background and transported O3 and its precursor concentrations entering the area and will identify those areas which are subjected to transport. Start Printed Page 61323

    (b) Type 2 sites are established to monitor the magnitude and type of precursor emissions in the area where maximum precursor emissions are expected to impact and are suited for the monitoring of urban air toxic pollutants.

    (c) Type 3 sites are intended to monitor maximum O3 concentrations occurring downwind from the area of maximum precursor emissions.

    (d) Type 4 sites are established to characterize the downwind transported O3 and its precursor concentrations exiting the area and will identify those areas which are potentially contributing to overwhelming transport in other areas.

    5.2 Monitoring Period. PAMS precursor monitoring must be conducted annually throughout the months of June, July and August (as a minimum) when peak O3 values are expected in each area. Alternate precursor monitoring periods may be submitted for approval to the Administrator as a part of the annual monitoring network plan required by § 58.10.

    5.3 Minimum Monitoring Network Requirements. A Type 2 site is required for each area. Overall, only two sites are required for each area, providing all chemical measurements are made. For example, if a design includes two Type 2 sites, then a third site will be necessary to capture the NOy measurement. The minimum required number and type of monitoring sites and sampling requirements are listed in Table D-6 of this appendix. Any alternative plans may be put in place in lieu of these requirements, if approved by the Administrator.

    Table D-6 of Appendix D to Part 58. Minimum Required PAMS Monitoring Locations and Frequencies

    MeasurementWhere requiredSampling frequency (all daily except for upper air meteorology) 1
    Speciated VOC2Two sites per area, one of which must be a Type 2 siteDuring the PAMS monitoring period: (1) Hourly auto GC, or (2) Eight 3-hour canisters, or (3) 1 morning and 1 afternoon canister with a 3-hour or less averaging time plus Continuous Total Non-methane Hydrocarbon measurement.
    Carbonyl samplingType 2 site in areas classified as serious or above for the 8-hour ozone standard3-hour samples every day during the PAMS monitoring period.
    NOXAll Type 2 sitesHourly during the ozone monitoring season.3
    NOyOne site per area at the Type 3 or Type 1 siteHourly during the ozone monitoring season.
    CO (ppb level)One site per area at a Type 2 siteHourly during the ozone monitoring season.
    OzoneAll sitesHourly during the ozone monitoring season.
    Surface metAll sitesHourly during the ozone monitoring season.
    Upper air meteorologyOne representative location within PAMS areaSampling frequency must be approved as part of the annual monitoring network plan required in 40 CFR 58.10.
    1 Daily or with an approved alternative plan.
    2 Speciated VOC is defined in the “Technical Assistance Document for Sampling and Analysis of Ozone Precursors”, EPA/600-R-98/161, September 1998.
    3 Approved ozone monitoring season as stipulated in Table D-3 of this appendix.

    5.4 Transition Period. A transition period is allowed for phasing in the operation of newly required PAMS programs (due generally to reclassification of an area into serious, severe, or extreme nonattainment for ozone). Following the date of redesignation or reclassification of any existing O3 nonattainment area to serious, severe, or extreme, or the designation of a new area and classification to serious, severe, or extreme O3 nonattainment, a State is allowed 1 year to develop plans for its PAMS implementation strategy. Subsequently, a minimum of one Type 2 site must be operating by the first month of the following approved PAMS season. Operation of the remaining site(s) must, at a minimum, be phased in at the rate of one site per year during subsequent years as outlined in the approved PAMS network description provided by the State.

    6. References

    1. Ball, R.J. and G.E. Anderson. Optimum Site Exposure Criteria for SO2 Monitoring. The Center for the Environment and Man, Inc., Hartford, CT. Prepared for U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA Publication No. EPA-450/3-77-013. April 1977.

    2. Ludwig, F.F., J.H.S. Kealoha, and E. Shelar. Selecting Sites for Carbon Monoxide Monitoring. Stanford Research Institute, Menlo Park, CA. Prepared for U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA Publication No. EPA-450/3-75-077, September 1975.

    3. Air Quality Criteria for Lead. Office of Research and Development, U.S. Environmental Protection Agency, Washington D.C. EPA Publication No. 600/8-89-049F. August 1990. (NTIS document numbers PB87-142378 and PB91-138420.)

    4. Optimum Site Exposure Criteria for Lead Monitoring. PEDCo Environmental, Inc. Cincinnati, OH. Prepared for U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA Contract No. 68-02-3013. May 1981.

    5. Guidance for Conducting Ambient Air Monitoring for Lead Around Point Sources. Office of Air Quality Planning and Standards, U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA-454/R-92-009. May 1997.

    6. Koch, R.C. and H.E. Rector. Optimum Network Design and Site Exposure Criteria for Particulate Matter. GEOMET Technologies, Inc., Rockville, MD. Prepared for U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA Contract No. 68-02-3584. EPA 450/4-87-009. May 1987.

    7. Watson et al. Guidance for Network Design and Optimum Site Exposure for PM2.5 and PM10. Prepared for U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA-454/R-99-022, December 1997.

    8. Guideline for Selecting and Modifying the Ozone Monitoring Season Based on an 8-Hour Ozone Standard. Prepared for U.S. Environmental Protection Agency, RTP, NC. EPA-454/R-98-001, June 1998.

    9. Photochemical Assessment Monitoring Stations Implementation Manual. Office of Air Quality Planning and Standards, U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA-454/B-93-051. March 1994.

    37. Appendix E to part 58 is revised to read as follows:

    Appendix E to Part 58—Probe and Monitoring Path Siting Criteria for Ambient Air Quality Monitoring

    1. Introduction.

    2. Horizontal and Vertical Placement.

    3. Spacing from Minor Sources.

    4. Spacing From Obstructions.

    5. Spacing From Trees.

    6. Spacing From Roadways.

    7. Cumulative Interferences on a Monitoring Path.

    8. Maximum Monitoring Path Length.

    9. Probe Material and Pollutant Sample Residence Time.

    10. Waiver Provisions.

    11. Summary.

    12. References.

    1. Introduction

    (a) This appendix contains specific location criteria applicable to SLAMS, NCore, and PAMS ambient air quality monitoring probes, inlets, and optical paths after the general location has been selected based on the monitoring objectives and spatial scale of representation discussed in appendix D to this part. Adherence to these Start Printed Page 61324siting criteria is necessary to ensure the uniform collection of compatible and comparable air quality data.

    (b) The probe and monitoring path siting criteria discussed in this appendix must be followed to the maximum extent possible. It is recognized that there may be situations where some deviation from the siting criteria may be necessary. In any such case, the reasons must be thoroughly documented in a written request for a waiver that describes how and why the proposed siting deviates from the criteria. This documentation should help to avoid later questions about the validity of the resulting monitoring data. Conditions under which the EPA would consider an application for waiver from these siting criteria are discussed in section 10 of this appendix.

    (c) The pollutant-specific probe and monitoring path siting criteria generally apply to all spatial scales except where noted otherwise. Specific siting criteria that are phrased with a “must” are defined as requirements and exceptions must be approved through the waiver provisions. However, siting criteria that are phrased with a “should” are defined as goals to meet for consistency but are not requirements.

    2. Horizontal and Vertical Placement

    The probe or at least 80 percent of the monitoring path must be located between 2 and 15 meters above ground level for all ozone, sulfur dioxide and nitrogen dioxide monitoring sites, and for neighborhood scale Pb, PM10, PM10-2.5, PM2.5, and carbon monoxide sites. Middle scale PM10-2.5 sites are required to have sampler inlets between 2 and 7 meters above ground level. Microscale Pb, PM10, PM10-2.5 and PM2.5 sites are required to have sampler inlets between 2 and 7 meters above ground level. The inlet probes for microscale carbon monoxide monitors that are being used to measure concentrations near roadways must be 3±1/2 meters above ground level. The probe or at least 90 percent of the monitoring path must be at least 1 meter vertically or horizontally away from any supporting structure, walls, parapets, penthouses, etc., and away from dusty or dirty areas. If the probe or a significant portion of the monitoring path is located near the side of a building, then it should be located on the windward side of the building relative to the prevailing wind direction during the season of highest concentration potential for the pollutant being measured.

    3. Spacing From Minor Sources

    (a) It is important to understand the monitoring objective for a particular location in order to interpret this particular requirement. Local minor sources of a primary pollutant, such as SO2, lead, or particles, can cause high concentrations of that particular pollutant at a monitoring site. If the objective for that monitoring site is to investigate these local primary pollutant emissions, then the site is likely to be properly located nearby. This type of monitoring site would in all likelihood be a microscale type of monitoring site. If a monitoring site is to be used to determine air quality over a much larger area, such as a neighborhood or city, a monitoring agency should avoid placing a monitor probe, path, or inlet near local, minor sources. The plume from the local minor sources should not be allowed to inappropriately impact the air quality data collected at a site. Particulate matter sites should not be located in an unpaved area unless there is vegetative ground cover year round, so that the impact of wind blown dusts will be kept to a minimum.

    (b) Similarly, local sources of nitric oxide (NO) and ozone-reactive hydrocarbons can have a scavenging effect causing unrepresentatively low concentrations of O3 in the vicinity of probes and monitoring paths for O3. To minimize these potential interferences, the probe or at least 90 percent of the monitoring path must be away from furnace or incineration flues or other minor sources of SO2 or NO. The separation distance should take into account the heights of the flues, type of waste or fuel burned, and the sulfur content of the fuel.

    4. Spacing From Obstructions

    (a) Buildings and other obstacles may possibly scavenge SO2, O3, or NO2, and can act to restrict airflow for any pollutant. To avoid this interference, the probe, inlet, or at least 90 percent of the monitoring path must have unrestricted airflow and be located away from obstacles. The distance from the obstacle to the probe, inlet, or monitoring path must be at least twice the height that the obstacle protrudes above the probe, inlet, or monitoring path. An exception to this requirement can be made for measurements taken in street canyons or at source-oriented sites where buildings and other structures are unavoidable.

    (b) Generally, a probe or monitoring path located near or along a vertical wall is undesirable because air moving along the wall may be subject to possible removal mechanisms. A probe, inlet, or monitoring path must have unrestricted airflow in an arc of at least 180 degrees. This arc must include the predominant wind direction for the season of greatest pollutant concentration potential. For particle sampling, a minimum of 2 meters of separation from walls, parapets, and structures is required for rooftop site placement.

    (c) Special consideration must be given to the use of open path analyzers due to their inherent potential sensitivity to certain types of interferences, or optical obstructions. A monitoring path must be clear of all trees, brush, buildings, plumes, dust, or other optical obstructions, including potential obstructions that may move due to wind, human activity, growth of vegetation, etc. Temporary optical obstructions, such as rain, particles, fog, or snow, should be considered when siting an open path analyzer. Any of these temporary obstructions that are of sufficient density to obscure the light beam will affect the ability of the open path analyzer to continuously measure pollutant concentrations. Transient, but significant obscuration of especially longer measurement paths could occur as a result of certain meteorological conditions (e.g., heavy fog, rain, snow) and/or aerosol levels that are of a sufficient density to prevent the open path analyzer's light transmission. If certain compensating measures are not otherwise implemented at the onset of monitoring (e.g., shorter path lengths, higher light source intensity), data recovery during periods of greatest primary pollutant potential could be compromised. For instance, if heavy fog or high particulate levels are coincident with periods of projected NAAQS-threatening pollutant potential, the representativeness of the resulting data record in reflecting maximum pollutant concentrations may be substantially impaired despite the fact that the site may otherwise exhibit an acceptable, even exceedingly high overall valid data capture rate.

    5. Spacing From Trees

    (a) Trees can provide surfaces for SO2, O3, or NO2 adsorption or reactions, and surfaces for particle deposition. Trees can also act as obstructions in cases where they are located between the air pollutant sources or source areas and the monitoring site, and where the trees are of a sufficient height and leaf canopy density to interfere with the normal airflow around the probe, inlet, or monitoring path. To reduce this possible interference/obstruction, the probe, inlet, or at least 90 percent of the monitoring path must be at least 10 meters or further from the drip line of trees.

    (b) The scavenging effect of trees is greater for O3 than for other criteria pollutants. Monitoring agencies must take steps to consider the impact of trees on ozone monitoring sites and take steps to avoid this problem.

    (c) For microscale sites of any air pollutant, no trees or shrubs should be located between the probe and the source under investigation, such as a roadway or a stationary source.

    6. Spacing From Roadways

    6.1 Spacing for Ozone and Oxide of Nitrogen Probes and Monitoring Paths. In siting an O3 analyzer, it is important to minimize destructive interferences from sources of NO, since NO readily reacts with O3. In siting NO2 analyzers for neighborhood and urban scale monitoring, it is important to minimize interferences from automotive sources. Table E-1 of this appendix provides the required minimum separation distances between a roadway and a probe or, where applicable, at least 90 percent of a monitoring path for various ranges of daily roadway traffic. A sampling site having a point analyzer probe located closer to a roadway than allowed by the Table E-1 requirements should be classified as middle scale rather than neighborhood or urban scale, since the measurements from such a site would more closely represent the middle scale. If an open path analyzer is used at a site, the monitoring path(s) must not cross over a roadway with an average daily traffic count of 10,000 vehicles per day or more. For those situations where a monitoring path crosses a roadway with fewer than 10,000 vehicles per day, one must consider the entire segment of the monitoring path in the area of potential atmospheric interference from automobile emissions. Therefore, this calculation must include the length of the monitoring path over the roadway plus any segments of the Start Printed Page 61325monitoring path that lie in the area between the roadway and the minimum separation distance, as determined from Table E-1 of this appendix. The sum of these distances must not be greater than 10 percent of the total monitoring path length.

    Table E-1 to Appendix E of Part 58. Minimum Separation Distance Between Roadways and Probes or Monitoring Paths for Monitoring Neighborhood and Urban Scale Ozone (O3) and Oxides of Nitrogen (NO, NO2, NOX, NOy)

    Roadway average daily traffic, vehicles per dayMinimum distance 1 (meters)Minimum distance 1, 2 (meters)
    ≤1,0001010
    10,0001020
    15,0002030
    20,0003040
    40,0005060
    70,000100100
    ≥110,000250250
    1 Distance from the edge of the nearest traffic lane. The distance for intermediate traffic counts should be interpolated from the table values based on the actual traffic count.
    2 Applicable for ozone monitors whose placement has not already been approved as of December 18, 2006.

    6.2 Spacing for Carbon Monoxide Probes and Monitoring Paths. (a) Street canyon and traffic corridor sites (microscale) are intended to provide a measurement of the influence of the immediate source on the pollution exposure of the population. In order to provide some reasonable consistency and comparability in the air quality data from microscale sites, a minimum distance of 2 meters and a maximum distance of 10 meters from the edge of the nearest traffic lane must be maintained for these CO monitoring inlet probes. This should give consistency to the data, yet still allow flexibility of finding suitable locations.

    (b) Street canyon/corridor (microscale) inlet probes must be located at least 10 meters from an intersection and preferably at a midblock location. Midblock locations are preferable to intersection locations because intersections represent a much smaller portion of downtown space than do the streets between them. Pedestrian exposure is probably also greater in street canyon/corridors than at intersections.

    (c) In determining the minimum separation between a neighborhood scale monitoring site and a specific roadway, the presumption is made that measurements should not be substantially influenced by any one roadway. Computations were made to determine the separation distance, and Table E-2 of this appendix provides the required minimum separation distance between roadways and a probe or 90 percent of a monitoring path. Probes or monitoring paths that are located closer to roads than this criterion allows should not be classified as a neighborhood scale, since the measurements from such a site would closely represent the middle scale. Therefore, sites not meeting this criterion should be classified as middle scale.

    Table E-2 to Appendix E of Part 58. Minimum Separation Distance Between Roadways and Probes or Monitoring Paths for Monitoring Neighborhood Scale Carbon Monoxide

    Roadway average daily traffic, vehicles per dayMinimum distance 1 (meters)
    ≤10,00010
    15,00025
    20,00045
    30,00080
    40,000115
    50,000135
    ≥60,000150
    1 Distance from the edge of the nearest traffic lane. The distance for intermediate traffic counts should be interpolated from the table values based on the actual traffic count.

    6.3 Spacing for Particulate Matter (PM2.5, PM10, Pb) Inlets. (a) Since emissions associated with the operation of motor vehicles contribute to urban area particulate matter ambient levels, spacing from roadway criteria are necessary for ensuring national consistency in PM sampler siting.

    (b) The intent is to locate localized hot-spot sites in areas of highest concentrations whether it be from mobile or multiple stationary sources. If the area is primarily affected by mobile sources and the maximum concentration area(s) is judged to be a traffic corridor or street canyon location, then the monitors should be located near roadways with the highest traffic volume and at separation distances most likely to produce the highest concentrations. For the microscale traffic corridor site, the location must be between 5 and 15 meters from the major roadway. For the microscale street canyon site the location must be between 2 and 10 meters from the roadway. For the middle scale site, a range of acceptable distances from the roadway is shown in figure E-1 of this appendix. This figure also includes separation distances between a roadway and neighborhood or larger scale sites by default. Any site, 2 to 15 meters high, and further back than the middle scale requirements will generally be neighborhood, urban or regional scale. For example, according to Figure E-1 of this appendix, if a PM sampler is primarily influenced by roadway emissions and that sampler is set back 10 meters from a 30,000 ADT (average daily traffic) road, the site should be classified as microscale, if the sampler height is between 2 and 7 meters. If the sampler height is between 7 and 15 meters, the site should be classified as middle scale. If the sample is 20 meters from the same road, it will be classified as middle scale; if 40 meters, neighborhood scale; and if 110 meters, an urban scale.

    Start Printed Page 61326

    7. Cumulative Interferences on a Monitoring Path

    (This paragraph applies only to open path analyzers.) The cumulative length or portion of a monitoring path that is affected by minor sources, trees, or roadways must not exceed 10 percent of the total monitoring path length.

    8. Maximum Monitoring Path Length

    (This paragraph applies only to open path analyzers.) The monitoring path length must not exceed 1 kilometer for analyzers in neighborhood, urban, or regional scale. For middle scale monitoring sites, the monitoring path length must not exceed 300 meters. In areas subject to frequent periods of dust, fog, rain, or snow, consideration should be given to a shortened monitoring path length to minimize loss of monitoring data due to these temporary optical obstructions. For certain ambient air monitoring scenarios using open path analyzers, shorter path lengths may be needed in order to ensure that the monitoring site meets the objectives and spatial scales defined in appendix D to this part. The Regional Administrator may require shorter path lengths, as needed on an individual basis, to ensure that the SLAMS sites meet the appendix D requirements. Likewise, the Administrator may specify the maximum path length used at NCore monitoring sites.

    9. Probe Material and Pollutant Sample Residence Time

    (a) For the reactive gases, SO2, NO2, and O3, special probe material must be used for point analyzers. Studies20−24 have been conducted to determine the suitability of materials such as polypropylene, polyethylene, polyvinyl chloride, Tygon®, aluminum, brass, stainless steel, copper, Pyrex® glass and Teflon® for use as intake sampling lines. Of the above materials, only Pyrex® glass and Teflon® have been found to be acceptable for use as intake sampling lines for all the reactive gaseous pollutants. Furthermore, the EPA25 has specified borosilicate glass or FEP Teflon® as the only acceptable probe materials for delivering test atmospheres in the determination of reference or equivalent methods. Therefore, borosilicate glass, FEP Teflon® or their equivalent must be the only material in the sampling train (from inlet probe to the back of the analyzer) that can be in contact with the ambient air sample for existing and new SLAMs.

    (b) For volatile organic compound (VOC) monitoring at PAMS, FEP Teflon® is unacceptable as the probe material because of VOC adsorption and desorption reactions on the FEP Teflon®. Borosilicate glass, stainless steel, or its equivalent are the acceptable probe materials for VOC and carbonyl sampling. Care must be taken to ensure that the sample residence time is kept to 20 seconds or less.

    (c) No matter how nonreactive the sampling probe material is initially, after a period of use reactive particulate matter is deposited on the probe walls. Therefore, the time it takes the gas to transfer from the probe inlet to the sampling device is also critical. Ozone in the presence of nitrogen oxide (NO) will show significant losses even in the most inert probe material when the residence time exceeds 20 seconds.26 Other studies27−28 indicate that a 10-second or less residence time is easily achievable. Therefore, sampling probes for reactive gas monitors at NCore must have a sample residence time less than 20 seconds.

    10. Waiver Provisions

    Most sampling probes or monitors can be located so that they meet the requirements of this appendix. New sites with rare exceptions, can be located within the limits of this appendix. However, some existing sites may not meet these requirements and still produce useful data for some purposes. The EPA will consider a written request from the State agency to waive one or more siting criteria for some monitoring sites providing that the State can adequately demonstrate the need (purpose) for monitoring or establishing a monitoring site at that location.

    10.1 For establishing a new site, a waiver may be granted only if both of the following criteria are met:

    10.1.1 The site can be demonstrated to be as representative of the monitoring area as it would be if the siting criteria were being met.

    10.1.2 The monitor or probe cannot reasonably be located so as to meet the siting criteria because of physical constraints (e.g., inability to locate the required type of site the necessary distance from roadways or obstructions).

    10.2 However, for an existing site, a waiver may be granted if either of the criteria in sections 10.1.1 and 10.1.2 of this appendix are met.

    10.3 Cost benefits, historical trends, and other factors may be used to add support to the criteria in sections 10.1.1 and 10.1.2 of this appendix, however, they in themselves, will not be acceptable reasons for granting a waiver. Written requests for waivers must be submitted to the Regional Administrator.

    11. Summary

    Table E-4 of this appendix presents a summary of the general requirements for probe and monitoring path siting criteria with respect to distances and heights. It is apparent from Table E-4 that different elevation distances above the ground are shown for the various pollutants. The discussion in this appendix for each of the pollutants describes reasons for elevating the monitor, probe, or monitoring path. The differences in the specified range of heights Start Printed Page 61327are based on the vertical concentration gradients. For CO, the gradients in the vertical direction are very large for the microscale, so a small range of heights are used. The upper limit of 15 meters is specified for consistency between pollutants and to allow the use of a single manifold or monitoring path for monitoring more than one pollutant.

    Table E-4 of Appendix E to Part 58. Summary of Probe and Monitoring Path Siting Criteria

    PollutantScale (maximum monitoring path length, meters)Height from ground to probe, inlet or 80% of monitoring path 1Horizontal and vertical distance from supporting structures 2 to probe, inlet or 90% of monitoring path 1 (meters)Distance from trees to probe, inlet or 90% of monitoring path 1 (meters)Distance from roadways to probe, inlet or monitoring path 1 (meters)
    SO23,4,5,6Middle (300 m) Neighborhood Urban, and Regional (1 km)2-15> 1> 10N/A
    CO 4,5,7Micro, middle (300 m), Neighborhood (1 km)3±1/2: 2-15> 1> 102-10; see Table E-2 of this appendix for middle and neighborhood scales.
    NO2, O33,4,5Middle (300 m) Neighborhood, Urban, and Regional (1 km)2-15> 1> 10See Table E-1 of this appendix for all scales.
    Ozone precursors (for PAMS) 3,4,5 Neighborhood and Urban (1 km) 2-15> 1> 10 See Table E-4 of this appendix for all scales.
    PM,Pb 3,4,5,6,8Micro: Middle, Neighborhood, Urban and Regional2-7 (micro); 2-7 (middle PM10-2.5); 2-15 (all other scales)> 2 (all scales, horizontal distance only)> 10 (all scales)2-10 (micro); see Figure E-1 of this appendix for all other scales.
    N/A—Not applicable.
    1 Monitoring path for open path analyzers is applicable only to middle or neighborhood scale CO monitoring and all applicable scales for monitoring SO2,O3, O3 precursors, and NO2.
    2 When probe is located on a rooftop, this separation distance is in reference to walls, parapets, or penthouses located on roof.
    3 Should be >20 meters from the dripline of tree(s) and must be 10 meters from the dripline when the tree(s) act as an obstruction.
    4 Distance from sampler, probe, or 90% of monitoring path to obstacle, such as a building, must be at least twice the height the obstacle protrudes above the sampler, probe, or monitoring path. Sites not meeting this criterion may be classified as middle scale (see text).
    5 Must have unrestricted airflow 270 degrees around the probe or sampler; 180 degrees if the probe is on the side of a building.
    6 The probe, sampler, or monitoring path should be away from minor sources, such as furnace or incineration flues. The separation distance is dependent on the height of the minor source's emission point (such as a flue), the type of fuel or waste burned, and the quality of the fuel (sulfur, ash, or lead content). This criterion is designed to avoid undue influences from minor sources.
    7 For microscale CO monitoring sites, the probe must be >10 meters from a street intersection and preferably at a midblock location.
    8 Collocated monitors must be within 4 meters of each other and at least 2 meters apart for flow rates greater than 200 liters/min or at least 1 meter apart for samplers having flow rates less than 200 liters/min to preclude airflow interference.

    12. References

    1. Bryan, R.J., R.J. Gordon, and H. Menck. Comparison of High Volume Air Filter Samples at Varying Distances from Los Angeles Freeway. University of Southern California, School of Medicine, Los Angeles, CA. (Presented at 66th Annual Meeting of Air Pollution Control Association. Chicago, IL. June 24-28, 1973. APCA 73-158.)

    2. Teer, E.H. Atmospheric Lead Concentration Above an Urban Street. Master of Science Thesis, Washington University, St. Louis, MO. January 1971.

    3. Bradway, R.M., F.A. Record, and W.E. Belanger. Monitoring and Modeling of Resuspended Roadway Dust Near Urban Arterials. GCA Technology Division, Bedford, MA. (Presented at 1978 Annual Meeting of Transportation Research Board, Washington, DC. January 1978.)

    4. Pace, T.G., W.P. Freas, and E.M. Afify. Quantification of Relationship Between Monitor Height and Measured Particulate Levels in Seven U.S. Urban Areas. U.S. Environmental Protection Agency, Research Triangle Park, NC. (Presented at 70th Annual Meeting of Air Pollution Control Association, Toronto, Canada. June 20-24, 1977. APCA 77-13.4.)

    5. Harrison, P.R. Considerations for Siting Air Quality Monitors in Urban Areas. City of Chicago, Department of Environmental Control, Chicago, IL. (Presented at 66th Annual Meeting of Air Pollution Control Association, Chicago, IL. June 24-28, 1973. APCA 73-161.)

    6. Study of Suspended Particulate Measurements at Varying Heights Above Ground. Texas State Department of Health, Air Control Section, Austin, TX. 1970. p.7.

    7. Rodes, C.E. and G.F. Evans. Summary of LACS Integrated Pollutant Data. In: Los Angeles Catalyst Study Symposium. U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA Publication No. EPA-600/4-77-034. June 1977.

    8. Lynn, D.A. et al. National Assessment of the Urban Particulate Problem: Volume 1, National Assessment. GCA Technology Division, Bedford, MA. U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA Publication No. EPA-450/3-75-024. June 1976.

    9. Pace, T.G. Impact of Vehicle-Related Particulates on TSP Concentrations and Rationale for Siting Hi-Vols in the Vicinity of Roadways. OAQPS, U.S. Environmental Protection Agency, Research Triangle Park, NC. April 1978.

    10. Ludwig, F.L., J.H. Kealoha, and E. Shelar. Selecting Sites for Monitoring Total Suspended Particulates. Stanford Research Institute, Menlo Park, CA. Prepared for U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA Publication No. EPA-450/3-77-018. June 1977, revised December 1977.

    11. Ball, R.J. and G.E. Anderson. Optimum Site Exposure Criteria for SO2 Monitoring. The Center for the Environment and Man, Inc., Hartford, CT. Prepared for U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA Publication No. EPA-450/3-77-013. April 1977.

    12. Ludwig, F.L. and J.H.S. Kealoha. Selecting Sites for Carbon Monoxide Monitoring. Stanford Research Institute, Menlo Park, CA. Prepared for U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA Publication No. EPA-450/3-75-077. September 1975.

    13. Ludwig, F.L. and E. Shelar. Site Selection for the Monitoring of Start Printed Page 61328Photochemical Air Pollutants. Stanford Research Institute, Menlo Park, CA. Prepared for U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA Publication No. EPA-450/3-78-013. April 1978.

    14. Lead Analysis for Kansas City and Cincinnati, PEDCo Environmental, Inc., Cincinnati, OH. Prepared for U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA Contract No. 66-02-2515, June 1977.

    15. Barltrap, D. and C.D. Strelow. Westway Nursery Testing Project. Report to the Greater London Council. August 1976.

    16. Daines, R. H., H. Moto, and D. M. Chilko. Atmospheric Lead: Its Relationship to Traffic Volume and Proximity to Highways. Environ. Sci. and Technol., 4:318, 1970.

    17. Johnson, D. E., et al. Epidemiologic Study of the Effects of Automobile Traffic on Blood Lead Levels, Southwest Research Institute, Houston, TX. Prepared for U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA-600/1-78-055, August 1978.

    18. Air Quality Criteria for Lead. Office of Research and Development, U.S. Environmental Protection Agency, Washington, DC EPA-600/8-83-028 aF-dF, 1986, and supplements EPA-600/8-89/049F, August 1990. (NTIS document numbers PB87-142378 and PB91-138420.)

    19. Lyman, D. R. The Atmospheric Diffusion of Carbon Monoxide and Lead from an Expressway, Ph.D. Dissertation, University of Cincinnati, Cincinnati, OH. 1972.

    20. Wechter, S.G. Preparation of Stable Pollutant Gas Standards Using Treated Aluminum Cylinders. ASTM STP. 598:40-54, 1976.

    21. Wohlers, H.C., H. Newstein and D. Daunis. Carbon Monoxide and Sulfur Dioxide Adsorption On and Description From Glass, Plastic and Metal Tubings. J. Air Poll. Con. Assoc. 17:753, 1976.

    22. Elfers, L.A. Field Operating Guide for Automated Air Monitoring Equipment. U.S. NTIS. p. 202, 249, 1971.

    23. Hughes, E.E. Development of Standard Reference Material for Air Quality Measurement. ISA Transactions, 14:281-291, 1975.

    24. Altshuller, A.D. and A.G. Wartburg. The Interaction of Ozone with Plastic and Metallic Materials in a Dynamic Flow System. Intern. Jour. Air and Water Poll., 4:70-78, 1961.

    25. Code of Federal Regulations. Title 40 part 53.22, July 1976.

    26. Butcher, S.S. and R.E. Ruff. Effect of Inlet Residence Time on Analysis of Atmospheric Nitrogen Oxides and Ozone, Anal. Chem., 43:1890, 1971.

    27. Slowik, A.A. and E.B. Sansone. Diffusion Losses of Sulfur Dioxide in Sampling Manifolds. J. Air. Poll. Con. Assoc., 24:245, 1974.

    28. Yamada, V.M. and R.J. Charlson. Proper Sizing of the Sampling Inlet Line for a Continuous Air Monitoring Station. Environ. Sci. and Technol., 3:483, 1969.

    29. Koch, R.C. and H.E. Rector. Optimum Network Design and Site Exposure Criteria for Particulate Matter, GEOMET Technologies, Inc., Rockville, MD. Prepared for U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA Contract No. 68-02-3584. EPA 450/4-87-009. May 1987.

    30. Burton, R.M. and J.C. Suggs. Philadelphia Roadway Study. Environmental Monitoring Systems Laboratory, U.S. Environmental Protection Agency, Research Triangle Park, N.C. EPA-600/4-84-070 September 1984.

    31. Technical Assistance Document For Sampling and Analysis of Ozone Precursors. Atmospheric Research and Exposure Assessment Laboratory, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711. EPA 600/8-91-215. October 1991.

    32. Quality Assurance Handbook for Air Pollution Measurement Systems: Volume IV. Meteorological Measurements. Atmospheric Research and Exposure Assessment Laboratory, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711. EPA 600/4-90-0003. August 1989.

    33. On-Site Meteorological Program Guidance for Regulatory Modeling Applications. Office of Air Quality Planning and Standards, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711. EPA 450/4-87-013. June 1987F.

    Appendix F—[Removed and Reserved]

    38. Appendix F to part 58 is removed and reserved.

    End Supplemental Information

    Footnotes

    1.  “Station” and “site” are used somewhat interchangeably in this notice of final rulemaking. When there is a difference (which will be apparent from context), “site” generally refers to the location of a monitor, while “station” refers to a suite of measurements at a particular site.

    Back to Citation

    2.  NOy refers to a broad class of nitrogen-containing reactive compounds in ambient air, explained in more detail in sections V.E.1 and V.E.7 of this preamble.

    Back to Citation

    3.  Throughout this preamble, “States” is meant to also refer to local governments that have been assigned responsibility for ambient air monitoring within their respective jurisdiction by their States. This preamble also uses “monitoring organization” to refer to States, local agencies, and/or Tribes conducting monitoring under or guided by the provisions of 40 CFR part 58. This final rule applies the same requirements to the District of Columbia, Puerto Rico, and the Virgin Islands as apply to the 50 States. Other U.S. territories are not subject to this final rule.

    Back to Citation

    4.  As explained in section IV.B of this preamble, the term “PM10c” refers to a PM10 Federal reference method (FRM) that is designated as a PM10c FRM under the final NAAQS rule appearing elsewhere in today's Federal Register. In essence, it would be a PM2.5 FRM with the inertial fractionator used to separate out particles larger than 2.5 microns removed so that all PM10 is collected. Unlike other PM10 instruments, a PM10c instrument must control flow to a specified flow rate of 16.67 liters/minute at local conditions of temperature and pressure. A PM10−2.5 FRM consists of a PM2.5 FRM and a PM10c FRM of the same model. See also 71 FR 2720.

    Back to Citation

    5.  Technical Assistance Document (TAD) for Precursor Gas Measurments in the NCore Multipollutant Monitoring Network. Version 4. U.S. Environmental Protection Agency. EPA-454/R-05-003. September 2005. Available at: http://www.epa.gov/​ttn/​amtic/​pretecdoc.html.

    Back to Citation

    6.  Additional information on EPA/National Park Service IMPROVE (Interagency Monitoring of Protected Visual Environments) Visibility Program is available at: http://www.epa.gov/​ttn/​amtic/​visdata.html.

    Back to Citation

    7.  Additional information on CASTNET is available at: http://www.epa.gov/​castnet/​.

    Back to Citation

    8.  Section 103(c)(2) of the Clean Air Act [42 U.S.C.A. 7403(c)] provides that the Administrator shall conduct a program for sampling air pollution that includes the establishment of a national network to monitor air quality and to ensure the comparability of air quality data collected in different states. Section 110(a)(2)(B) [42 U.S.C.A 7410(a)] provides that each State implementation plan shall provide for establishment and operation of appropriate devices, methods, systems, and procedures necessary to monitor, compile, and analyze data on ambient air quality and upon request make such data available to the Administrator. Section 182(c)(1) [42 U.S.C.A. 7511a(c)(1)] states that the Administrator will promulgate rules for enhanced monitoring for ozone, oxides of nitrogen, and volatile organic compounds in serious ozone areas.

    Back to Citation

    9.  Henderson, R. Clean Air Scientific Advisory Committee (CASAC) Review of the EPA Staff Recommendations Concerning a Potential Thoracic Coarse PM Standard in the Review of the National Ambient Air Quality Standards for Particulate Matter: Policy Assessment of Scientific and Technical Information (Final PM OAQPS Staff Paper, EPA-452/R-05-005). September 15, 2005. http://www.epa.gov/​sab/​panels/​casacpmpanel.html.

    Henderson, R. Letter from Dr. Rogene Henderson, Chair, Clean Air Scientific Advisory Committee to the Honorable Stephen L. Johnson, Administrator, U.S. EPA. Clean Air Scientific Advisory Committee Recommendations Concerning the Proposed National Ambient Air Quality Standards for Particulate Matter. March 21, 2006. http://www.epa.gov/​sab/​pdf/​casac-ltr-06-002.pdf.

    Back to Citation

    10.  See S. Rep. No. 91-1196. 91st Cong. 2d Sess. 10 (1970) (NAAQS is to be set to protect sensitive, at-risk population groups).

    Back to Citation

    11.  “Draft National Ambient Air Monitoring Strategy,” December 2005.

    Back to Citation

    12.  EPA is recodifying this provision in section 58.30 of the final monitoring rule, but is not reconsidering or otherwise reevaluating it.

    Back to Citation

    13.  “Proposal to Change PM2.5 and PM10 Collocation Sampling Frequency Requriements,” Mike Papp and Louise Camalier; November 2005. http://www.epa.gov/​ttn/​amtic/​pmgainf.html.

    Back to Citation

    14.  The QA Strategy Workgroup consists of EPA, State, and local staff responsible for monitoring quality assurance activities who meet informally to exchange information on current monitoring issues.

    Back to Citation

    15.  Technical Assistance Document (TAD) for Sampling and Analysis of Ozone Precursors. U.S. Environmental Protection Agency. Human Exposure and Atmospheric Sciences Division. EPA/600-R-98/161. September 1998. Available at: http://www.epa.gov/​ttn/​amtic/​pams.html.

    Back to Citation

    16.  To clarify, under the proposed rule, and this final rule, 41 States, the District of Columbia, the Virgin Islands, and Puerto Rico will be required to operate one NCore site. The other nine States will be required to operate two or three sites, for a national total of 62 to 71 required sites. Some of these required sites might be waived by EPA. The EPA anticipates, but the rule does not require that some of these sites will be rural. Counting non-required sites, the goal is a total of about 75 sites, about 20 of which will be rural.

    Back to Citation

    17.  As explained earlier, an approved regional method (ARM) is a PM2.5 method that has been approved specifically within a State, local, or Tribal air monitoring network for purposes of comparison to the National Ambient Air Quality Standards and to meet other monitoring objectives. See section V.D.2 of this preamble.

    Back to Citation

    18.  The possible additional monitoring discussed in the text above could be compared solely to the 24-hour PM2.5 NAAQS. As mentioned earlier, the 1997 rules provide that monitors that are sited in relatively unique population-oriented microscale areas, localized hot spots, or unique population-oriented middle-scale areas, may not be compared to the annual PM2.5 NAAQS.

    Back to Citation

    19.  As used in the Staff Paper, the term “mining sources” is intended to include all activities that encompass extraction and/or mechanical handling of natural geologic crustal materials. In the context of this rule making, neither mining nor agricultural sources are included in the more general category of “industrial sources.”

    Back to Citation

    [FR Doc. 06-8478 Filed 10-16-06; 8:45 am]

    BILLING CODE 6560-50-P

Document Information

Effective Date:
12/18/2006
Published:
10/17/2006
Department:
Environmental Protection Agency
Entry Type:
Rule
Action:
Final rule.
Document Number:
06-8478
Dates:
This final rule is effective on December 18, 2006.
Pages:
61235-61328 (94 pages)
Docket Numbers:
EPA-HQ-OAR-2004-0018, FRL-8227-2
RINs:
2060-AJ25: Ambient Air Quality Monitoring Regulations: Revisions
RIN Links:
https://www.federalregister.gov/regulations/2060-AJ25/ambient-air-quality-monitoring-regulations-revisions
Topics:
Administrative practice and procedure, Air pollution control, Environmental protection, Intergovernmental relations, Reporting and recordkeeping requirements
PDF File:
06-8478.pdf
CFR: (41)
40 CFR 53.1
40 CFR 53.2
40 CFR 53.3
40 CFR 53.4
40 CFR 53.5
More ...