96-31437. Proposed Requirements for Designation of Reference and Equivalent Methods for PMINF2.5 and Ambient Air Quality Surveillance for Particulate Matter  

  • [Federal Register Volume 61, Number 241 (Friday, December 13, 1996)]
    [Proposed Rules]
    [Pages 65780-65872]
    From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
    [FR Doc No: 96-31437]
    
    
    
    [[Page 65779]]
    
    _______________________________________________________________________
    
    Part VI
    
    
    
    
    
    Environmental Protection Agency
    
    
    
    
    
    _______________________________________________________________________
    
    
    
    40 CFR Parts 53 and 58
    
    
    
    Proposed Requirements for Designation of Reference and Equivalent 
    Methods for PM2.5 and Ambient Air Quality Surveillance for 
    Particulate Matter; Proposed Rule
    
    Federal Register / Vol. 61, No. 241 / Friday, December 13, 1996 / 
    Proposed Rules
    
    [[Page 65780]]
    
    
    
    ENVIRONMENTAL PROTECTION AGENCY
    
    40 CFR Parts 53 and 58
    
    RIN 2060-AH09
    [AD-FRL-5659-2]
    
    
    Proposed Requirements for Designation of Reference and Equivalent 
    Methods for PM2.5 and Ambient Air Quality Surveillance for 
    Particulate Matter
    
    AGENCY: Environmental Protection Agency (EPA).
    
    ACTION: Proposed rule.
    
    -----------------------------------------------------------------------
    
    SUMMARY: The EPA proposes to revise 40 CFR part 58 to establish ambient 
    air quality monitoring requirements for PM2.5 (particles with an 
    aerodynamic diameter less than or equal to a nominal 2.5 micrometers) 
    as measured by a new reference method being proposed in Appendix L to 
    40 CFR part 50 or by an equivalent method designated in accordance with 
    requirements being proposed in 40 CFR part 53. In addition, this 
    document also proposes certain revisions to existing ambient air 
    quality monitoring requirements for PM10 (particles with an 
    aerodynamic diameter less than or equal to a nominal 10 micrometers). 
    The changes proposed in this document address among other things, 
    network design and siting, quality assurance and quality control, and 
    monitoring methodology. The document also indicates EPA's intent to 
    explore opportunities to coordinate and integrate the existing 
    visibility monitoring requirements with the ambient air quality 
    monitoring requirements for particulate matter being proposed today to 
    accommodate a better regional haze program and to reduce burdens and 
    achieve multiple monitoring objectives.
    
    DATES: Comments must be submitted on or before February 18, 1997.
    
    ADDRESSES: Comments should be submitted (in duplicate, if possible) to: 
    Air Docket (LE-131), U.S. Environmental Protection Agency, Attn. Docket 
    No. A-96-51, 401 M Street, SW, Washington, DC 20460. The docket may be 
    inspected between 8:00 a.m. and 5:30 p.m. on weekdays. A reasonable fee 
    may be charged for copying.
        Public hearing: The EPA will announce in a separate Federal 
    Register document the date, time, and address of the public hearing on 
    this proposed decision.
    FOR FURTHER INFORMATION CONTACT: Mr. Neil Frank (MD-14), Monitoring and 
    Quality Assurance Group, Emissions, Monitoring, and Analysis Division, 
    U.S. Environmental Protection Agency, Research Triangle Park, North 
    Carolina 27711, telephone (919) 541-5560.
    
    SUPPLEMENTARY INFORMATION:
    
    Table of Contents
    
    I. Authority
    II. Introduction
    III. Discussion of Proposed Revisions to Part 53
        A. Designation of Reference and Equivalent Methods
        B. Quality Assurance
    IV. Discussion of Proposed Revisions to Part 58
        A. Section 58.1  Definitions
        B. Section 58.13  Operating schedule
        C. Section 58.14  Special purpose monitors
        D. Section 58.15  PM2.5 NAAQS eligible monitors
        E. Section 58.20  Air quality surveillance: plan content
        F. Section 58.23  Monitoring network completion
        G. Section 58.25  System modification
        H. Section 58.26  Annual SLAMS summary report
        I. Section 58.30  NAMS network establishment
        J. Section 58.31  NAMS network description
        K. Section 58.34  NAMS network completion
        L. Section 58.35  NAMS data submittal
        M. Appendix A--Quality Assurance Requirements for State and 
    Local Air Monitoring Stations (SLAMS)
        N. Appendix B--Quality Assurance Requirements for Prevention of 
    Significant Deterioration (PSD) Air Monitoring
        O. Appendix C--Ambient Air Quality Monitoring Methodology
        P. Appendix D--Network Design for State and Local Air Monitoring 
    Stations (SLAMS), National Air Monitoring Stations (NAMS), and 
    Photochemical Assessment Monitoring Stations (PAMS)
        Q. Appendix E--Probe and Monitoring Path Siting Criteria for 
    Ambient Air Quality Monitoring
        R. Cost Estimates for New PM Networks
        S. Reference
    V. Administrative Requirements
        A. Regulatory Impact Analysis
        B. Reporting and Record keeping Requirements
        C. Impact on Small Entities
        D. Unfunded Mandates Reform Act of 1995
    
    I. Authority
    
        Sections 110, 301(a), and 319 of the Clean Air Act as amended 42 
    U.S.C. 7410, 7601(a), 7619.
    
    II. Introduction
    
    A. Proposed Revision to the Particulate Matter NAAQS
    
        Elsewhere in today's Federal Register, EPA announced proposed 
    revisions to the national ambient air quality standards for particulate 
    matter. In that notice, EPA proposes to amend the current suite of 
    PM10 standards by adding new PM2.5 standards and by revising 
    the form of the current 24-hour PM10 standard. Specifically, the 
    EPA proposes to add two new primary PM2.5 standards set at 15 
    g/m3, annual mean, and 50 g/m3, 24-hour 
    average. The proposed new annual PM2.5 standard would be met when 
    the 3-year average of the annual arithmetic mean PM2.5 
    concentrations, spatially averaged across an area, is less than or 
    equal to 15 g/m3. The proposed new 24-hour PM2.5 
    standard would be met when the 3-year average of the 98th percentile of 
    24-hour PM2.5 concentrations at each monitor within an area is 
    less than or equal to 50.
        The EPA also proposes to retain the current annual PM10 
    standard at the level of 50 g/m3, which would be met when 
    the 3-year average of the annual arithmetic PM10 concentrations at 
    each monitor within an area is less than or equal to 50 g/
    m3. Further, EPA proposes to retain the current 24-hour PM10 
    standard at the level of 150 g/m3, but to revise the form 
    such that the standard would be met when the 3-year average of the 98th 
    percentile of the monitored concentrations at the highest monitor in an 
    area is less than or equal to 150 g/m3.
        In the part 50 notice, EPA also proposed to revise the current 
    secondary standards by making them identical to the suite of proposed 
    primary standards. The suite of PM2.5 and PM10 standards, in 
    conjunction with the establishment of a regional haze program under 
    section 169A of the Clean Air Act (Act), are intended to protect 
    against PM-related welfare effects including soiling and materials 
    damage and visibility impairment.
        As discussed in the part 50 notice, the proposed new PM2.5 
    standards are intended to protect against exposures to fine particulate 
    pollution, while the new PM10 standards are intended to protect 
    against coarse fraction particles as measured by PM10.
        For PM2.5, the annual standard is intended to protect against 
    both long- and short-term exposures to fine particle pollution. Under 
    this approach, the PM2.5 24-hour standard would serve as a ``back 
    stop'' to provide additional protection against days with high 
    PM2.5 concentrations, localized ``hot spots,'' and risks arising 
    from seasonal emissions that would not be well controlled by a national 
    annual standard.
        In specifying that the calculation of the annual arithmetic mean 
    for an area (for purposes of comparison to level of PM2.5 annual 
    standard) should be
    
    [[Page 65781]]
    
    accomplished by averaging the annual arithmetic means derived from 
    multiple, population-oriented monitoring sites, EPA took into account 
    several factors. As discussed in the part 50 notice, many of the 
    community-based epidemiologic studies examined in this review used 
    spatial averages, when multiple monitoring sites were available, to 
    characterize area-wide PM exposure levels and associated public health 
    risk. Even in those studies that used only one monitoring location, the 
    selected site was chosen to represent community-wide exposures, not the 
    highest value likely to be experienced within the community. Because 
    the annual PM2.5 standard would be intended to reduce aggregate 
    population risk from both long- and short-term exposures by lowering 
    the broad distribution of PM concentrations across the community, an 
    annual standard based on spatially averaged concentrations from several 
    population-oriented monitoring sites would better reflect areawide PM 
    exposure levels and associated health risks than would a standard based 
    on concentrations from a single monitor with the highest measured 
    values in the area. The spatial average approach is not appropriate for 
    PM10 because the spatial distribution of coarse particles is 
    different and tends to be more localized in its behavior.
        Finally, under the policy approach presented in the part 50 notice, 
    the 24-hour PM2.5 standard would be intended to supplement a 
    spatially-averaged annual PM2.5 standard by providing protection 
    against peak 24-hour concentrations arising from situations that would 
    not be well-controlled by an annual standard. Accordingly, the 24-hour 
    PM2.5 standard would be based on the single population-oriented 
    monitoring site within a monitoring planning area with the highest 
    measured values.
        In EPA's judgment, an annual PM2.5 standard expressed as a 
    spatial average, established in conjunction with a 24-hour standard 
    based on the monitoring site with the highest measured values, would 
    provide the most appropriate target for reducing area-wide population 
    exposure to fine particle pollution and would be most consistent with 
    the underlying epidemiologic data base. On the other hand, EPA is 
    mindful that adoption of spatial averaging for a PM2.5 standard 
    would add a degree of complexity to the monitoring siting requirements 
    and to the specification of those areas across which spatial averaging 
    should be permitted. This approach may also require larger monitoring 
    networks in some areas. By proposing a spatial averaging approach, the 
    part 50 notice recognizes that some monitoring planning areas may have 
    to be subdivided into smaller subareas to reflect gradients in particle 
    levels (e.g., upwind suburban sites, central city sites, downwind 
    sites) as well as topographical barriers or other factors that may 
    result in a monitoring planning area having several distinct air 
    quality regimes.
        Recognizing the complexities that spatial averaging may introduce 
    into risk management programs and that unforeseen issues may arise from 
    public comment on the requirements presented in this notice, the part 
    50 notice also requests comment on the alternative of basing the 
    PM2.5 annual standard on the population-oriented monitoring site 
    within the monitoring planning area with the highest 3-year average 
    annual mean. The part 50 notice indicates, based on comments received, 
    that EPA may choose either of these two approaches for specifying the 
    form of the annual PM2.5 standard at the time of promulgation of 
    any revisions to the PM standards.
        In the part 50 notice, EPA also solicits comments on alternative 
    levels of both annual and 24-hour PM2.5 primary standards and on 
    revoking the current 24-hour primary PM10 standard.
    
    B. Air Quality Monitoring Requirements
    
        Section 110(a)(2)(C) of the Act requires ambient air quality 
    monitoring for purposes of the State implementation plans (SIP's) and 
    for reporting data quality to EPA. Uniform criteria to be followed when 
    measuring air quality and provisions for daily air pollution index 
    reporting are required by section 319 of the Act. To satisfy these 
    requirements, on May 10, 1979 (44 FR 27558), EPA established 40 CFR 
    part 58 which provided detailed requirements for air quality 
    monitoring, data reporting, and surveillance for all of the pollutants 
    for which national ambient air quality standards have been established 
    (criteria pollutants). Provisions were promulgated subsequently for 
    particulate matter (PM10) on July 1, 1987 (52 FR 24740).
        The intent of the air quality surveillance requirement being 
    proposed today is to establish a revised particulate matter monitoring 
    network that would produce air quality data for the purpose of 
    comparison to the proposed primary and secondary PM NAAQS and to 
    facilitate implementation of a possible new regional haze program. In 
    developing a new particulate matter monitoring network and associated 
    requirements, consideration has been given to the indicators, forms, 
    and levels of the proposed primary and secondary PM NAAQS. As a result, 
    nationwide monitoring would be performed for two indicators of PM: 
    PM2.5 and PM10. To be reflective of the basis for and the 
    specification of the forms of the proposed new annual and 24-hour 
    primary and secondary PM2.5 NAAQS, new monitoring network design 
    and siting requirements are being proposed. For purposes of comparison 
    to the proposed PM2.5 annual standard, such sites would be 
    population-oriented and be representative of community-wide exposure 
    levels. The siting criteria for monitors to be used for comparison to 
    the proposed 24-hour PM2.5 standard would also be population-
    oriented but reflective of the highest measured values within the 
    community. To ensure PM data of the highest possible quality, new 
    requirements for quality assurance and designation of new PM2.5 
    reference or equivalent samplers are also described.
        With respect to NAAQS comparisons and visibility protection in more 
    rural areas, the new network design and siting requirements would 
    encourage the placement of PM2.5 monitors outside population 
    centers with two purposes in mind: (1) To provide air quality data 
    necessary to facilitate implementation of the proposed NAAQS, and (2) 
    augmentation of the existing visibility fine particle monitoring 
    network. The coordination of these two monitoring objectives would 
    facilitate implementation of a regional haze program and lead to an 
    integrated monitoring program for fine particles.
        The network design and siting requirements for the annual and 24-
    hour PM10 NAAQS would continue to emphasize identification of 
    locations at maximum concentrations. The PM10 network itself, 
    however, would be revised because the proposed PM2.5 standards 
    would likely be the controlling standards in most situations.
        The new network for PM10 would be derived from the existing 
    network of State and Local Air Monitoring Stations (SLAMS), National 
    Air Monitoring Stations (NAMS), and other monitors generically 
    classified as Special Purpose Monitors (SPM's) which include industrial 
    and special study monitors. Population-oriented NAMS will generally be 
    maintained, other key sampling locations in existing nonattainment 
    areas, and in areas whose concentrations are near the levels of the 
    proposed PM10 NAAQS will be continued. Currently approved 
    reference or equivalent PM10 samplers could continue to be 
    utilized. The revised network would ensure that analysis of national 
    trends in PM10 can
    
    [[Page 65782]]
    
    be continued, that air surveillance in areas with established PM 
    emission control programs can be maintained, and that the PM10 
    NAAQS will not be jeopardized by additional growth in PM10 
    emissions. PM10 sites should be collocated with new PM2.5 
    sites at key population-oriented monitoring stations so that better 
    definition of fine and coarse contributions to PM10 can be 
    determined to provide a better understanding of exposure, emission 
    controls, and atmospheric processes. PM10 sites not needed for 
    trends or for monitoring in areas with relatively high PM10 
    concentrations would likely be discontinued in a longer-term PM10 
    network. The sampling frequency at all PM10 sites would be reduced 
    to a minimum of once in 6 days, which would be sufficient to make 
    comparisons with proposed PM10 standards. The combination of fewer 
    PM10 sites and the reduction in required sampling frequency would 
    save significant resources that could be redirected to PM2.5 
    monitoring.
        The new network for PM2.5 would consist of a ``core'' network 
    of population-oriented SLAMS monitors, ``core'' regional background and 
    regional transport SLAMS, a NAMS subset for long-term monitoring, other 
    SLAMS monitors, and supplementary network of SPM's. The core 
    population-oriented sites would be reflective of community-wide 
    exposure and would be required in all of the largest metropolitan areas 
    and must sample everyday. Frequent measurements are important to 
    understand episodic behavior of PM2.5, and to establish effective 
    emission control strategies to assure protection of the NAAQS. Many of 
    the new PM2.5 sites are expected to be located at existing 
    PM10 sites, and would be collocated with some PAMS sites.
        Consistency with the proposed new PM2.5 NAAQS necessitates the 
    adoption of new concepts for identification and establishment of 
    monitoring stations for the PM2.5 ambient air monitoring network 
    as well as use of the data in relation to the proposed PM2.5 
    NAAQS. These concepts include: (1) The addition of specially coded 
    sites whose data would be used to compare to the levels of the annual 
    and 24-hour PM2.5 NAAQS, and (2) the inclusion of monitoring 
    planning areas and spatial averaging zones (SAZ) to correspond to the 
    population-oriented, spatial averaging approach. These concepts and 
    associated requirements are discussed in sections 58.15 and sections 
    2.8.1-2.8.3 of Appendix D below.
        Although the major emphasis of the new PM networks is compliance 
    monitoring in support of the NAAQS, the network is also intended to 
    assist in reporting of data to the general public, especially during 
    air pollution episodes and to assist in the SIP planning process. To 
    these ends, additional monitoring and analysis requirements are 
    proposed concerning the location of nephelometers (or other continuous 
    particulate matter measuring devices) at some core monitoring sites and 
    the archiving of filters for possible subsequent analysis for subsets 
    of the PM2.5 SLAMS sites. Moreover, collection of meteorological 
    data at core SLAMS sites (including background and regional transport 
    sites) are suggested. The additional requirements should help to 
    further characterize the composition and trends in PM2.5 and 
    better understand the sources and processes leading to elevated 
    PM2.5 concentrations. Because these proposed revisions do not 
    specifically require the chemical analysis of collected PM2.5 or 
    PM10 filters, the Administrator would welcome comments on this 
    issue. In particular, comments are solicited on the need for 
    alternative PM2.5 monitoring methodologies and additional 
    monitoring requirements which might accompany the part 51 
    implementation rules to identify the causes of detected PM2.5 
    NAAQS violations and to assist in the development of PM2.5 
    emission control strategies.
        While the proposed siting criteria and network designs are 
    appropriate for both the proposed revisions to the primary and 
    secondary NAAQS as a whole, additional consideration must be given to 
    air quality surveillance in more rural/remote areas to characterize 
    fine particle levels in order to protect against broader regional scale 
    visibility impairment. To achieve the appropriate level of air quality 
    surveillance in such areas, EPA believes it is important to coordinate 
    and integrate the background and transport monitoring sites specified 
    in this notice with the existing Interagency Monitoring of Protected 
    Visual Environments (IMPROVE) monitors that are in place in a number of 
    locations around the country to characterize fine particle levels and 
    visibility in mandatory Federal Class I areas (e.g., certain national 
    parks and wilderness areas). The need for coordination and integration 
    of visibility-oriented monitoring sites will increase when EPA proposes 
    rules under section 169A of the Act to supplement the secondary NAAQS 
    in addressing regional haze. More detailed guidance on monitoring and 
    assessment requirements will be provided when those rules are proposed. 
    This will include details on topics such as monitor placement, 
    monitoring methodology, duration of sampling and frequency of sampling. 
    It is anticipated, however, that the existing IMPROVE network, together 
    with sites established under this proposal, would be an integral part 
    of the network for determining reasonable progress under a regional 
    haze program.
        In the meantime, EPA recommends that States, in conjunction with 
    EPA and Federal land managers, explore opportunities for expanding and 
    managing PM2.5 and visibility monitoring networks in most 
    efficient and effective ways to meet the collective goals of these 
    programs. To facilitate this, EPA has proposed changes in Appendix C 
    below, to allow use of existing or new IMPROVE monitoring sites to meet 
    the requirements for a transport and/or background site for the 
    proposed PM2.5 standards. States should consider the feasibility 
    of siting new transport/background and/or visibility monitoring 
    locations at or near mandatory Federal Class I areas currently without 
    an IMPROVE site so that such sites could provide data to characterize 
    both fine particle levels and visibility in or near Class I areas. It 
    is EPA's intent that monitoring conducted for purposes of the PM 
    primary and secondary NAAQS (including background and transport sites), 
    and for visibility protection be undertaken as one coordinated national 
    PM monitoring program, rather than as a number of independent networks.
        It is recognized by EPA as well as many outside groups including 
    the Clean Air Act Advisory Committee's Subcommittee on Ozone, 
    Particulate Matter, Regional Haze Implementation Programs and the 
    National Research Council in its 1993 report ``Protecting Visibility in 
    National Parks and Wilderness Areas'' that chemical speciation of PM 
    data would permit development of more effective control strategies to 
    better target those sources of emissions that are causing or 
    contributing to elevated levels of PM2.5 and PM10. Speciation 
    of PM2.5 data can also be used to develop reliable estimates of 
    seasonal and annual average visibility conditions.
        Because of the costs associated with conducting filter analysis on 
    a routine basis, this proposal only requires filters to be archived so 
    they are available for analysis on an as needed basis. The EPA requests 
    comment, however, on the extent to which chemical speciation should be 
    conducted. This would include: (1) Whether specific monitoring sites 
    should be designated for such analyses; (2) the criteria to be
    
    [[Page 65783]]
    
    used to select sites for speciated sampling and analysis; (3) the 
    extent and frequency to which speciation should be required by EPA for 
    at least some monitoring stations and (4) the need for monitoring 
    methodologies not described in this proposal which may be needed to 
    facilitate compositional analysis. The EPA recognizes that there is a 
    need for speciation and other specialized monitoring efforts which are 
    not specifically required by this proposed rule. Accordingly, EPA will 
    give these PM monitoring efforts high priority in its section 105 
    grants program. The Administrator solicits comment on the appropriate 
    portion of the nation's monitoring resources which should be dedicated 
    to speciation and collection of special study data relative to the 
    siting and collection of mass measurements for purposes of comparisons 
    to the NAAQS and visibility assessments at permanent and temporary 
    monitoring stations. The estimated cost for the new PM monitoring 
    program is discussed further in Section IV. R.
        Finally, in anticipation of a new regional haze program and 
    associated additional monitoring requirements, EPA also requests 
    comment on ways that the future PM and IMPROVE networks can be 
    coordinated to conserve resources and serve the goals of both the PM 
    and regional haze implementation program.
        This proposed rulemaking is taken in conjunction with the proposed 
    revisions to the PM NAAQS published elsewhere in today's Federal 
    Register and pertains to changes in the ambient air monitoring 
    requirements for particulate matter contained primarily in 40 CFR part 
    58. A new Federal Reference Method for PM2.5, and changes to the 
    definition of PM10 measurements are proposed in a new Appendix L 
    and revisions to Appendix J respectively in 40 CR part 50. The 
    effective date of these proposed monitoring regulations would be 6 
    months after the actual promulgation date. The EPA is soliciting 
    comment on all aspects of all of the proposed rules.
    
    III. Proposed Revisions to Part 53
    
    A. Designation of Reference Methods for PM2.5
    
        The specifications for reference methods for PM2.5 are 
    described in Appendix L to part 50, proposed elsewhere in this issue of 
    the Federal Register. The performance-based specifications for the 
    operational aspects of a reference method sampler allow various sampler 
    manufacturers to design and fabricate different samplers that would 
    meet the specifications. Accordingly, multiple PM2.5 reference 
    methods are expected to become available from several manufacturers, as 
    is the case for reference methods for PM10 and most gaseous 
    criteria pollutants. Similarly, each reference method for PM2.5, 
    based on a particular sampler, would be formally designated as such by 
    the EPA under new provisions added to part 53.
        These new provisions, primarily contained in a new subpart E, would 
    require that the applicant submit information and documentation to 
    demonstrate that a candidate reference method sampler meets the design 
    specifications set forth in Appendix L of part 50. The provisions would 
    also require that the applicant carry out specific tests to demonstrate 
    that the sampler meets all performance specifications. The nature of 
    these tests and the requirement that they be carried out by the 
    applicant rather than the EPA is consistent with the current 
    requirements in part 53 for designating reference methods for other 
    criteria pollutants.
        Since the critical inlet and particle size separation components of 
    the sampler are specified by design, no wind tunnel or aerodynamic 
    performance tests of these components would be required. But 
    documentation would be required to demonstrate that samplers to be sold 
    as reference methods would be manufactured under an effective quality 
    control system, such as required in an International Organization for 
    Standardization (ISO) 1 9001-certified facility, or a quality 
    control system otherwise certified to meet similar requirements. 
    Specific tests would be required to verify that the critical PM2.5 
    impact or jet diameter was within the design specifications, and that 
    the surface finish of surfaces required to anodized meets the surface 
    finish specifications. Also, a checklist certifying that reference 
    method samplers are or will be manufactured under an acceptable quality 
    assurance system would have to be completed by an ISO-certified or 
    equivalent auditor and submitted initially and annually.
    ---------------------------------------------------------------------------
    
        \1\ The ISO certification ensures compliance to international 
    manufacturing standards from the design and engineering 
    specifications. An ISO certification, or its equivalence for the 
    manufacturing of the reference samplers is consistent with National 
    Technology Transfer and Advancement Act Section 12(d), 15 U.S.C. 
    Section 272 (1996).
    ---------------------------------------------------------------------------
    
        The performance tests for reference method samplers would focus on 
    testing of the sampler's operational performance parameters, the 
    accuracy of its measurement systems, its field precision, and various 
    other sampler control functions. A comprehensive test procedure is 
    proposed for testing a representative candidate sampler for correct 
    flow rate, flow rate regulation, flow rate measurement accuracy, 
    ambient air temperature and barometric pressure measurement accuracy, 
    filter temperature control and measurement accuracy, and sampling time 
    accuracy. This test procedure would require a temperature-controlled 
    environmental test chamber, a technique to simulate reduced barometric 
    pressure, and facilities to generate simulated solar radiation. Other 
    specific tests are proposed to test the sampler's post-sampling filter 
    temperature control, leak check procedure, flow rate cut off function, 
    and field operational precision. It should be noted that work to test 
    the feasibility of these proposed test procedures has not been 
    completed at this time; therefore, some technical changes to the 
    proposed test procedures may be necessary following the results of that 
    work.
    
    B. Designation of Equivalent Methods for PM2.5
    
        In keeping with the EPA's largely performance-based approach for 
    specification of measurement methods for environmental pollutants, 
    provision is also proposed for designating equivalent methods for 
    PM2.5. These provisions are contained in proposed additions to 
    subparts A and C and proposed new subparts E and F of part 53. To 
    minimize the number and extent of performance tests to which candidate 
    equivalent methods would be subjected, three classes of equivalent 
    method are proposed to be defined.
        The first class (Class I) would include PM2.5 methods based on 
    samplers that are very similar to a reference method sampler as 
    specified in appendix L to part 50. Class I would primarily include 
    methods based on samplers whose primary difference from reference 
    method samplers is one or more modifications necessary to provide 
    capability for collection of several sequential samples automatically 
    without intermediate operator service. Samplers capable of collecting 
    multiple sequential samples are important because the sampling 
    schedules proposed in Sec. 58.13 of part 58 call for daily sampling for 
    certain SLAMS. With such a requirement, there is an expected need for 
    samplers that will permit the collection of the required daily samples 
    without the need for an operator to visit the site on a daily basis or 
    for installing multiple samplers at the site. (Since the samplers would 
    need to sample from midnight to midnight, a minimum of two single day 
    samplers would be
    
    [[Page 65784]]
    
    required for full daily sampling; however, as a practical matter, 
    additional single day samplers would generally be needed at a daily 
    monitoring site to cover weekends, holidays, and personnel and 
    scheduling logistics.) A sampler capable of automatically collecting 
    five sequential samples would permit twice-weekly servicing of a 
    monitoring site (assuming sample filters can be retrieved and reloaded 
    on the inactive channels without affecting the actively sampling 
    channel).
        Since the design of sequential samplers is not specified 
    explicitly, sampler manufacturers would be able to design and develop 
    their own techniques to provide for this capability. Where the 
    sequential sample technique consists of relatively minor or simple 
    modifications of the reference method sampler, the sampler would be 
    classified as a Class I candidate equivalent method. (Sequential 
    samplers would also be possible as Class II or III equivalent methods.)
        Class I equivalent method sequential samplers would have to be 
    tested to make sure that the modifications required to provide for 
    sequential sampling do not significantly compromise sampler 
    performance. However, because of their similarity to the reference 
    method sampler, the only additional test requirement for most Class I 
    candidate equivalent methods--in addition to the tests and performance 
    requirements applicable to reference method samplers--would be a test 
    for possible loss of PM in any new or modified components in the 
    sampler inlet upstream of the sample filter. This additional test for 
    Class I samplers is set forth in the proposed new Subpart E, along with 
    the tests for reference method samplers.
        Class II equivalent methods would include all other PM2.5 
    methods that are based on a 24-hour integrated filter sample which is 
    subjected to subsequent moisture equilibration and gravimetric mass 
    analysis, but with an associated sampler having substantial deviations 
    from the design or performance specifications for reference method 
    samplers. These samplers may have a different inlet, a different 
    particle size separator, a different volumetric flow rate, a different 
    filter or filter face velocity, or other significant differences. More 
    extensive performance testing would be required for designation of 
    Class II candidate equivalent methods, with various tests required 
    depending on the nature and extent of the differences between the 
    candidate sampler and specified reference method samplers. These tests 
    include a full wind tunnel evaluation, a wind tunnel inlet aspiration 
    test, a static fractionator test, a fractionator loading test, and a 
    volatility test. The tests and their specific applicability to various 
    types of candidate Class II equivalent method samplers are set forth in 
    proposed new subpart F.
        Finally, Class III equivalent methods would include any candidate 
    PM2.5 methods that could not qualify as Class I or Class II. This 
    class would include any filter-based integrated sampling method having 
    other than a 24-hour PM2.5 sample collection interval followed by 
    moisture equilibration and gravimetric mass. More importantly, class 
    III would also include filter-based continuous or semi-continuous 
    methods, such as beta attenuation instruments, harmonic oscillating 
    element instruments, and other complete in situ monitor types, as well 
    as non-filter-based methods such as nephelometry or other optical 
    instruments.
        The testing requirements for designation of Class III candidate 
    methods would be the most stringent, since quantitative comparability 
    to the reference method would have to be shown under various potential 
    particle size distributions and aerosol composition. However, because 
    of the variety of measurement principles and types of methods possible 
    for Class III candidate equivalent methods, the test requirements would 
    have to be individually selected or specifically designed or adapted 
    for each such type of method. Therefore, the EPA believes that it is 
    not practical to attempt to develop and explicitly describe the test 
    procedures and performance requirements for all of these potential 
    Class III methods a priori. Rather, it is proposed that the test 
    procedures and performance requirements applicable to specific Class 
    III candidate methods would be determined by the EPA on a case-by-case 
    basis upon request, in connection with each proposed or anticipated 
    application for a Class III equivalent method determination. In this 
    regard, the EPA is interested in receiving comments pertinent to the 
    nature and extent of tests that would be appropriate and effectual in 
    determining the performance of various types of Class III candidate 
    equivalent methods relative to the performance of reference methods for 
    PM2.5.
        All classes of candidate equivalent methods would have to be field-
    tested to determine their comparability to measurements obtained with 
    collocated reference methods. For Classes I and II, these collocated 
    field test requirements are specified explicitly in Subpart C, which is 
    proposed to be revised to include the specific requirements for 
    PM2.5 candidate equivalent methods. The proposed requirements for 
    PM2.5 methods are generally patterned after the existing 
    requirements for PM10 candidate methods.
        However, because of the need for greater measurement precision for 
    PM2.5, the comparability specifications, summarized in Table C-4, 
    are somewhat more stringent than those previously established for 
    PM10. Also, for Class II candidate equivalent methods--where two 
    different test sites are required--more definitive specifications are 
    proposed for the tests sites in terms of the PM2.5 to PM10 
    measurement ratio for the test samples. This is necessary because 
    experience with PM10 measurements has indicated that PM 
    measurements made with dissimilar samplers are often considerably 
    affected by differences in the ``character'' of the PM at different 
    monitoring sites, as represented by differences in particle size 
    distribution, composition, density, humidity, and other factors. For 
    purposes of the comparability test, the character of the PM at each 
    test site is represented by the measured PM2.5 to PM10 ratio, 
    which must be greater than 0.75 for one site and less than 0.40 at the 
    other site. (More definitive tests of PM character at the test site are 
    deemed too difficult or costly to carry out for purposes of the 
    comparability test.) Insuring comparability to reference method 
    measurements at sites having profoundly different character of PM is 
    critically important for Class II (and Class III) candidate equivalent 
    methods. Note, however, that the PM2.5 to PM10 ratio 
    requirement does not apply to testing of Class I candidate methods, 
    where only one test site is required.
    
    C. Quality Assurance
    
        Accurate measurement of ambient particulate matter concentrations 
    is severely hampered by the impracticality of providing PM 
    concentration standards for field (or even laboratory) testing of 
    ambient samplers or monitors. Therefore, it is necessary to rely on a 
    specific, well-defined reference method, uniformity of reference method 
    devices and procedures, and continual assessment of bias and operating 
    precision. For the purposes of this regulation, PM2.5 
    concentration measurements would be referenced to measurements made 
    with a reference method sampler in accordance with the reference method 
    as specified in Appendix L of part 50 of this chapter. Monitoring for 
    PM2.5 requires greater attention to achieving data of high 
    quality, with minimal imprecision and
    
    [[Page 65785]]
    
    relative error. These higher quality monitoring data are essential to 
    reduce the chance that PM2.5 measurements would potentially cause 
    unjustified health risk to the population, when measurements 
    underestimate true concentrations, or unnecessary control requirements, 
    when measurements overestimate the true concentrations.
        To meet a data quality objective of 15% precision for 
    ambient PM2.5 attainment measurements, enhanced quality assurance 
    would be required in all areas relating to sampler performance 
    including sampler manufacturing and sampler operation. This is 
    especially important because a reference method sampler is proposed to 
    be used to audit other field monitors, as described later.
        Designated reference and equivalent method samplers and monitors 
    would be required to be manufactured in a manufacturing facility that 
    is either (1) an ISO 9001-registered manufacturing facility, with 
    registration maintained continuously, or (2) a facility that can be 
    demonstrated, on the basis of information submitted to the EPA, to be 
    operated according to an EPA-approved and periodically audited quality 
    system which meets, to the extent appropriate, the same general 
    requirements as for an ISO-registered facility. (This requirement is 
    referred to in this document as an ISO-registered facility, regardless 
    of the procedure taken for EPA approval.)
        In addition to the ISO registration (or equivalent) requirement, a 
    quality assurance manufacturing checklist would have to be submitted 
    annually attesting that the appropriate quality assurance procedures 
    are routinely implemented in the manufacturing of samplers sold as 
    reference or equivalent method samplers. This check list would have to 
    be signed by an ISO-certified auditor or by an auditor who, based on 
    information submitted to the EPA, meets the same general requirements 
    as provided for ISO-certified auditors. (Similarly, an auditor approved 
    by EPA through either mechanism is referred to in this document as an 
    ISO-certified auditor.) This requirement allows for the demonstration 
    of consistency in production and sustained uniformity in design and 
    operation. Further, all testing related to an application for a 
    reference or equivalent method determination under part 53 would have 
    to be carried out in accordance with ISO 9001 and ANSI/ASQC E4 
    standards.
        It is believed that these requirements are necessary to insure that 
    all samplers or analyzers sold as reference or equivalent methods are 
    manufactured to the high standard required to achieve the needed data 
    quality. These procedures are in keeping with the developing 
    international standards for manufacturing in this and other industries. 
    However, comments on the appropriateness and impact of these proposed 
    requirements are solicited. While these requirements are currently 
    proposed to apply only to the manufacture of PM2.5 monitors, 
    extending these requirements to the manufacture of PM10 monitors 
    and possibly other types of SLAMS monitors will likely be considered at 
    a later time.
        A new operational requirement would also have to be met by each 
    PM2.5 sampler or monitor to retain its designation as a reference 
    or equivalent method. Each user agency operating a SLAMS site would be 
    required to obtain at least 6 collocated measurements (audits) per year 
    with a reference method ``audit'' sampler for each routinely operating 
    PM2.5 monitor. The data obtained from these collocated audits 
    would be used to determine a national network integrated operating 
    precision and relative accuracy performance indicator for each 
    designated method. A PM2.5 monitoring method that fails to meet 
    the specified limits for this performance indicator would be subject to 
    possible cancellation of its reference or equivalent method designation 
    under the provisions of Sec. 53.11. For more information on this 
    provision, see section 6 of proposed revisions to Appendix A of part 58 
    and its associated preamble, set forth elsewhere in this Federal 
    Register.
    
    D. Other Changes
    
        A number of other relatively minor technical changes are proposed 
    to Appendix A, some of which affect designation of reference or 
    equivalent methods for other criteria pollutants as well as for 
    PM2.5. These changes include new definitions and clarifications of 
    existing definitions in Sec. 53.1; clarifications of the reference and 
    equivalent method designation requirements for methods for all 
    pollutants, including the new classes of equivalent methods for 
    PM2.5 and a new table summarizing all the designation 
    requirements; and updating of the name of the EPA laboratory to which 
    applications are to be sent. Additional changes include proposed 
    clarifications of the content of information required in submitted 
    applications regarding the candidate method test data, manufacturing 
    quality assurance system, and product warranty, and the content 
    required in the operation or instruction manual associated with a 
    candidate method sampler or analyzer.
        Also, because of the increasing complexity of anticipated candidate 
    methods for all criteria pollutants, an increase in the EPA's time 
    limit for processing applications for reference and equivalent methods, 
    from 75 to 120 days, is proposed. Finally, it is proposed (under 
    Sec. 53.4) that applicants for a PM2.5 reference or equivalent 
    method determination be required to provide a sampler or analyzer that 
    is representative of the one associated with the candidate method for 
    inspection and possible testing by the EPA in connection with 
    processing of the application.
    
    IV. Discussion of Proposed Revisions to Part 58
    
    A. Section 58.1--Definitions
    
        The revisions proposed today would revise the definition of the 
    term traceable and add definitions of the terms Consolidated 
    Metropolitan Statistical Area (CMSA), core SLAMS, equivalent methods, 
    Metropolitan Statistical Area (MSA), monitoring planning area (MPA), 
    monitoring plan, PM2.5, Primary Metropolitan Statistical Area 
    (PMSA), population-oriented, reference method, SAZ (SAZ), SPM fine 
    monitors, and Annual State Monitoring Report.
    
    B. Section 58.13--Operating Schedule
    
        1. PM10 Sampling. The current operating schedule for PM10 
    is based primarily on an analysis of the ratio of measured PM10 
    concentrations to the controlling PM10 standard. Depending upon 
    the ratio, the sampling frequency is either every day, every other day, 
    or every sixth day. The proposed operating schedule would reduce the 
    sampling frequency at all PM10 sites to once every sixth day.
        The Administrator has proposed a new 24-hr PM10 standard based 
    on the 98th percentile which offers a more stable statistical form. She 
    has also solicited comment on the need to retain any 24-hour PM10 
    standard. Unlike the current 24-hr PM10 standard, the proposed 
    standard, if adopted, would not place emphasis on the most extreme 24-
    hr concentrations, especially in areas influenced by fugitive dust. 
    Furthermore, more emphasis for control requirements is anticipated to 
    be placed on annual average concentrations and fewer nonattainment 
    areas (i.e. violation areas) are expected to be based on peak daily 
    concentrations. Consequently, 1 in 6 day sampling should be sufficient 
    to support the new PM10 NAAQS and a less dense monitoring network 
    would also be needed. Comments are solicited on the appropriate 
    sampling schedules
    
    [[Page 65786]]
    
    for PM10 sites if the 24-hour NAAQS for PM10 is retained.
        2. PM2.5 Sampling. Core PM2.5 SLAMS (including NAMS and 
    Core SLAMS collocated at PAMS sites) would be required to sample every 
    day, unless an exception is approved by EPA during established seasons 
    of low PM pollution during which time a minimum of once in 6 days 
    sampling would be permitted. Non-core SLAMS sites would generally be 
    required to sample a minimum of once every sixth day, although episodic 
    or seasonal sampling could also be possible (e.g., in areas where 
    significant violations of the 24-hour NAAQS are expected or at sites 
    heavily influenced by regional transport or episodic conditions). 
    Special purpose monitors, however, may sample on any sampling schedule.
        There is currently very little PM2.5 measurement data. New 
    networks must be established as expeditiously as possible to help 
    characterize the nature and extent of PM2.5 ambient air quality 
    nationwide. Daily sampling for PM2.5 is especially important 
    during the first few years of the new PM2.5 monitoring program to 
    allow for the collection of complete sets of data in order to help with 
    identifying temporal patterns and to understand the episodic behavior 
    of fine particles.
        Although daily sampling with manual methods is labor intensive due 
    to site visits and filter equilibration and weighing, semi-automatic 
    sequential samplers are anticipated to be approvable as class I 
    equivalent samplers (under the provisions of Part 53) which will 
    simplify the data collection process. The EPA solicits comments on the 
    need to extend the start date for a requirement to perform everyday 
    sampling until the time when Class I equivalent samplers have been 
    approved by the Agency.
        In addition, alternative PM2.5 operating schedules which 
    combine intermittent sampling with the use of acceptable continuous 
    fine particle samplers are approvable at some core sites. This 
    alternative is intended to give the States additional flexibility in 
    designing their PM2.5 monitoring networks and to permit data from 
    continuous instruments to be telemetered. This would facilitate public 
    reporting of fine particle concentrations, allow air pollution alerts 
    to be issued and episodic controls to be implemented (as currently done 
    in woodburning areas for PM10). Furthermore, this would permit 
    monitoring agencies to take advantage of new and improved monitoring 
    technologies that should become available during the first few years 
    following the promulgation. As proposed, applicability of the 
    alternative depends on population size of the monitoring area and 
    PM2.5 air quality status.
        After the initial 3 years of PM2.5 data collection and after 
    characterization of PM2.5 levels, determination of violation areas 
    and development of State Implementation Plans), reductions in the 
    frequency of PM2.5 sampling may be appropriate. The EPA welcomes 
    comments on the need for continued long-term monitoring with reference 
    or equivalent samplers on an every day schedule at some or all 
    monitoring stations and on the appropriateness of the criteria for 
    allowing alternative schedules.
    
    C. Section 58.14--Special Purpose Monitors
    
        Special purpose monitoring is needed to help identify potential 
    problems, to help define boundaries of problem areas, to better define 
    temporal (e.g., diurnal) patterns, to determine the spatial scale of 
    high concentration areas, and to help characterize the chemical 
    composition of PM (using alternative samplers and supplemental 
    analyzers), especially on high concentration days or during special 
    studies. Special purpose monitors are an important part of the overall 
    PM monitoring program, and sufficient EPA and State resources must be 
    allocated for their use.
        Today's revisions propose that special purpose PM2.5 and 
    PM10 monitors may sample with any measurement method on any 
    sampling schedule. However, the data from SPM's would not be used for 
    attainment/nonattainment designations if the monitoring method is not a 
    reference or equivalent method or does not meet the requirements of 
    Section 2.4 of Appendix C of Part 58. Moreover, in order to encourage 
    the deployment of SPM's, today's revisions propose that nonattainment 
    designations will not be based on data produced at an SPM site with any 
    monitoring method for a period of 3 years following the promulgation 
    date of the NAAQS.
        The rationale for this concept is based on the need for to 
    encourage building from ``ground zero'' a monitoring infrastructure. 
    Such an infrastructure is needed because of the complexity of the 
    PM2.5 problem and the relative paucity of PM2.5 data to 
    determine where problem areas lie, and the lack of information about 
    sources and formation of aerosols in particular areas. The requirements 
    for the NAMS, minimum core SLAMS, and minimum additional SLAMS sites, 
    described in this notice, are designed to provide much of the 
    information needed to merely define the location of problem areas.
        There is a need, however, to look beyond this minimal network to 
    create an ``optimal'' network that would gather air quality data over a 
    wider geographic area. The optimal network would consist of SLAMS 
    monitors in addition to the required minimums and also SPM's. There are 
    several reasons for a moratorium on regulatory use of data from the 
    during the first 3 years following promulgation of the NAAQS:
        (1) SPM data have historically supplemented the SLAMS network to 
    provide the States with a flexible monitoring program. Although the SPM 
    monitoring does not have to use reference or equivalent monitors, the 
    States tend to use these monitors for data collection. And although SPM 
    data are not required to be submitted to EPA, the States tend to enter 
    all such data into the AIRS data base. Because of the paucity of 
    PM2.5 data, we want to encourage both the collection--with 
    reference or equivalent monitors--and the reporting of as much new 
    PM2.5 data as possible. This includes SPM data.
        (2) There is a general reluctance among State and local governments 
    and businesses to monitor ambient air quality beyond those minimum 
    requirements contained in regulations promulgated by the Environmental 
    Protection Agency (EPA) in the Code of Federal Regulations at Part 58. 
    The reluctance is based in part on the fact that areas have 
    historically been designated to nonattainment where monitoring shows 
    violations of the NAAQS and then classified according to the 
    seriousness of the air pollution problem. Currently, such a 
    nonattainment designation and classification automatically trigger the 
    State implementation attainment planning and demonstration 
    requirements, potential stationary and mobile source emission controls, 
    nonattainment new source review for sources wanting to locate or expand 
    in the new nonattainment area, and possibly additional requirements 
    relating to nonattainment of the NAAQS. Thus, to many affected parties, 
    the current regulatory system results in a disincentive for detecting 
    violations.
        (3) The EPA is evaluating a concept involving the identification of 
    areas that have measured or modeled violations and subsequent 
    identification of other areas whose emissions contribute to those 
    violations. The new required PM2.5 monitoring network, however, 
    may be insufficient to determine all such violation areas and 
    contributing areas, and therefore additional monitors may be desirable. 
    Ambient air
    
    [[Page 65787]]
    
    monitoring will play an important and expanded role in defining 
    violating and contributing areas; with a moratorium on the regulatory 
    use of SPM data, States and businesses would have an additional 
    incentive to monitor for data to more accurately determine the extent 
    of these areas.
        (4) During the initial stages of development of a new PM2.5 
    network, there is a greater need for experimental sampling--to move 
    samplers around, to sample for short periods of time, and to utilize 
    different methods. Incomplete data sets may not be fully representative 
    of local air quality. For these and other similar reasons, there is a 
    need for a pilot network that would not be subjected to the same rules 
    as the full SLAMS network.
        (5) Finally, collecting data at a number of sites beyond either the 
    minimum or optimal number proposed in these regulations would support 
    modeling studies to better define pollution problems, identification of 
    potential pollution problems for enhanced air management programs, the 
    design and implementation of episodic control plans to encourage quick 
    response actions for voluntary emission reduction measures to lower 
    pollution and thereby possibly avoiding nonattainment or ``bump-ups'', 
    and to measure progress toward attainment by relating air quality to 
    population.
        The system of SPM's would at first not be part of the full required 
    or even the ``optimal'' network. To provide the best kind of 
    information, EPA believes that properly sited Federal Reference or 
    Equivalent Methods be used for these SPM efforts in order to collect 
    technically credible data. The EPA also believes that data from those 
    efforts be reported to AIRS so that they are generally available to the 
    public at large and to those who need them for understanding the nature 
    of the problem and for developing solutions and control strategies.
        In proposing a 3-year moratorium on the regulatory use of SPM data, 
    EPA is trying to establish an incentive for States to engage in this 
    additional SPM monitoring using properly sited Federal Reference or 
    Equivalent Monitors. The data from these SPM's would supplement the 
    data collected by SLAMS sites. Although the SPM data would be exempt 
    from regulatory use during the 3-year moratorium, they would 
    nevertheless be evaluated by the State during its annual SLAMS network 
    review. A notice of NAAQS violations resulting from PM SPM``s should be 
    reported to EPA, such high concentrations should be evaluated by the 
    State in the design of its overall SLAMS network and considered by EPA 
    in its review and approval of the State''s monitoring plan. Therefore, 
    during the first 3 years, the SPM data would still play an important 
    role in the regulatory process. After the proposed 3-year exemption 
    period, SPM locations should be considered as potential SLAMS in the 
    State's development and subsequent EPA reviewal process of their 
    monitoring plan network, if the sites record high concentrations which 
    indicate potential violations of the PM NAAQS (for either PM10 or 
    PM2.5) and have been operating for at least 6 months.
        The EPA could have taken a different approach to this problem and 
    not propose a moratorium on the regulatory use of data from the SPM 
    sites. States would still be able to deploy SPM monitors in ways to 
    avoid legal consequences if an exceedance of the NAAQS were found. For 
    instance, any State may use non-reference or non-equivalent methods, 
    which do not meet EPA specifications. Any State could site monitors so 
    that they do not meet EPA siting criteria. Such monitoring would avoid 
    the above-described legal entanglements associated with any NAAQS 
    exceedances, because the data collected would not, under current EPA 
    regulations, be valid for use in comparison to the NAAQS. Moreover, any 
    State could simply not submit the SPM data to EPA.
        The approach described in the above paragraph, however, does have 
    major disadvantages. For instance, an approach that uses unacceptable 
    monitors or siting would result in data that--even if close to being 
    representative of the area or what a properly sited acceptable monitor 
    would measure, would still be clouded with questions regarding its 
    accuracy or precision, which would limit their value in the kinds of 
    analyses mentioned above. In the case of data simply not submitted to 
    EPA, data would not be available to either other States that would be 
    working on development of a solution to the PM-fine problem, or, more 
    important, to the public at large so that they could be aware if there 
    really are problems detected by the monitor.
        In light of these concerns, EPA's proposal is an attempt to take a 
    more straightforward approach, which will encourage collection of 
    additional data that is technically credible and publicly available, 
    and therefore address the Act's mandate for EPA to take the lead in 
    this matter, as found in section 103(c).
    
    D. Section 58.15--PM2.5 NAAQS Eligible Monitors
    
        This new section is proposed to define the PM2.5 monitors 
    eligible for use in determining compliance with the PM2.5 annual 
    and 24-hour NAAQS. The EPA proposes that States identify on EPA's AIRS 
    monitoring site file, all PM2.5 sites eligible for both annual 
    NAAQS comparisons and 24-hour comparisons and those only eligible for 
    24-hour (daily) comparisons. The former sites are intended to be 
    population oriented spatial averaging sites and the latter are intended 
    to represent population-oriented ``hot spot'' locations. The reasons 
    for the different types of monitors are discussed in the preamble to 40 
    CFR part 50.
    
    E. Section 58.20--Air Quality Surveillance: Plan Content
    
        The revisions proposed today would require States to submit a PM 
    monitoring plan to the Regional Administrator within 6 months of the 
    effective date promulgation. The monitoring plan would describe the PM 
    monitoring strategy based on the use of SLAMS (including NAMS and PAMS) 
    and SPM's for PM10 and PM2.5; describe the phase-in of 
    PM2.5 monitors and changes in the existing PM10 monitoring 
    program; describe monitoring objectives and scales of 
    representativeness to facilitate subsequent interpretation of data; 
    define sampling schedules; denote sites intended for comparison to the 
    PM NAAQS; and define the monitoring planning areas (MPA's) and SAZ's 
    (SAZ's) within the State. It should also reference the revised quality 
    assurance plan which is required by Appendix A to Part 58. In regard to 
    the use of air quality data for making comparisons to the NAAQS and 
    other SIP related purposes, the monitoring plan shall also describe the 
    SPM's whose data the State intends to use for SIP purposes. The 
    monitoring plan must also provide for an annual review for termination, 
    relocations, or establishment of new SLAMS or core SLAMS.
    
    F. Section 58.23--Monitoring Network Completion
    
        Under the revisions proposed today, the PM networks would be 
    expected to be completed within 3 years of the effective date of 
    promulgation. While new PM2.5 networks are developed, existing 
    PM10 networks should be considered for reductions consistent with 
    the goals stated in the background section earlier. For PM2.5, a 
    3-year phase-in would be used. The proposed schedule for deployment of 
    new required PM2.5 monitors is described
    
    [[Page 65788]]
    
    here. During the first year, a minimum of one monitoring planning area 
    per State would be required to have core PM2.5 SLAMS. This area 
    would be selected by the State according to the likelihood of observing 
    high PM2.5 concentrations and according to the size of the 
    affected population. In addition, one PM2.5 site would be 
    collocated at one site in each of the PAMS areas. During the second 
    year, all other core population-oriented PM2.5 SLAMS, and all core 
    background and transport sites, must be fully operational. During the 
    third year, any additional required PM2.5 (non-core) SLAMS must be 
    fully deployed and all NAMS sites must be selected from core SLAMS and 
    proposed to EPA for approval.
    
    G. Section 58.25--System Modification
    
        No changes to the regulatory language are proposed to Sec. 58.25; 
    however, under the revisions proposed today, the annual system 
    modifications review must include changes to PM2.5 site 
    designations (e.g., NAAQS comparison sites), the number or boundaries 
    of monitoring planning areas and/or SAZ's.
    
    H. Section 58.26--Annual State Monitoring Report
    
        Under the current regulations, States are required to submit an 
    annual SLAMS data summary report. Under today's proposed revisions, 
    this report shall be expanded to include additional information. First, 
    the new State Monitoring report shall describe the proposed changes to 
    the State's Monitoring Plan, as defined in Sec. 58.20. It shall include 
    a new brief narrative report to describe the findings of the annual 
    SLAMS network review, reflecting within year and proposed changes to 
    the State air quality surveillance system, and to provide information 
    on PM SPM's and other PM sites described in the monitoring plan 
    regardless of whether data from the stations are submitted to EPA 
    (including number of monitoring stations; general locations; monitoring 
    objective; scale of measurement; and appropriate concentration 
    statistics to characterize PM air quality such as number of 
    measurements, averaging time, and maximum, minimum, and average 
    concentration). The latter is needed for EPA to ensure that a proper 
    mix of permanent and temporary monitoring locations are used, that 
    populated areas throughout the nation are monitored, and to provide 
    needed flexibility in the State monitoring program. The content of this 
    brief report shall be in accordance with EPA guidance, and will be 
    available at the time of promulgation of the final Part 58 rule.
        Next, States would be required to describe the proposed changes to 
    existing PM networks. Proposed changes to the existing networks may 
    include modifications to the number, size, or boundaries of Monitoring 
    Planning Areas or SAZ's, number and location of PM SLAMS; number or 
    location of core PM2.5 SLAMS; alternative sampling frequencies 
    proposed for PM2.5 SLAMS (including core PM2.5 SLAMS and 
    PM2.5 NAMS); core PM2.5 SLAMS to be designated PM2.5 
    NAMS; and PM SLAMS to be designated PM NAMS. SLAMS with NAAQS 
    violations should be considered to become new or replacement core 
    sites, and SPM's with NAAQS violations could become part of the SLAMS 
    network. The proposed changes should be developed in close consultation 
    with the appropriate EPA Regional Office and submitted to the 
    appropriate Regional Office for approval. The portion of the plan 
    pertaining to NAMS would be submitted to the Administrator (through the 
    appropriate Regional Office).
        Finally, as a continuation of current regulations, the States shall 
    be required to submit the Annual SLAMS summary report and to certify to 
    the Administrator that the SLAMS data submitted are accurate and in 
    conformance with applicable Part 58 requirements. Under the revisions 
    proposed today, States would also be required to submit annual 
    summaries of SPM data to the Regional Administrator for sites included 
    in their Monitoring Plan and to certify that such data are similarly 
    accurate and likewise in conformance with applicable Part 58 
    requirements or other requirements approved by the Regional 
    Administrator, if these data are intended to be used for SIP purposes.
        During the first 3 years following promulgation, the monitoring 
    plan and any modifications of it must be submitted to EPA by July 1 
    (starting on the year following promulgation) or by alternate annual 
    date to be negotiated between the State and Regional Administrator, 
    with review and approval/disapproval by the Regional Administrator 
    within 45 days. After the initial 3-year period or once a SAZ has been 
    determined to be violating any PM2.5 NAAQS, then changes to a 
    monitoring planning area will require public review and notification to 
    ensure that the appropriate monitoring locations and site types are 
    included. Specific comment on or suggestions for alternate procedures 
    that are not unduly time consuming or burdensome to allow public review 
    and comment on changes in MPA's, SAZ's, or other elements of a 
    monitoring plan developed by a State or local air pollution control 
    agency are especially welcome.
    
    I. Section 58.30--NAMS Network Establishment
    
        The revision proposed today would designate 6 months after the 
    effective date of promulgation as the date by which the NAMS network 
    portion (to be derived from core PM2.5 SLAMS) of each State's 
    SLAMS network must be fully described and documented in a submittal to 
    the Administrator (through the appropriate EPA Regional Office). At 
    this time, a State's NAMS PM10 network must be reaffirmed if no 
    changes are made to the existing network and if changed must also be 
    fully described and documented in a submittal to the Administrator 
    (through the appropriate EPA Regional Office).
    
    J. Section 58.31--NAMS Network Description
    
        Today's proposed revision would require that the NAMS network 
    description also include for PM2.5 the monitoring planning area, 
    SAZ, and the site code designation to identify which site will be used 
    to determine violation of the appropriate NAAQS (annual or 24-hour).
    
    K. Section 58.34--NAMS Network Completion
    
        The revision proposed today would designate 3 years after the 
    effective date of promulgation as the date by which the State must have 
    all PM2.5 NAMS in operation, and 1 year after the effective date 
    of promulgation as the date by which the State must have made all 
    changes to the existing PM10 NAMS.
    
    L. Section 58.35--NAMS Data Submittal
    
        This section defines the data submittal requirements for NAMS and 
    SLAMS. Consistent with current requirements, only the total mass 
    derived from PM10 and PM2.5 SLAMS would be required to be 
    submitted to EPA. However, EPA encourages reporting all data from 
    monitors proposed in the State monitoring plan. These optional data 
    would include data from SPM's and compositional data from all monitors.
    
    M. Appendix A--Quality Assurance Requirements for SLAMS
    
        Meeting the more stringent data quality objectives for ambient 
    PM2.5 monitoring will require considerably enhanced quality 
    assurance in the areas of sampler operation, filter handling, data 
    quality assessment, and other
    
    [[Page 65789]]
    
    operator-related aspects of the PM2.5 measurement process.
        Most operational quality control aspects are specified in Appendix 
    A in general terms. For PM2.5, however, explicit, more stringent, 
    requirements are proposed for sample filter treatment--including the 
    moisture equilibration protocol, weighing procedures, temperature 
    limits for collected samples, and time limits for prompt analysis of 
    samples. These requirements, which are specified in the reference 
    method set forth in proposed new Appendix L to part 50, will help to 
    control measurement precision. Additional or supplemental detailed 
    quality assurance procedures and guidance for all operator-related 
    aspects of the PM2.5 monitoring process will be developed and 
    published as a new Section 2.12 of the EPA's, Quality Assurance 
    Handbook for Air Pollution Measurement Systems series to assist 
    monitoring personnel in maintaining high standards of data quality.
        Procedures for continually assessing the operational quality of the 
    SLAMS monitoring data are specified explicitly in Appendix A of part 
    58. Perhaps the most significant new data quality assessment 
    requirement proposed for PM2.5 monitoring is the requirement that 
    each routinely operating PM2.5 ``compliance'' monitor must be 
    ``audited'' at least 6 times per year. A compliance monitor is a 
    monitor at a site which is included in the PM monitoring plan and whose 
    data is intended for comparison to the NAAQS as described in Appendix 
    D. This is the first time a requirement has been proposed to assess the 
    relative accuracy of the mass concentration measured by a SLAMS PM 
    monitor.
        Each of these 6 ``audits'' would be performed by the monitoring 
    agency and would consist of concurrent operation of a collocated 
    reference method audit sampler along with the routinely operated 
    compliance sampler or monitor. The data from these collocated audits 
    would be pooled by the EPA to assess the performance of PM2.5 
    monitoring methods on a national basis and for each reporting 
    organization. These data would also be used in a screening test of the 
    performance of individual monitors at each monitoring location. Six has 
    been determined to be the minimum number of audit data points needed to 
    yield a reasonable assessment of individual monitor operational 
    performance on an annual basis. This number is analagous to the data 
    requirements for the precision and accuracy assessments for PM10, 
    PM2.5 and other pollutants described in Section 5.
        The integrated operating precision and relative accuracy, evaluated 
    annually, would have to meet a limit of 15 percent. A 
    monitoring method that fails this requirement nationally would be 
    placed in a probationary status pending resolution of the inadequate 
    performance or possible cancellation of its reference or equivalent 
    method designation under the provisions of Sec. 53.11 of part 53 of 
    this chapter. While this action would not result in immediate 
    cancellation of the designation, it would require the method applicant 
    (e.g., the manufacturer) to correct the method performance problems or 
    to submit alternative evidence or arguments (possibly in collaboration 
    with other affected entities) that the method's designation should not 
    be canceled.
        Reporting organizations whose monitoring data failed to meet this 
    requirement (or are significantly worse than the national norm) would 
    be notified that its quality assurance plan or procedures need 
    improvement. Similarly, monitoring data from individual sites that fail 
    the screening test would require remedial action or replacement of the 
    monitoring method. Note, however, that failure of either of these tests 
    or the national test would not automatically cause the associated 
    monitoring data to be invalid.
        Comments are solicited on these method operating performance audits 
    and particularly on the potential use of the audit data by EPA to: (1) 
    Determine a national network operating precision and accuracy 
    performance indicator for each type of designated method, (2) determine 
    the operational performance of methods used by reporting organizations 
    relative to the national norm, and (3) consider cancellation of the 
    reference or equivalent method designation of methods failing to meet 
    the 15 percent operational performance specification.
        Other data assessment requirements proposed in Appendix A for 
    PM2.5 monitoring networks are patterned after the current 
    requirements for PM10 networks and are intended to supplement the 
    audit procedure. PM2.5 network monitors would be subject to 
    precision and accuracy assessments for both manual and automated 
    methods, using procedures similar or identical to the current 
    procedures required for PM10 monitoring networks. Results of these 
    field tests performed by the monitoring agencies (along with the 
    results of the field audits) would be sent to the EPA, which then would 
    carry out the specified calculations. These calculated statistics would 
    become part of the annual assessment of the quality of the monitoring 
    data.
        For automated methods, the additional assessment of the precision 
    would consist of a one-point precision check performed at least once 
    every 2 weeks on each automated analyzer used to measure PM2.5. 
    This precision check would be made by checking the operational flow 
    rate of the analyzer. A standard precision flow rate check procedure 
    similar to that currently used for PM10 network assessments is 
    proposed. Also proposed is an alternative procedure where, under 
    certain specific conditions, it would be permissible to obtain the 
    precision check flow rate data from the analyzer's internal flow meter 
    without the use of an external flow rate transfer standard. (This 
    alternative procedure would also be made applicable to PM10 
    methods.)
        The additional accuracy assessment procedure proposed for 
    PM2.5 automated methods is also similar to that used for PM10 
    networks, although each PM2.5 analyzer would have to be audited 
    quarterly rather than annually, as is the current requirement for 
    PM10 analyzers. The assessment would be performed on the 
    analyzer's operational flow rate using a flow rate transfer standard, 
    with the accuracy calculated from the percent difference between the 
    actual flow rate and the corresponding flow rate indicated by the 
    analyzer.
        For manual methods, an additional precision assessment would be 
    calculated from the data collected from collocated samplers, as is 
    currently required for manual PM10 methods. The number of 
    collocated samplers within each PM2.5 network is proposed to be 
    based upon the total number of samplers within the reporting 
    organization's network. For 1 to 10 total sites, 1 site would be 
    selected for collocation; for 11 to 20 total sites, 2 sites would be 
    selected for collocation; and if a reporting organization has over 20 
    total sites, then 3 sites would be selected for collocation. As for 
    PM10, one sampler of the collocated pair would be designated as 
    the primary sampler whose samples would be used to report air quality 
    for the site, and the other would be designated as the duplicate 
    sampler. The percent differences in measured concentration between the 
    two collocated samplers would be used to calculate this additional 
    network precision.
        The accuracy of the flow rate system of manual methods for 
    PM2.5 would be determined, as for automated methods, by auditing 
    each sampler each calendar quarter. Using a flow rate transfer 
    standard, each sampler would be audited at its normal operating flow
    
    [[Page 65790]]
    
    rate. The percent differences between these flow rates would be used to 
    calculate an additional indicator of accuracy.
        Although the new quality assurance requirements for PM2.5 
    would result in an increase in the quality of the PM monitoring data, 
    the additional QA/QC checks would entail additional cost to the 
    monitoring agency. Some of the new QA/QC assessment requirements may 
    somewhat overlap the similar information provided by other checks, such 
    as the periodic flow rate checks and the use of collocated samplers in 
    monitoring networks. Consequently, the EPA solicits comments on the 
    need to maintain all of these QA requirements and also on the adequacy 
    of the proposed QA data assessments to ensure the defined quality for 
    PM2.5 measurements.
        Table A-1, which summarizes the minimum data quality assessment 
    requirements, would be updated to include the new requirements for 
    PM2.5 methods, and other minor, mostly editorial changes are 
    proposed to Appendix A to update and clarify the language and specific 
    requirements.
        A change to section 2.5 of Appendix A is also being proposed to 
    provide for technical system audits to be performed by EPA at least 
    every three years rather than every year. This change to a less 
    frequent system audit schedule recognizes the fact that for many well 
    established agencies, an extensive system audit and rigorous inspection 
    may not be necessary every year. The determination of the extent and 
    frequency of system audits at an even lower frequency than the proposed 
    three year interval is being left to the discretion of the appropriate 
    Regional Office, based on an evaluation of the Agency's data quality 
    measures. This change would afford both the EPA and the air monitoring 
    agencies flexibility to manage their air monitoring resources to better 
    address the most critical data quality issues.
    N. Appendix C--Monitoring Methodology
        Section 2.2 of Appendix C is proposed to be amended to allow the 
    use of PM10 monitors as surrogates for PM2.5 monitors for 
    purposes of demonstrating compliance with the NAAQS. However, following 
    the measurement of a PM10 concentration higher than the 24-hour 
    PM2.5 standard or an annual average concentration higher than the 
    annual average PM2.5 standard, the PM10 monitor would have to 
    be replaced with a PM2.5 monitor. In addition, for NAMS that are 
    converted to PM2.5 monitoring from PM10 monitoring, the 
    PM10 monitoring must continue concurrently with the PM10 
    monitoring for 1 year following the beginning of the PM2.5 
    monitoring.
        Appendix C would also be amended to add a new section 2.4 
    containing provisions that would allow the use at a SLAMS of a 
    PM2.5 method that had not been designated as a reference or 
    equivalent method under part 53. Such a method would be allowed to be 
    used at a particular SLAMS site to make comparisons to the NAAQS if it 
    met the basic requirements of the test for comparability to a reference 
    method sampler for PM2.5, as specified in Subpart C of part 53 of 
    this chapter, in each of the four seasons of the year at the site at 
    which it is intended to be used. A method that meets this test would 
    then be further subjected to the operating precision and accuracy 
    requirements specified in section 6 of Appendix A of this part, at 
    twice the normal evaluation interval (6 audits in 6 months instead of 6 
    audits in 12 months). A method that meets these requirements would not 
    become an equivalent method, but the method could be used at that 
    particular SLAMS site for any regulatory purpose. The method would be 
    assigned a special method code, and monitoring data obtained with the 
    method would be accepted into AIRS as if they had been obtained with a 
    reference or equivalent method. This provision could thus allow the use 
    of non-conventional PM2.5 methods, such as optical or open path 
    measurement methods, which would be difficult to test under the 
    equivalent method test procedures proposed for part 53.
        In addition, Appendix C would also be amended to add two new 
    sections. A proposed new section 2.5 would clarify that correlated 
    acceptable continuous (CAC) methods for PM2.5 approved for use in 
    a SLAMS under proposed new provisions in Sec. 58.13(f) would not become 
    de facto equivalent methods. This applies to methods that have not been 
    designated equivalent and do not satisfy the requirement of Section 2.4 
    described above. The new section would further clarify that the 
    monitoring data obtained with CAC methods would be restricted to use 
    for the purposes of Sec. 58.13(f) and would not be used for making 
    comparisons to the NAAQS. Proposed new section 2.9 would define so-
    called ``IMPROVE'' samplers for fine particulate matter and clarify 
    that IMPROVE samplers, although not designated as equivalent methods, 
    could be used in SLAMS for monitoring regional background 
    concentrations of fine particulate matter.
        Finally, minor changes are proposed to section 2.7.1 to update the 
    address to which requests for approval for the use of methods under the 
    various provisions of Appendix C should be sent, and section 5 to add 
    additional references.
    
    O. Appendix D
    
        The revisions to Appendix D proposed today would revise Sections 1, 
    2, 2.8, 3, 3.7, and 5 to incorporate changes made necessary by the 
    proposed new PM2.5 NAAQS. Section 1 is revised to add criteria for 
    core PM2.5 stations. Two additional SLAMS monitoring objectives 
    are added: the first is to determine the extent of regional pollutant 
    transport among populated areas, which may originate from distant 
    pollutant sources; the second is in support of secondary NAAQS, to 
    determine the welfare-related impacts in more rural and remote areas 
    (such as visibility impairment and effects on vegetation). Section 2 is 
    revised to include information that would be useful in designing 
    regulatory networks. Section 2.8 and 3.7 are revised to apply to 
    PM2.5 as well as PM10. Section 2.8.1 is added to discuss 
    monitoring planning areas and SAZ's. Section 2.8.2 is added to address 
    the PM2.5 monitoring sites and other requirements to be discussed 
    in the State PM monitoring plan. Finally, section 2.8.3 is added to 
    describe the selection of monitoring locations and SAZ's within the 
    monitoring planning area. A series of diagrams are used to illustrate 
    the basic principles.
        The PM2.5 NAMS shall be selected from the core PM2.5 
    SLAMS. This network will focus on population-oriented surveillance and 
    is intended to provide a national trends network to study the impact of 
    PM2.5 emission sources including regional transport. A new Table 
    5, which lists the goals for the number of PM2.5 NAMS by EPA 
    Region, is added to Section 3.7. Table 5 in Section 5 is redesignated 
    as Table 6 and revised to include PM2.5.
        In Section 2.8.1, in particular, MPA's and SAZ's are introduced to 
    conform to the population-oriented, spatial averaging approach taken in 
    the proposed new PM2.5 NAAQS under 40 CFR Part 50. This approach 
    is more directly related to the epidemiological studies used as the 
    basis for the proposed revisions to the particulate matter NAAQS. This 
    proposal recognizes that the use of MPA's and SAZ's introduces greater 
    complexity into the network design process and the assessment of 
    violations of the NAAQS. Thus, the Administrator would specifically 
    welcome comments on the
    
    [[Page 65791]]
    
    network design approach described in Section 2.8.1 through Section 
    2.8.3.
        Previous requirements for number of monitors in this appendix have 
    been related to the urbanized area populations. The boundaries for the 
    urbanized populations to do not follow political or geographical 
    boundaries. Hence, it is difficult at times to determine the component 
    populations, emissions, or location of monitoring sites. A new concept 
    is being introduced with this proposal to change from urbanized area 
    population to MSA/PMSA populations for PM10 and PM2.5. This 
    will make it easier to track monitors for the above reasons, and to 
    more accurately relate measured concentrations to population exposures.
    1. NAAQS Comparison Sites and New Site Codes
        Through its monitoring plan, which is reviewed and approved by the 
    Regional Administrator, a State would select the population-oriented 
    2 sites eligible for NAAQS comparisons which are included in each 
    monitoring planning area and its SAZ's. Comparisons with the annual 
    primary PM2.5 NAAQS would be based on population oriented SLAMS 
    sites as well as other sites representative of area-wide concentrations 
    in SAZ's. Comparisons to the 24-hour primary PM2.5 NAAQS would be 
    based on these sites as well as all other sites which are population-
    oriented. To encourage PM2.5 monitoring initially, for the first 3 
    years after effective date of promulgation a moratorium is proposed on 
    using data from all eligible SPM's to determine violations of the 
    NAAQS. After this time, any operating SPM site which records a 
    violation of the NAAQS would become eligible for NAAQS comparisons, 
    should be included in the State monitoring plan, and should be 
    considered during the State's review and development of their 
    monitoring network.
    ---------------------------------------------------------------------------
    
        \2\ As currently used in Part 58, population-oriented monitoring 
    or sites applies to residential areas, commercial areas, 
    recreational areas, industrial areas where workers from more than 
    one company are located, and other areas where a substantial number 
    of people may spend a significant fraction of their day.
    ---------------------------------------------------------------------------
    
        Figure 1 in Appendix D shows a conceptual Venn diagram that 
    illustrates which PM2.5 sites in a MPA would be eligible for 
    comparison to the 24-hour and annual PM2.5 NAAQS. To be eligible 
    for NAAQS comparisons, sites must meet all three of the following 
    requirements: (1) Are NAMS/SLAMS or other population oriented sites, 
    (2) are included in the monitoring plan, and (3) meet the requirements 
    of 58.13 and Appendices A, C, and E. Sites that meet the additional 
    requirement of generally representing areawide concentrations in the 
    SAZ are also eligible for comparison to the annual PM2.5 NAAQS 
    using the spatial averaging procedure specified in Part 50 Appendix K. 
    Such sites are designated ``B''. All core monitoring sites and NAMS 
    sites, which are a subset of the core sites, are B sites as are many 
    other SLAMS and some non-SLAMS sites. Other population-oriented sites 
    which are more representative of localized hot spots are only eligible 
    for comparison on a site-by-site basis to the 24-hour PM2.5 NAAQS 
    and are designated ``D''. These may include population-oriented 
    industrial monitors which meet the applicable Part 58 requirements and 
    are also included in the PM monitoring plan. The figure shows that all 
    PM2.5 SLAMS sites are designated ``B'' or ``D''. Sites not 
    designated as ``B'' or ``D'' sites would be designated as ``O'' sites. 
    These codes would become new pollutant specific codes on the AIRS 
    monitoring site file. In addition, core SLAMS PM2.5 sites will 
    receive a new AIRS site type code. These data reporting changes will be 
    described more fully in future AIRS guidance.
        A network design issue that relates to the spatial averaging form 
    of the annual standard is the selection of the first (and/or only) site 
    in a prospective SAZ. Because the intent of the spatial average form of 
    the PM2.5 NAAQS is to estimate community, area-wide air pollution, 
    the emphasis on the first selected SLAMS sites (including core SLAMS) 
    would be ``typical population exposure.''
    2. Monitoring Planning Areas and SAZ's
        In order to acquire population-oriented, spatially averaged 
    monitoring data that correspond more closely to the data that are the 
    basis for the proposed PM2.5 NAAQS, the concepts of monitoring 
    planning areas and SAZ's are used in Section 2.8.1. As part of its 
    monitoring plan, a State will propose monitoring planning areas and 
    also propose non-overlapping SAZ's for each monitoring planning area. 
    The number of monitoring planning areas is determined by the State. 
    This may be one area to cover a small State like Rhode Island or be as 
    many as 25 to correspond to existing air pollution control districts in 
    a State like California. Information to be considered includes 
    topography, PM emissions, number and type of significant PM sources as 
    well as population density and distribution. Monitoring planning areas 
    are required to include all metropolitan statistical areas (MSA's) and 
    Primary Metropolitan Statistical areas (PMSA's) with population greater 
    than 500,000, and generally recommended to include MSA's/PMSA's with 
    population greater than 250,000 and high pollution (defined as 
    producing measurements greater than or equal to 0.8 times the level of 
    the PM2.5 NAAQS) as well as other areas determined to be likely to 
    have high concentration of PM2.5. In addition, optional MPA's may 
    include other designated parts of a State. An MPA should not include 
    different areas separated by topographical barriers. Each MPA can have 
    one or more SAZ's representing the area. The SAZ define the area within 
    which all eligible monitoring data (from ``B'' sites) will be averaged 
    for comparisons with the annual PM2.5 NAAQS. The MPA's and SAZ's 
    would be reviewed and approved annually by the Regional Administrator. 
    Until the monitoring plan is approved, EPA intends to have the SLAMS 
    and sites eligible for NAAQS comparisons default to the SLAMS 
    previously approved. Sites which have discontinued monitoring would 
    continue to be used for comparisons to the NAAQS until their monitoring 
    type status changes.
        Multiple zones within an MPA are most appropriate for large 
    metropolitan areas, large geographical monitoring regions and areas in 
    which concentrated source regions are in low population portions of an 
    MPA. All MPA's and SAZ's must be defined on the basis of some existing 
    delineated mapping data such as county boundaries, zip codes, census 
    blocks or groups of census blocks. This will assist in the proper 
    characterization of the spatial representativeness of air monitoring 
    sites and facilitate better presentations of air monitoring data on 
    national, regional, and local maps.
        All areas in the ambient air may become a SAZ based on 
    considerations of population density, pollution concentration gradients 
    and or the physical size of the area. Generally, a SAZ should 
    characterize an area of relatively homogeneous air quality (i.e., the 
    annual average concentration of the individual monitoring locations 
    within the area should be within 20 percent of the spatial 
    average) and be affected by the same major source categories of 
    particulate matter. In MSA's, the SAZ's must completely cover the 
    entire MPA. In other MPA's, the SAZ's might not completely cover the 
    entire MPA. For example, small networks consisting of say one or two 
    monitoring sites may not adequately characterize the air quality in a 
    large geographic area or in large areas of relatively low population or 
    pollution density. In another situation,
    
    [[Page 65792]]
    
    population centers and pollution regions represented by monitoring 
    sites may be geographically disjoint. In these cases, the spatial 
    representativeness of the monitoring site should be considered in 
    defining the SAZ boundaries. Until more monitoring sites are 
    established, the monitored air quality in areas outside of SAZ's is not 
    known. Although ideally all areas of a State should be included in a 
    SAZ, monitoring density may be insufficient to completely characterize 
    a specific MPA and more monitors would be needed. Nonetheless, in some 
    circumstances a SAZ can be represented by a single monitoring location 
    and this may be sufficient to properly characterize an MPA. The SAZ's 
    should generally include a minimum population of 250,000 and not more 
    than 2 million. Deviations from this criteria should be based on the 
    area's physical size and population density.
        The Administrator recognizes that the designation of SAZ's within 
    Monitoring Planning Areas introduces a certain degree of complexity 
    into the monitoring network planning and data usage process. Comments 
    are therefore solicited on the use of a simpler approach to satisfy the 
    requirements for spatial averaging which are proposed in Part 50. In 
    particular, comments are solicited on a approach wherein there is only 
    one SAZ in each MPA which has the same boundaries as the MPA.
    3. Core Monitoring Sites
        To provide a minimal PM2.5 network in all high population 
    areas for protection of the annual and 24-hour PM NAAQS, each required 
    monitoring planning area must have at least two core monitors. The new 
    core monitoring locations would be an important part of the basic PM-
    fine SLAMS regulatory network. These sites are intended to primarily 
    reflect community-wide air pollution, which would reflect monitoring 
    locations in residential areas or where people spend a substantial part 
    of the day take. In addition to the population-oriented monitoring 
    sites, core monitors would also be established for background and 
    transport monitoring. States should work cooperatively in establishing 
    their State networks in order to maximize the value of monitoring data 
    to best understand the regional behavior of PM2.5.
        To permit interface with measurements of ozone precursors which are 
    also contributors to PM2.5, an additional core monitor collocated 
    at a PAMS site is required in those MSA's where both PAMS and 
    PM2.5 monitoring are required. The core monitor to be collocated 
    at a PAMS site is considered part of the MPA PM2.5 SLAMS network 
    and is not considered as a part of the PAMS network as described in 
    Section 4 of Appendix D.
        The new core population-oriented PM-fine network is conceptually 
    similar to the existing NAMS for other pollutants, but would allow for 
    some year to year changes in site location to ensure that the typical 
    areas of high pollution, high population areas are always monitored. 
    Core sites will be the key sampling locations designated for initial 
    monitoring, and a subset would be selected for longer-term monitoring. 
    The latter would become the NAMS.
        The core sites will also produce the most complete data in the PM-
    fine network. Daily sampling would be required, except during low 
    pollution seasons or other periods as exempted by EPA. As such, a 
    subset of these sites should be considered as candidate locations for 
    adding state-of-the-art research monitoring devices whose data might 
    need to be considered in future reviews of the PM NAAQS. This will 
    ensure continuity and comparability of past, present and future PM data 
    bases.
        Finally, because the core sites would produce the most data, many 
    would be the most likely locations for determining violations of a 
    short-term NAAQS. The core locations would become critical for judging 
    future attainment in an area that has been determined to violate the 
    NAAQS, again putting emphasis on areas with the largest population 
    impact. Complete data at background and transport core sites will also 
    provide the needed data base to better understand the source-receptor 
    relationships and assist the implementation program.
        Each SAZ in a required MPA must have at least one core monitor; the 
    SAZ's in optional MPA's should have at least one core monitor; and it 
    is also suggested that SAZ's should have at least one core site for 
    every four SLAMS. Exemptions are allowed for required core stations in 
    MSA's with population greater than 500,000, if measured or modeled 
    concentrations of PM2.5 are less than 80 percent of the NAAQS for 
    PM2.5. Specific comments on the required and suggested number of 
    core monitoring locations are requested.
    4. Examples of MPA's, SAZ's and NAAQS Eligible Monitors
        Some examples may better illustrate how the concepts of monitoring 
    planning areas and SAZ's may be realized in practice. The San Joaquin 
    Valley air basin in California could be an MPA. If emission sources are 
    distributed throughout this region, then the entire MPA could also be 
    the SAZ. For large counties, such as California's San Bernardino 
    County, which have non-uniform emission sources and population density, 
    there could be at least two SAZ's, such as an eastern SAZ and a western 
    SAZ which is part of the South Coast Air Basin. For an MSA, such as the 
    Philadelphia MSA, or MSA/MPA which crosses State boundaries, separate 
    SAZ's are suggested for each State portion, with substantial population 
    (e.g. greater than 250,000). For the Philadelphia PA-NJ MSA, this could 
    mean at least separate zones for the Philadelphia, PA and NJ portions. 
    In this manner, each State would be responsible for the networks in its 
    SAZ portion of the MPA. (Each of these SAZ's must have at least one 
    core monitor for a total of two for the MPA). Furthermore, for MSA's 
    and large geographic areas with concentrated source regions or 
    industrial areas, such as Philadelphia, separate SAZ's are suggested 
    for the residential/city center and the industrial area to better 
    characterize the gradients in PM2.5 concentrations. Downtown 
    street canyons may be appropriate SAZ's if they also include 
    residential areas, such as is the case in mid-town Manhattan, NY or if 
    they include commercial areas which have higher PM2.5 
    concentrations within the MPA and where significant numbers of people 
    work during the day. Comments are solicited on criteria for defining 
    SAZ's.
        A series of figures is presented to illustrate the concept of MPA's 
    and SAZ's. A hypothetical MPA representing an Eastern urban area is 
    given in Figure 2 of appendix D and illustrates how monitors can be 
    located in relation to population and areas of poor quality. Figure 3 
    in Appendix D shows the same MPA as Figure 2, but includes three SAZ's: 
    an industrial zone, a downtown central business district, and 
    residential areas. Figure 4 in Appendix D shows the same MPA 
    illustrated in Figures 2 and 3. However, sites are denoted by whether 
    they are eligible for comparison with the 24-hour PM2.5 NAAQS or 
    both the 24-hour and the annual PM2.5 NAAQS. Figure 5 in Appendix 
    D shows potential SAZ's in a hypothetical Western State. Figure 6 in 
    Appendix D illustrates State coverage by SAZ's both within and outside 
    MPA's. More detailed guidance for network design for PM2.5 using 
    the concepts of core monitoring stations, MPA's, and SAZ's will be 
    available shortly in an EPA guidance document which is in preparation.
    
    [[Page 65793]]
    
    5. Substitute PM Samplers
        Appendix C (Section 2.2) to Part 58 describes conditions under 
    which TSP samplers may be used as substitutes for PM10 samplers 
    and when such TSP samplers must be replaced with PM10 samplers. 
    The proposed rule will describe similar language regarding PM10 
    samplers which may be used as substitutes for PM2.5 and provide 
    clarification to ensure that only the appropriate TSP or PM10 
    sites are required to be converted to PM10 and PM2.5, 
    respectively. This provision is intended to be used when PM 
    concentrations are low and substitute samplers can be used to satisfy 
    the minimum number of PM samplers needed for an adequate PM network. 
    This may be most appropriate when sufficient resources to purchase new 
    PM samplers may not exist and existing samplers can be temporarily used 
    to serve a new PM network.
    6. NAMS Network Design
        In Section 3.7, the PM10 design criteria for NAMS, namely 
    monitoring objectives, spatial representativeness, the category ``a'' 
    maximum concentration site, number of sites, etc., remains in effect. 
    In addition, the traditional concept of NAMS as long-term monitoring 
    stations to assess trends and to support national assessments and 
    decisions is reiterated. However, concerning PM2.5 network design, 
    a more flexible approach is proposed. First, the PM2.5 NAMS will 
    be concentrated in metropolitan areas in keeping with the risk 
    management approach of the proposed new PM2.5 NAAQS. Next, a 
    numeric range of prospective PM2.5 NAMS by EPA Region are 
    identified. These are based on consideration of a number of factors set 
    by Regions to provide maximum flexibility for State and local agencies, 
    but should represent the range of conditions occurring in the Regions 
    taking into consideration such factors as the total number and types of 
    sources, ambient characteristics of particulate matter, regional 
    transport, geographic area, and affected population. The goals for 
    Regions varies from a low of 10 to 15 for Regions VII, VIII and X to a 
    high of 35 to 50 for Regions IV and V while the total ranges from 205 
    to 295 with an expected national target of 250. In particular, comments 
    are requested about the general approach of goals by Region and the 
    numbers estimated.
    
    P. Appendix E
    
        Today's revision to Appendix E consists of relatively minor changes 
    to Section 8 which currently provides the sampler siting criteria for 
    PM10. The modifications basically expand the siting requirements 
    to include PM2.5 as well as PM10 by selectively replacing the 
    term PM10 with PM which would be defined as applying to PM10 
    and PM2.5. This will permit existing PM10 sites to continue 
    to be used and, when appropriate, to serve as platforms for new 
    PM2.5 sampling.
    
    Q. Appendix F
    
        A new section has been added for the annual summary statistics for 
    PM2.5 in Appendix F. It should be noted that the current 
    procedures for reporting and certifying the air quality data may be 
    changed later, since the AIRS system is undergoing reengineering.
    
    R. Cost Estimates for New PM Networks
    
        The costs associated with the start-up of a PM2.5 network and 
    the phase-down of the existing PM10 sampling network depend on the 
    3-year phase-in of the new proposed requirements and the number of PM 
    monitors that the Administrator believes are necessary in a mature 
    network.
    
                                              Table 1. PM-2.5 Network Costs                                         
                                                 [Thousands of dollars]                                             
    ----------------------------------------------------------------------------------------------------------------
                                     Number of   Number of    Capital    Sampling     Filter     Special     Total  
                  Year                 sites      samplers      cost       & QA      analysis    studies      cost  
    ------------------------------------------------\1\--------------------------------\2\--------------------------
    1997...........................          0           0      $4,095  .........  ...........  .........     $4,095
    1998...........................        216         318       7,908     $4,382      $1,558      $2,600     16,478
    1999...........................        714       1,004       6,850     11,514         926       1,300     20,590
    2000...........................      1,200       1,490   .........     17,833         926       1,300    20,059 
    ----------------------------------------------------------------------------------------------------------------
    \1\ The PM-2.5 Network includes 160 collocated monitors for QA purposes, and 130 collocated monitors to avoid   
      weekend site visits.                                                                                          
    \2\ Three different types of filter analyses are anticipated (exceedances analyses, screening analyses, and     
      detailed analyses).                                                                                           
    
    
                    Table 2.--Cost for PM2.5 Filter Analyses                
    ------------------------------------------------------------------------
                                                                   Estimated
                       Type of filter analysis                      cost per
                                                                     sample 
    ------------------------------------------------------------------------
    Exceedance Analysis:                                                $200
        High PM2.5 concentration events are optically analyzed              
         for particle size and composition utilizing electron               
         microscopy..............................................           
    Screening Analysis:                                                     
        X-Ray Fluorescence (XRF) for elemental composition                  
         (crustal material, sulfur, and heavy metals)............         50
        Thermo-optical analysis for elemental/organic/total                 
         carbon..................................................         50
                          Detailed Analysis:                                
        Inductively Coupled Argon Plasma (ICAP) Analysis for                
         elemental composition...................................        100
        Analysis for speciated organic composition...............        400
        Analysis for sulfate, aerosol acidity....................        100
    ------------------------------------------------------------------------
    
        Table 3 presents the change in PM10 network costs. The costs 
    are shown for a current network of 1,650 sites and the phase down to a 
    future projected network of 600 sites. PM10 costs have been 
    calculated for the continued operation on a one in 6-day schedule, and 
    for the relocation or discontinuance of monitoring sites. Table 4 shows 
    the cost of PM monitoring according to sampling frequency and the type 
    of PM monitor. Details of this information can be found in the 
    ``Information Collection Request'' for these proposed requirements.
    
    [[Page 65794]]
    
    
    
                                              Table3.--PM-10 Network Costs                                          
                                                 [Thousands of dollars]                                             
    ----------------------------------------------------------------------------------------------------------------
                                                                                    Capital                         
                                                           Number of   Number of    cost to   Operation &    Total  
                             Year                            sites    samplers\1\    remove   maintenance     cost  
                                                                                     sites        cost              
    ----------------------------------------------------------------------------------------------------------------
    1997.................................................      1,650       1,810   .........     $15,474     $15,473
    1998.................................................      1,374       1,544        $110      12,181      12,291
    1999.................................................        972       1,132         174       8,914       9,088
    2000.................................................        600         760         161       5,966       6,127
    ----------------------------------------------------------------------------------------------------------------
    \1\ The PM10 network includes 160 collocated monitors for QA purposes.                                          
    
    
                   Table 4.--Costs for Particulate Monitoring               
    ------------------------------------------------------------------------
        PM monitor and sampling        One-time capital   Annual operation &
               frequency                     cost          maintenance cost 
    ------------------------------------------------------------------------
    PM-10 1-in-6 day sampling        $14,500............  $8,700.           
     schedule.                                                              
    PM-2.5 1-in-6 day sampling       $9,600 to $16,900..  $11,200.          
     schedule.                                                              
    PM-2.5 every day sampling......  $14,600 to $21,900.  $18,900.          
    Nephelometer (continuous)......  $20,100 to $26,300.  $16,700 to        
                                                           $17,500.         
    ------------------------------------------------------------------------
    
    S. Reference
    
        1. Information Collection Request, 40 CFR 58 Ambient Air Quality 
    Surveillance, OMB #2060-0084, EPA ICR #0940.14, U.S. Environmental 
    Protection Agency, Office of Air Quality Planning and Standards, 
    Research Triangle Park, NC 27711 (October 23, 1996).
    
    V. Administrative Requirements
    
    A. Regulatory Impact Analysis
    
        Under Executive Order 12866 (58 FR 51735, October 4, 1993), the 
    Agency must determine whether the regulatory action is ``significant'' 
    and therefore subject to Office of Management and Budget (OMB) review 
    and to the requirements of the Executive Order. The Order defines 
    ``significant regulatory action'' as one that is likely to result in a 
    rule that may:
        (1) Have an annual effect on the economy of $100 million or more or 
    adversely affect in a material way the economy, a sector of the 
    economy, productivity, competition, jobs, the environment, public 
    health or safety, or State, local, or governments or communities;
        (2) Create a serious inconsistency or otherwise interfere with an 
    action taken or planned by another Agency;
        (3) Materially alter the budgetary impact of entitlements, grants, 
    user fees, or loan programs or the rights and obligations or recipients 
    thereof; or
        (4) Raise novel legal or policy issues arising out of legal 
    mandates, the President's priorities, or the principles set forth in 
    the Executive Order.
        It has been determined that this rule is not a ``significant 
    regulatory action'' under the terms of the Executive Order 12866 and is 
    therefore not subject to formal OMB review. However, this rule is being 
    reviewed by OMB under Reporting and Record keeping Requirements (see 
    below).
    
    B. Paperwork Reduction Act
    
        The information collection requirements contained in this proposed 
    rule have been submitted for approval to OMB under the Paperwork 
    Reduction Act, 44 U.S.C. 3501 et seq. An Information Collection Request 
    document has been prepared by the EPA (ICR No. 0940.14) and a copy may 
    be obtained from Sandy Farmer, Information Policy Branch, EPA, 401 M 
    Street SW, Mail Code 2137, Washington, DC 20460; or by calling (202) 
    260-2740.
    1. Need and Use of the Collection
        The main use for the collection of the data is to support the PM 
    NAAQS revisions. The various parameters reported as part of this ICR 
    are necessary to ensure that the information and data collected by 
    State and local agencies to assess the nation's air quality are 
    defensible, of known quality, and meet the EPA's data quality goals of 
    completeness, precision, and accuracy.
        The need and authority for this information collection is contained 
    in Section 110(a)(2)(C) of the Act, which requires ambient air quality 
    monitoring for purposes of the SIP and reporting of the data to EPA, 
    and Section 319, which requires the reporting of a daily air pollution 
    index. The legal authority for this requirement is the Ambient Air 
    Quality Surveillance Regulations, 40 CFR 58.20, 58.21, 58.25, 58.26, 
    58.28, 58.30, 58.31, 58.35, and 58.36.
        The EPA's Office of Air Quality Planning and Standards uses ambient 
    air monitoring data for a wide variety of purposes, including making 
    NAAQS attainment/nonattainment decisions; determining the effectiveness 
    of air pollution control programs; evaluating the effects of air 
    pollution levels on public health; tracking the progress of SIP's; 
    providing dispersion modeling support; developing responsible, cost-
    effective control strategies; reconciling emission inventories; and 
    developing air quality trends. The collection of PM2.5 data is 
    necessary to support the PM2.5 NAAQS, and the information 
    collected will have practical utility as a data analysis tool.
        The State and local agencies with responsibility for reporting 
    ambient air quality data and information as requested by these proposed 
    regulations will submit these data electronically to the U.S. EPA's 
    Aerometric Information Retrieval System, Air Quality Subsystem (AIRS-
    AQS). Quality assurance/quality control records and monitoring network 
    documentation are also maintained by each State/local agency, in AIRS-
    AQS electronic format where possible.
    2. Reporting and Recordkeeping Burden
        The total annual collection and reporting burden associated with 
    this proposal is estimated to be 490,526 hours. Of this total, 484,545 
    hours are estimated to be for data reporting, or an average of 3,327 
    hours for the estimated 130 respondents. The remainder of 5,981 hours 
    for recordkeeping burden averages 46 hours for the estimated 130 
    respondents. The capital O/M costs associated with this proposal are 
    estimated to be $19,714,453. These estimates include time for reviewing 
    instructions, searching existing data sources, gathering and 
    maintaining the data needed, and completing and reviewing the 
    collection of information.
        The frequency of data reporting for the NAMS and the SLAMS air 
    quality data as well as the associated precision and accuracy data are 
    submitted to EPA according to the schedule defined in 40 CFR part 58. 
    This regulation currently requires that State and local air quality
    
    [[Page 65795]]
    
    management agencies report their data within 90 days after the end of 
    the quarter during which the data were collected. The annual SLAMS 
    report is submitted by July 1 of each year for data collected from 
    January 1 through December 31 of the previous year in accordance with 
    40 CFR 58.26. This certification also implies that all SPM data to be 
    used for regulatory purposes by the affected State or local air quality 
    management agency have been submitted by July 1.
    3. Burden
        Burden means the total time, effort, or financial resources 
    expended by persons to generate, maintain, retain, or disclose or 
    provide information to or for a Federal agency. This includes the time 
    needed to review instructions; develop, acquire, install, and utilize 
    technology and systems for the purpose of collecting, validating, and 
    verifying information, processing and maintaining information, and 
    disclosing and providing information; adjust the existing ways to 
    comply with any previously applicable instructions and requirements; 
    train personnel to be able to respond to a collection of information; 
    search data sources; complete and review the collection of information; 
    and transmit or otherwise disclose the information.
        An Agency may not conduct or sponsor, and a person is not required 
    to respond to a collection of information unless it displays a 
    currently valid OMB control number. The OMB control numbers for EPA's 
    regulations are listed in 40 CFR Part 9 and 48 CFR Chapter 15.
        Comments are requested on the Agency's need for this information, 
    the accuracy of the provided burden estimates, and any suggested 
    methods for minimizing respondent burden, including through the use of 
    automated collection techniques. Send comments on the ICR to the 
    Director, OPPE Regulatory Information Division; U.S. Environmental 
    Protection Agency (2137); 401 M St., SW.; Washington, DC 20460; and to 
    the Office of Information and Regulatory Affairs, Office of Management 
    and Budget, 725 17th St., NW., Washington, DC 20503, marked 
    ``Attention: Desk Officer for EPA.'' Include the ICR number in any 
    correspondence. Since OMB is required to make a decision concerning the 
    ICR between 30 and 60 days after December 13, 1996, a comment to OMB is 
    best assured of having its full effect if OMB receives it by January 
    13, 1997. The final rule will respond to any OMB or public comments on 
    the information collection requirements contained in this proposal.
    
    C. Impact on Small Entities
    
        Pursuant to section 605(b) of the Regulatory Flexibility Act, 5 
    U.S.C. 605(b), the Administrator certifies that this rule will not have 
    a significant economic impact on a substantial number of small 
    entities. This rulemaking package does not impose any additional 
    requirements on small entities because it applies to governments whose 
    jurisdictions cover more than 200,000 population. Under the Regulatory 
    Flexibility Act, governments are small entities only if they have 
    jurisdictions of less than 50,000 people. In addition, this rule 
    imposes no enforceable duties on small businesses.
    
    D. Unfunded Mandates Reform Act of 1995
    
        Under sections 202, 203 and 205 of the Unfunded Mandates Reform Act 
    of 1995 (``Unfunded Mandates Act''), signed into law on March 22, 1995, 
    the EPA must undertake various actions in association with proposed or 
    final rules that include a Federal mandate that may result in estimated 
    costs of $100 million or more to the private sector, or to State or 
    local governments in the aggregate.
        The EPA has determined that this rule does not contain a Federal 
    mandate that may result in expenditures of $100 million or more for 
    State, and local governments, in the aggregate, or the private sector 
    in any one year. Our economic analysis indicates that the total 
    implementation cost will be approximately $88,728,000 in 1996 dollars 
    for the 3 years to phase in the network, or an average of $29,576,000 
    for the 3-year implementation. The table below shows how this 3-year 
    average was derived for the various cost elements of monitoring. While 
    this table represents the 3-year period 1998-2000, the total cost for 
    PM2.5 monitoring include the initial capital costs anticipated in 
    1997. In addition, this rule imposes no enforceable duties on small 
    businesses.
    
                          Cost Based on 3-Year Average                      
                             [Thousands of dollars]                         
    ------------------------------------------------------------------------
                                                                     3 year 
                  Cost/Element                  PM10      PM2.5      totals 
    ------------------------------------------------------------------------
    Network design.........................         $0       $571       $571
    Site installation......................        311      5,013      5,324
    Sampling & analysis....................      2,647      6,758      9,405
    Maintenance............................      1,233      1,928      3,161
    Data management........................      1,245      1,574      2,819
    Quality assurance......................      1,745      3,373      5,118
    Supervision............................      1,988      1,189      3,177
                                            --------------------------------
        Summary \1\........................      9,169     20,407    29,576 
    ------------------------------------------------------------------------
    \1\ Totals are rounded.                                                 
    
    List of Subjects
    
    40 CFR Part 53
    
        Environmental protection, Administrative practice and procedure, 
    Air pollution control, Reporting and recordkeeping requirements.
    
    40 CFR Part 58
    
        Air pollution control, Intergovernmental relations, Reporting and 
    recordkeeping requirements.
    
        Dated: November 27, 1996.
    Carol M. Browner,
    Administrator.
    
        For the reasons set forth in the preamble, title 40, chapter I, 
    part 53 and part 58 of the Code of Federal Regulations are proposed to 
    be amended as follows:
    
    [[Page 65796]]
    
    PART 53--[AMENDED]
    
        1. The authority citation for part 53 continues to read as follows:
    
        Authority: Sec. 301(a) of the Clean Air Act (42 U.S.C. Sec. 
    1857g(a)) as amended by sec. 15(c)(2) of Pub. L. 91-604, 84 Stat. 
    1713, unless otherwise noted.
    
        2. Subpart A is revised to read as follows:
    
    Subpart A--General Provisions
    
    Sec.
    53.1 Definitions.
    53.2 General requirements for a reference method determination.
    53.3 General requirements for an equivalent method determination.
    53.4 Applications for reference or equivalent method determinations.
    53.5 Processing of applications.
    53.6 Right to witness conduct of tests.
    53.7 Testing of methods at the initiative of the Administrator.
    53.8 Designation of reference and equivalent methods.
    53.9 Conditions of designation.
    53.10 Appeal from rejection of application.
    53.11 Cancellation of reference or equivalent method designation.
    53.12 Request for hearing on cancellation.
    53.13 Hearings.
    53.14 Modification of a reference or equivalent method.
    53.15 Trade secrets and confidential or privileged information.
    53.16 Supersession of reference methods.
    
    Tables to Subpart A of Part 53
    
    Table A-1--Summary of Applicable Requirements for Reference & 
    Equivalent Methods for Air Monitoring of Criteria Pollutants
    
    Appendix A to Subpart A of Part 53--References
    
    Subpart A--General Provisions
    
    
    Sec. 53.1  Definitions.
    
        (a) Terms used but not defined in this part shall have the meaning 
    given them by the Act.
        (b) Act means the Clean Air Act (42 U.S.C. 1857-1857l), as amended.
        (c) Agency means the Environmental Protection Agency.
        (d) Administrator means the Administrator of the Environmental 
    Protection Agency or the Administrator's authorized representative.
        (e) Reference method means a method of sampling and analyzing the 
    ambient air for an air pollutant that is specified as a reference 
    method in an appendix to part 50 of this chapter, or a method that has 
    been designated as a reference method in accordance with this part; it 
    does not include a method for which a reference method designation has 
    been canceled in accordance with Sec. 53.11 or Sec. 53.16.
        (f) Equivalent method means a method of sampling and analyzing the 
    ambient air for an air pollutant that has been designated as an 
    equivalent method in accordance with this part; it does not include a 
    method for which an equivalent method designation has been canceled in 
    accordance with Sec. 53.11 or Sec. 53.16.
        (g) Candidate method means a method of sampling and analyzing the 
    ambient air for an air pollutant for which an application for a 
    reference method determination or an equivalent method determination is 
    submitted in accordance with Sec. 53.4, or a method tested at the 
    initiative of the Administrator in accordance with Sec. 53.7.
        (h) Manual method means a method for measuring concentrations of an 
    ambient air pollutant in which sample collection, analysis, or 
    measurement, or some combination thereof, is performed manually. A 
    method for PM10 or PM2.5 which utilizes a sampler that 
    requires manual preparation, loading, and weighing of filter samples is 
    considered a manual method even though the sampler may be capable of 
    automatically collecting a series of sequential samples.
        (i) Automated method or analyzer means a method for measuring 
    concentrations of an ambient air pollutant in which sample collection 
    (if necessary), analysis, and measurement are performed automatically 
    by an instrument.
        (j) Test analyzer means an analyzer subjected to testing as part of 
    a candidate method in accordance with subparts B, C, D, E, or F of this 
    part, as applicable.
        (k) Applicant means a person or entity who submits an application 
    for a reference or equivalent method determination under Sec. 53.4, or 
    a person or entity who assumes the rights and obligations of an 
    applicant under Sec. 53.7. Applicant may include a manufacturer, 
    distributer, supplier, or vendor.
        (l) Ultimate purchaser means the first person who purchases a 
    reference method or an equivalent method for purposes other than 
    resale.
        (m) PM10 sampler or PM2.5 sampler means a device, 
    associated with a manual method for measuring PM10 or PM2.5 
    (respectively), designed to collect PM10 or PM2.5 
    (respectively) from an ambient air sample, but lacking the ability to 
    automatically analyze or measure the collected sample to determine the 
    mass concentration of PM10 or PM2.5 in the sampled air.
        (n) Test sampler means a PM10 sampler or a PM2.5 sampler 
    subjected to testing as part of a candidate method in accordance with 
    subparts C, D, E or F of this part.
        (o) Collocated describes two or more air samplers, analyzers, or 
    other instruments which sample the ambient air that are operated 
    simultaneously while located side by side, separated by a distance that 
    is large enough to preclude the air sampled by any of the devices from 
    being affected by any of the other devices, but small enough so that 
    all devices obtain identical or uniform ambient air samples that are 
    equally representative of the general area in which the group of 
    devices is located.
        (p) Sequential samples for particulate matter samplers means two or 
    more particulate matter samples for sequential (but not necessarily 
    contiguous) time periods that are collected automatically by the same 
    sampler without the need for intervening operator service.
        (q) Class I equivalent method means an equivalent method for 
    PM2.5 which is based on a sampler that is very similar to the 
    sampler specified for reference methods in Appendix L of part 50 of 
    this chapter, with only minor deviations or modifications, as 
    determined by the EPA. A common example of a Class I PM2.5 sampler 
    is a reference method sampler that has been modified to provide 
    automatic collection of sequential samples, as defined in paragraph (p) 
    of this section.
        (r) Class II equivalent method means an equivalent method for 
    PM2.5 that utilizes a PM2.5 sampler in which an integrated 
    PM2.5 sample is obtained from the atmosphere by filtration and 
    subjected to a subsequent filter equilibration process followed by a 
    gravimetric mass determination, but which is not a Class I equivalent 
    method because of substantial deviations from the design specifications 
    of the sampler specified for reference methods in Appendix L of part 50 
    of this chapter, as determined by the EPA.
        (s) Class III equivalent method means an equivalent method for 
    PM2.5 that has been determined by the EPA not to be a Class I or 
    Class II equivalent method. This fourth type of PM2.5 method 
    includes alternative equivalent method samplers and continuous 
    analyzers, based on designs and measurement principles different from 
    those specified for reference methods (e.g., a means for estimating 
    aerosol mass concentration other than by conventional integrated 
    filtration followed by equilibration and gravimetric analysis). These 
    samplers (or monitors) are those deemed to be substantially different 
    from reference method samplers and may use components and methods other 
    than
    
    [[Page 65797]]
    
    those specified for reference method samplers. Class III candidate 
    samplers or analyzers require full equivalency testing and must meet 
    all requirements specified in subpart F of this chapter.
        (t) An ISO-registered facility shall be defined as a manufacturing 
    facility that is either:
        (1) An International Organization for Standardization (ISO) 9001-
    registered manufacturing facility, with registration maintained 
    continuously; or
        (2) A facility that can be demonstrated, on the basis of 
    information submitted to the EPA, to be operated according to an EPA-
    approved and periodically audited quality system which meets, to the 
    extent appropriate, the same general requirements as an ISO registered 
    facility for the design and manufacture of designated reference and 
    equivalent method samplers and monitors.
        (u) An ISO-certified auditor shall be defined as an auditor either 
    certified by an ISO accredited registrar or an auditor who, based on 
    information submitted to the EPA, meets the same general requirements 
    as provided for ISO-certified auditors.
    
    
    Sec. 53.2  General requirements for a reference method determination.
    
        The following general requirements for a reference method 
    determination are summarized in Table A-1 of this subpart.
        (a) Manual methods. (1) For measuring SO2 and lead, Appendices 
    A and G of part 50 of this chapter specify unique manual reference 
    methods for those pollutants. Except as provided in Sec. 53.16, other 
    manual methods for SO2 and lead will not be considered for 
    reference method determinations under this part.
        (2) A reference method for measuring PM10 must be a manual 
    method that meets all requirements specified in Appendix J of part 50 
    of this chapter and must include a PM10 sampler that has been 
    shown in accordance with this part to meet all requirements specified 
    in subpart D of this part.
        (3) A reference method for measuring PM2.5 must be a manual 
    method that meets the requirements specified in Appendix L of part 50 
    of this chapter and must include a PM2.5 sampler that has been 
    shown in accordance with this part to meet the applicable requirements 
    specified in subpart E of this part. Further, reference method samplers 
    must be manufactured in an ISO 9001-registered facility as defined in 
    Sec. 53.1(t) and , as set forth in Sec. 53.51 (subpart E, of this 
    part), and the Product Manufacturing Checklist set forth in subpart E 
    of this part must be completed by an ISO 9001-certified auditor, as 
    defined in Sec. 53.1(u), and submitted to the EPA annually to retain a 
    PM2.5 reference method designation. In addition, all designated 
    reference methods for PM2.5 must meet requirements for network 
    operating performance determined annually as set forth in section 6 of 
    Appendix A of part 58 of this chapter.
        (b) ``Automated methods.'' An automated reference method for 
    measuring CO, O3, and NO2 must utilize the measurement 
    principle and calibration procedure specified in the appropriate 
    appendix to part 50 of this chapter and must have been shown in 
    accordance with this part to meet the requirements specified in subpart 
    B of this part.
    
    
    Sec. 53.3  General requirements for an equivalent method determination.
    
        (a) Manual methods. A manual equivalent method must have been shown 
    in accordance with this part to satisfy the applicable requirements 
    specified in subpart C of this part. In addition, PM10 or 
    PM2.5 samplers associated with manual equivalent methods for 
    PM10 or PM2.5 must have been shown in accordance with this 
    part to satisfy the following additional requirements:
        (1) A PM10 sampler associated with a manual method for 
    PM10 must satisfy the requirements of subpart D of this part.
        (2) A PM2.5 Class I equivalent method sampler must satisfy all 
    requirements of subparts C and E of this part, which include 
    appropriate demonstration that each and every deviation or modification 
    from the reference method sampler specifications does not significantly 
    alter the performance of the sampler.
        (3) A PM2.5 Class II equivalent method sampler must satisfy 
    the requirements of subparts C, E, and F of this chapter.
        (4) Requirements for PM2.5 Class III equivalent method 
    samplers are not provided in this part because of the wide range of no-
    filter-based measurement technologies that could be applied and the 
    likelihood that these requirements will have to be specifically adapted 
    for each such type of technology. Specific requirements will be 
    developed as needed.
        (5) All designated equivalent methods for PM2.5 must be 
    manufactured in an ISO 9001-registered facility, as defined in 
    Sec. 53.1(t) and as set forth in Sec. 53.51 (subpart E) of this part, 
    and the Product Manufacturing Checklist set forth in Appendix E of this 
    part must be completed by an ISO 9001-certified auditor, as defined in 
    Sec. 53.1(u), and submitted to the EPA annually to retain a PM2.5 
    equivalent method designation.
        (6) All designated equivalent methods for PM2.5 must also meet 
    annual requirements for network operating performance determined as set 
    forth in section 6 of Appendix A of part 58 of this chapter.
        (b) Automated methods. (1) Automated equivalent methods for 
    pollutants other than PM2.5 or PM10 must have been shown in 
    accordance with this part to satisfy the requirements specified in 
    subparts B and C of this part.
        (2) Automated equivalent methods for PM10 must have been shown 
    in accordance with this part to satisfy the requirements of subparts C 
    and D of this part.
        (3) Requirements for PM2.5 Class III automated equivalent 
    methods for PM2.5 are not provided in this part because of the 
    wide range of non-filter-based measurement technologies that could be 
    applied and the likelihood that these requirements will have to be 
    specifically adapted for each such type of technology. Specific 
    requirements will be developed as needed.
        (4) All designated equivalent methods for PM2.5 must be 
    manufactured in an ISO 9001-registered facility, as set forth in 
    Appendix E of this part, and the Product Manufacturing Checklist set 
    forth in Appendix E of this part must be completed by an ISO 9001-
    certified auditor and submitted to the EPA annually to retain a 
    PM2.5 equivalent method designation.
        (5) All designated equivalent methods for PM2.5 must also meet 
    annual requirements for network operating performance determined as set 
    forth in section 6 of Appendix A of part 58 of this chapter.
    
    
    Sec. 53.4  Applications for reference or equivalent method 
    determinations.
    
        (a) Applications for reference or equivalent method determinations 
    shall be submitted in duplicate to: Director, National Exposure 
    Research Laboratory, Department E (MD-77B), U.S. Environmental 
    Protection Agency, Research Triangle Park, North Carolina 27711.
        (b) Each application shall be signed by an authorized 
    representative of the applicant, shall be marked in accordance with 
    Sec. 53.15 (if applicable), and shall contain the following:
        (1) A clear identification of the candidate method, which will 
    distinguish it from all other methods such that the method may be 
    referred to unambiguously. This identification must consist of a unique 
    series of descriptors such as title, identification
    
    [[Page 65798]]
    
    number, analyte, measurement principle, manufacturer, brand, model, 
    etc., as necessary to distinguish the method from all other methods or 
    method variations, both within and outside the applicant's 
    organization.
        (2) A detailed description of the candidate method, including but 
    not limited to the following: The measurement principle, manufacturer, 
    name, model number and other forms of identification, a list of the 
    significant components, schematic diagrams, design drawings, and a 
    detailed description of the apparatus and measurement procedures. 
    Drawings and descriptions pertaining to candidate methods or samplers 
    for PM2.5 must meet all applicable requirements in Reference 1 of 
    Appendix A to this subpart, using appropriate graphical, nomenclature, 
    and mathematical conventions such as those specified in References 3 
    and 4 of Appendix A to this subpart.
        (3) A copy of a comprehensive operation or instruction manual 
    providing a complete and detailed description of the operational and 
    calibration procedures prescribed for field use of the candidate method 
    and all instruments utilized as part of that method (see Sec. 53.9a).
        (i) As a minimum this manual shall include:
        (A) Description of the method and associated instruments;
        (B) Explanation of all indicators, information displays, and 
    controls;
        (C) Complete setup and installation instructions, including any 
    additional materials or supplies required;
        (D) Details of all initial or startup checks or acceptance tests 
    and any auxiliary equipment required;
        (E) Complete operational instructions;
        (F) Calibration procedures and required calibration equipment and 
    standards;
        (G) Instructions for verification of correct or proper operation;
        (H) Trouble-shooting guidance and suggested corrective actions for 
    abnormal operation;
        (I) Required or recommended routine, periodic, and preventative 
    maintenance and maintenance schedules,
        (J) Any calculations required to derive final concentration 
    measurements; and
        (K) Appropriate references to 40 CFR part 50, Appendix L, Reference 
    6, and any other pertinent guidelines.
        (ii) The manual shall also include adequate warning of potential 
    safety hazards that may result from normal use and/or malfunction of 
    the method and a description of necessary safety precautions. [See 
    Sec. 53.9(b)] However, the previous requirement shall not be 
    interpreted to constitute or imply any warranty of safety of the method 
    by the EPA. For samplers and automated methods, the manual shall 
    include a clear description of all procedures pertaining to 
    installation, operation, preventative maintenance, and troubleshooting 
    and shall also include parts identification diagrams. The manual may be 
    used to satisfy the requirements of paragraphs (b) (1) and (2) of this 
    section to the extent that it includes information necessary to meet 
    those requirements.
        (4) A statement that the candidate method has been tested in 
    accordance with the procedures described in subparts B, C, D, E, and/or 
    F of this part, as applicable.
        (5) Test data, records, calculations, and test results as specified 
    in subparts B, C, D, E, and/or F of this part, as applicable. Data must 
    be sufficiently detailed to meet appropriate principles described in 
    paragraphs 4 through 6 of Reference 2, Part b, sections 3.3.1 
    (paragraph 1) and 3.5.1 (paragraphs 2 and 3) and in paragraphs 1 
    through 3 of Reference 5 (section 4.8, Records) of appendix A of this 
    subpart. Salient requirements from these references include the 
    following:
        (i) The applicant shall maintain and include records of all 
    relevant measuring equipment, including the make, type, and serial 
    number or other identification, and most recent calibration with 
    identification of the measurement standard or standards used and their 
    NIST traceability. These records shall demonstrate the measurement 
    capability of each item of measuring equipment used for the application 
    and include a description and justification (if needed) of the 
    measurement setup or configuration in which it was used for the tests. 
    The calibration results shall be recorded and identified in sufficient 
    detail so that the traceability of all measurements can be determined 
    and any measurement could be reproduced under conditions close to the 
    original conditions, if necessary, to resolve any anomalies.
        (ii) Test data shall be collected according to the standards of 
    good practice and by qualified personnel. Test anomalies or 
    irregularities shall be documented and explained or justified. The 
    impact and significance of the deviation on test results and 
    conclusions shall be determined. Data collected shall correspond 
    directly to the specified test requirement and be labeled and 
    identified clearly so that results can be verified and evaluated 
    against the test requirement. Calculations or data manipulations must 
    be explained in detail so that they can be verified.
        (6) A statement that the method, analyzer, or sampler tested in 
    accordance with this part is representative of the candidate method 
    described in the application.
        (c) For candidate automated methods and candidate manual methods 
    for PM10 and PM2.5, the application shall also contain the 
    following:
        (1) A detailed description of the quality system that will be 
    utilized, if the candidate method is designated as a reference or 
    equivalent method, to ensure that all analyzers or samplers offered for 
    sale under that designation will have essentially the same performance 
    characteristics as the analyzer(s) or samplers tested in accordance 
    with this part. In addition, the quality system requirements for 
    candidate methods for PM2.5 must be described in sufficient 
    detail, based on the elements described in section 4 of Reference 1 
    (Quality System Requirements) of appendix A of this subpart. Further 
    clarification is provided in the following sections of Reference 2: 
    Part A (Management Systems), sections 2.2 (Quality System and 
    Description), 2.3 (Personnel Qualification and Training), 2.4 
    (Procurement of Items and Services), 2.5 (Documents and Records), and 
    2.7 (Planning); Part B (Collection and Evaluation of Environmental 
    Data), sections 3.1 (Planning and Scoping), 3.2 (Design of Data 
    Collection Operations), and 3.5 (Assessment and Verification of Data 
    Usability); and Part C (Operation of Environmental Technology), 
    sections 4.1 (Planning), 4.2 (Design of Systems), and 4.4 (Operation of 
    Systems) of appendix A of this subpart .
        (2) A description of the durability characteristics of such 
    analyzers or samplers [see Sec. 53.9(c)]. For methods for PM2.5, 
    the warranty program must ensure that the required specifications (see 
    Table A-1 of this subpart) will be met throughout the warranty period 
    and that the applicant accepts responsibility and liability for 
    ensuring this conformance, or resolving any nonconformities, including 
    all necessary components of the system, regardless of the original 
    manufacturer. The warranty program must be described in sufficient 
    detail to meet appropriate provisions of the ANSI/ASQC and ISO 9001 
    standards (References 1 and 2 in appendix A of this subpart) for 
    controlling conformance and resolving nonconformance, particularly 
    sections 4.12, 4.13, and 4.14 of Reference 1 in appendix A of this 
    subpart.
    
    [[Page 65799]]
    
        (i) Section 4.12 in appendix A of this subpart requires the 
    manufacturer to establish and maintain a system of procedures for 
    identifying and maintaining the identification of inspection and test 
    status throughout all phases of manufacturing to ensure that only 
    instruments that have passed the required inspections and tests are 
    released for sale.
        (ii) Section 4.13 in appendix A of this subpart requires documented 
    procedures for control of nonconforming product, including review and 
    acceptable alternatives for disposition; section 4.14 requires 
    documented procedures for implementing corrective (4.14.2) and 
    preventive (4.14.3) action to eliminate the causes of actual or 
    potential nonconformities. In particular, section 4.14.3 requires that 
    potential causes of nonconformities be eliminated by using information 
    such as service reports and customer complaints to eliminate potential 
    causes of nonconformities.
        (d) For candidate reference or equivalent methods for PM2.5, 
    the applicant shall provide to EPA for test purposes one sampler or 
    analyzer that is representative of the sampler or analyzer associated 
    with the candidate method. The sampler or analyzer shall be shipped FOB 
    destination to Department E, (MD-77B), U.S. EPA, 79 T.W. Alexander 
    Drive, Research Triangle Park, NC 27709, scheduled to arrive concurrent 
    with or within 30 days of the arrival of the other application 
    materials. This analyzer or sampler may be subjected to various tests 
    that the EPA determines to be necessary or appropriate under 
    Sec. 53.5(e), and such tests may include special tests not otherwise 
    described in this part. If the instrument submitted under this 
    paragraph malfunctions, becomes inoperative, or fails to perform as 
    represented in the application before the necessary EPA testing is 
    completed, the applicant shall be afforded an opportunity to repair or 
    replace the device at no cost to the EPA. Upon completion of the EPA 
    testing, the analyzer or sampler submitted under this paragraph shall 
    be repacked by the EPA for return shipment to the applicant, using the 
    same packing materials used for shipping the instrument to the EPA 
    unless alternative packing is provided by the applicant. Arrangements 
    for, and the cost of, return shipment shall be the responsibility of 
    the applicant. The EPA does not warrant or assume any liability for the 
    condition of the analyzer or sampler upon return to the applicant.
    
    
    Sec. 53.5  Processing of applications.
    
        After receiving an application for a reference or equivalent method 
    determination, the Administrator will publish notice of the application 
    in the Federal Register and, within 120 calendar days after receipt of 
    the application, take one or more of the following actions:
        (a) Send notice to the applicant, in accordance with Sec. 53.8, 
    that the candidate method has been determined to be a reference or 
    equivalent method;
        (b) Send notice to the applicant that the application has been 
    rejected, including a statement of reasons for rejection;
        (c) Send notice to the applicant that additional information must 
    be submitted before a determination can be made and specify the 
    additional information that is needed (in such cases, the 120-day 
    period shall commence upon receipt of the additional information);
        (d) Send notice to the applicant that additional test data must be 
    submitted and specify what tests are necessary and how they shall be 
    interpreted (in such cases, the 120-day period shall commence upon 
    receipt of the additional test data);
        (e) Send notice to the applicant that the application has been 
    found to be substantially deficient or incomplete and cannot be 
    processed until additional information is submitted to complete the 
    application and specify the general areas of substantial deficiency; or
        (f) Send notice to the applicant that additional tests will be 
    conducted by the Administrator, specifying the nature of and reasons 
    for the additional tests and the estimated time required (in such 
    cases, the 120-day period shall commence one calendar day after the 
    additional tests have been completed).
    
    
    Sec. 53.6  Right to witness conduct of tests.
    
        (a) Submission of an application for a reference or equivalent 
    method determination shall constitute consent for the Administrator or 
    the Administrator's authorized representative, upon presentation of 
    appropriate credentials, to witness or observe any tests required by 
    this part in connection with the application or in connection with any 
    modification or intended modification of the method by the applicant.
        (b) The applicant shall have the right to witness or observe any 
    test conducted by the Administrator in connection with the application 
    or in connection with any modification or intended modification of the 
    method by the applicant.
        (c) Any tests by either party that are to be witnessed or observed 
    by the other party shall be conducted at a time and place mutually 
    agreeable to both parties.
    
    
    Sec. 53.7  Testing of methods at the initiative of the Administrator.
    
        (a) In the absence of an application for a reference or equivalent 
    method determination, the Administrator may conduct the tests required 
    by this part for such a determination, may compile such other 
    information as may be necessary in the judgment of the Administrator to 
    make such a determination, and on the basis of the tests and 
    information may determine that a method satisfies applicable 
    requirements of this part.
        (b) In the absence of an application requesting the Administrator 
    to consider revising an appendix to part 50 of this chapter in 
    accordance with Sec. 53.16, the Administrator may conduct such tests 
    and compile such information as may be necessary in the Administrator's 
    judgment to make a determination under Sec. 53.16(d) and on the basis 
    of the tests and information make such a determination.
        (c) If a method tested in accordance with this section is 
    designated as a reference or equivalent method in accordance with 
    Sec. 53.8 or is specified or designated as a reference method in 
    accordance with Sec. 53.16, any person or entity who offers the method 
    for sale as a reference or equivalent method thereafter shall assume 
    the rights and obligations of an applicant for purposes of this part, 
    with the exception of those pertaining to submission and processing of 
    applications.
    
    
    Sec. 53.8  Designation of reference and equivalent methods.
    
        (a) A candidate method determined by the Administrator to satisfy 
    the applicable requirements of this part shall be designated as a 
    reference method or equivalent method (as applicable), and a notice of 
    the designation shall be submitted for publication in the Federal 
    Register not later than 15 days after the determination is made.
        (b) A notice indicating that the method has been determined to be a 
    reference method or an equivalent method shall be sent to the 
    applicant. This notice shall constitute proof of the determination 
    until a notice of designation is published in accordance with paragraph 
    (a) of this section.
        (c) The Administrator will maintain a current list of methods 
    designated as reference or equivalent methods in accordance with this 
    part and will send a copy of the list to any person or group
    
    [[Page 65800]]
    
    upon request. A copy of the list will be available for inspection or 
    copying at EPA Regional Offices.
    
    
    Sec. 53.9  Conditions of designation.
    
        Designation of a candidate method as a reference method or 
    equivalent method shall be conditioned on the applicant's compliance 
    with the following requirements. Failure to comply with any of the 
    requirements shall constitute a ground for cancellation of the 
    designation in accordance with Sec. 53.11.
        (a) Any method offered for sale as a reference or equivalent method 
    shall be accompanied by a copy of the manual referred to in 
    Sec. 53.4(b)(3) when delivered to any ultimate purchaser.
        (b) Any method offered for sale as a reference or equivalent method 
    shall generate no unreasonable hazard to operators or to the 
    environment during normal use or when malfunctioning.
        (c) Any analyzer, PM10 sampler, or PM2.5 sampler offered 
    for sale as a reference or equivalent method shall function within the 
    limits of the performance specifications referred to in Sec. 53.20(a), 
    Sec. 53.40(a), Sec. 53.50(a), or Sec. 53.60(a), as applicable, for at 
    least 1 year after delivery and acceptance when maintained and operated 
    in accordance with the manual referred to in Sec. 53.4(b)(3).
        (d) Any analyzer, PM10 sampler or PM2.5 sampler offered 
    for sale as a reference or equivalent method shall bear a prominent, 
    permanently affixed label or sticker indicating that the analyzer or 
    sampler has been designated by EPA as a reference method or as an 
    equivalent method (as applicable) in accordance with this part and 
    displaying any designated method identification number that may be 
    assigned by the EPA.
        (e) If an analyzer is offered for sale as a reference or equivalent 
    method and has one or more selectable ranges, the label or sticker 
    required by paragraph (d) of this section shall be placed in close 
    proximity to the range selector and shall indicate clearly which range 
    or ranges have been designated as parts of the reference or equivalent 
    method.
        (f) An applicant who offers analyzers, PM10 samplers, or 
    PM2.5 samplers for sale as reference or equivalent methods shall 
    maintain an accurate and current list of the names and mailing 
    addresses of all ultimate purchasers of such analyzers or samplers. For 
    a period of 7 years after publication of the reference or equivalent 
    method designation applicable to such an analyzer or sampler, the 
    applicant shall notify all ultimate purchasers of the analyzer or 
    PM2.5 or PM10 sampler within 30 days if the designation has 
    been canceled in accordance with Sec. 53.11 or Sec. 53.16 or if 
    adjustment of the analyzer or sampler is necessary under Sec. 53.11(b).
        (g) If an applicant modifies an analyzer, PM10 sampler, or 
    PM2.5 sampler that has been designated as a reference or 
    equivalent method, the applicant shall not sell the modified analyzer 
    or sampler as a reference or equivalent method nor attach a label or 
    sticker to the modified analyzer or sampler under paragraph (d) or (e) 
    of this section until the applicant has received notice under 
    Sec. 53.14(c) that the existing designation or a new designation will 
    apply to the modified analyzer, PM10 sampler, or PM2.5 
    sampler or has applied for and received notice under Sec. 53.8(b) of a 
    new reference or equivalent method determination for the modified 
    analyzer or sampler.
        (h) An applicant who has offered PM2.5 samplers or analyzers 
    for sale as part of a reference or equivalent method may continue to do 
    so only so long as the reference or equivalent method meets the annual 
    requirements for network operating performance determined as set forth 
    in section 6 of Appendix A of part 58 of this chapter. In the event 
    that the annual network operating performance does not meet those 
    requirements, the EPA shall, within 90 days after the end of the 
    calendar year, notify the applicant of the unacceptable network 
    performance assessment and issue a preliminary finding and notification 
    of possible cancellation of the reference or equivalent method 
    designation under Sec. 53.11. (Net performance is generally assessed 
    for each calendar year, although when the number of samples for a 
    specific method is not great enough to determine precision with 
    adequate confidence, more than 1 calendar year of data may be 
    combined.)
        (i) An applicant who has offered PM2.5 samplers or analyzers 
    for sale as part of a reference or equivalent method may continue to do 
    so only so long as the facility in which the samplers or analyzers are 
    manufactured continues to be an ISO-registered facility, as set forth 
    in subpart E of this part. In the event that the ISO registration for 
    the facility is withdrawn, suspended, or otherwise becomes 
    inapplicable, either permanently or for some specified time interval, 
    such that the facility is no longer an ISO-registered facility, the 
    applicant shall notify EPA within 30 days of the date the facility 
    becomes other than an ISO-registered facility, and upon such 
    notification, the EPA shall issue a preliminary finding and 
    notification of possible cancellation of the reference or equivalent 
    method designation under Sec. 53.11.
        (j) An applicant who has offered PM2.5 samplers or analyzers 
    for sale as part of a reference or equivalent method may continue to do 
    so only so long as updates of the Product Manufacturing Checklist set 
    forth in subpart E of this part are submitted annually. In the event 
    that an annual Checklist update is not received by the EPA within 12 
    months of the date of the last such submitted Checklist or Checklist 
    update, the EPA shall notify the applicant within 30 days that the 
    Checklist update has not been received and shall, within 30 days from 
    the issuance of such notification, issue a preliminary finding and 
    notification of possible cancellation of the reference or equivalent 
    method designation under Sec. 53.11.
    
    
    Sec. 53.10  Appeal from rejection of application.
    
        Any applicant whose application for a reference or equivalent 
    method determination has been rejected may appeal the Administrator's 
    decision by taking one or more of the following actions:
        (a) The applicant may submit new or additional information in 
    support of the application.
        (b) The applicant may request that the Administrator reconsider the 
    data and information already submitted.
        (c) The applicant may request that any test conducted by the 
    Administrator that was a material factor in the decision to reject the 
    application be repeated.
    
    
    Sec. 53.11  Cancellation of reference or equivalent method designation.
    
        (a) Preliminary finding. If the Administrator makes a preliminary 
    finding on the basis of any available information that a representative 
    sample of a method designated as a reference or equivalent method and 
    offered for sale as such does not fully satisfy the requirements of 
    this part or that there is any violation of the requirements set forth 
    in Sec. 53.9, the Administrator may initiate proceedings to cancel the 
    designation in accordance with the following procedures.
        (b) Notification and opportunity to demonstrate or achieve 
    compliance.
        (1) After making a preliminary finding in accordance with paragraph 
    (a) of this section, the Administrator will send notice of the 
    preliminary finding to the applicant, together with a statement of the 
    facts and reasons on which the preliminary finding is based, and will 
    publish notice of the preliminary finding in the Federal Register.
        (2) The applicant will be afforded an opportunity to demonstrate or 
    to
    
    [[Page 65801]]
    
    achieve compliance with the requirements of this part within 60 days 
    after publication of notice in accordance with paragraph (b)(1) of this 
    section or within such further period as the Administrator may allow, 
    by demonstrating to the satisfaction of the Administrator that the 
    method in question satisfies the requirements of this part, by 
    commencing a program to make any adjustments that are necessary to 
    bring the method into compliance, or by taking such action as may be 
    necessary to cure any violation of the requirements of Sec. 53.9. If 
    adjustments are necessary to bring the method into compliance, all such 
    adjustments shall be made within a reasonable time as determined by the 
    Administrator. If the applicant demonstrates or achieves compliance in 
    accordance with this paragraph (b)(2), the Administrator will publish 
    notice of such demonstration or achievement in the Federal Register.
        (c) Request for hearing. Within 60 days after publication of a 
    notice in accordance with paragraph (b)(1) of this section, the 
    applicant or any interested person may request a hearing as provided in 
    Sec. 53.12.
        (d) Notice of cancellation. If, at the end of the period referred 
    to in paragraph (b)(2) of this section, the Administrator determines 
    that the reference or equivalent method designation should be canceled, 
    a notice of cancellation will be published in the Federal Register and 
    the designation will be deleted from the list maintained under 
    Sec. 53.8(c). If a hearing has been requested and granted in accordance 
    with Sec. 53.12, action under this paragraph (d) will be taken only 
    after completion of proceedings (including any administrative review) 
    conducted in accordance with Sec. 53.13 and only if the decision of the 
    Administrator reached in such proceedings is that the designation in 
    question should be canceled.
    
    
    Sec. 53.12  Request for hearing on cancellation.
    
        Within 60 days after publication of a notice in accordance with 
    Sec. 53.11(b)(1), the applicant or any interested person may request a 
    hearing on the Administrator's action. If, after reviewing the request 
    and supporting data, the Administrator finds that the request raises a 
    substantial issue of fact, a hearing will be granted in accordance with 
    Sec. 53.13 with respect to such issue. The request shall be in writing, 
    signed by an authorized representative of the applicant or interested 
    person, and shall include a statement specifying:
        (a) Any objections to the Administrator's action; and
        (b) Data or other information in support of such objections.
    
    
    Sec. 53.13  Hearings.
    
        (a)(1) After granting a request for a hearing under Sec. 53.12, the 
    Administrator will designate a presiding officer for the hearing.
        (2) If a time and place for the hearing have not been fixed by the 
    Administrator, the hearing will be held as soon as practicable at a 
    time and place fixed by the presiding officer, except that the hearing 
    shall in no case be held sooner than 30 days after publication of a 
    notice of hearing in the Federal Register.
        (3) For purposes of the hearing, the parties shall include the 
    Environmental Protection Agency, the applicant or interested person(s) 
    who requested the hearing, and any person permitted to intervene in 
    accordance with paragraph (c) of this section.
        (4) The Deputy General Counsel or the Deputy General Counsel's 
    representative will represent the Environmental Protection Agency in 
    any hearing under this section.
        (5) Each party other than the Environmental Protection Agency may 
    be represented by counsel or by any other duly authorized 
    representative.
        (b)(1) Upon appointment, the presiding officer will establish a 
    hearing file. The file shall contain copies of the notices issued by 
    the Administrator pursuant to Sec. 53.11(b)(1), together with any 
    accompanying material, the request for a hearing and supporting data 
    submitted therewith, the notice of hearing published in accordance with 
    paragraph (a)(2) of this section, and correspondence and other material 
    data relevant to the hearing.
        (2) The hearing file shall be available for inspection by the 
    parties or their representatives at the office of the presiding 
    officer, except to the extent that it contains information identified 
    in accordance with Sec. 53.15.
        (c) The presiding officer may permit any interested person to 
    intervene in the hearing upon such a showing of interest as the 
    presiding officer may require; provided that permission to intervene 
    may be denied in the interest of expediting the hearing where it 
    appears that the interests of the person seeking to intervene will be 
    adequately represented by another party (or by other parties), 
    including the Environmental Protection Agency.
        (d)(1) The presiding officer, upon the request of any party or at 
    the officer's discretion, may arrange for a prehearing conference at a 
    time and place specified by the officer to consider the following:
        (i) Simplification of the issues.
        (ii) Stipulations, admissions of fact, and the introduction of 
    documents.
        (iii) Limitation of the number of expert witnesses.
        (iv) Possibility of agreement on disposing of all or any of the 
    issues in dispute.
        (v) Such other matters as may aid in the disposition of the 
    hearing, including such additional tests as may be agreed upon by the 
    parties.
        (2) The results of the conference shall be reduced to writing by 
    the presiding officer and made part of the record.
        (e)(1) Hearings shall be conducted by the presiding officer in an 
    informal but orderly and expeditious manner. The parties may offer oral 
    or written evidence, subject to exclusion by the presiding officer of 
    irrelevant, immaterial, or repetitious evidence.
        (2) Witnesses shall be placed under oath.
        (3) Any witness may be examined or cross-examined by the presiding 
    officer, the parties, or their representatives. The presiding officer 
    may, at his discretion, limit cross-examination to relevant and 
    material issues.
        (4) Hearings shall be reported verbatim. Copies of transcripts of 
    proceedings may be purchased from the reporter.
        (5) All written statements, charts, tabulations, and data offered 
    in evidence at the hearing shall, upon a showing satisfactory to the 
    presiding officer of their authenticity, relevancy, and materiality, be 
    received in evidence and shall constitute part of the record.
        (6) Oral argument shall be permitted. The presiding officer may 
    limit oral presentations to relevant and material issues and designate 
    the amount of time allowed for oral argument.
        (f)(1) The presiding officer shall make an initial decision which 
    shall include written findings and conclusions and the reasons therefor 
    on all the material issues of fact, law, or discretion presented on the 
    record. The findings, conclusions, and written decision shall be 
    provided to the parties and made part of the record. The initial 
    decision shall become the decision of the Administrator without further 
    proceedings unless there is an appeal to, or review on motion of, the 
    Administrator within 30 calendar days after the initial decision is 
    filed.
        (2) On appeal from or review of the initial decision, the 
    Administrator will have all the powers consistent with making the 
    initial decision, including the discretion to require or allow briefs, 
    oral argument, the taking of additional evidence or the remanding to 
    the presiding officer for additional proceedings. The decision by the
    
    [[Page 65802]]
    
    Administrator will include written findings and conclusions and the 
    reasons or basis therefor on all the material issues of fact, law, or 
    discretion presented on the appeal or considered in the review.
    
    
    Sec. 53.14  Modification of a reference or equivalent method.
    
        (a) An applicant who offers a method for sale as a reference or 
    equivalent method shall report to the Administrator prior to 
    implementation any intended modification of the method, including but 
    not limited to modifications of design or construction or of 
    operational and maintenance procedures specified in the operation 
    manual [see Sec. 53.9(g)]. The report shall be signed by an authorized 
    representative of the applicant, marked in accordance with Sec. 53.15 
    (if applicable), and addressed as specified in Sec. 53.4(a).
        (b) A report submitted under paragraph (a) of this section shall 
    include:
        (1) A description, in such detail as may be appropriate, of the 
    intended modification;
        (2) A brief statement of the applicant's belief that the 
    modification will, will not, or may affect the performance 
    characteristics of the method;
        (3) A brief statement of the probable effect if the applicant 
    believes the modification will or may affect the performance 
    characteristics of the method; and
        (4) Such further information, including test data, as may be 
    necessary to explain and support any statement required by paragraphs 
    (b)(2) and (b)(3) of this section.
        (c) Within 30 calendar days after receiving a report under 
    paragraph (a) of this section, the Administrator will take one or more 
    of the following actions:
        (1) Notify the applicant that the designation will continue to 
    apply to the method if the modification is implemented.
        (2) Send notice to the applicant that a new designation will apply 
    to the method (as modified) if the modification is implemented, submit 
    notice of the determination for publication in the Federal Register, 
    and revise or supplement the list referred to in Sec. 53.8(c) to 
    reflect the determination.
        (3) Send notice to the applicant that the designation will not 
    apply to the method (as modified) if the modification is implemented 
    and submit notice of the determination for publication in the Federal 
    Register;
        (4) Send notice to the applicant that additional information must 
    be submitted before a determination can be made and specify the 
    additional information that is needed (in such cases, the 30-day period 
    shall commence upon receipt of the additional information);
        (5) Send notice to the applicant that additional tests are 
    necessary and specify what tests are necessary and how they shall be 
    interpreted (in such cases, the 30-day period shall commence upon 
    receipt of the additional test data); or
        (6) Send notice to the applicant that additional tests will be 
    conducted by the Administrator and specify the reasons for and the 
    nature of the additional tests (in such cases, the 30-day period shall 
    commence one calendar day after the additional tests are completed).
        (d) An applicant who has received a notice under paragraph (c)(3) 
    of this section may appeal the Administrator's action as follows:
        (1) The applicant may submit new or additional information 
    pertinent to the intended modification.
        (2) The applicant may request the Administrator to reconsider data 
    and information already submitted.
        (3) The applicant may request that the Administrator repeat any 
    test conducted that was a material factor in the Administrator's 
    determination. A representative of the applicant may be present during 
    the performance of any such retest.
    
    
    Sec. 53.15  Trade secrets and confidential or privileged information.
    
        Any information submitted under this part that is claimed to be a 
    trade secret or confidential or privileged information shall be marked 
    or otherwise clearly identified as such in the submittal. Information 
    so identified will be treated in accordance with part 2 of this chapter 
    (concerning public information).
    
    
    Sec. 53.16  Supersession of reference methods.
    
        (a) This section prescribes procedures and criteria applicable to 
    requests that the Administrator specify a new reference method, or a 
    new measurement principle and calibration procedure on which reference 
    methods shall be based, by revision of the appropriate appendix to 50 
    part of this chapter. Such action will ordinarily be taken only if the 
    Administrator determines that a candidate method or a variation thereof 
    is substantially superior to the existing reference method(s).
        (b) In exercising discretion under this section, the Administrator 
    will consider:
        (1) The benefits, in terms of the requirements and purposes of the 
    Act, that would result from specifying a new reference method or a new 
    measurement principle and calibration procedure;
        (2) The potential economic consequences of such action for State 
    and local control agencies; and
        (3) Any disruption of State and local air quality monitoring 
    programs that might result from such action.
        (c) An applicant who wishes the Administrator to consider revising 
    an appendix to part 50 of this chapter on the ground that the 
    applicant's candidate method is substantially superior to the existing 
    reference method(s) shall submit an application for a reference or 
    equivalent method determination in accordance with Sec. 53.4 and shall 
    indicate therein that such consideration is desired. The application 
    shall include, in addition to the information required by Sec. 53.4, 
    data and any other information supporting the applicant's claim that 
    the candidate method is substantially superior to the existing 
    reference method(s).
        (d) After receiving an application under paragraph (c) of this 
    section, the Administrator will publish notice of its receipt in the 
    Federal Register and, within 120 calendar days after receipt of the 
    application, take one of the following actions:
        (1) Determine that it is appropriate to propose a revision of the 
    appendix to part 50 of this chapter in question and send notice of the 
    determination to the applicant;
        (2) Determine that it is inappropriate to propose a revision of the 
    appendix to part 50 of this chapter in question, determine whether the 
    candidate method is a reference or equivalent method, and send notice 
    of the determinations, including a statement of reasons for the 
    determination not to propose a revision, to the applicant;
        (3) Send notice to the applicant that additional information must 
    be submitted before a determination can be made and specify the 
    additional information that is needed (in such cases, the 120-day 
    period shall commence upon receipt of the additional information);
        (4) Send notice to the applicant that additional tests are 
    necessary, specifying what tests are necessary and how they shall be 
    interpreted (in such cases, the 120-day period shall commence upon 
    receipt of the additional test data); or
        (5) Send notice to the applicant that additional tests will be 
    conducted by the Administrator, specifying the nature of and reasons 
    for the additional tests and the estimated time required (in such 
    cases, the 120-day period shall
    
    [[Page 65803]]
    
    commence one calendar day after the additional tests have been 
    completed).
        (e)(1)(i) After making a determination under paragraph (d)(1) of 
    this section, the Administrator will publish a notice of proposed 
    rulemaking in the Federal Register. The notice will indicate that the 
    Administrator proposes:
        (A) To revise the appendix to part 50 of this chapter in question;
        (B) Where the appendix specifies a measurement principle and 
    calibration procedure, to cancel reference method designations based on 
    the appendix; and
        (C) To cancel equivalent method designations based on the existing 
    reference method(s).
        (ii) The notice will include the terms or substance of the proposed 
    revision, will indicate what period(s) of time the Administrator 
    proposes to allow for replacement of existing methods under section 2.3 
    of Appendix C to part 58 of this chapter, and will solicit public 
    comments on the proposal with particular reference to the 
    considerations set forth in paragraphs (a) and (b) of this section.
        (2) If, after consideration of comments received, the Administrator 
    determines that the appendix to part 50 in question should be revised, 
    the Administrator will by publication in the Federal Register 
    promulgate the proposed revision, with such modifications as may be 
    appropriate in view of comments received; where the appendix to part 50 
    (prior to revision) specifies a measurement principle and calibration 
    procedure, cancel reference method designations based on the appendix; 
    cancel equivalent method designations based on the existing reference 
    method(s); and specify the period(s) that will be allowed for 
    replacement of existing methods under section 2.3 of Appendix C to part 
    58 of this chapter, with such modifications from the proposed period(s) 
    as may be appropriate in view of comments received. Canceled 
    designations will be deleted from the list maintained under 
    Sec. 53.8(c). The requirements and procedures for cancellation set 
    forth in Sec. 53.11 shall be inapplicable to cancellation of reference 
    or equivalent method designations under this section.
        (3) If the appendix to part 50 of this chapter in question is 
    revised to specify a new measurement principle and calibration 
    procedure on which the applicant's candidate method is based, the 
    Administrator will take appropriate action under Sec. 53.5 to determine 
    whether the candidate method is a reference method.
        (4) Upon taking action under paragraph (e)(2) of this section, the 
    Administrator will send notice of the action to all applicants for 
    whose methods reference and equivalent method designations are canceled 
    by such action.
        (f) An applicant who has received notice of a determination under 
    paragraph (d)(2) of this section may appeal the determination by taking 
    one or more of the following actions:
        (1) The applicant may submit new or additional information in 
    support of the application.
        (2) The applicant may request that the Administrator reconsider the 
    data and information already submitted.
        (3) The applicant may request that any test conducted by the 
    Administrator that was a material factor in making the determination be 
    repeated.
    Tables to Subpart A of Part 53
    
            Table A-1 to Subpart A--Summary of Applicable Requirements for Reference and Equivalent Methods for Air Monitoring of Criteria Pollutants       
    --------------------------------------------------------------------------------------------------------------------------------------------------------
                                                                                                                      Applicable subparts of part 53        
                  Pollutant                  Ref. or equivalent      Manual or automated    Applicable part  -----------------------------------------------
                                                                                              50 appendix        A       B       C       D       E       F  
    --------------------------------------------------------------------------------------------------------------------------------------------------------
    SO2.................................  Reference..............  Manual................  A                  ......  ......  ......  ......  ......  ......
                                          Equivalent.............  Manual................  .................                 >                         
                                                                   Automated.............  .................         >       >                         
    CO..................................  Reference..............  Automated.............  C                         >                                 
                                          Equivalent.............  Manual................  .................                 >                         
                                                                   Automated.............  .................         >       >                         
    O3..................................  Reference..............  Automated.............  D                         >                                 
                                          Equivalent.............  Manual................  .................                 >                         
                                                                   Automated.............  .................         >       >                         
    NO2.................................  Reference..............  Automated.............  F                         >                                 
                                          Equivalent.............  Manual................  .................                 >                         
                                                                   Automated.............  .................         >       >                         
    Pb..................................  Reference..............  Manual................  G                  ......  ......  ......  ......  ......  ......
                                          Equivalent.............  Manual................  .................                 >                         
    PM10................................  Reference..............  Manual................  J                                         >                 
                                          Equivalent.............  Manual................  .................                 >       >                 
                                                                   Automated.............  .................                 >       >                 
    PM2.5...............................  Reference..............  Manual................  L                                                 >         
                                          Equivalent Class I.....  Manual................  L                                 >               >         
                                          Equivalent Class II....  Manual................  L                                 >               >       > 
                                          Equivalent Class III...  Manual or Automated...  .................             > \1\           > \1\   > \1\ 
    --------------------------------------------------------------------------------------------------------------------------------------------------------
    \1\ Because of the wide variety of potential devices possible, the specific requirements applicable to a Class III candidate equivalent method for PM2.5
      are not specified explicitly in this part but, instead, shall be determined on a case-by-case basis for each such candidiate method.                  
    
    
    [[Page 65804]]
    
    Appendix A to Subpart A of Part 53--References
    
    1. American National StandardQuality Systems-Model for Quality 
    Assurance in Design, Development, Production, Installation, and 
    Servicing, ANSI/ISO/ASQC Q9001-1994. Available from American Society 
    for Quality Control, 611 East Wisconsin Avenue, Milwaukee, WI 53202.
    2. American National Standard--Specifications and Guidelines for 
    Quality Systems for Environmental Data Collection and Environmental 
    Technology Programs, ANSI/ASQC E41994. Available from American 
    Society for Quality Control, 611 East Wisconsin Avenue, Milwaukee, 
    WI 53202.
    3. Dimensioning and Tolerancing, ASME Y14.5M-1994. Available from 
    the American Society of Mechanical Engineers, 345 East 47th Street, 
    New York, NY 10017.
    4. Mathematical Definition of Dimensioning and Tolerancing 
    Principles, ASME Y14.5.1M-1994. Available from the American Society 
    of Mechanical Engineers, 345 East 47th Street, New York, NY 10017.
    5. ISO 10012, Quality assurance requirements for measuring 
    equipmentPart 1: Meteorological confirmation system for measuring 
    equipment):1992(E). Available from American Society for Quality 
    Control, 611 East Wisconsin Avenue, Milwaukee, WI 53202.
    6. Quality Assurance Handbook for Air Pollution Measurement Systems, 
    Volume II, Ambient Air Specific Methods (Interim Edition), Section 
    2.12. EPA/600/R-94/038b, April 1994. Available from CERI, ORD 
    Publications, U.S. Environmental Protection Agency, 26 West Martin 
    Luther King Drive, Cincinnati, Ohio 45268. [Note: Section 2.12 of 
    Volume II is currently under development and will not be available 
    from the CERI address until it is published as an addition to EPA/
    600/R-94/038b. Prepublication draft copies of Section 2.12 will be 
    available from Department E (MD-77B), U. S. EPA, Research Triangle 
    Park, NC 27711 or from the contact identified at the beginning of 
    this proposed rule.]
    
    3. Subpart C is revised to read as follows:
    
    Subpart C--Procedures for Determining Comparability Between Candidate 
    Methods and Reference Methods
    Sec.
    53.30  General provisions.
    53.31  Test conditions.
    53.32  Test procedures for methods for SO2, CO, O3, and 
    NO2.
    53.33  Test procedure for methods for lead.
    53.34  Test procedure for methods for PM10 and PM2.5
    
    Tables to Subpart C of Part 53
    
    Table C-1--Test Concentration Ranges, Number of Measurements 
    Required, and Maximum Discrepancy Specification
    Table C-2--Sequence of Test Measurements
    Table C-3--Test Specifications for Lead Methods
    Table C-4--Specifications for PM10 and PM2.5 Methods
    
    Figures to Subpart C
    
    Figure C-1--Suggested Format for Reporting Test Results
    
    Appendix A to Subpart C to Part 53--References
    
    Subpart C--Procedures for Determining Comparability Between 
    Candidate Methods and Reference Methods
    
    
    Sec. 53.30  General provisions.
    
        (a) Determination of comparability. The test procedures prescribed 
    in this Subpart shall be used to determine if a candidate method is 
    comparable to a reference method when both methods measure pollutant 
    concentrations in ambient air.
        (1) Comparability is shown for SO2, CO, O33, and NO2 
    methods when the differences between:
        (i) Measurements made by a candidate manual method or by a test 
    analyzer representative of a candidate automated method; and
        (ii) Measurements made simultaneously by a reference method, are 
    less than or equal to the values specified in the last column of Table
    C-1 of this subpart.
        (2) Comparability is shown for lead methods when the differences 
    between:
        (i) Measurements made by a candidate method, and
        (ii) Measurements made by the reference method on simultaneously 
    collected lead samples (or the same sample, if applicable), are less 
    than or equal to the value specified in Table
    C-3 of this subpart.
        (3) Comparability is shown for PM10 and PM2.5 methods 
    when the relationship between:
        (i) Measurements made by a candidate method; and
        (ii) Measurements made by a reference method on simultaneously 
    collected samples (or the same sample, if applicable) at each of two 
    test sites, is such that the linear regression parameters (slope, 
    intercept, and correlation coefficient) describing the relationship 
    meet the values specified in Table C-4 of this subpart.
        (b) Selection of test sites. (1) All methods. Each test site shall 
    be in a predominately urban area which can be shown to have at least 
    moderate concentrations of various pollutants. The site shall be 
    clearly identified and shall be justified as an appropriate test site 
    with suitable supporting evidence such as maps, population density 
    data, vehicular traffic data, emission inventories, pollutant 
    measurements from previous years, concurrent pollutant measurements, 
    and meteorological data. If approval of a proposed test site is desired 
    prior to conducting the tests, a written request for approval of the 
    test site or sites must be submitted prior to conducting the tests and 
    must include the supporting and justification information required. The 
    Administrator may exercise discretion in selecting a different site (or 
    sites) for any additional tests the Administrator decides to conduct.
        (2) Methods for SO2, CO, O3, and NO2. All test 
    measurements are to be made at the same test site. If necessary, the 
    concentration of pollutant in the sampled ambient air may be augmented 
    with artificially generated pollutant to facilitate measurements in the 
    specified ranges. [See paragraph (d)(2) of this section.]
        (3) Methods for lead. Test measurements may be made at any number 
    of test sites. Augmentation of pollutant concentrations is not 
    permitted, hence an appropriate test site or sites must be selected to 
    provide lead concentrations in the specified range.
        (4) Methods for PM10. Test measurements must be made, or 
    derived from particulate samples collected, at not less than two test 
    sites, each of which must be located in a geographical area 
    characterized by ambient particulate matter that is significantly 
    different in nature and composition from that at the other test 
    site(s). Augmentation of pollutant concentrations is not permitted, 
    hence appropriate test sites must be selected to provide PM10 
    concentrations in the specified range. The tests at the two sites may 
    be conducted in different calendar seasons, if appropriate, to provide 
    PM10 concentrations in the specified ranges.
        (5) Methods for PM2.5. Augmentation of pollutant 
    concentrations is not permitted, hence appropriate test sites must be 
    selected to provide PM2.5 concentrations and PM2.5/PM10 
    ratios (if applicable) in the specified ranges.
        (i) Where only one test site is required, as specified in Table C-4 
    of this subpart, the site need only meet the PM2.5 ambient 
    concentration levels required by Sec. 53.34(c)(3).
        (ii) Where two sites are required, as specified in Table C-4 of 
    this subpart, each site must be selected to provide the ambient 
    concentration levels required by Sec. 53.34(c)(3). In addition, one 
    site
    
    [[Page 65805]]
    
    must be selected such that all acceptable test sample sets, as defined 
    in Sec. 53.34(c)(3), have a PM2.5/PM10 ratio of more than 
    0.75; the other site must be selected such that all acceptable test 
    sample sets, as defined in Sec. 53.34(c)(3), have a PM2.5/
    PM10 ratio of less than 0.40. At least two reference method 
    PM10 samplers shall be collocated with the candidate and reference 
    method PM2.5 samplers and operated simultaneously with the other 
    samplers at each test site to measure concurrent ambient concentrations 
    of PM10 to determine the PM2.5/PM10 ratio for each 
    sample set. The PM2.5/PM10 ratio for each sample set shall be 
    the average of the PM2.5 concentration, as determined in 
    Sec. 53.34(c)(1), divided by the average PM10 concentration, as 
    measured by the PM10 samplers. The tests at the two sites may be 
    conducted in different calendar seasons, if appropriate, to provide 
    PM2.5 concentrations and PM2.5/PM10 ratios in the 
    specified ranges.
        (c) Test atmosphere. Ambient air sampled at an appropriate test 
    site or sites shall be used for these tests. Simultaneous concentration 
    measurements shall be made in each of the concentration ranges 
    specified in Table C-1, C-3, or C-4 of this subpart, as appropriate.
        (d) Sample collection.
        (1) All methods. All test concentration measurements or samples 
    shall be taken in such a way that both the candidate method and the 
    reference method receive air samples that are homogenous or as nearly 
    identical as practical.
        (2) Methods for SO2, CO, O3, and NO2. Ambient air 
    shall be sampled from a common intake and distribution manifold 
    designed to deliver homogenous air samples to both methods. Precautions 
    shall be taken in the design and construction of this manifold to 
    minimize the removal of particulates and trace gases, and to insure 
    that identical samples reach the two methods. If necessary, the 
    concentration of pollutant in the sampled ambient air may be augmented 
    with artificially generated pollutant. However, at all times the air 
    sample measured by the candidate and reference methods under test shall 
    consist of not less than 80 percent ambient air by volume. Schematic 
    drawings, physical illustrations, descriptions, and complete details of 
    the manifold system and the augmentation system (if used) shall be 
    submitted.
        (3) Methods for lead, PM10 and PM2.5. The ambient air 
    intake points of all the candidate and reference method collocated 
    samplers for lead, PM10 or PM2.5 shall be positioned at the 
    same height above the ground level, and between 2 and 5 meters apart. 
    The samplers shall be oriented in a manner that will minimize spatial 
    and wind directional effects on sample collection.
        (4) PM10 methods employing the same sampling procedure as the 
    reference method but a different analytical method. Candidate methods 
    for PM10 which employ a sampler and sample collection procedure 
    that are identical to the sampler and sample collection procedure 
    specified in the reference method, but use a different analytical 
    procedure, may be tested by analyzing common samples. The common 
    samples shall be collected according to the sample collection procedure 
    specified by the reference method and shall be analyzed in accordance 
    with the analytical procedures of both the candidate method and the 
    reference method.
        (e) Submission of test data and other information. All recorder 
    charts, calibration data, records, test results, procedural 
    descriptions and details, and other documentation obtained from (or 
    pertinent to) these tests shall be identified, dated, signed by the 
    analyst performing the test, and submitted. For candidate methods for 
    PM2.5, all submitted information must meet the requirements of the 
    ANSI/ASQC E4, sections 3.3.1, paragraphs 1 and 2 (Reference 1) of 
    Appendix A of this Subpart.
    
    
    Sec. 53.31  Test conditions.
    
        (a) All methods. All test measurements made or test samples 
    collected by means of a sample manifold as specified in 
    Sec. 53.30(d)(2) shall be at a room temperature between 20 deg. and 
    30 deg.C, and at a line voltage between 105 and 125 volts. All methods 
    shall be calibrated as specified in paragraph (c) of this section prior 
    to initiation of the tests.
        (b) Samplers and automated methods. (1) Setup and start-up of the 
    test analyzer, test sampler(s), and reference method (if applicable) 
    shall be in strict accordance with the applicable operation manual(s). 
    If the test analyzer does not have an integral strip chart or digital 
    data recorder, connect the analyzer output to a suitable strip chart or 
    digital data recorder. This recorder shall have a chart width of at 
    least 25 centimeters, a response time of 1 second or less, a deadband 
    of not more than 0.25 percent of full scale, and capability of either 
    reading measurements at least 5 percent below zero or offsetting the 
    zero by at least 5 percent. Digital data shall be recorded at 
    appropriate time intervals such that trend plots similar to a strip 
    chart recording may be constructed with a similar or suitable level of 
    detail.
        (2) Other data acquisition components may be used along with the 
    chart recorder during the conduct of these tests. Use of the chart 
    recorder is intended only to facilitate visual evaluation of data 
    submitted.
        (3) Allow adequate warmup or stabilization time as indicated in the 
    applicable operation manual(s) before beginning the tests.
        (c) Calibration. The reference method shall be calibrated according 
    to the appropriate appendix to part 50 of this chapter (if it is a 
    manual method) or according to the applicable operation manual(s) (if 
    it is an automated method). A candidate manual method (or portion 
    thereof) shall be calibrated, according to the applicable operation 
    manual(s), if such calibration is a part of the method.
        (d) Range. Except as provided in paragraph (d)(2) of this section, 
    each method shall be operated in the range specified for the reference 
    method in the appropriate appendix to part 50 of this chapter (for 
    manual reference methods), or specified in Table B-1 of subpart B of 
    this part (for automated reference methods).
        (e) Operation of automated methods. (1) Once the test analyzer has 
    been set up and calibrated and tests started, manual adjustment or 
    normal periodic maintenance is permitted only every 3 days. Automatic 
    adjustments which the test analyzer performs by itself are permitted at 
    any time. At 3-day intervals only adjustments and periodic maintenance 
    as specified in the manual referred to in Sec. 53.4(b)(3) are 
    permitted. The submitted records shall show clearly when manual 
    adjustments were made and describe the operations performed.
        (2) All test measurements shall be made with the same test 
    analyzer; use of multiple test analyzers is not permitted. The test 
    analyzer shall be operated continuously during the entire series of 
    test measurements.
        (3) If a test analyzer should malfunction during any of these 
    tests, the entire set of measurements shall be repeated, and a detailed 
    explanation of the malfunction, remedial action taken, and whether 
    recalibration was necessary (along with all pertinent records and 
    charts) shall be submitted.
    
    
    Sec. 53.32  Test procedures for methods for SO2, CO, O3, and 
    NO2.
    
        (a) Conduct the first set of simultaneous measurements with the 
    candidate and reference methods:
        (1) Table C-1 of this subpart specifies the type (1- or 24-hour) 
    and number of
    
    [[Page 65806]]
    
    measurements to be made in each of the three test concentration ranges.
        (2) The pollutant concentration must fall within the specified 
    range as measured by the reference method.
        (3) The measurements shall be made in the sequence specified in 
    Table C-2 of this subpart, except for the 1-hour SO2 measurements, 
    which are all in the high range.
        (b) For each pair of measurements, determine the difference 
    (discrepancy) between the candidate method measurement and reference 
    method measurement. A discrepancy which exceeds the discrepancy 
    specified in Table C-1 of this subpart constitutes a failure. (See 
    Figure C-1 of this subpart for a suggested format for reporting the 
    test results.)
        (c) The results of the first set of measurements shall be 
    interpreted as follows:
        (1) Zero (0) failures. The candidate method passes the test for 
    comparability.
        (2) Three (3) or more failures. The candidate method fails the test 
    for comparability.
        (3) One (1) or two (2) failures. Conduct a second set of 
    simultaneous measurements as specified in Table C-1 of this subpart. 
    The results of the combined total of first-set and second-set 
    measurements shall be interpreted as follows:
        (i) One (1) or two (2) failures. The candidate method passes the 
    test for comparability.
        (ii) Three (3) or more failures. The candidate method fails the 
    test for comparability.
        (4) For sulfur dioxide, the 1-hour and 24-hour measurements shall 
    be interpreted separately, and the candidate method must pass the tests 
    for both 1- and 24-hour measurements to pass the test for 
    comparability.
        (d) A 1-hour measurement consists of the integral of the 
    instantaneous concentration over a 60-minute continuous period divided 
    by the time period. Integration of the instantaneous concentration may 
    be performed by any appropriate means such as chemical, electronic, 
    mechanical, visual judgment, or by calculating the mean of not less 
    than 12 equally spaced instantaneous readings. Appropriate allowances 
    or corrections shall be made in cases where significant errors could 
    occur due to characteristic lag time or rise/fall time differences 
    between the candidate and reference methods. Details of the means of 
    integration and any corrections shall be submitted.
        (e) A 24-hour measurement consists of the integral of the 
    instantaneous concentration over a 24-hour continuous period divided by 
    the time period. This integration may be performed by any appropriate 
    means such as chemical, electronic, mechanical, or by calculating the 
    mean of twenty-four (24) sequential 1-hour measurements.
        (f) For oxidant and carbon monoxide, no more than six (6) 1-hour 
    measurements shall be made per day. For sulfur dioxide, no more than 
    four (4) 1-hour measurements or one (1) 24-hour measurement shall be 
    made per day. One-hour measurements may be made concurrently with 24-
    hour measurements if appropriate.
        (g) For applicable methods, control or calibration checks may be 
    performed once per day without adjusting the test analyzer or method. 
    These checks may be used as a basis for a linear interpolation-type 
    correction to be applied to the measurements to correct for drift. If 
    such a correction is used, it shall be applied to all measurements made 
    with the method, and the correction procedure shall become a part of 
    the method.
    
    
    Sec. 53.33  Test procedure for methods for lead.
    
        (a) Sample collection. Collect simultaneous 24-hour samples 
    (filters) of lead at the test site or sites with both the reference and 
    candidate methods until at least 10 filter pairs have been obtained. If 
    the conditions of Sec. 53.30(d)(4) apply, collect at least 10 common 
    samples (filters) in accordance with Sec. 53.30(d)(4) and divide each 
    to form the filter pairs.
        (b) Audit samples. Three audit samples must be obtained from the 
    Quality Assurance Branch (MD-77B), Air Measurements Research Division, 
    National Exposure Research Laboratory, U.S. Environmental Protection 
    Agency, Research Triangle Park, NC 27711. The audit samples are \3/4\ 
    x  8-inch glass fiber strips containing known amounts of lead at the 
    following nominal levels: 100 g/strip; 300 g/strip; 
    750 g/strip. The true amount of lead in total g/strip 
    will be provided with each audit sample.
        (c) Filter analysis.
        (1) For both the reference method and the audit samples, analyze 
    each filter extract 3 times in accordance with the reference method 
    analytical procedure. The analysis of replicates should not be 
    performed sequentially (i.e., a single sample should not be analyzed 
    three times in sequence). Calculate the indicated lead concentrations 
    for the reference method samples in g/m3 for each 
    analysis of each filter. Calculate the indicated total lead amount for 
    the audit samples in g/strip for each analysis of each strip. 
    Label these test results as R1A, R1B, R1C, R2A, 
    R2B, * * *, Q1A, Q1B, Q1C, * * *., where R denotes 
    results from the reference method samples; Q denotes results from the 
    audit samples; 1, 2, 3 indicates filter number and A, B, C indicates 
    the first, second, and third analysis of each filter, respectively.
        (2) For the candidate method samples, analyze each sample filter or 
    filter extract three times and calculate, in accordance with the 
    candidate method, the indicated lead concentration in g/
    m3 for each analysis of each filter. Label these test results as 
    C1A, C1B, C2C, * * *, where C denotes results from the 
    candidate method. (For candidate methods which provide a direct 
    measurement of lead concentrates without a separable procedure, 
    C1A=C1B=C1C, C2A=C2B=C2C, etc.)
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.051
    
    
        (d) For the reference method, calculate the average lead 
    concentration for each filter by averaging the concentrations 
    calculated from the three analyses: where I is the filter number.
        (e) Disregard all filter pairs for which the lead concentration as 
    determined in the previous paragraph (d) of this section by the average 
    of the three reference method determinations, falls outside the range 
    of 0.5 to 4.0 g/m3. All remaining filter pairs must be 
    subjected to both of the following tests for precision and 
    comparability. At least five filter pairs must be within the 0.5 to 4.0 
    g/m3 range for the tests to be valid.
        (f) Test for precision. (1) Calculate the precision (P) of the 
    analysis (in percent) for each filter and for each method, as the 
    maximum minus the minimum divided by the average of the three 
    concentration values, as follows:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.052
    
    
    or
    [GRAPHIC] [TIFF OMITTED] TP13DE96.053
    
    
    where I indicates the filter number.
        (2) If any reference method precision value (PRi) exceeds 15 
    percent, the precision of the reference method analytical procedure is 
    out of control. Corrective action must be taken to determine the 
    source(s) of imprecision and the reference method determinations must 
    be repeated
    
    [[Page 65807]]
    
    according to paragraph (c) of this section, or the entire test 
    procedure (starting with paragraph (a) of this section) must be 
    repeated.
        (3) If any candidate method precision value (PCi) exceeds 15 
    percent, the candidate method fails the precision test.
        (4) The candidate method passes this test if all precision values 
    (i.e., all PRi's and all PCi's) are less than 15 percent.
        (g) Test for accuracy.
        (1) (i) For the audit samples calculate the average lead 
    concentration for each strip by averaging the concentrations calculated 
    from the three analyses:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.054
    
    
    where i is audit sample number.
        (ii) Calculate the percent difference (Dq) between the 
    indicated lead concentration for each audit sample and the true lead 
    concentration (Tq) as follows:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.055
    
    
        (2) If any difference value (Dqi) exceeds 5 
    percent the accuracy of the reference method analytical procedure is 
    out of control. Corrective action must be taken to determine the source 
    of the error(s) (e.g., calibration standard discrepancies, extraction 
    problems, etc.) and the reference method and audit sample 
    determinations must be repeated according to paragraph (c) of this 
    section or the entire test procedure (starting with paragraph (a) of 
    this section) must be repeated.
        (h) Test for comparability.
        (1) For each filter pair, calculate all nine possible percent 
    differences (D) between the reference and candidate methods, using all 
    nine possible combinations of the three determinations (A, B, and C) 
    for each method, as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.056
    
    
    where i is the filter number, and n numbers from 1 to 9 for the nine 
    possible difference combinations for the three determinations for each 
    method (j= A, B, C, candidate; k= A, B, C, reference).
        (2) If none of the percent differences (D) exceed 20 
    percent, the candidate method passes the test for comparability.
        (3) If one or more of the percent differences (D) exceed 
    20 percent, the candidate method fails the test for 
    comparability.
        (i) The candidate method must pass both the precision test and the 
    comparability test to qualify for designation as an equivalent method.
    
    
    Sec. 53.34  Test procedure for methods for PM10 and PM2.5.
    
        (a) Collocated measurements. Set up three reference method samplers 
    collocated with three candidate method samplers or analyzers at each of 
    the number of test sites specified in Table C-4 of this subpart. At 
    each site, obtain as many sets of simultaneous PM10 or PM2.5 
    measurements as necessary (see 53.34(c)(3)), each set consisting of 
    three reference method and three candidate method measurements, all 
    obtained simultaneously. For PM2.5 Class II candidate methods, at 
    least two collocated PM10 reference method samplers are also 
    required to obtain PM2.5/PM10 ratios for each sample set. 
    Candidate PM10 method measurements shall be 24-hour integrated 
    measurements; PM2.5 measurements may be either 24- or 48-hour 
    integrated measurements. All collocated measurements in a sample set 
    must cover the same 24- or 48-hour time period. For samplers, retrieve 
    the samples promptly after sample collection and analyze each sample 
    according to the reference method or candidate method, as appropriate, 
    and determine the PM10 or PM2.5 concentration in g/
    m3. If the conditions of Sec. 53.30(d)(4) apply, collect sample 
    sets only with the three reference method samplers. Guidance for 
    quality assurance procedures for PM2.5 methods is found in section 
    2.12 of the Quality Assurance Handbook.
        (b) Sequential samplers. For sequential samplers, the sampler shall 
    be configured for the maximum number of sequential samples and shall be 
    set for automatic collection of all samples sequentially such that the 
    test samples are collected equally, to the extent possible, among all 
    available sequential channels or utilizing the full available 
    sequential capability. At least 2 valid samples, one each above and 
    below the applicable concentration limit specified in paragraph (c)(3) 
    of this section, shall be obtained from each sequential channel in the 
    maximum-channel configuration of the sampler.
        (c) Test for comparability. (1) For each of the measurement sets, 
    calculate the average PM10 or PM2.5 concentration obtained 
    with the reference method samplers:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.057
    
    
    where R denotes results from the reference method, I is the sampler 
    number, and j is the set.
        (2)(i) For each of the measurement sets, calculate the precision of 
    the reference method PM10 or PM2.5 measurements as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.058
    
    
    if Rj is below:
    
    80 g/m3 for PM10 methods;
    40 g/m3 for 24-hour PM2.5 at single test sites for 
    Class I candidate methods;
    40 g/m3 for 24-hour PM2.5 at sites having 
    PM2.5/PM10 ratios >0.75;
    30 g/m3 for 48-hour PM2.5 at single test sites for 
    Class I candidate methods;
    30 g/m3 for 48-hour PM2.5 at sites having 
    PM2.5/PM10 ratios >0.75;
    30 g/m3 for 24-hour PM2.5 at sites having 
    PM2.5/PM10 ratios <0.40; and="" 20="">g/m3 for 48-hour PM2.5 at sites having 
    PM2.5/PM10 ratios >0.75.
    
        (ii) Otherwise, calculate the precision of the reference method 
    PM10 or PM2.5 measurements as:
    
    
    [[Page 65808]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.059
    
    
    
        (3) If Rj falls outside the acceptable concentration range 
    specified in Table C-4 of this subpart for any set, or if Pj or 
    RPj, as applicable, exceeds the value specified in Table C-4 of 
    this subpart for any set, that set of measurements shall be discarded. 
    For each site, Table C-4 of this subpart specifies the minimum number 
    of sample sets required for various conditions, and Sec. 53.30(b)(5) 
    specifies the PM2.5/PM10 ratio requirements applicable to 
    Class II candidate equivalent methods. Additional measurement sets 
    shall be collected and analyzed, as necessary, to provide a minimum of 
    10 acceptable measurement sets for each test site. If more than 10 
    measurement sets are collected that meet the above criteria, all such 
    measurement sets shall be used to demonstrate comparability.
        (4) For each of the acceptable measurement sets, calculate the 
    average PM10 or PM2.5 concentration obtained with the 
    candidate method samplers:
    [GRAPHIC] [TIFF OMITTED] TP13DE96.060
    
    
    where C denotes results from the candidate method, I is the sampler 
    number, and j is the set.
        (5) For each site, plot the average PM10 or PM2.5 
    measurements obtained with the candidate method (Cj) against the 
    corresponding average PM10 or PM2.5 measurements obtained 
    with the reference method (Rj). For each site, calculate and 
    record the linear regression slope and intercept, and the correlation 
    coefficient.
        (6) If the linear regression parameters calculated above meet the 
    values specified in Table C-4 of this subpart for all test sites, the 
    candidate method passes the test for comparability.
    
    Tables to Subpart C of Part 53
    
                          Table C-1.--Test Concentration Ranges, Number of Measurements Required, and Maximum Discrepancy Specification                     
    --------------------------------------------------------------------------------------------------------------------------------------------------------
                                                                                                Simultaneous measurements required               Maximum    
                                                                                       ----------------------------------------------------    discrepancy  
                   Pollutant                   Concentration range parts per million              1-hr                      24-hr            specification, 
                                                                                       ----------------------------------------------------     parts per   
                                                                                         First set    Second set   First set    Second set       million    
    --------------------------------------------------------------------------------------------------------------------------------------------------------
    Oxidants..............................  Low 0.06 to 0.10..........................            5            6  ...........  ...........              0.02
                                            Med 0.15 to 0.25..........................            5            6  ...........  ...........               .03
                                            High 0.35 to 0.45.........................            4            6  ...........  ...........               .04
                                                                                       ---------------------------------------------------------------------
                                               Total..................................           14           18  ...........  ...........  ................
                                                                                       =====================================================================
    Carbon monoxide.......................  Low 7 to 11...............................            5            6  ...........  ...........               1.5
                                            Med 20 to 30..............................            5            6  ...........  ...........               2.0
                                            High 35 to 45.............................            4            6  ...........  ...........               3.0
                                                                                       ---------------------------------------------------------------------
                                               Total..................................           14           18  ...........  ...........  ................
                                                                                       =====================================================================
    Sulfur dioxide........................  Low 0.02 to 0.05..........................  ...........  ...........            3            3              0.02
                                            Med 0.10 to 0.15..........................  ...........  ...........            2            3               .03
                                            High 0.30 to 0.50.........................            7            8            2            2               .04
                                                                                       ---------------------------------------------------------------------
                                               Total..................................            7            8            7            8  ................
                                                                                       =====================================================================
    Nitrogen dioxide......................  Low 0.02 to 0.08..........................  ...........  ...........            3            3              0.02
                                            Med 0.10 to 0.20..........................  ...........  ...........            2            3               .03
                                            High 0.25 to 0.35.........................  ...........  ...........            2            2               .03
                                                                                       ---------------------------------------------------------------------
                                               Total..................................  ...........  ...........            7            8  ................
    --------------------------------------------------------------------------------------------------------------------------------------------------------
    
    
    [[Page 65809]]
    
    
                    Table C-2.--Sequence of Test Measurements               
    ------------------------------------------------------------------------
                                               Concentration range          
              Measurement           ----------------------------------------
                                          First set           Second set    
    ------------------------------------------------------------------------
    1..............................  Low...............  Medum.             
    2..............................  High..............  High.              
    3..............................  Medium............  Low.               
    4..............................  High..............  High.              
    5..............................  Low...............  Medium.            
    6..............................  Medium............  Low.               
    7..............................  Low...............  Medium.            
    8..............................  Medium............  Low.               
    9..............................  High..............  High.              
    10.............................  Medium............  Low.               
    11.............................  High..............  Medium.            
    12.............................  Low...............  High.              
    13.............................  Medium............  Medium.            
    14.............................  Low...............  High.              
    15.............................  ..................  Low.               
    16.............................  ..................  Medium.            
    17.............................  ..................  Low.               
    18.............................  ..................  High.              
    ------------------------------------------------------------------------
    
    
                Table C-3.--Test Specifications for Lead Methods            
    ------------------------------------------------------------------------
                                                                            
    ------------------------------------------------------------------------
    Concentration range, g/m\3\..........................   0.5-4.0
    Minimum number of 24-hr measurements..........................         5
    Maximum analytical precision, percent.........................         5
    Maximum analytical accuracy, percent..........................  10 and PM2.5 Methods                           
    ----------------------------------------------------------------------------------------------------------------
                                                                                        PM2.5                       
             Specification                      PM10           -----------------------------------------------------
                                                                         Class I                    Class II        
    ----------------------------------------------------------------------------------------------------------------
    Acceptable concentration range   30-300...................  10-200...................  10-200                   
     (Rj), g/m3.                                                                                           
    Minimum number of test sites...  2........................  1........................  2                        
    Number of candidate method       3........................  3........................  3                        
     samplers per site.                                                                                             
    Number of reference method       3........................  3........................  3                        
     samplers per site.                                                                                             
    Minimum number of acceptable                                                                                    
     sample sets per site for PM10:                                                                                 
        Rj < 80="">g/m3......  3........................  .........................  .........................
        Rj > 80 g/m3......  3........................  .........................  .........................
            Total..................  10.......................  .........................  .........................
    Minimum number of acceptable                                                                                    
     sample sets per site for                                                                                       
     PM2.5:                                                                                                         
        Single test site for Class                                                                                  
         I candidate equivalent                                                                                     
         methods:                                                                                                   
            Rj < 40="">g/m3      .......................  3a.......................  .........................
             for 24-hr or Rj < 30="">g/m3 for 48-                                                                                  
             hr samples.                                                                                            
            Rj > 40 g/m3      .......................  3a.......................  .........................
             for 24-hr or Rj > 30                                                                                   
             g/m3 for 48-                                                                                  
             hr samples.                                                                                            
        Sites at which the PM2.5/                                                                                   
         PM10 ratio must be > 0.75:                                                                                 
            Rj < 40="">g/m3      .......................  .........................  3a                       
             for 24-hr or Rj < 30="">g/m3 for 48-                                                                                  
             hr samples.                                                                                            
            Rj > 40 g/m3      .......................  .........................  3a                       
             for 24-hr or Rj > 30                                                                                   
             g/m3 for 48-                                                                                  
             hr samples.                                                                                            
        Sites at which the PM2.5/                                                                                   
         PM10 ratio must be < 0.40:="">j < 30="">g/m3      .......................  .........................  3a                       
             for 24-hr or Rj < 20="">g/m3 for 48-                                                                                  
             hr samples.                                                                                            
            Rj > 30 g/m3      .......................  .........................  3a                       
             for 24-hr or Rj > 20                                                                                   
             g/m3 for 48-                                                                                  
             hr samples.                                                                                            
    Total, each site...............    .......................  10a......................  10a                      
    Precision of replicate           5 g/m3 or 7%....  2 g/m3 or 5%....  2 g/m3 or 5%    
     reference method measurements,                                                                                 
     Pj or RPj.                                                                                                     
    Slope of regression              10.1.........  10.05........  10.05        
     relationship.                                                                                                  
    Intercept of regression          05...........  01...........  01           
     relationship, g/m3.                                                                                   
    Correlation of reference method  0.97..........  0.97..........  0.97          
     and candidate method                                                                                           
     measurements.                                                                                                  
    ----------------------------------------------------------------------------------------------------------------
    a For sequential samplers, at least 2 samples, one above and one below the applicable concentration limit shall 
      be obtained from each sequential channel in the maximum sequential configuration of the sampler. Therefore,   
      the number of samples in each category, and possibly the total number of samples, will be dependent on the    
      number of sequential channels available.                                                                      
    
    
    BILLING CODE 6560-50-P
    
    [[Page 65810]]
    
    FIGURES TO SUBPART C OF PART 53
    
    Figure C-1.--Suggested Format for Reporting Test Results
    [GRAPHIC] [TIFF OMITTED] TP13DE96.061
    
    
    
    [[Page 65811]]
    
    
    BILLING CODE 6560-50-C
    
    Appendix A to Subpart C of Part 53--References
    
        1. American National Standard--Specifications and Guidelines for 
    Quality Systems for Environmental Data Collection and Environmental 
    Technology Programs, ANSI/ASQC E4-1994. Available from American Society 
    for Quality Control, 611 East Wisconsin Avenue, Milwaukee, WI 53202.
        4. Subpart E is added to read as follows:
    Subpart E--Procedures for Testing Physical (Design) and Performance 
    Characteristics of Reference Methods and Class I Equivalent Methods for 
    PM.2.5
    Sec.
    53.50  General provisions.
    53.51  Requirements to show compliance with design specifications.
    53.52  Comprehensive procedure to test sampler performance under 
    various environmental conditions (environmental chamber tests).
    53.53  Post-sampling filter temperature control test.
    53.54  Leak check test.
    53.55  Flow rate cut-off test.
    53.56  Operational field precision test.
    53.57  Aerosol transport test for Class I sequential samplers.
    
    Tables to Subpart E of Part 53
    
    Table E-1--Test conditions for Sec. 53.52 comprehensive 24-hour 
    tests
    Table E-2--Summary of test requirements for reference and Class I 
    equivalent methods for PM.2.5
    
    Figures to Subpart E of Part 53
    
    Figure E-1--Designation Check List
    Figure E-2--Product Manufacturing Check List
    Figure E-3--Suggested test configuration for simulating reduced 
    barometric pressure for comprehensive test procedure (Sec. 53.52)
    
    Appendix to Subpart E of Part 53--References
    
    Subpart E--Procedures for Testing Physical (Design) and Performance 
    Characteristics of Reference Methods and Class I Equivalent Methods 
    for PM.2.5
    
    
    Sec. 53.50  General provisions.
    
        (a) This subpart sets forth the specific tests that must be carried 
    out and the test results, evidence, documentation, and other materials 
    that must be provided to EPA to demonstrate that a PM2.5 sampler 
    associated with a candidate reference method or Class I equivalent 
    method meets all design and performance specifications set forth in 
    Appendix L of part 50 of this chapter as well as additional 
    requirements specified in this subpart E. Some of these tests may also 
    be applicable to portions of a Class II or III equivalent method 
    sampler, as determined under subpart F of this part.
        (b) Samplers associated with candidate reference methods for 
    PM2.5 shall be subject to the provisions, specifications, and test 
    procedures prescribed in Secs. 53.51 through 53.56. Samplers associated 
    with candidate Class I equivalent method for PM2.5 shall be 
    subject to the provisions, specifications, and test procedures 
    prescribed in all sections of this Subpart. Samplers associated with 
    candidate Class II or Class III equivalent method for PM2.5 shall 
    be subject to the provisions, specifications, and test procedures 
    prescribed in all applicable sections of this Subpart, as specified in 
    subpart F of this part.
        (c) Section 53.51 pertains to test results and documentation 
    required to demonstrate compliance of a candidate method sampler with 
    the design specifications set forth in Appendix L of part 50 of this 
    chapter. Test procedures prescribed in Secs. 53.52 through 53.56 
    pertain to performance tests required to demonstrate compliance of a 
    candidate method sampler with the performance specifications set forth 
    in Appendix L of part 50 of this chapter, as well as additional 
    requirements specified in this subpart E. These latter test procedures 
    shall be used to test the performance of candidate samplers against the 
    performance specifications and requirements specified in each procedure 
    and summarized in Table E-1 of this subpart.
    
    [[Page 65812]]
    
        (d) Test procedures prescribed in Sec. 53.57 do not apply to 
    candidate reference method samplers. These procedures apply primarily 
    to candidate class I equivalent method samplers for PM2.5 that 
    have a sample air flow path configuration upstream of the sample filter 
    that is modified from that specified for the reference method sampler--
    as set forth in Drawings L-18 and L-24 of Appendix L to part 50 of this 
    chapter to provide for sequential sample capability. The additional 
    tests determine the adequacy of aerosol transport through any altered 
    components or supplemental devices that are used in a candidate sampler 
    upstream of the filter to achieve the sequential sample capability. 
    These tests may also apply, with appropriate adaptation, if necessary, 
    to candidate samplers having minor deviations from the specified 
    reference method sampler for purposes other than sequential operation. 
    In addition to the other test procedures in this subpart, these test 
    procedures shall be used to further test the performance of such 
    equivalent method samplers against the performance specifications given 
    in Table E-2 of this subpart.
        (e) Tests of a candidate sampler for sample flow rate capacity and 
    regulation, flow rate control, flow rate measurement accuracy, ambient 
    temperature and pressure measurement accuracy, filter temperature 
    control during sampling, and correct determination of elapsed sample 
    time, average volumetric flow rate, and flow rate variation are all 
    combined into a comprehensive test procedure (Sec. 53.52) that is 
    carried out over four 24-hour test periods under multiple test 
    conditions. Other performance parameters are tested individually with 
    specific test procedures (Secs. 53.53--53.57).
        (f) A 10-day field test of measurement precision is required for 
    both reference and equivalent method samplers. This test requires 
    collocated operation of 3 candidate method samplers at a field test 
    site. For candidate equivalent method samplers, this test may be 
    combined and carried out concurrently with the test for comparability 
    to the reference method specified under Sec. 53.34, which requires 
    collocated operation of three reference method samplers and three 
    candidate equivalent method samplers.
        (g) All tests and collection of test data shall be in accordance 
    with the requirements of Reference 1, section 4.10.5 (ISO 9001) and 
    Reference 2, Part B, section 3.3.1, paragraphs 1 and 2 and Part C, 
    section 4.6 (ANSI/ASQC E4) in appendix A of this subpart. All test data 
    and other documentation obtained specifically from or pertinent to 
    these tests shall be identified, dated, signed by the analyst 
    performing the test, and submitted to EPA in accordance with subpart A 
    of this part.
    
    
    Sec. 53.51  Requirements to show compliance with design specifications.
    
        For the purposes of this document the definitions of ISO registered 
    facility and ISO-certified auditor are found in Sec. 53.1(t) and (u). 
    An exception to this reliance by EPA on ISO affiliate audits is the 
    requirement of the submission of the operation or instruction manual 
    associated with the candidate method to EPA prior to designation. This 
    manual is required under Sec. 53.4(b)(3). The EPA has determined that 
    acceptable technical judgment for review of this manual may not be 
    assured by ISO affiliates, and approval of this manual will therefore 
    be accomplished by the EPA.
        (a) Overview. (1) In the absence of performance standards for some 
    features of the FRM sampler system, and of the EPA resources to 
    directly review and ensure manufacturer performance in producing 
    samplers according to the EPA design specifications in 40 CFR part 50, 
    Appendix L, EPA considers it necessary to require manufacturers to meet 
    two kinds of requirements to ensure their compliance with the design 
    specifications of 40 CFR part 50, Appendix L.
        (2) The subsequent paragraphs of this section specify certain 
    documentation that must be submitted and tests that are required to 
    demonstrate that instruments associated with a designated reference or 
    equivalent method for PM2.5 are properly manufactured to meet all 
    applicable design specifications and have been properly tested 
    according to all applicable test requirements for such designation. 
    Documentation is required to show that instruments and components are 
    manufactured or assembled in an ISO-9001-registered (or equivalent) 
    facility under a quality system that meets ISO-9001
    
    [[Page 65813]]
    
    requirements for manufacturing quality control and testing.
        (3) In addition, specific tests are required to verify that two 
    critical features of reference method samplersimpactor jet diameter 
    and the surface finish of surfaces specified to be anodizedmeet the 
    specifications of 40 CFR part 50, Appendix L. A checklist is required 
    to provide certification by an ISO-certified auditor that all 
    performance and other required tests have been properly and 
    appropriately conducted. Following designation of the method, another 
    checklist is required, initially and annually, to provide an ISO-
    qualified (or equivalent) auditor's certification that an adequate and 
    appropriate quality system is being implemented in the instrument 
    manufacturing process.
        (b) ISO Registration of manufacturing facility. (1) The applicant 
    must submit documentation verifying that the samplers associated with 
    the candidate method will be manufactured in an ISO 9001-registered 
    facility (as defined in Sec. 53.1(u)) and that the manufacturing 
    facility is maintained in compliance with all applicable ISO 9001 
    requirements (Reference 1 in appendix A of this subpart). The 
    documentation shall indicate the date of the original ISO 9001 
    registration for the facility and shall include a copy of the most 
    recent certification of continued ISO 9001 facility registration. If 
    the manufacturer does not wish to initiate or complete ISO 9001 
    registration for the manufacturing facility, documentation must be 
    included in the application to EPA describing an alternative method to 
    demonstrate that the facility meets the same general requirements as 
    required for ISO registration. In this case, the applicant must provide 
    documentation in the application to demonstrate, by required ISO-
    certified auditor's inspections, that a quality system is in place 
    which is adequate to document and monitor that the sampler system 
    components all conform to the design, performance and other 
    requirements specified in Appendix L of part 50 of this chapter.
        (2) Phase-in period. For a period of 1 year following the effective 
    date of this subpart, a candidate reference or equivalent method for 
    PM2.5 that utilizes a sampler manufactured in a facility that is 
    not ISO 9001-registered or otherwise approved by the EPA under 
    paragraph (b)(1) of this section may be conditionally designated as a 
    reference or equivalent method under this part. Such conditional 
    designation will be considered on the basis of evidence submitted in 
    association with the candidate method application showing that 
    appropriate efforts are currently underway to seek ISO 9001 
    registration or alternative approval of the facility's quality system 
    under paragraph (b)(1) of this section within the next 12 months. Such 
    conditional designation shall expire 1 year after the date of the 
    Federal Register notice of the conditional designation unless 
    documentation verifying successful ISO 9001 registration for the 
    facility or other EPA-acceptable quality system review and approval 
    process of the production that will manufacture the samplers is 
    submitted at least 30 days prior to the expiration date.
        (c) Sampler Manufacturing Quality Control. The manufacturer must 
    ensure that all components used in the manufacture of PM2.5 
    samplers to be sold as reference or equivalent methods and that are 
    specified by design in Appendix L of part 50 of this chapter are 
    fabricated or manufactured exactly as specified. If the manufacturer's 
    QC records show that its QC and QA system of standard process control 
    inspections (of a set number and frequency of testing that is less than 
    100%) complies with the applicable QA provisions of section 4 of 
    Reference 4 in Appendix A of this subpart and prevents nonconformances, 
    100% testing shall not be required until that conclusion is disproved 
    by customer return or other independent manufacturer or customer test 
    records. If problems are uncovered, inspection to verify conformance to 
    the drawings, specifications, and tolerances shall be performed. See 
    also paragraph (e) of this section (final assembly and inspection 
    requirements).
        (d) Specific tests and supporting documentation required to verify 
    conformance to critical component specifications. (1) Verification of 
    PM2.5 impactor jet diameter.  The diameter of the jet of each 
    impactor manufactured for a PM2.5 sampler under the impactor 
    design specifications set forth in Appendix L of part 50 of this 
    chapter shall be verified against the tolerance specified on the 
    drawing, using standard, NIST-traceable plug gages. This test shall be 
    a final check of the jet diameter following all fabrication operations, 
    and a record shall be kept of this final check. Submit evidence that 
    this procedure is incorporated in the ISO 9001-certified manufacturing 
    procedure, that the test is or will be routinely implemented, and that 
    an appropriate procedure is in place for the disposition of units that 
    fail this tolerance test.
        (2) Verification of surface finish. The anodization process used to 
    treat surfaces specified to be anodized shall be verified by testing 
    treated specimen surfaces for weight and corrosion resistance to ensure 
    that the coating obtained conforms to the coating specification. The 
    specimen surfaces shall be finished in accordance with military 
    standard specification 8625F, Type II, Class I (Reference 4) in the 
    same way the sampler surfaces are finished, and tested, prior to 
    sealing, as specified in Section 4.5.2 of Reference 4 in Appendix A of 
    this subpart.
        (e) Final assembly and inspection requirements. Each sampler shall 
    be tested after manufacture and before delivery to the final user. Each 
    manufacturer shall document its post-manufacturing test procedures. As 
    a minimum, each test shall consist of the following: Tests of the 
    overall integrity of the sampler, including leak tests; calibration or 
    verification of the calibration of the flow measurement device, 
    barometric pressure sensors, and temperature sensors; and operation of 
    the sampler with a filter in place over a period of at least 48 hours. 
    The results of each test shall be suitably documented and shall be 
    subject to review by an ISO 9001 auditor.
        (f) Manufacturer's audit checklists. Manufacturers shall require 
    ISO 9001 auditors to sign and date a statement indicating that the 
    auditor is aware of the appropriate manufacturing specifications 
    contained in Appendix L of part 50 of this chapter and the test or 
    verification requirements in this subpart. Manufacturers shall also 
    require ISO 9001 auditors to complete the checklists, shown in Figures 
    E-1 and E-2 of this subpart, which describe the manufacturer's ability 
    to meet the requirements of the standard for both designation testing 
    and product
    
    [[Page 65814]]
    
    manufacture. Refer to Reference 5 for additional guidance on the scope 
    and detail required for the checklist evaluations.
        (1) Designation testing checklist. The completed statement and 
    checklist as shown in Figure E-1 of this subpart shall be submitted 
    with the application for reference or equivalent method determination.
        (2) Product manufacturing checklist. Manufacturers shall require 
    ISO 9001 auditors to complete the attached Production Checklist, which 
    evaluates the manufacturer on its ability to meet the requirements of 
    the standard in maintaining quality control in the production of 
    reference or equivalent devices. The completed statement and checklist 
    shall be submitted with the application for reference or equivalent 
    method determination. As set forth in subpart A of this part, this 
    checklist must be completed and submitted annually to retain a 
    reference or equivalent method designation for a PM2.5 method.
        (3) If the conditions of paragraph (b)(2) of this section apply, a 
    candidate reference or equivalent method for PM2.5 may be 
    conditionally designated as a reference or equivalent method under this 
    part 53 without the submission of the checklists described in 
    paragraphs (f) (1) and (2) of this section. Such conditional 
    designation shall expire 1 year after the date of the Federal Register 
    notice of the conditional designation unless the checklists are 
    submitted at least 30 days prior to the expiration date.
    
    
    Sec. 53.52  Comprehensive procedure to test sampler performance under 
    various environmental conditions (environmental chamber tests).
    
        (a) Overview. This test procedure is a combined procedure to test 
    the following performance parameters:
        (1) Sample flow rate, flow rate regulation, and flow rate 
    measurement accuracy;
        (2) Ambient air temperature and barometric pressure measurement 
    accuracy;
        (3) Filter temperature control during sampling; and
        (4) Elapsed sampling time accuracy.
        The performance parameters tested under this procedure, the 
    corresponding minimum performance specifications, and the applicable 
    test conditions are summarized in Table E-2 of this subpart. Each 
    performance parameter tested, as described or determined in the test 
    procedure, must meet or exceed the performance specification given in 
    Table E-2 of this subpart. The candidate sampler must meet all 
    specifications for the associated PM2.5 method to be considered 
    for designation as a reference or equivalent method.
        (b) Technical definition. Sample flow rate means the quantitative 
    volumetric flow rate of the air stream caused by the sampler to enter 
    the sampler inlet and pass through the sample filter, measured in 
    actual volume units at the temperature and pressure of the air as it 
    enters the inlet.
        (c) Required test equipment.
        (1) Environmental chamber or other temperature-controlled 
    environment or environments, capable of obtaining and maintaining the 
    various temperatures between -20  deg.C to +40  deg.C as required for 
    the test with an accuracy of 2  deg.C. The test 
    environment(s) must be capable of maintaining temperature within the 
    specified limits continuously with the additional heat load of the 
    operating test sampler in the environment. [Henceforth, where the test 
    procedures specify a test or environmental ``chamber,'' an alternative 
    temperature-controlled environmental area or areas may be substituted, 
    provided the required test temperatures and all other test requirements 
    are met. See paragraph (f)(1) of this section]
        (2) Variable voltage ac power transformer, range 100 to 130 Vac, 
    with sufficient VA capacity to operate the test sampler continuously 
    under the test conditions.
        (3) Flow rate meter, suitable for measuring the actual volumetric 
    sampler flow rate at the sampler downtube in either an open system or 
    in a closed system operating below atmospheric pressure, range 10 to 25 
    actual L/min, 2 percent certified accuracy, NIST-traceable, over a 
    temperature range of -30  deg.C to +50  deg.C and pressure range of 600 
    to 800 mm Hg, with continuous (analog) recording capability or digital 
    recording at intervals of not more than 5 minutes. Mass flow meter type 
    recommended; however, note that temperature and pressure corrections 
    are generally required to convert measured mass flow rate to actual 
    volumetric flow rate.
        (4) Ambient air temperature recorder, range -30 deg.C to +50 deg.C, 
    certified accurate to within 0.5  deg.C with a radiation error of 0.2 
    deg.C or less under a solar radiation intensity of 1000 watts/m2, 
    as described in Reference 6 in appendix A of this subpart.
        (5) Barometric pressure meter, range 600 to 800 mm Hg, certified 
    accurate to 2 mm Hg.
        (6) Miniature temperature sensor, capable of being installed in the 
    sampler without introducing air leakage and capable of measuring the 
    sample air temperature within 1 cm of the center of the filter, 
    downstream of the filter, certified accurate to within 0.5  deg.C, NIST 
    traceable, with continuous (analog) recording capability or digital 
    recording at intervals of not more than 5 minutes.
        (7) Means for creating or simulating the effect of a reduced 
    barometric pressure on the test sampler during sampler operation, 
    capable of simulating barometric pressures ranging from 730 to 600 mm 
    Hg. A suggested, closed-system technique for a hypothetical sampler is 
    illustrated in Figure E-3 of this subpart, but the configuration shown 
    may have to be modified or adapted to accommodate the specific design 
    of the actual candidate method sampler. The sampler-specific technique 
    or apparatus proposed by the applicant for simulating barometric 
    pressure for purposes of this test may be submitted for pre-approval of 
    concept prior to conducting the test. Alternatively, a hypobarometric 
    chamber or other test environment with capability of maintaining 
    barometric pressures ranging from local actual barometric pressure to 
    600 mm Hg, as well as the temperature capability specified in paragraph 
    (c)(1) of this section, shall be used.
        (8) Means, such as a solar-spectrum lamp or lamps, for generating 
    or simulating thermal radiation in approximate spectral content and 
    intensity equivalent to solar insolation of 1000 watts/m2 (1.43 
    langleys/min) inside the environmental chamber.
        (9) AC rms voltmeter, accurate to 0.5 volts.
        (10) Means for creating an additional pressure drop of 55 mm Hg in 
    the sampler to simulate a heavily loaded filter, such as an orifice or 
    flow restrictive plate installed in the filter holder or a valve or 
    other flow restrictor temporarily installed in the flow path near the 
    filter.
        (11) Time measurement system, accurate to within 10 seconds per 
    day.
        (12) Radiometer, to measure the intensity of the simulated solar 
    radiation in the test environment, range 0--1500/m2.
        (d) Calibration of test measurement instruments. Submit 
    documentation showing evidence of recent calibration, calibration 
    accuracy, and NIST-traceability (if required) of all measurement 
    instruments used in the tests. The accuracy of flow meters shall be 
    verified at the highest and lowest pressures and temperatures used in 
    the tests and shall be checked at zero and one or more non-zero flow 
    rates within 7 days of test use. Where an instrument's measurements are 
    to be recorded with an analog recording
    
    [[Page 65815]]
    
    device, the accuracy of the entire instrument-recorder system shall be 
    calibrated or verified.
        (e) Test setup. (1) The test sampler shall be set up for testing in 
    the temperature-controlled chamber. Setup of the sampler shall be 
    performed as described in the sampler's operation or instruction manual 
    referred to in Sec. 53.4(b)(3). The sampler shall be installed upright 
    and set up in its normal configuration for collecting PM2.5 
    samples, except that the sample air inlet shall be removed to permit 
    measurement of the sampler flow rate.
        (2) The certified flow rate meter shall be connected to the test 
    sampler so as to accurately measure the sampler flow rate at the 
    entrance to the sampler (i.e., the flow rate that would enter the 
    sampler inlet if the inlet had not been removed).
        (3) The sampler shall be provided with ac line power from the 
    variable voltage ac power transformer, which shall be initially set to 
    a nominal voltage of 115 volts ac (rms).
        (4) The miniature temperature sensor shall be installed in the test 
    sampler such that it accurately measures the air temperature 1 cm from 
    the center of the filter on the downstream side of the filter. The 
    sensor shall be installed in a way such that no external or internal 
    leakage is created by the sensor installation.
        (5) If a closed-system means for simulating reduced barometric 
    pressure in the sampler, as suggested in paragraph (c)(7) of this 
    section, is to be used in lieu of a hypobarometric chamber, the 
    necessary apparatus shall be installed on the test sampler as 
    appropriate, in such a way that the certified flow rate meter will 
    still accurately measure the sampler flow rate. Also, the barometric 
    pressure meter shall be installed to accurately measure the simulated 
    or actual reduced barometric pressure to which the sampler is subjected 
    during the test.
        (6) The solar radiant energy source shall be installed in the test 
    chamber such that the entire test sampler is irradiated in a manner 
    similar to the way it would be irradiated by solar radiation if it were 
    located outdoors in an open area on a sunny day, with the radiation 
    arriving at an angle of between 30 and 45 degrees from vertical and 
    such that the intensity of the radiation received by all sampler 
    surfaces that receive direct radiation is not less than 1000 watts/
    cm2, measured in a plane perpendicular to the incident radiation. 
    The incident radiation shall be oriented with respect to the sampler 
    such that the area of the sampler's ambient temperature sensor (or 
    temperature shield) receives direct radiation as it would or could 
    during normal outdoor installation. Also, the sensor must not be 
    shielded from the radiation by a sampler part in a way that would not 
    occur at other normal insolation angles or directions.
        (7) The ambient air temperature recorder shall be installed in the 
    test chamber such that it will accurately measure the temperature of 
    the air in the chamber without being unduly affected by the chamber's 
    air temperature control system or by the radiant energy from the solar 
    radiation source that may be present inside the test chamber.
        (f) Procedure. (1) The test sampler shall be tested during 
    operation over four (4) 24-hour sample collection periods (Test numbers 
    1--4) under the conditions specified in Table E-1 of this subpart. The 
    test chamber temperature shall be held at the specified initial 
    temperature for the first 8 hours of each test period, during which 
    various performance parameters are measured. During hours 9 through 21 
    of each test period, the chamber temperature is transitioned from the 
    initial to the final specified temperature; the temperature profile is 
    unspecified during this period, provided that the final specified 
    temperature is achieved before the start of hour 22 of each test 
    period. The specified final temperature shall be maintained during 
    hours 22 through 24 of each test period.
        (2) Prepare the test sampler for normal sample collection operation 
    as directed in the sampler's operation or instructional manual. If the 
    sampler has multiple (sequential) sample capability, this capability 
    may be used for the four 24-hour tests, if desired. Convenient start 
    and stop times for a 240.1 hour test period shall be set in 
    the test sampler to effect automatic sampler operation for each test 
    period. Test periods are not required to start at midnight; each test 
    period may start at any time of day.
        (3) Carry out a leak test of the sampler as described in the 
    sampler's operation manual. The leak test must be properly passed 
    before other tests are carried out.
        (4) At the beginning of each test period, the solar insolation 
    source, as described in paragraph (c)(8) of this section, shall be off, 
    and the sampler shall be subject to barometric pressure of not less 
    than 730 mm Hg.
        (5) During each 24-hour test period, continuously record the test 
    chamber air temperature, the filter temperature, and the sampler flow 
    rate, as measured by the test equipment [paragraph (c) of this 
    section], either via a continuous analog recording or digital recording 
    at intervals of not more than 5 minutes. Note and record the actual 
    start and stop times for the sample period. The sampler power line 
    voltage shall be measured and recorded during hours 1 and 24 of the 
    test period and following completion of the specific performance 
    parameter tests during the initial 8-hour portion of the test period.
        (6) The following tests shall be carried out at some time during 
    hours 1-8 of each 24-hour test period. The time at which the test data 
    for each test are obtained (either time of day or elapsed time since 
    the start of the 24-hour test period, whichever system is used to 
    record flow rate and chamber temperature, to the closest 1 minute) 
    shall be recorded along with the test data. If analog recording is 
    used, the time of each test shall be identified or annotated directly 
    on the strip chart record.
        (i) Determine and record the sampler flow rate, in actual 
    volumetric units, indicated by the sampler, and the corresponding flow 
    rate measured by the flow rate test meter specified in paragraph (c)(3) 
    of this section.
        (ii) Determine and record the ambient (chamber) temperature 
    indicated by the sampler and the corresponding ambient (chamber) 
    temperature measured by the ambient temperature recorder specified in 
    paragraph (c)(4) of this section.
        (iii) Determine and record the ambient (chamber) barometric 
    pressure indicated by the sampler and the corresponding ambient 
    (chamber) barometric pressure measured by the barometric pressure meter 
    specified in paragraph (c)(5) of this section.
        (iv) Activate the solar radiation source; after at least 2 hours 
    (120 minutes) of sampler operation following the start of simulated 
    insolation exposure, repeat tests in paragraphs (f)(6) (i) and (ii) of 
    this section under continuation of the insolation exposure.
        (v) Activate the solar radiation source; after at least 2 hours 
    (120 minutes) of sampler operation following the start of simulated 
    solar insolation exposure, subject the sampler to a barometric pressure 
    (actual or simulated) of 600 mm Hg (absolute) while 
    continuing the insolation exposure. After at least 1 hour (60 minutes) 
    of sampler operation at this barometric pressure, repeat tests in 
    paragraphs (f)(6) (i), (ii), and (iii) of this section under 
    continuation of the reduced barometric pressure and insolation 
    exposure.
        (vi) Activate the solar radiation source; after at least 2 hours 
    (120 minutes) of sampler operation following the start of insolation 
    exposure, subject the sampler to a barometric pressure (actual or 
    simulated) of 600 mm Hg
    
    [[Page 65816]]
    
    while continuing the insolation exposure. After at least 1 hour (60 
    minutes) of sampler operation at this barometric pressure, provide an 
    additional filter pressure drop of 55 mm Hg, as specified in paragraph 
    (c)(10) of this section and repeat tests in paragraphs (f)(6)(i), and 
    (iii) of this section under continuation of the reduced barometric 
    pressure, increased pressure drop, and insolation exposure. One or more 
    of the power interruptions required in paragraph (f) (6)(vii) of this 
    section may be used, if appropriate, to make necessary adjustments to 
    the sampler to effect the additional filter pressure drop.
        (vii) Interrupt the ac line electrical power to the sampler for 
    periods of 20 seconds, 40 seconds, 2 minutes, 7 minutes, and 20 
    minutes, with not less than 5 minutes of electrical power, at the 
    voltage specified for the test, between each power interruption. Record 
    the hour and minute of each power interruption.
        (7) After completing the special tests under paragraph (f)(6) of 
    this section, the remainder of the 24-hour test period may be completed 
    with the test sampler subjected to any barometric pressure within the 
    range specified in Table
    E-2 of this subpart, with or without the additional filter pressure 
    drop, and with the solar radiation either off or on.
        (g) Test Results. All requirements in this procedure must be passed 
    in full for each of the four 24-hour tests; no provision is made for 
    additional trials to compensate for failed tests. For each of the four 
    24-hour test periods, validate the test conditions and determine the 
    test results as follows:
        (1) Chamber temperature control. Examine the continuous record of 
    the chamber temperature obtained in test procedure paragraph (f)(5) of 
    this section and verify that the temperature met the requirements 
    specified in Table E-1 of this subpart at all times during the test. If 
    not, the entire 24-hour test is not valid and must be repeated.
        (2) Power line voltage. Verify that each of the three power line 
    voltage measurements obtained in test procedure in paragraph (f)(5) of 
    this section met the line voltage requirements specified in Table E-1 
    of this subpart. If not, the entire 24-hour test is not valid and must 
    be repeated.
        (3) Sample flow rate. (i) From the continuous record of the test 
    sampler flow rate obtained from the flow rate meter in test procedure 
    paragraph (f)(5) of this section, determine the average or 
    instantaneous sampler flow rate, or average flow rate, at intervals of 
    not more than 5 minutes for the entire 24-hour sample period. Calculate 
    the percent difference between the sampler interval flow rate, in 
    actual liters per minute (L/min), and 16.67 L/min, for each interval in 
    test procedures in paragraphs (f)(6)(i), (6)(iv), (6)(v), and (6)(vi) 
    of this section, as follows:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.062
    
    
    Where Fi is the measured sampler flow rate for interval I, in 
    actual L/min.
        (ii) All calculated sampler flow rate percent differences must meet 
    the sample flow rate specification listed in Table E-2 of this subpart.
        (4) Sample flow rate regulation. (i) Using the sampler interval 
    flow rates obtained in paragraph (g)(3) of this section, calculate the 
    average sampler flow rate in actual liters per minute for the 24-hour 
    period, excluding periods of electrical power interruption, as,
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.063
    
    
    where
    Fave = average sampler flow rate over the 24-hour test period,
    Fi = sampler flow rate for interval I
    n = number of flow intervals over the 24-hour period, excluding 
    intervals of no flow rate during power interruptions.
        (ii) For each interval over the 24-hour period, calculate the 
    difference between the interval sampler flow rate and the average 
    sampler flow rate. The difference between the interval sampler flow 
    rate and the average sampler flow rate must meet the flow rate 
    regulation specification listed in Table E-2 of this subpart for all 
    intervals during the 24-hour test period, excluding periods of 
    electrical power interruption.
        (5) Sample flow rate coefficient of variation. (i) Using the 
    sampler interval flow rates determined in paragraph (g)(3) of this 
    section, calculate the sampler flow rate coefficient of variation, 
    CVflow as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.064
    
    
    Where
    
    CVflow = coefficient of variation of sampler flow rate, and 
    Fave, Fi, I, and n are as defined previously.
    
        (ii) The CVflow calculated must meet the sampler flow rate 
    coefficient of variation specification listed in Table E-2 of this 
    subpart for each test. Also the coefficient of variation reported by 
    the sampler at the end of the sample period must agree with CVflow 
    calculated here within 0.5%.
        (6) Flow rate measurement accuracy. (i)(A) Calculate the percent 
    difference between the sampler flow rate, in actual liters per minute 
    (L/min), indicated by the sampler, and the sampler flow rate measured 
    with the flow rate test meter [paragraph (c)(3) of this section] in 
    test procedures in paragraphs (f) (6)(i), (6)(iv), (6)(v), and (6)(vi) 
    of this section, for each set of measurements as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.065
    
    
    Where
    
    Fsi = sampler flow rate indicated by the sampler, in actual L/
    min., for measurement set I.
    
        (B) All calculated sampler flow rate percent differences must meet 
    the flow
    
    [[Page 65817]]
    
    rate measurement accuracy specification listed in Table E-2 of this 
    subpart.
        (ii)(A) Obtain the value for the average sampler volumetric flow 
    rate reported by the sampler at the end of the sample period and 
    calculate the percent difference between the reported average sampler 
    flow rate and the average flow rate determined in paragraph (f)(4) of 
    this section as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.066
    
    
    Where
    
    Fs,ave = average sampler flow rate reported by the sampler.
    
        (B) This calculated percent difference must also meet the flow rate 
    measurement accuracy specification listed in Table E-2 of this subpart.
        (7) Ambient temperature measurement accuracy. (i) Calculate the 
    difference between the ambient air temperature indicated by the sampler 
    and the ambient (chamber) air temperature measured with the ambient air 
    temperature recorder, paragraph (c)(4) of this section, in test 
    procedures paragraphs (f) (6)(ii), (6)(iv), and (6)(v) of this section, 
    as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.067
    
    
    Where
    
    Ts = ambient air temperature indicated by the sampler,  deg.C; and
    Tm = ambient air temperature measured by the test temperature 
    instrument,  deg.C.
    
        (ii) All calculated temperature differences must meet the ambient 
    air temperature measurement accuracy specification listed in Table E-2 
    of this subpart.
        (8) Ambient barometric pressure measurement accuracy. (i) Calculate 
    the difference between the ambient barometric pressure indicated by the 
    sampler and the ambient barometric pressure measured with the ambient 
    barometric pressure meter, paragraph (c)(5) of this section, in test 
    procedures in paragraphs (f)(6)(iii), (6)(v), and (6)(vi) of this 
    section, as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.068
    
    
    Where
    
    Ps=ambient barometric pressure indicated by the sampler, mm Hg; 
    and
    Pm=ambient barometric pressure measured by the test barometric 
    pressure meter, mm Hg.
    
        (ii) All calculated differences for barometric pressure must meet 
    the ambient barometric pressure measurement accuracy specification 
    listed in Table E-2 of this subpart.
        (9)(i) Filter temperature control (sampling). From the continuous 
    record of the test sampler filter temperature obtained from the filter 
    temperature sensor, paragraphs (c)(6) and (e)(4) of this section, in 
    test procedure in paragraph (f)(5) of this section, determine the 
    measured instantaneous or average filter temperature at intervals of 
    not more than 5 minutes for the entire 24-hour sample period. From the 
    continuous record of the ambient air temperature obtained from the 
    ambient (chamber) air temperature recorder, paragraph (c)(4) of this 
    section, in test procedure paragraph (f)(5) of this section, determine 
    the measured instantaneous or average ambient (chamber) air temperature 
    at intervals of not more than 5 minutes for the entire 24-hour sample 
    period. For each interval over the 24-hour period (excluding intervals 
    during power interruptions), calculate the difference, in  deg.C, 
    between the measured interval filter temperature and the measured 
    interval ambient temperature for the corresponding interval, as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.069
    
    
        (ii) The difference between the interval filter temperature and the 
    interval average ambient temperature for all intervals must meet the 
    filter temperature control specification listed in Table E-2 of this 
    subpart, excluding periods of electrical power interruption.
        (10) Elapsed sample time accuracy. Calculate the sample time for 
    the 24-hour sample period as the difference between the sample end time 
    and the sample start time, as recorded in paragraph (f)(5) of this 
    section, less the total time duration of all power interruptions. The 
    difference between the actual sampler time calculated and the sample 
    time reported by the sampler at the end of the sample period must meet 
    the elapsed sample time accuracy specification listed in Table E-2 of 
    this subpart.
        (11) Record of power interruptions. Verify that the sampler 
    provides a visual display of the correct year, month, day-of-month, 
    hour, and minute, within 2 minutes, of the start of each 
    power interruption of more than 60 seconds.
    
    
    Sec. 53.53  Post-sampling filter temperature control test.
    
        (a) Overview. This procedure provides for testing the temperature 
    control of the sample filter during the post-sampling (non-sampling) 
    mode following sample collection. The test conditions and performance 
    specifications are summarized in Table E-2 of this subpart. This 
    performance parameter, when tested or determined as described in this 
    test procedure, must meet or exceed the performance specification given 
    in Table E-2 of this subpart for the associated PM2.5 method to be 
    considered for designation as a reference or equivalent method.
        (b) Technical Definition. Post-sampling temperature control is the 
    ability of a sampler to maintain the temperature of the particulate 
    matter sample filter within the specified deviation from ambient 
    temperature during the period between the end of active sample 
    collection of the PM2.5 sample by the sampler until the filter is 
    retrieved from the sampler for laboratory analysis.
        (c) Required test equipment. (1) Environmental chamber or other
    
    [[Page 65818]]
    
    temperature-controlled environment or environments, capable of 
    obtaining and maintaining the various temperatures between -20  deg.C 
    to +40  deg.C as required for the test with an accuracy of 
    2  deg.C. The test environment(s) must be capable of 
    maintaining temperature within the specified limits continuously with 
    the additional heat load of the operating test sampler in the 
    environment. [Henceforth, where the test procedures specify a test or 
    environmental ``chamber,'' an alternative temperature-controlled 
    environmental area or areas may be substituted, provided the required 
    test temperatures and all other test requirements are met. See 
    Sec. 53.52(f)(1)].
        (2) Variable voltage ac power transformer, range 100 to 130 Vac, 
    with sufficient VA capacity to operate the sampler continuously under 
    test conditions.
        (3) Ambient air temperature recorder, range -30 deg.C to +50 deg.C, 
    certified accurate to within 0.5  deg.C with a radiation error of 0.2 
    deg.C or less under a solar radiation intensity of 1000 watts/m2, 
    as described in Reference 6 in Appendix A of this subpart.
        (4) Miniature temperature sensor, capable of being installed in the 
    sampler without introducing air leakage and capable of measuring the 
    sample air temperature within 1 cm of the center of the filter, 
    downstream of the filter, certified accurate to within 0.5 deg.C, NIST 
    traceable, with continuous (analog) recording capability or digital 
    recording at intervals of not more than 5 minutes.
        (5) Means, such as a solar-spectrum lamp or lamps, for generating 
    or simulating thermal radiation in approximate spectral content and 
    intensity equivalent to solar insolation of 1000 watts/m2, inside 
    the environmental chamber.
        (6) AC rms voltmeter, accurate to 0.5 volts.
        (7) Time measurement system, accurate to 10 seconds per day.
        (d) Calibration of test measurement instruments. Submit 
    documentation showing evidence of recent calibration, calibration 
    accuracy, and NIST-traceability (if required) of all measurement 
    instruments used for the tests. Where an instrument's measurements are 
    to be recorded with an analog recording device, the accuracy of the 
    entire instrument-recorder system shall be calibrated or verified.
        (e) Test Setup. (1) The test sampler shall be set up for testing in 
    the temperature-controlled chamber. Setup of the sampler shall be 
    performed as described in the sampler's operation or instruction manual 
    referred to in Sec. 53.4 (b)(3). The sampler shall be installed upright 
    and set up in its normal configuration for collecting PM2.5 
    samples with a filter installed, except that the sample air inlet may 
    be removed, if desired.
        (2) The sampler shall be provided ac line power from the variable 
    voltage ac power transformer, which shall be set to provide power to 
    the sampler at a voltage of 105 1 volts ac (rms) during 
    this test.
        (3) The miniature temperature sensor shall be installed in the test 
    sampler such that it accurately measures the temperature of the air 1 
    cm from the center of the filter on the downstream side of the filter.
        (4) The solar radiant energy source shall be installed in the test 
    chamber such that the entire test sampler is irradiated in a manner 
    similar to the way it would be irradiated by solar radiation if it were 
    located outdoors in an open area on a sunny day, with the radiation 
    arriving at an angle of between 30 and 45 degrees from vertical and 
    such that the intensity of the radiation received by all sampler 
    surfaces that receive direct radiation is not less than 1000 watts/
    m2 (measured in a plane perpendicular to the incident radiation). 
    The incident radiation shall be oriented with respect to the sampler 
    such that the area of the sampler's ambient temperature sensor (or 
    temperature sensor shield) receives direct radiation as it would or 
    could during normal outdoor installation. Also, the sensor must not be 
    shielded from the radiation by a sampler part in a way that would not 
    occur at other normal insolation angles or directions.
        (5) The ambient air temperature recorder shall be installed in the 
    test chamber such that it will accurately measure the temperature of 
    the air in the chamber without being unduly affected by the chamber's 
    air temperature control system or by the radiant energy from the solar 
    radiation that may be present inside the test chamber.
        (f) Procedure. (1) The test sampler shall be tested during 
    operation in the post-sample collection operational mode (operation of 
    the sampler during the period from the end of active sample collection 
    of the PM2.5 sample by the sampler until the filter is retrieved 
    from the sampler for laboratory analysis) over seven (7) hours, 
    following one of the 24-hour tests described in Sec. 53.52. The test 
    chamber temperature shall be initially set to -20  deg.C, 
    raised to 40  deg.C, held at 40  deg.C for one 
    hour, then reduced to -20  deg.C during the test.
        (2) Prepare the sampler for the test by allowing the sampler to 
    operate for a normal 24-hour sample collection period, as directed in 
    the sampler's operation or instruction manual. If the sampler has 
    multiple (sequential) sample capability, any of the sequential channels 
    may be used for the test; however, if the sampler has multiple filter 
    holders, each filter holder must be tested for temperature control. 
    Convenient start and stop times for a 24  0.1 hour sample 
    collection period shall be set in the sampler to effect automatic 
    sampler operation for each test period. The active sample collection 
    period may start at any time of day and is not required to start at 
    midnight. One or more of the test periods associated with test 
    procedure set forth in Sec. 53.52 may be used for this test 
    preparation.
        (3) At the beginning of the 7-hour test period, the solar 
    insolation source, as described in paragraphs (c)(4) and (e)(4) of this 
    section, shall be on, the ambient (chamber) temperature shall be set to 
    -20  deg.C, and the sampler power line voltage shall be set 
    to 105 1 volts ac (rms).
        (4) During the 7-hour test period, continuously record the test 
    chamber air temperature and the filter temperature, as measured by the 
    test equipment in paragraph (c) of this section, either via a 
    continuous analog recording or digital recording at intervals of not 
    more than 5 minutes. Note and record the actual start and stop times 
    for the sample period. The sampler power line voltage shall be measured 
    during hours 1 and 7 of the test and at any other time during the test 
    period when there is a possibility that the voltage may have changed.
        (5) During the first 3 hours of the test, the chamber air 
    temperature shall be increased such that the chamber air temperature is 
    40  deg.C 3 hours after the beginning of the test. The 
    chamber air temperature shall be maintained at 40  deg.C for 
    one hour (until 4 hours after the beginning of the test), then 
    decreased over the next 3 hours of the test such that the chamber air 
    temperature is -20  deg.C at the end of the test (7 hours 
    after the beginning of the test. The chamber air temperature profile 
    during the first and last three hours of the test is unspecified, 
    provided the initial, central hour, and final temperatures are as 
    specified in paragraph (f)(1) of this section.
        (g) Test Results--(1) Filter temperature control (post-sampling). 
    From the continuous record of the test sampler filter temperature 
    obtained from the filter temperature sensor, paragraphs (c)(3) and 
    (e)(3) of this section, determine the measured instantaneous or average 
    filter temperature at intervals of not more than 5 minutes for the 
    entire 7-hour test
    
    [[Page 65819]]
    
    period. From the continuous record of the ambient air temperature 
    obtained from the ambient (chamber) air temperature recorder, 
    paragraphs (c)(4) and (e)(5) of this section, determine the measured 
    instantaneous or average ambient (chamber) air temperature at the same 
    intervals used for filter temperature for the entire 7-hour sample 
    period. For each interval over the 7-hour period, calculate the 
    difference, in  deg.C, between the measured interval filter temperature 
    and the measured interval ambient temperature for the corresponding 
    interval, as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.070
    
    
        (2) The difference between the interval filter temperature and the 
    interval average ambient temperature for each and all intervals must 
    meet the filter temperature control specification listed in Table E-2 
    of this subpart, excluding periods of electrical power interruption, if 
    any.
    
    
    Sec. 53.54  Leak check test.
    
        (a) Overview. Under section 7.4.6 of Appendix L of part 50 of this 
    chapter, the sampler is required to include a facility--including 
    components, instruments, operator controls, a written procedure, and 
    other capabilities as necessary--to allow the operator to carry out a 
    leak test of the sampler at a field monitoring site without additional 
    equipment. This procedure is intended to test the adequacy and 
    effectiveness of the sampler's leak check facility. Because of the 
    variety of potential sampler configurations and leak check procedures 
    possible, some adaptation of this procedure may be necessary to 
    accommodate the specific sampler under test.
        (b) Technical definitions. (1) External leakage includes the total 
    flow rate of external ambient air which enters the sampler other than 
    through the sampler inlet and which passes through any one or more of 
    the impactor, filter, or flow rate measurement components.
        (2) Internal leakage is the total sample air flow rate that passes 
    through the filter holder assembly without passing through the sample 
    filter.
        (c) Required test equipment.
        (1) Flow rate measurement device, range 70 to 130 mL/min, 2 percent 
    certified accuracy, NIST-traceable.
        (2) Flow control device, capable of providing a controlled, 
    simulated leak flow rate of 100 mL/min.
        (3) Flow rate measurement adaptor (Drawing L-27, Appendix L of part 
    50 of this chapter) or equivalent adaptor to facilitate measurement of 
    sampler flow rate.
        (4) A disk, such as a sample filter that is heavily loaded or a 
    flow-impervious membrane containing one or more pinholes, which can be 
    installed into the sampler's filter cassette (either with or without a 
    normal sample filter) and which blocks the normal flow rate through the 
    filter cassette but which, instead, provides a simulated leak flow rate 
    through the disk of not more than 100 mL/min under the conditions 
    specified for the leak check in the sampler's leak check procedure.
        (d) Calibration of test measurement instruments. Submit 
    documentation showing evidence of recent calibration, calibration 
    accuracy, and NIST-traceability (if required) of all measurement 
    instruments used in the tests. The accuracy of flow meters shall be 
    verified at the highest and lowest pressures and temperatures used in 
    the tests and shall be checked at zero and one or more non-zero flow 
    rates within 7 days of test use.
        (e) Test setup. (1) The test sampler shall be set up for testing as 
    described in the sampler's operation or instruction manual referred to 
    in Sec. 53.4(b)(3). The sampler shall be installed upright and set up 
    in its normal configuration for collecting PM2.5 samples, except 
    that the sample air inlet shall be removed and a device such as a flow 
    rate measurement adaptor shall be installed on the sampler's downtube.
        (2) The flow rate control device shall be set up to provide a 
    constant, controlled flow rate of 100 mL/min into the sampler downtube 
    under the conditions specified for the leak check in the sampler's leak 
    check procedure.
        (3) The flow rate measurement device shall be set up to measure the 
    controlled flow rate of 100 mL/min into the sampler downtube under the 
    conditions specified for the leak check in the sampler's leak check 
    procedure.
        (f) Procedure. (1) Install a sample filter in the test sampler and 
    ensure that the sampler has no internal or external leaks.
        (2) Carry out both the external and internal leak check procedure 
    as described in the sampler's operation/instruction manual and verify 
    that both leak checks indicate no significant leaks in the test 
    sampler.
        (3) Arrange the flow control device, flow rate measurement device, 
    and other apparatus as necessary to provide a simulated leak flow rate 
    of 100 mL/min into the test sampler through the downtube during the 
    specified external leak check procedure. Carry out the external leak 
    check procedure as described in the sampler's operation/instruction 
    manual but with the simulated leak of 100 mL/min.
        (4) Install the disk that simulates a filter-bypass leak in the 
    filter cassette and carry out the internal leak check procedure as 
    described in the sampler's operation/instruction manual.
        (g) Test results. The requirements for successful passage of this 
    test are:
        (1) That the leak check procedure indicates no significant external 
    or internal leaks in the test sampler when no simulated leaks are 
    introduced.
        (2) That the external leak check procedure properly identifies the 
    simulated external leak of 100 mL/min.
        (3) That the internal leak check procedure properly identifies the 
    simulated internal leak of 100 mL/min.
    
    
    Sec. 53.55  Flow rate cut-off test.
    
        (a) Overview. This test is intended to verify that the sampler 
    carries out the required automatic sample flow rate cut-off function 
    properly.
        (b) Technical definition. The flow rate-cut off function requires 
    the sampler to automatically stop sample flow and terminate the current 
    sample collection if the sample flow rate becomes less than the minimum 
    flow rate specified in Table E-2 of this subpart (10 percent below the 
    nominal sample flow rate) for more than 60 seconds during a sample 
    collection period.
        (c) Required test equipment. (1) Flow rate meter, suitable for 
    measuring the sampler flow rate at the sampler inlet in a closed system 
    below atmospheric pressure, range 10 to 25 actual L/min, 2 percent 
    certified accuracy, NIST-traceable, with continuous (analog) recording 
    capability or digital recording at intervals of not more than 5 
    seconds. Mass flow meter type recommended; however, note that 
    temperature and pressure corrections are generally required to convert 
    measured mass flow rate to actual volumetric flow rate.
        (2) Valve or other means to restrict or reduce the sample flow 
    rate.
        (d) Calibration of test measurement instruments. Submit 
    documentation showing evidence of recent calibration, calibration 
    accuracy, and NIST-traceability of the flow rate meter used for this 
    test. The accuracy of the flow meter shall be verified at the highest
    
    [[Page 65820]]
    
    and lowest pressures used in the tests and shall be checked at zero and 
    one or more non-zero flow rates within 7 days of test use. Where an 
    instrument's measurements are to be recorded with an analog recording 
    device, the accuracy of the entire instrument-recorder system shall be 
    calibrated or verified.
        (e) Test setup. (1) The test sampler shall be set up for testing at 
    any temperature and barometric pressure within the specified ranges. 
    Setup of the sampler shall be performed as described in the sampler's 
    operation or instruction manual referred to in Sec. 53.4(b)(3). The 
    sampler shall be installed upright and set up in its normal 
    configuration for collecting PM2.5 samples, except that the sample 
    air inlet shall be removed to permit measurement of the sampler flow 
    rate by the certified flow rate meter.
        (2) The flow rate meter shall be connected so as to measure the 
    sampler flow rate at the entrance to the sampler (i.e. the flow rate 
    that would enter the sampler inlet if the inlet had not been removed).
        (3) The valve or means for reducing sampler flow rate shall be 
    installed such that the sampler flow rate can be manually restricted 
    during the test.
        (f) Procedure. (1) Prepare the sampler for normal sample collection 
    operation as directed in the sampler's operation or instruction manual. 
    Set the sampler to automatically start a normal 24-hour sampler 
    collection period at a convenient time.
        (2) Continuously record the sampler flow rate and the time during 
    the sample period, with at least 5-minute resolution during the normal 
    operation of the sampler and at least 5-second resolution during the 
    time period when the sampler flow rate is manually reduced.
        (3) After at least 1 hour of normal sampler operation at a sample 
    flow rate within the specified flow rate range specified in Table E-2 
    of this subpart, manually restrict the sampler flow rate such that the 
    sampler flow rate is decreased slowly over several minutes to a flow 
    rate less than the flow rate cut off value specified in Table E-2 of 
    this subpart. Maintain this flow rate for at least 2.0 minutes or until 
    the sampler stops the sample flow automatically.
        (g) Test Results. (1) Inspect the continuous record of the sampler 
    flow rate and determine the time at which the sampler flow rate 
    decreases to a value less than the cut-off value specified in Table E-2 
    of this subpart. To pass this test, the sampler must automatically stop 
    the sampler flow at least 30 seconds but not more than 50 seconds after 
    the time at which the sampler flow rate was determined to have 
    decreased to a value less than the value specified in Table E-2 of this 
    subpart.
        (2) Verify that the elapsed sample time and average flow rate 
    reported by the sampler for this test sample period are accurate within 
    2 percent. The sampler must provide the same information to the 
    operator as is required following a normal sample collection period, 
    and the information reported in this test must accurately reflect the 
    substantially shortened sample collection period caused by the 
    automatic sample flow cut off.
        (3) Verify that the sampler's required ``Flow-out-of-spec'' and the 
    ``Incorrect sample period'' flag indicators are set at the end of the 
    test.
    
    
    Sec. 53.56  Operational field precision test.
    
        (a) Overview. This test is intended to determine the operational 
    precision of the candidate sampler during a minimum of 10 days of field 
    operation, using three collocated test samplers. Measurements of 
    PM2.5 are made with all of the samplers and then compared to 
    determine replicate precision. This procedure is applicable to both 
    reference and equivalent methods. In the case of equivalent methods, 
    this test may be combined and conducted concurrently with the 
    comparability test for equivalent methods (subpart C of this part), 
    using three reference method samplers collocated with three candidate 
    equivalent method samplers and meeting the applicable site and other 
    requirements of subpart C of this part.
        (b) Technical definition. Field precision means the standard 
    deviation or relative standard deviation of a set of measurements 
    obtained concurrently with three or more collocated samplers in actual 
    ambient air field operation.
        (c) Test site. Any outdoor test site having PM2.5 
    concentrations that are reasonably uniform over the test area and that 
    meet the minimum level requirement of Sec. 53.56(g) is acceptable for 
    this test.
        (d) Required facilities and equipment. An appropriate test site and 
    suitable electrical power to accommodate three test samplers.
        (e) Test setup. (1) Three identical test samplers shall be 
    installed at the test site in their normal configuration for collecting 
    PM2.5 samples in accordance with the instructions in the 
    associated manual referred to in Sec. 53.4(b)(3) and in accordance with 
    applicable supplemental guidance provided in Reference 3 in Appendix A 
    of this subpart. The test sampler inlet openings shall be located at 
    the same height above ground and between 2 and 4 meters apart 
    horizontally. The samplers shall be arranged or oriented in a manner 
    that will minimize spatial and wind directional effects on sample 
    collection of one sampler on the other samplers.
        (2) Each test sampler shall be leak checked, calibrated, and set up 
    for normal operation in accordance with the instruction manual and with 
    any applicable supplemental guidance provided in Reference 3 in 
    Appendix A of this supbart.
        (f) Test procedure. (1) Install a specified filter in each sampler 
    and otherwise prepare each sampler for normal sample collection. Set 
    identical sample collection start and stop times for each sampler.
        (2) Collect either a 24-hour or a 48-hour atmospheric PM2.5 
    sample simultaneously with each of the three test samplers.
        (3) Determine the measured PM2.5 mass concentration for each 
    sample in accordance with the procedures prescribed for the candidate 
    method in the associated manual referred to in Sec. 53.4(b)(3) and in 
    accordance with supplemental guidance in Reference 3 in Appendix A of 
    this subpart.
        (4) Repeat this procedure to obtain a total of 10 sets of 24-hour 
    or 48-hour PM2.5 measurements over 10 test periods.
        (g) Calculations. (1) Record the PM2.5 concentration for each 
    test sampler for each test day as Ci,j, where I is the sampler 
    number (I=1,2,3) and j is the test day (j=1,2, . . . 10).
        (2) For each test day, calculate and record the average of the 
    three measured PM2.5 concentrations as Cj where j is the test 
    day:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.071
    
    
        If Cj<10>g/m\3\ for any test day, data from that test 
    day are unacceptable and an additional sample collection set must be 
    performed to replace the unacceptable data.
        (3) Calculate and record the precision for each of the 10 test days 
    as:
    
    
    [[Page 65821]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.072
    
    
    
        if Cj is below 40 g/m\3\ for 24-hour measurements or 
    below 30 g/m\3\ for 48-hour measurements; or
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.073
    
    
        if Cj is above 40 g/m3 for 24-hour measurements 
    or above 30 g/m3 for 48-hour measurements.
        (h) Test results. The candidate method passes the precision test if 
    all 10 Pj or RPj values meet the specifications in Table E-2 
    of this subpart.
    
    
    Sec. 53.57  Aerosol transport test for class I sequential samplers
    
        (a) Overview. This test is intended to verify adequate aerosol 
    transport through any air flow splitting components that may be used in 
    a Class I candidate equivalent method sampler to achieve sequential 
    sampling capability. This test is applicable to all Class I candidate 
    samplers in which the aerosol flow path (the flow of air upstream of 
    filtration) differs from that specified for reference method samplers 
    as set forth in Drawings L-18 and L-24 of Appendix L to part 50 of this 
    chapter. This test does not apply to candidate Class I equivalent 
    method samplers in which each channel consists of a separate inlet, 
    impactor, and filter holder of the exact same internal geometry as 
    specified for the reference method sampler. The test requirements and 
    performance specifications for this test are summarized in Table E-1 of 
    this subpart.
        (b) Technical Definitions. (1) Aerosol transport is the percentage 
    of the laboratory challenge aerosol which penetrates to the active 
    sample filter of the candidate Class I sampler.
        (2) The active sample filter is the exclusive filter through which 
    air is flowing during performance of this test.
        (3) A no-flow filter is a sample filter through which no air is 
    flowing during performance of this test.
        (4) A channel is a flow path that the aerosol make take, only one 
    of which may be active at a time.
        (5) An added component is any physical part of the sampler which is 
    different from that specified for the reference method sampler and 
    which allows or causes the aerosol to be routed to a different channel.
        (c) Required facilities and test equipment. (1) Aerosol generation 
    system, as specified in Sec. 53.64(c)(1).
        (2) Aerosol delivery system, as specified in Sec. 53.64(c)(2).
        (3) Particle size verification equipment, as specified in 
    Sec. 53.64(c)(3).
        (4) Fluorometer, as specified in Sec. 53.64(c)(4).
        (5) Candidate sampler, with the inlet and impactor or impactors 
    removed, and with all internal surfaces of added components electroless 
    nickel coated as specified in Sec. 53.64(d)(5)
        (d) Calibration of test measurement instruments. Submit 
    documentation showing evidence of recent calibration, calibration 
    accuracy, and NIST-traceability (if required) of all measurement 
    instruments used for the tests. Where an instrument's measurements are 
    to be recorded with an analog recording device, the accuracy of the 
    entire instrument-recorder system shall be calibrated or verified.
        (e) Test setup. (1) The candidate sampler, with its inlet and 
    impactor(s) removed, shall be installed in the particle delivery system 
    so that the test aerosol is introduced at the top of the downtube that 
    connects to the exit adaptor of the inlet. If the candidate sampler has 
    a separate impactor for each channel, then for this test the filter 
    holder assemblies must be connected to the physical location on the 
    sampler where the impactors would normally connect.
        (2) Filters that are appropriate for use with fluorometric methods 
    (e.g., glass fiber) shall be used for particle collection for these 
    tests.
        (f) Procedure. (1) All surfaces of the added component(s) which 
    come in contact with the aerosol flow shall be thoroughly washed with 
    0.01 N NaOH and then dried.
        (2) Generate aerosol composed of oleic acid with a uranine 
    fluorometric tag of 4 m 0.25 m using a 
    vibrating orifice aerosol generator according to procedures specified 
    in Sec. 53.61(g). Check for the presence of satellites and adjust the 
    generator to minimize their production. Calculate the aerodynamic 
    particle size using the operating parameters of the vibrating orifice 
    aerosol generator and record. The calculated aerodynamic diameter must 
    be within 0.25 m of 4 m.
        (3) Verify the particle size according to procedures specified in 
    Sec. 53.62(d)(4)(i).
        (4) Collect particles on filters for a time period such that the 
    relative error of the measured fluorometric concentration in the active 
    filter is less than 5 percent.
        (5) Determine the quantity of material collected on the active 
    filter using a calibrated fluorometer. Record the mass of fluorometric 
    material for the active filter as Mactive(I) where I = active 
    channel number.
        (6) Determine the quantity of material collected on the no-flow 
    filter(s) using a calibrated fluorometer. Record the mass of 
    fluorometric material on each no-flow filter as Mno-flow(ij) where 
    I = active channel number and j = no-flow filter number.
        (7) Wash the surfaces of the added component(s) which contact the 
    aerosol flow with 0.01N NaOH and determine the quantity of material 
    collected using a calibrated fluorometer. Record the mass of 
    fluorometric material collected in the wash as Mwash(I), where I = 
    replicate number.
    
    [[Page 65822]]
    
        (8) Calculate and record the aerosol transport as:
    
        [GRAPHIC] [TIFF OMITTED] TP13DE96.074
        
    
        where I = active channel number and j = no-flow filter number.
    
        (9) Repeat paragraphs (f) (1) through (6) of this section for each 
    channel, making each channel in turn the exclusive active channel.
        (g) Evaluation of test results. The candidate Class I sampler 
    passes the aerosol transport test if the specification in Table E-1 of 
    this subpart is met for each channel.
    
    Tables to Subpart E of Part 53
    
     Table E-1--Test Conditions for Sec.  53.52 Comprehensive 24-Hour Tests 
    ------------------------------------------------------------------------
                                                   Initial         Final    
                                    Power Line   temperature    temperature,
         24-hour test number         voltage     Deg C, Hours  Deg. C, Hours
                                                     1-8           22-24    
    ------------------------------------------------------------------------
    1............................   105 -2  15.0 1   0.0            minus>2.0   
    2............................   125 40
                                       minus>1   minus>2.0      .0          
    3............................   125 40  15.0 1   .0             minus>2.0   
    4............................   105 -2
                                       minus>1   minus>2.0      0.0         
    ------------------------------------------------------------------------
    
    
    BILLING CODE 6560-50-P
    
    [[Page 65823]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.075
    
    
    
    [[Page 65824]]
    
    Figures to Subpart E
    [GRAPHIC] [TIFF OMITTED] TP13DE96.076
    
    
    [[Page 65825]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.077
    
    
    
    [[Page 65826]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.078
    
    
    
    [[Page 65827]]
    
    Appendix A to Subpart E of Part 53--References
    
        1. ``Quality systems--Model for quality assurance in design, 
    development, production, installation and servicing,'' ISO9001. July 
    1994. Available from American Society for Quality Control, 611 East 
    Wisconsin Avenue, Milwaukee, WI 53202.
        2. ``American National Standard--Specifications and Guidelines 
    for Quality Systems for Environmental Data Collection and 
    Environmental Technology Programs.'' ANSI/ASQC E4-1994. January 
    1995. Available from American Society for Quality Control, 611 East 
    Wisconsin Avenue, Milwaukee, WI 53202.
        3. Quality Assurance Handbook for Air Pollution Measurement 
    Systems, Volume II, Ambient Air Specific Methods (Interim Edition), 
    section 2.12. EPA/600/R-94/038b, April 1994. Available from CERI, 
    ORD Publications, U.S. Environmental Protection Agency, 26 West 
    Martin Luther King Drive, Cincinnati, Ohio 45268. [Section 2.12 is 
    currently under development and will not be available from the 
    previous address until it is published as an addition to EPA/600/R-
    94/038b. Prepublication draft copies of section 2.12 will be 
    available from Department E (MD-77B), U.S. EPA, Research Triangle 
    Park, NC 27711 or from the contact identified at the beginning of 
    this proposed rule].
        4. Military standard specification (mil. spec.) 8625F, Type II, 
    Class 1 as listed in Department of Defense Index of Specifications 
    and Standards (DODISS), available from DODSSP-Customer Service, 
    Standardization Documents Order Desk, 700 Robbins Avenue, Building 
    4D, Philadelphia, PA 1911-5094.
        5. ``Guidance for the Use and Application of Designation Testing 
    and Sampler Manufacturing Checklists, as Required under 40 CFR 
    53.51'' U.S. EPA Publication No. [To be prepared.]
        6. Quality Assurance Handbook for Air Pollution Measurement 
    Systems, Volume IV: Meteorological Measurements. Revised March, 
    1995. EPA-600/R-94-038d. Available from U.S. EPA, ORD Publications 
    Office, Center for Environmental Research Information (CERI), 26 
    West Martin Luther King Drive, Cincinnati, Ohio 45268-1072 (513-569-
    7562).
    
        5. Subpart F is added to read as follows:
    
    Subpart F--Procedures for Testing Performance Characteristics of Class 
    II Equivalent Methods for PM2.5
    
    Sec.
    53.60 General provisions.
    53.61 Test conditions for PM2.5 reference method equivalency.
    53.62 Test procedures: Full wind tunnel test.
    53.63 Test procedures: Wind tunnel inlet aspiration test.
    53.64 Test procedures: Static fractionator test.
    53.65 Test procedures: Loading test.
    53.66 Test procedures: Volatility test.
    
    Tables to Subpart F of Part 53
    
    Table F-1  Performance Specifications for PM2.5 Class II 
    Equivalent Samplers
    Table F-2  Particle Size and Wind Speeds for Full Wind Tunnel 
    Evaluation, Wind Tunnel Inlet Aspiration Test, and Statics Chamber 
    Test
    Table F-3  Critical Parameters of Idealized Ambient Particle Size 
    Distributions
    Table F-4  Estimated Mass Concentration of PM2.5 for Idealized 
    Coarse Aerosol Size Distribution
    Table F-5  Estimated Mass Concentration Measurement of PM2.5 
    for Idealized ``Typical'' Coarse Aerosol Size Distribution
    Table F-6  Estimated Mass Concentration Measurement of PM2.5 
    for Idealized Fine Aerosol Size Distribution
    
    Figures to Subpart F of Part 53
    
    Figure F-1  Flowchart for Determining Requirements for Class II 
    Samplers Equivalent
    Figure F-2  Designation Testing Checklist
    
    Appendix A to Subpart F of Part 53--References
    
    Subpart F--Procedures for Testing Performance Characteristics of 
    Class II Equivalent Methods for PM2.5
    
    
    Sec. 53.60  General provisions.
    
        (a) This subpart sets forth the specific requirements that a 
    PM2.5 sampler associated with a candidate Class II equivalent 
    method must meet to be designated as an equivalent method for 
    PM2.5. This subpart also sets forth the explicit test procedures 
    that must be carried out and the test results, evidence, documentation, 
    and other materials that must be provided to EPA to demonstrate that a 
    sampler meets all specified requirements for designation as an 
    equivalent method.
        (b) A candidate method described in an application for a reference 
    or equivalent method application submitted under Sec. 53.4 shall be 
    determined by the EPA to be a Class II candidate equivalent method on 
    the basis of the definition of a Class II equivalent method given in 
    Sec. 53.1.
        (c) Any sampler associated with a Class II candidate equivalent 
    method (Class II sampler) must meet all requirements for reference 
    method samplers or Class I equivalent method samplers specified in 
    subpart E of this part, as appropriate. In addition, a Class II sampler 
    must meet the additional requirements as specified in Sec. 53.60(d) of 
    this part.
        (d) Except as provided in paragraph (d) (1), (2) and (3) of this 
    section, all Class II samplers are subject to the additional tests and 
    performance requirements specified in Sec. 53.62 (full wind tunnel 
    test), Sec. 53.65 (loading test), and Sec. 53.66 (volatility test). 
    Alternative tests and performance requirements, as described in 
    paragraphs (d) (1), (2), and (3) of this section, are optionally 
    available for certain Class II samplers which meet the requirements for 
    reference method or Class I samplers given in Appendix L of part 50 of 
    this chapter and in Subpart E of this part, except for specific 
    deviations of the inlet, fractionator, or filter. These requirements 
    and the exceptions in paragraphs (d) (1), (2), and (3) of this section 
    are summarized in the flowchart given in Figure F-1 of this subpart.
        (1) Inlet deviation. A sampler which has been determined to be a 
    Class II sampler (rather than a reference method or Class II sampler) 
    solely because the design or construction of its inlet deviates from 
    the design or construction of the inlet specified in Appendix L for 
    reference method samplers shall not be subject to the requirements of 
    Sec. 53.62 (full wind tunnel test), provided that it meets all 
    requirements of Sec. 53.63 (inlet aspiration test), Sec. 53.65 (loading 
    test), and Sec. 53.66 (volatility test).
        (2) Fractionator deviation. A sampler which has been determined to 
    be a Class II sampler solely because the design or construction of its 
    particle size fractionator deviates significantly from the design or 
    construction of the particle size fractionator specified in 40 CFR part 
    50, Appendix L for reference method samplers shall not be subject to 
    the requirements of Sec. 53.62 (full wind tunnel test), provided that 
    it meets all requirements of Sec. 53.64 (static fractionator test), 
    Sec. 53.65 (loading test), and Sec. 53.66 (volatility test).
        (3) Filter size deviation. A sampler which has been determined to 
    be a Class II sampler solely because the size of its sample collection 
    filter deviates from the sampler filter size specified in Appendix L 
    for reference method samplers shall not be subject to the requirements 
    of Sec. 53.62 (full wind tunnel test) nor Sec. 53.65 (loading test), 
    provided it meets all requirements of Sec. 53.66 (volatility test).
        (e) The test specifications and acceptance criteria for each test 
    are summarized in Table F-1 of this subpart. The candidate sampler must 
    demonstrate performance that meets the acceptance criteria for each 
    applicable test to be designated as an equivalent method.
        (f) Overview of various test procedures for Class II samplers. (1) 
    Full wind tunnel test. This test procedure is designed to ensure that 
    the candidate sampler's aspiration of an ambient aerosol and 
    penetration of the sub 2.5-micron fraction to its sample filter will be 
    comparable to that of a reference method sampler. The test conditions 
    are
    
    [[Page 65828]]
    
    summarized in Table F-2 of this subpart (under the heading, ``Full Wind 
    Tunnel Test'), and the candidate sampler must meet the acceptance 
    criteria specified in Table F-1 of this subpart.
        (2) Wind tunnel inlet test. The wind tunnel inlet aspiration test 
    challenges the candidate sampler with a monodisperse aerosol that is 
    specified in Table F-2 of this subpart (under the heading, ``Inlet 
    Aspiration Test'). The aerosol is introduced into a wind tunnel 
    environment, and the aspiration of the candidate sampler is compared 
    with that of the reference method sampler at wind speeds of 2 km/hr and 
    24 km/hr. The acceptance criteria presented in Table F-1 of this 
    subpart is based on the relative aspiration between the candidate 
    sampler and federal reference method sampler.
        (3) Static 2.5-micron fractionator test. The static 2.5-micron 
    fractionator test determines the effectiveness of the candidate 
    fractionator under static conditions for aerosols of the size and type 
    specified in Table F-2 of this subpart (under the heading, ``Static 
    Fractionator Test'). The candidate sampler must meet the acceptance 
    criteria presented in Table F-1 of this subpart.
        (4) Loading test. (i) The loading test is used to ensure that the 
    performance of a candidate sampler is not significantly affected by the 
    amount of material deposited on its interior surfaces between periodic 
    cleaning. This test is divided into two distinct experiments:
        (A) A mandatory demonstration of no significant performance shift 
    over a 24-hour time period; and
        (B) An optional demonstration of no significant performance shift 
    over an extended time period for approval of a cleaning interval 
    greater than 24 hours.
        (ii) In the initial evaluation, the candidate sampler is operated 
    in test environment equivalent to sampling 150 g/m3 
    coarse mode aerosol over a 24-hour time period. The candidate's 
    performance must then be evaluated by Sec. 53.62 (full wind tunnel 
    evaluation) with the exception being a modification to the fractionator 
    alone, in which case the performance may be optionally evaluated by 
    Sec. 53.64 (static fractionator test). If the results of the 
    appropriate test meet the criteria presented in Table F1 of this 
    subpart, then the candidate sampler passes the loading test under the 
    condition that it be cleaned after each 24-hour use.
        (iii) An extended loading test may be performed to gain approval of 
    a longer time period between periodic cleaning of the fractionator. In 
    this extended loading test, the candidate sampler is loaded with a mass 
    equivalent to operating the unit in an environment of 150 g/
    m3 coarse mode aerosol over the time period proposed by the 
    manufacturer between cleaning. Reevaluation of the expected mass 
    collected is performed via the wind tunnel test or the static 2.5-
    micron fractionator test, depending upon which test was used for the 
    initial evaluation. If the results meet the criteria presented in Table 
    F-1 of this subpart, then the candidate sampler passes the loading test 
    under the condition that it be cleaned at least as often as the 
    proposed cleaning frequency.
        (5) Volatility test. The volatility test challenges the candidate 
    sampler with a polydisperse, semi-volatile liquid aerosol. This aerosol 
    is simultaneously sampled by the candidate method sampler and a 
    reference method sampler for a specified time period. Clean air is then 
    passed through the samplers for an additional time period. The filters 
    are then reweighed to determine residual mass of the collected aerosol. 
    The candidate sampler passes the volatility test if the candidate 
    method meets the specifications presented in Table F-1 of this subpart.
        (g) Test data. All test data and other documentation obtained from 
    or pertinent to these tests shall be identified, dated, signed by the 
    analyst performing the test, and submitted to EPA as part of the 
    equivalent method application. Schematic drawings of each particle 
    delivery system and other information showing complete procedural 
    details of the test atmosphere generation, verification, and delivery 
    techniques for each test performed shall be submitted to EPA. All 
    pertinent calculations shall be clearly presented. In addition, 
    manufacturers are required to complete and submit the designation 
    testing checklist presented in Figure 2 of this subpart as part of the 
    application.
    
    
    Sec. 53.61  Test conditions.
    
        (a) Sampler surface preparation. Internal surfaces of the candidate 
    sampler shall be cleaned and dried prior to performing any Class II 
    sampler test in this Subpart. The internal collection surfaces of the 
    sampler shall then be prepared in strict accordance with the operating 
    instructions specified in the sampler's operating manual referred to in 
    Sec. 53.4(b)(3).
        (b) Sampler setup. Set up and start up of all test samplers shall 
    be in strict accordance with the operating instructions specified in 
    the manual referred to in Sec. 53.4(b)(3), unless otherwise specified 
    within this subpart.
        (c) Sampler adjustments. Once the test sampler or samplers have 
    been set up and the performance tests started, manual adjustment shall 
    be permitted only between test points for all applicable tests. Manual 
    adjustments and any periodic maintenance shall be limited to only those 
    procedures prescribed in the manual referred to in Sec. 53.4(b)(3). The 
    submitted records shall clearly indicate when any manual adjustment or 
    periodic maintenance was made and shall describe the operations 
    performed.
        (d) Sampler malfunctions. If a test sampler malfunctions during any 
    of the applicable tests that test run shall be repeated. A detailed 
    explanation of all malfunctions and the remedial actions taken shall be 
    submitted as part of the equivalent method application.
        (e) Particle concentration measurements. All measurements of 
    particle concentration must be made such that the relative error in 
    measurement is less than 5.0 percent. Relative error is defined as (s x 
    100 percent)/(X), where s is the sample standard deviation of the 
    particle concentration detector, X is the measured concentration, and 
    the units of s and X are identical.
        (f) Operation of test measurement equipment. All test measurement 
    equipment shall be setup, calibrated, and maintained according to the 
    manufacturer's instructions by qualified personnel only. All 
    appropriate calibration information and manuals for this equipment 
    shall be kept on file.
        (g) Aerosol generation parameters. This section prescribes 
    conventions regarding aerosol generation techniques. Size-selective 
    performance tests outlined in Secs. 53.62, 53.63, 53.64, and 53.65 
    specify the use of the vibrating orifice aerosol generator (VOAG) for 
    the production of test aerosols. The volatility test in Sec. 53.66 
    specifies the use of a nebulized polydisperse aerosol.
        (1) Particle aerodynamic diameter. The VOAG produces near-
    monodisperse droplets through the controlled breakup of a liquid jet. 
    When the liquid solution consists of a non-volatile solute dissolved in 
    a volatile solvent, the droplets dry to form particles of near-
    monodisperse size.
        (i) The physical diameter of a generated spherical particle can be 
    calculated from the operating parameters of the VOAG as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.079
    
    
    where:
    
    Dp=particle physical diameter, m
    Q=liquid volumetric flow rate, m3/sec
    
    [[Page 65829]]
    
    Cvol=volume concentration (particle volume produced per drop 
    volume), dimensionless
    f=frequency of applied vibrational signal, sec-1.
        (ii) A given particle's aerodynamic behavior is a function of its 
    physical particle size, particle shape, and density. Aerodynamic 
    diameter is defined as the diameter of a unit density 
    (o=1 g/m3) sphere having the same settling velocity 
    as the particle under consideration. For converting a spherical 
    particle of known density to aerodynamic diameter, the governing 
    relationship is:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.080
    
    
    where
    
    Dae=particle aerodynamic diameter, m
    p=particle density, g/cm3
    o=aerodynamic particle density=1 g/m3
    CDp=Cunningham's slip correction factor for physical particle 
    diameter, dimensionless
    CDae=Cunningham's slip correction factor for aerodynamic particle 
    diameter, dimensionless.
    
        (iii) At room temperature and standard pressure, the Cunningham's 
    slip correction factor is solely a function of particle diameter:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.081
    
    
    or
    [GRAPHIC] [TIFF OMITTED] TP13DE96.082
    
    
        (iv) Since the slip correction factor is itself a function of 
    particle diameter, the aerodynamic diameter cannot be solved directly 
    but can be determined by iteration.
        (2) Solid particle generation. As specified in Table F-2 of this 
    subpart, all solid particle tests in this subpart shall be conducted 
    using particles composed of ammonium fluorescein. For use in the VOAG, 
    liquid solutions of known volumetric concentration can be prepared by 
    diluting fluorescein powder (C20H12O5, FW=332.31, CAS 
    2321-07-5) with aqueous ammonia. Guidelines for preparation of 
    fluorescein solutions of the desired volume concentration (Cvol) 
    are presented by Vanderpool and Rubow (1988) (Reference 2 in Appendix A 
    of this subpart). For purposes of converting particle physical diameter 
    to aerodynamic diameter, an ammonium fluorescein density of 1.35 g/
    cm3 shall be used. Mass deposits of ammonium fluorescein shall be 
    extracted and analyzed using solutions of 0.01 N ammonium hydroxide.
        (3) Liquid particle generation. (i) Oleic acid particles. (A) Tests 
    prescribed in Sec. 53.63 for inlet aspiration require the use of liquid 
    particle tests composed of oleic acid tagged with uranine to enable 
    subsequent fluorometric quantitation of collected aerosol mass 
    deposits. Oleic acid (C18H34O2, FW=282.47, CAS 112-80-1) 
    has a density of 0.8935 g/cm3. Because the viscosity of oleic acid 
    is relatively high, significant errors can occur when dispensing oleic 
    acid using volumetric pipettes. For this reason, it is recommended that 
    oleic acid solutions be prepared by quantifying dispensed oleic acid 
    gravimetrically. The volume of oleic acid dispensed can then be 
    calculated simply by dividing the dispensed mass by the oleic acid 
    density.
        (B) Oleic acid solutions tagged with uranine shall be prepared as 
    follows. A known mass of oleic acid shall first be diluted using 
    absolute ethanol. The desired mass of the uranine tag should then be 
    diluted in a separate container using absolute ethanol. Uranine 
    (C20H10O5Na2, FW=376.3, CAS 518-47-8) is the 
    disodium salt of fluorescein and has a density of 1.53 g/cm3. In 
    preparing uranine tagged oleic acid particles, the uranine content 
    shall not exceed 20 percent on a mass basis. Once both oleic acid and 
    uranine solutions are properly prepared, they can then be combined and 
    diluted to final volume using absolute ethanol.
        (C) Calculation of the physical diameter of the particles produced 
    by the VOAG requires knowledge of the liquid solution's volume 
    concentration (Cvol). Because uranine is essentially insoluble in 
    oleic acid, the total particle volume is the sum of the oleic acid 
    volume and the uranine volume. The volume concentration of the liquid 
    solution shall be calculated as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.083
    
    
    where:
    
    Vu=uranine volume, ml
    Voleic=oleic acid volume, ml
    Vsol=total solution volume, ml
    Mu=uranine mass, g
    u=uranine density, g/cm3
    Moleic=oleic acid mass, g
    oleic=oleic acid density, g/cm3
    
        (D) For purposes of converting the particles' physical diameter to 
    aerodynamic diameter, the density of the generated particles shall be 
    calculated as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.084
    
    
    
    [[Page 65830]]
    
    
        (E) Mass deposits of oleic acid shall be extracted and analyzed 
    using solutions of 0.01 N sodium hydroxide.
        (ii) Glycerol. Tests prescribed in Sec. 53.66 for conducting 
    volatility tests shall be conducted using ACS reagent grade glycerol 
    (C3H8O3, FW=92.09, CAS 56-81-5) with a minimum purity of 
    99.5 percent.
    
    
    Sec. 53.62  Test Procedure: Full wind tunnel test.
    
        (a) Overview. The full wind tunnel test evaluates the effectiveness 
    of the candidate sampler at 2 km/hr and 24 km/hr for aerosols of the 
    size and type specified in Table F-2 of this subpart (under the 
    heading, ``Full Wind Tunnel Test''). For each wind speed, a smooth 
    curve is fit to the effectiveness data and corrected for the presence 
    of multiplets in the wind tunnel calibration aerosol. The cutpoint 
    diameter (Dp50) at each wind speed is then be determined from the 
    corrected effectiveness curves. The two resultant penetration curves 
    are then numerically integrated with three idealized ambient particle 
    size distributions to provide an estimate of measured mass 
    concentration. Critical parameters for these idealized distributions 
    are presented in Table F-3 of this subpart.
        (b) Technical definitions. Effectiveness is the ratio (expressed as 
    a percentage) of the mass concentration of particles of a specific size 
    reaching the sampler filter or filters to the mass concentration of 
    particles of the same size approaching the sampler.
        (c) Facilities and equipment required. (1) Wind tunnel. The 
    particle delivery system shall consist of a blower system and a wind 
    tunnel having a test section of sufficiently large cross-sectional area 
    such that the test sampler, or portion thereof, as installed in the 
    test section for testing, blocks no more than 15 percent of the test 
    section area. The wind tunnel blower system must be capable of 
    maintaining uniform wind speeds at the 2 km/hr and 24 km/hr.
        (2) Aerosol generation system. A vibrating orifice aerosol 
    generator shall be used to produce monodisperse solid particles of 
    ammonium fluorescein with equivalent aerodynamic diameters as specified 
    in Table F-2 of this subpart. The geometric standard deviation for each 
    particle size and type generated shall not exceed 1.1 (for primary 
    particles) and the proportion of multiplets (doublets and triplets) in 
    all test particle atmosphere shall not exceed 10 percent. The 
    aerodynamic particle diameter, as established by the operating 
    parameters of the vibrating orifice aerosol generator, shall be within 
    the tolerance specified in Table F-2 of this subpart.
        (3) Particle size verification equipment. The size of the test 
    particles shall be verified during this test by use of a suitable 
    instrument (e.g., scanning electron microscope, optical particle 
    counter, time-of-flight apparatus). The instrument must be capable of 
    measuring solid and liquid test particles with a size resolution of 0.1 
    m or less. The accuracy of the particle size verification 
    technique shall be 0.15 m or better.
        (4) Wind speed measurement. The wind speed in the wind tunnel shall 
    be determined during the tests using an appropriate technique capable 
    of a precision of 5 percent or better (e.g., hot-wire anemometry). For 
    the wind speeds specified in Table F-2 of this subpart, the wind speed 
    and turbulence intensity (longitudinal component and macro scale) shall 
    be measured at a minimum of 12 test points in a cross-sectional area of 
    the test section of the wind tunnel. The mean wind speed in the test 
    section must be within 10 percent of the value specified in 
    Table F-2 of this subpart, and the variation at any test point in the 
    test section may not exceed 10 percent of the measured mean.
        (5) Aerosol rake. The cross-sectional uniformity of the particle 
    concentration in the sampling zone of the test section shall be 
    established during the tests using an array of isokinetic samplers, 
    referred to as a rake. Not less than five evenly spaced isokinetic 
    samplers shall be used to determine the particle concentration spatial 
    uniformity in the sampling zone. The sampling zone shall be a 
    rectangular area having a horizontal dimension not less than 1.2 times 
    the width of the test sampler at its inlet opening and a vertical 
    dimension not less than 25 centimeters.
        (6) Total aerosol isokinetic sampler. A single isokinetic sampler 
    may be used in place of the array of isokinetic samplers for the 
    determination of particle mass concentration used in the calculation of 
    sampling effectiveness of the test sampler in Sec. 53.62(e)(5). In this 
    case, the array of isokinetic samplers must be used to demonstrate 
    particle concentration uniformity prior to the replicate measurements 
    of sampling effectiveness.
        (7) Fluorometer. A series of calibration standards shall be 
    prepared to encompass the minimum and maximum concentrations measured 
    during size-selective tests. Prior to each calibration and measurement, 
    the fluorometer shall be zeroed using an aliquot of the same solvent 
    used for extracting aerosol mass deposits.
        (8) Sampler flow rate measurements. All flow rate measurements used 
    to calculate the test atmosphere concentrations and the test results 
    must be accurate to within 2 percent, referenced to a NIST-
    traceable primary standard. Any necessary flow rate measurement 
    corrections shall be clearly documented. All flow rate measurements 
    shall be performed and reported in actual volumetric units.
        (d) Test procedures. (1) Establish and verify wind speed.
        (i) Establish a wind speed specified in Table F-2 of this subpart.
        (ii) Measure the wind speed and turbulence intensity (longitudinal 
    component and macro scale) at a minimum of 12 test points in a cross-
    sectional area of the test section of the wind tunnel using a device as 
    described in Sec. 53.62(c)(4).
        (iii) Verify that the mean wind speed in the test section of the 
    wind tunnel during the tests is within 10 percent of the value 
    specified in Table F-2 of this subpart. The wind speed measured at any 
    test point in the test section shall not differ by more than 10 percent 
    from the mean wind speed in the test section.
        (2) Generate aerosol. Generate particles of a size and type 
    specified in Table F-2 of this subpart using a vibrating orifice 
    aerosol generator. Check for the presence of satellites and adjust the 
    generator as necessary. Calculate the physical particle size using the 
    operating parameters of the vibrating orifice aerosol generator and 
    record. Determine the particle's aerodynamic diameter from the 
    calculated physical diameter and the known density of the generated 
    particle. The calculated aerodynamic diameter must be within the 
    tolerance specified in Table F-2 of this subpart.
        (3) Introduce particles into the wind tunnel. Introduce the 
    generated particles into the wind tunnel and allow the particle 
    concentration to stabilize.
        (4) Verify the quality of the test aerosol. (i) Extract a 
    representative sample of the aerosol from the sampling test zone and 
    measure the size distribution of the collected particles using an 
    appropriate sizing technique. If the measurement instrumentation does 
    not provide a direct measure of aerodynamic diameter, calculate the 
    geometric mean aerodynamic diameter using the known density of the 
    particle type in conjunction with the measured mean physical diameter. 
    The determined mean aerodynamic diameter of the test aerosol must be 
    within 0.15 m of the aerodynamic diameter calculated from the 
    operating parameters of the vibrating orifice aerosol generator. The 
    geometric
    
    [[Page 65831]]
    
    standard deviation of the primary particles must not exceed 1.1.
        (ii) Determine the population of multiplets in the collected 
    sample. The multiplet population of the particle test atmosphere must 
    not exceed 10 percent of the total particle population.
        (5) Aerosol uniformity and concentration measurement. (i) Install 
    an array of five or more evenly spaced isokinetic samplers in the 
    sampling zone [Sec. 53.62(c)(5)]. Collect particles on appropriate 
    filters over a time period such that the relative error of the measured 
    particle concentration is less than 5.0 percent.
        (ii) Determine the quantity of material collected with each 
    isokinetic sampler in the array using a calibrated fluorometer. 
    Calculate and record the mass concentration for each isokinetic sampler 
    as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.085
    
    
    Where
    
    i=replicate number
    j=isokinetic sampler number
    Miso=mass of material collected with the isokinetic sampler
    Q=isokinetic sampler volumetric flow rate
    t=sampling time.
    
        (iii) Calculate and record the mean mass concentration as:
        [GRAPHIC] [TIFF OMITTED] TP13DE96.086
        
    
    Where
    
    I=replicate number
    j=isokinetic sampler number
    n=total number of isokinetic samplers.
    
        (iv) Precision calculation. (A) Calculate the coefficient of 
    variation of the mass concentration measurements as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.087
    
    
    Where
    
    i=replicate number
    j=isokinetic sampler number
    n=total number of isokinetic samplers.
    
        (B) If the value of CViso(I) for any replicate exceeds 10 percent, 
    the particle concentration uniformity is unacceptable and step 5 must 
    be repeated. If adjustment of the vibrating orifice aerosol generator 
    or changes in the particle delivery system are necessary to achieve 
    uniformity, steps 2 through 5 must be repeated. When an acceptable 
    aerosol spatial uniformity is achieved, remove the array of isokinetic 
    samplers from the wind tunnel.
        (6) Alternative measure of wind tunnel total concentration. If a 
    single isokinetic sampler is used to determine the mean aerosol 
    concentration in the wind tunnel, install the sampler in the wind 
    tunnel with the sampler nozzle centered in the sampling zone 
    [Sec. 53.62(c)(6)].
        (i) Collect particles on an appropriate filter over a time period 
    such that the relative error of the measured concentration is less than 
    5.0 percent.
        (ii) Determine the quantity of material collected with the 
    isokinetic sampler using a calibrated fluorometer.
        (iii) Calculate and record the mass concentration as Ciso(I) 
    as in Sec. 53.62(e)(4)(ii).
        (iv) Remove the isokinetic sampler from the wind tunnel.
        (7) Measure the aerosol with the candidate sampler. (i) Install the 
    test sampler (or portion thereof) in the wind tunnel with the sampler 
    inlet opening centered in the sampling zone. To meet the maximum 
    blockage limit of Sec. 53.62(c)(1) or for convenience, part of the test 
    sampler may be positioned external to the wind tunnel provided that 
    neither the geometry of the sampler nor the length of any connecting 
    tube or pipe is altered. Collect particles for a time period such that 
    the relative error of the measured concentration is less than 5.0 
    percent.
        (ii) Remove the test sampler from the wind tunnel.
        (iii) Determine the quantity of material collected with the test 
    sampler using a calibrated fluorometer. Calculate and record the mass 
    concentration for each replicate as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.088
    
    
    Where
    
    i=replicate number
    Mcand=mass of material collected with the candidate sampler
    Q=candidate sampler volumetric flow rate
    t=sampling time.
    
        (iv) (A) Calculate and record the sampling effectiveness of the 
    candidate sampler as:
    [GRAPHIC] [TIFF OMITTED] TP13DE96.089
    
    
    Where:
    
     i = replicate number.
    
        (B) If a single isokinetic sampler is used for the determination of 
    particle mass concentration, replace Ciso(I) with Ciso.
        (8) Obtain a minimum of three replicate measures of sampling 
    effectiveness and calculate the mean sampling effectiveness. (i) Repeat 
    steps in paragraphs (d) (5) through (7) of this section, as 
    appropriate, to obtain a
    
    [[Page 65832]]
    
    minimum of three valid replicate measurements of sampling 
    effectiveness.
        (ii) Calculate and record the average sampling effectiveness of the 
    test sampler for the particle size and type as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.090
    
    
    Where:
    
    i = replicate number
    n = number of replicates.
    
        (iii) Sampling effectiveness precision. (A) Calculate and record 
    the coefficient of variation for the replicate sampling effectiveness 
    measurements of the test sampler as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.091
    
    
    Where:
    
    i = replicate number
    n = number of replicates.
    
        (B) If the value of CVE exceeds 10 percent, the test run 
    (steps in paragraphs (d)(2) through (8) of this section) must be 
    repeated until an acceptable value is obtained.
        (9) Repeat for each particle size and type for the selected wind 
    speed. Repeat steps in paragraphs (d)(2) through (8) of this section 
    until the sampling effectiveness has been measured for all particle 
    sizes and types specified in Table F-2 of this subpart.
        (10) Repeat for each wind speed. Repeat steps in paragraphs (d)(1) 
    through 9 of this section until tests have been successfully conducted 
    for both wind speeds of 2 km/hr and 24 km/hr.
        (e) Calculations. (1) Graphical treatment of effectiveness data. 
    For each wind speed given in Table F-2 of this subpart, plot the 
    particle sampling effectiveness of the test sampler as a function of 
    aerodynamic particle diameter (Dae) on semi-logarithmic graph 
    paper where the aerodynamic particle diameter is the particle size 
    established by the parameters of the VOAG in conjunction with the known 
    particle density. Construct a best-fit, smooth curve through the data 
    by extrapolating the sampling effectiveness curve through 100 percent 
    at an aerodynamic particle size of 0.5 m and 0 percent at an 
    aerodynamic particle size of 10 m. Correction for the presence 
    of multiplets shall be performed using the techniques presented by 
    Marple, et al (1987).
        (2) Cutpoint determination. For each wind speed determine the 
    sampler Dp50 cutpoint defined as the aerodynamic particle size 
    corresponding to 50 percent effectiveness from the multiplet corrected 
    smooth curve.
        (3) Expected mass concentration calculation. For each wind speed, 
    calculate the estimated mass concentration measurement for the test 
    sampler under each particle size distribution (Tables F-4, F-5, and F-6 
    of this subpart) and compare it to the mass concentration predicted for 
    the reference sampler, as follows:
        (i) Determine the value of corrected effectiveness using the best-
    fit curve at each of the particle sizes specified in the first column 
    of Table F-4 of this subpart. Record each corrected effectiveness value 
    as a decimal between 0 and 1 in column 2 of Table F-4 of this subpart.
        (ii) Calculate the interval estimated mass concentration 
    measurement by multiplying the values of corrected effectiveness in 
    column 2 by the interval mass concentration values in column 3 and 
    enter the products in column 4 of Table F-4 of this subpart.
        (iii) Calculate the estimated mass concentration measurement by 
    summing the values in column 4 and entering the total as the estimated 
    mass concentration measurement for the test sampler at the bottom of 
    column 4 of Table F-4 of this subpart.
        (iv) Calculate the estimated mass concentration ratio between the 
    candidate method and the reference method as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.092
    
    
    Where:
    
    Ccand(est)=estimated mass concentration measurement for the test 
    sampler, g/m3; and
    Cref(est)=estimated mass concentration measurement for the 
    reference sampler, g/m3 (calculated for the reference 
    sampler and specified at the bottom of column 7 of Table F-4 of this 
    subpart).
    
        (v) Repeat steps in paragraphs (e) (1) through (3) of this section 
    for Tables F-5 and F-6 of this subpart.
        (f) Evaluation of test results. The candidate method passes the 
    wind tunnel effectiveness test if the Rc value for each wind speed 
    meets the specification in Table F-1 of this subpart for each of the 
    three particle size distributions.
    
    
    Sec. 53.63  Test Procedure: Wind tunnel inlet aspiration test.
    
        (a) Overview. This test applies to a candidate sampler which 
    differs from the reference method sampler only with respect to the 
    design of the inlet. The purpose of this test is to compare the 
    aspiration of a Class II candidate sampler to that of the reference 
    method sampler's inlet. This wind tunnel test uses a 3.5-micron liquid 
    aerosol in conjunction with wind speeds of 2 km/hr and 24 km/hr. The 
    test atmosphere concentration is alternately measured with the 
    candidate sampler and a reference method device, both of which are 
    operated without the 2.5-micron fractionation device installed. The 
    test conditions are summarized in Table F-2 of this subpart (under the 
    heading of wind tunnel inlet aspiration test). The candidate sampler 
    must meet or exceed the acceptance criteria given in Table F-1 of this 
    subpart.
        (b) Technical definition. Relative aspiration is the ratio 
    (expressed as a percentage) of the aerosol mass concentration measured 
    by the candidate sampler to that measured by a reference method 
    sampler.
        (c) Facilities and equipment required. The facilities and equipment 
    are identical to those required for the full wind tunnel test 
    [Sec. 53.62(c)].
        (d) Test procedure. (1) Establish the wind tunnel test atmosphere. 
    Follow the procedures in Sec. 53.62(e)(1) through Sec. 53.62(e)(4) to 
    establish a test atmosphere for one of the two wind speeds specified in 
    Table F-2 of this subpart.
        (2) Measure the aerosol concentration with the reference sampler. 
    (i) Install the reference sampler (or portion thereof) in the wind 
    tunnel with the sampler inlet opening centered in the sampling zone. To 
    meet the maximum blockage limit of Sec. 53.62(c)(1) or for convenience, 
    part of the test sampler may be positioned external to the wind tunnel 
    provided
    
    [[Page 65833]]
    
    that neither the geometry of the sampler nor the length of any 
    connecting tube or pipe is altered. Collect particles for a time period 
    such that the relative error of the measured concentration [as defined 
    in Sec. 53.61(5)] is less than 5.0 percent.
        (ii) Determine the quantity of material collected with the 
    reference method sampler using a calibrated fluorometer. Calculate and 
    record the mass concentration as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.093
    
    
    Where:
    
    i=replicate number
    Mref=mass of material collected with the reference method sampler
    Q=reference method sampler volumetric flowrate
    t=sampling time.
    
        (iii) Remove the reference method sampler from the tunnel.
        (3) Measure the aerosol concentration with the candidate sampler. 
    (i) Install the candidate sampler (or portion thereof) in the wind 
    tunnel with the sampler inlet centered in the sampling zone. To meet 
    the maximum blockage limit of Sec. 53.62(c)(1) or for convenience, part 
    of the test sampler may be positioned external to the wind tunnel 
    provided that neither the geometry of the sampler nor the length of any 
    connecting tube or pipe is altered. Collect particles for a time period 
    such that the relative error of the measured concentration is less than 
    5.0 percent.
        (ii) Determine the quantity of material collected with the 
    candidate sampler using a calibrated fluorometer. Calculate and record 
    the mass concentration as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.094
    
    
    Where:
    i=replicate number
    Mcand=mass of material collected with the candidate sampler
    Q=candidate sampler volumetric flow rate
    t=sampling time.
    
        (iii) Remove the candidate sampler from the wind tunnel.
        (4) Repeat steps in paragraphs (d) (2) and (3) of this section. 
    Alternately measure the tunnel concentration with the reference sampler 
    and the candidate sampler until four reference sampler and five 
    candidate sampler measurements of the wind tunnel concentration are 
    obtained.
        (e) Calculations. (1) Aspiration ratio. Calculate aspiration ratio 
    for each candidate sampler run as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.095
    
    
    where
    
    i=replicate number.
    
        (2) Precision of aspiration ratio. Calculate the precision of 
    aspiration ratio measurements as the coefficient of variation for each 
    aspiration ratio:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.096
    
    
    where:
    
    i=replicate number
    n=total number of measurements of aspiration ratio.
        (f) Evaluation of test results. The candidate method passes the 
    inlet aspiration test if all values of A and CVA meet the 
    acceptance criteria specified in Table F-1 of this subpart.
    
    
    Sec. 53.64  Test Procedure: Static fractionator test.
    
        (a) Overview. This test applies only to those candidate methods in 
    which the sole deviation from the reference method is in the design of 
    the 2.5-micron fractionation device. The purpose of this test is to 
    ensure that the fractionation characteristics of the candidate 
    fractionator are acceptably similar to that of the reference method 
    sampler. It is recognized that various methodologies exist for 
    quantifying fractionator effectiveness. The following commonly-employed 
    techniques are provided for purposes of guidance. Other methodologies 
    for determining sampler effectiveness may be used contingent upon prior 
    approval by the Agency.
        (1) Wash-off method. Effectiveness is determined by measuring the 
    aerosol mass deposited in the candidate sampler's afterfilter versus 
    the aerosol mass deposited in the fractionator. The material deposited 
    in the fractionator is recovered by washing its internal surfaces. For 
    these wash-off tests, a fluorometer must be used to quantitate the 
    aerosol concentration. Note that if this technique is chosen, the 
    candidate must be reloaded with coarse aerosol prior to each test point 
    when reevaluating the curve as specified in the loading test.
        (2) Static chamber method. Effectiveness is determined by measuring 
    the aerosol mass concentration sampled by the candidate's sampler's 
    afterfilter versus that which exists in a static chamber. A calibrated 
    fluorometer must be used to quantify the collected aerosol deposits. 
    The aerosol concentration is calculated as the measured aerosol mass 
    divided by the sampled air volume.
        (3) Divided flow method. Effectiveness is determined by comparing 
    the aerosol concentration upstream of the candidate sampler's 
    fractionator versus that concentration which exists downstream of the 
    candidate fractionator. These tests may utilize either fluorometry or a 
    real-time aerosol measuring device to determine the aerosol 
    concentration.
        (b) Technical definition. Effectiveness under static conditions is 
    the ratio (expressed as a percentage) of the mass concentration of 
    particles of a given size reaching the sampler filter to the mass 
    concentration of particles of the same size approaching the sampler.
        (c) Facilities and equipment required.
    
    [[Page 65834]]
    
        (1) Aerosol generation. Methods for generating aerosols shall be 
    identical to those prescribed in Sec. 53.62(c)(2).
        (2) Particle delivery system. Acceptable apparatus for delivering 
    the generated aerosols to the candidate fractionator is dependent on 
    the effectiveness measurement methodology and are defined as follows:
        (i) Wash-off test apparatus. The aerosol may be delivered to the 
    candidate fractionator through direct piping (with or without an in-
    line mixing chamber). Particle size and quality validation shall be 
    conducted at the point where the fractionator attaches.
        (ii) Static chamber test apparatus. The aerosol shall be introduced 
    into a chamber and sufficiently mixed such that the aerosol 
    concentration within the chamber is spatially uniform. The chamber must 
    be of sufficient size to house at least four total filter samplers, as 
    well as the inlet of the candidate size discriminator. Particle size 
    validation and quality validation shall be conducted on representative 
    aerosol samples extracted from the chamber.
        (iii) Divided flow test apparatus. The apparatus shall allow the 
    aerosol concentration to be measured upstream and downstream of the 
    fractionator. The particles shall be delivered to the divided flow 
    apparatus via a symmetrical flow path.
        (3) Particle concentration measurement.
        (i) Fluorometry. Fluorometers used for quantifying extracted 
    aerosol mass deposits shall be set up, maintained, and calibrated 
    according to the manufacturer's instructions. A series of calibration 
    standards shall be prepared to encompass the minimum and maximum 
    concentrations measured during size-selective tests. Prior to each 
    calibration and measurement, the fluorometer shall be zeroed using an 
    aliquot of the same solvent used for extracting aerosol mass deposits.
        (ii) Number concentration measurement. A number counting device may 
    be used in conjunction with the divided flow test apparatus as 
    described above. This device must have a resolution and accuracy such 
    that primary particles may be distinguished from multiplets for all 
    test aerosols. The measurement of number concentration shall be 
    accomplished by integrating the primary particle peak.
        (d) Setup. (1) Remove the inlet from the candidate fractionator. 
    All tests procedures shall be conducted with the inlet removed from the 
    candidate sampler.
        (2) Surface treatment of the fractionator. Rinsing aluminum 
    surfaces with alkaline solutions has been found to adversely affect 
    subsequent fluorometric quantitation of aerosol mass deposits. If wash-
    off tests are to be used for quantifying aerosol penetration, internal 
    surfaces of the fractionator must first be plated with electroless 
    nickel. Specifications for this plating are specified in MIL.C-26074 
    Grade B, Class 4 (Reference 4 in appendix A of Subpart E).
        (e) Test Procedure: Wash off method. (1) Clean and dry internal 
    surfaces. Thoroughly clean and dry all internal surfaces of the 
    candidate particle size fractionator. The internal surfaces of the 
    fractionator shall then be prepared in strict accordance with the 
    operating instructions specified in the samplers operating manual. 
    Note: The procedures in this paragraph must be omitted if this test is 
    being used to evaluate the fractionator after being loaded as specified 
    in Sec. 53.65.
        (2) Generate aerosol. Follow the procedures for aerosol generation 
    prescribed in Sec. 53.62(e)(2).
        (3) Verify the quality of the test aerosol. Follow the procedures 
    for verification of test aerosol size and quality prescribed in 
    Sec. 53.62(e)(4).
        (4) Determine effectiveness for the particle size and type being 
    produced. (i) Collect particles downstream of the fractionator on an 
    appropriate filter over a time period such that the relative error of 
    the measurement is less than 5.0 percent.
        (ii) Determine the quantity of material collected on the 
    afterfilter of the candidate method using a calibrated fluorometer. 
    Calculate and record the aerosol mass concentration for the sampler 
    filter as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.097
    
    
    where:
    i=replicate number
    Mcand=mass of material collected with the candidate sampler
    Q=candidate sampler volumetric flowrate
    t=sampling time.
    
        (iii) Wash all interior surfaces upstream of the filter and 
    determine the quantity of material collected using a calibrated 
    fluorometer. Calculate and record the fluorometric mass concentration 
    of the sampler wash as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.098
    
    
    where:
    i=replicate number
    Mwash=mass of material washed from the interior surfaces of the 
    fractionator
    Q=candidate sampler volumetric flowrate
    t=sampling time.
        (iv) Calculate and record the sampling effectiveness of the test 
    sampler for this particle size as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.099
    
    
    where i=replicate number.
    
        (v) Repeat steps in paragraphs (e)(4)(9) through (iv) of this 
    section, as appropriate, to obtain a minimum of three replicate 
    measurements of sampling effectiveness.
        (vi) Calculate and record the average sampling effectiveness of the 
    test sampler as:
    [GRAPHIC] [TIFF OMITTED] TP13DE96.100
    
    
    where:
    i=replicate number
    n=number of replicates.
        (vii) (A) Calculate and record the coefficient of variation for the 
    replicate sampling effectiveness measurements of the test sampler as:
    
    
    [[Page 65835]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.101
    
    
    
    where:
    i=replicate number
    n=total number of measurements.
        (B) If the value of CVE exceeds 10 percent, then steps in 
    paragraphs (e) (2) through (4) of this section must be repeated. Note 
    that the sampler must be loaded according to the test procedures in 
    Sec. 53.65 prior to retesting each point if this test is being used as 
    a post-evaluation to satisfy the requirements of Sec. 53.65.
        (5) Repeat steps in paragraphs (e) (1) through (4) of this section 
    for each particle size and type specified in Table F-2 of this subpart.
        (f) Test procedure: Static chamber method.
        (1) Generate aerosol. Follow the procedures for aerosol generation 
    prescribed in Sec. 53.62(e)(2).
        (2) Verify the quality of the test aerosol. Follow the procedures 
    for verification of test aerosol size and quality prescribed in 
    Sec. 53.62(e)(4).
        (3) Introduction of particles into chamber. Introduce the particles 
    into the static chamber and allow the particle concentration to 
    stabilize.
        (4) Install and operate the candidate sampler and at least four 
    total filters. (i) Install the fractionator and an array of four or 
    more equally spaced filter samplers such that the filters surround and 
    are in the same plane as the inlet of the fractionator.
        (ii) Collect particles on an appropriate filter for a time period 
    such that the relative error of the measured concentration is less than 
    5.0 percent.
        (5) Calculate the aerosol spatial uniformity in the chamber. (i) 
    Determine the quantity of material collected with each total filter 
    sampler in the array using a calibrated fluorometer. Calculate and 
    record the mass concentration for each total filter sampler as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.102
    
    
    where:
    i=replicate number
    j=total filter sampler number
    Mtotal=mass of material collected with the total filter sampler
    Q=total filter sampler volumetric flowrate
    t=sample time.
    
        (ii) Calculate and record the mean mass concentration as:
    
        [GRAPHIC] [TIFF OMITTED] TP13DE96.103
        
    
    where:
    n=total number of samplers
    i=replicate number
    j=filter sampler number.
    
        (iii) (A) Calculate and record the coefficient of variation of the 
    total mass concentration as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.104
    
    
    where:
    i=replicate number
    j=total filter sampler number
    n=number of total filter samplers.
    
        (B) If the value of CVtotal exceeds 10 percent, then the 
    particle concentration uniformity is unacceptable, alterations to the 
    static chamber test apparatus must be made, and steps in paragraphs (f) 
    (1) through (5) of this section must be repeated.
        (6) Calculate the effectiveness of the candidate sampler. (i) 
    Determine the quantity of material collected on the candidate sampler's 
    afterfilter using a calibrated fluorometer. Calculate and record the 
    mass concentration for the candidate sampler as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.105
    
    
    where:
    i=replicate number
    Mcand=mass of material collected with the candidate sampler
    Q=candidate sampler volumetric flowrate
    t=sample time.
    
        (ii) Calculate and record the sampling effectiveness of the 
    candidate sampler as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.106
    
    
    where i=replicate number.
    
        (iii) Repeat step in paragraph (f)(4) through (6) of this section, 
    as appropriate, to obtain a minimum of three replicate measurements of 
    sampling effectiveness.
        (iv) Calculate and record the average sampling effectiveness of the 
    test sampler as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.107
    
    
    where i=replicate number.
    
        (v)(A) Calculate and record the coefficient of variation for the 
    replicate sampling effectiveness measurements of the test sampler as:
    
    
    [[Page 65836]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.108
    
    
    
    where:
    i = replicate number
    n = number of measurements of effectiveness.
    
        (B) If the value of CVE exceeds 10 percent, then the test run 
    (steps in paragraphs (f) (2) through (6) of this section).
        (7) Repeat steps in paragraphs (f) (1) through (6) of this section 
    for each particle size and type specified in Table F-2 of this subpart.
        (g) Test procedure: Divided flow method.--(1) Generate calibration 
    aerosol. Follow the procedures for aerosol generation prescribed in 
    Sec. 53.62(e)(2).
        (2) Verify the quality of the calibration aerosol. Follow the 
    procedures for verification of calibration aerosol size and quality 
    prescribed in Sec. 53.62(e)(4).
        (3) Introduce the calibration aerosol into the static chamber and 
    allow the particle concentration to stabilize.
        (4) Validate that transport is equal for the divided flow option.
        (i) With fluorometry (this applies only if fluorometry is used for 
    detection of particles):
        (A) Install a total filter on each leg of the divided flow 
    apparatus.
        (B) Collect particles simultaneously through both legs at 16.7 aLpm 
    onto an appropriate filter for a time period such that the relative 
    error of the measured concentration is less than 5.0 percent.
        (C) Determine the quantity of material collected on each filter 
    using a calibrated fluorometer. Calculate and record the mass 
    concentration measured in each leg as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.109
    
    
    where:
    i = replicate number
    M = mass of material collected with the total filter
    Q = candidate sampler volumetric flowrate.
    
        (D) Repeat steps in paragraphs (g)(4)(i) (A) through (C) of this 
    section at until a minimum of three replicate measurements are 
    performed.
        (ii) With a number counting device such as an aerosol detector:
        (A) Remove all flow obstructions from the flow paths of the two 
    legs.
        (B) Quantify the aerosol concentration of the primary particles in 
    each leg of the apparatus.
        (C) Repeat steps in paragraphs (g)(4)(i) (A) through (B) of this 
    section at until a minimum of three replicate measurements are 
    performed.
        (iii) (A) Calculate the mean concentration and coefficient of 
    variation as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.110
    
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.111
    
    
    where:
    i = replicate number
    n = number of replicates.
    
        (B) If the coefficient of variation is not less than 10 percent, 
    then adjustments may be made in the setup, and this step must be 
    repeated.
        (5) Determine the sampling effectiveness of the test sampler with 
    the inlet removed by one of the following procedures. (i) With 
    fluorometry as a detector:
        (A) Install the particle size fractionator. Install a filter 
    downstream of one leg and a total filter on the bypass leg of the flow 
    dividing apparatus.
        (B) Collect particles simultaneously through both legs at 16.7 aLpm 
    onto appropriate filters for a time period such that the relative error 
    of the measured concentration is less than 5.0 percent.
        (C) Determine the quantity of material collected on each filter 
    using a calibrated fluorometer. Calculate and record the mass 
    concentration measured by the total filter and that measured after 
    penetrating through the candidate fractionator as follows:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.112
    
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.113
    
    
    where i= replicate number.
        (ii) With a number counting device as a detector:
        (A) Install the particle size fractionator into one of the legs of 
    the divided flow apparatus.
        (B) Quantify and record the aerosol number concentration of the 
    primary particles passing through the fractionator as Ccand(I).
        (C) Divert the flow from the leg containing the candidate 
    fractionator to the bypass leg. Allow sufficient time for the aerosol 
    concentration to stabilize.
        (D) Quantify and record the aerosol number concentration of the 
    primary particles passing through the bypass leg as Ctotal(I).
        (iii) Calculate and record sampling effectiveness of the candidate 
    sampler as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.114
    
    
    where i = replicate number.
    
        (6) Repeat step in paragraph (g)(5) of this section, as 
    appropriate, to obtain a minimum of three replicate measurements of 
    sampling effectiveness.
        (7) Calculate the mean and CV for replicate measurements.
        (i) Calculate and record the mean sampling effectiveness of the 
    candidate sampler as:
    
    
    [[Page 65837]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.115
    
    
    
    Where i=replicate number.
    
        (ii)(A) Calculate and record the coefficient of variation for the 
    replicate sampling effectiveness measurements of the candidate sampler 
    as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.116
    
    
    Where:
    i=replicate number
    n=number of replicates.
    
        (B) If the coefficient of variation is not less than 10 percent, 
    then the test run must be repeated (steps in paragraphs (g) (1) through 
    (7) of this section).
        (8) Repeat steps in paragraphs (g) (1) through (7) of this section 
    for each particle size and type specified in Table F-2 of this subpart.
        (h) Calculations. (1) Treatment of multiplets. For all measurements 
    made by fluorometric analysis, data shall be corrected for the presence 
    of multiplets as described in Sec. 53.62(f)(1). Data collected using a 
    real-time device with sufficient resolution to discriminate primary 
    particles from multiplets will not require multiplet correction.
        (2) Cutpoint determination. For each wind speed determine the 
    sampler Dp50 cutpoint defined as the aerodynamic particle size 
    corresponding to 50 percent effectiveness from the multiplet corrected 
    smooth curve.
        (3) Graphical analysis and numerical integration with ambient 
    distributions. Follow the steps outlined in Sec. 53.62(f)(3) through 
    Sec. 53.62(f)(4) to calculate the estimated concentration measurement 
    ratio between the candidate sampler and a reference method sampler.
        (i) Test evaluation. The candidate method passes the static 
    fractionator test if the values of Rc and Dp50 for each 
    distribution meets the specifications in Table F-1 of this subpart.
    
    
    Sec. 53.65  Test Procedure: Loading Test
    
        (a) Overview. (1) The loading tests are designed to quantify any 
    appreciable changes in a candidate method's performance as a function 
    of coarse aerosol collection. This test is divided into two phases:
        (i) A mandatory demonstration that the candidate method is capable 
    of single-day sampling with periodic maintenance after each 24 hours of 
    operation; and
        (ii) An optional demonstration that the candidate is capable of 
    multi-day sampling with the periodic maintenance schedule as defined by 
    the manufacturer.
        (2) In the first phase, the candidate sampler is first exposed to a 
    laboratory-generated aerosol equivalent to sampling a nominal 
    concentration of 150 g/m\3\ over a 24-hour time period. 
    Following this initial loading, the candidate sampler's effectiveness 
    as a function of particle aerodynamic diameter must then be evaluated 
    using by performing the test in Sec. 53.62 (full wind tunnel test). A 
    sampler which fits the category of fractionator deviation in 
    Sec. 53.60(e)(2) may opt to perform the test in Sec. 53.64 (static 
    fractionator test) in lieu of the full wind tunnel test. The candidate 
    sampler is approved for single day sampling with maintenance after each 
    24 hours of operation if the criteria in Table F-1 of this subpart are 
    met for the 24-hour loading test.
        (3) In the test for extended periodic maintenance, the candidate 
    sampler is exposed to a mass of coarse aerosol equivalent to sampling a 
    mass concentration of 150 g/m\3\ over the time period that the 
    manufacturer has specified between periodic cleaning. The candidate 
    sampler's effectiveness as a function of particle aerodynamic diameter 
    must then be evaluated by performing the test in Sec. 53.62 (full wind 
    tunnel test). A sampler which fits the category of fractionator 
    deviation in Sec. 53.60(e)(2) may opt to perform the test in Sec. 53.64 
    (static fractionator test) in lieu of the full wind tunnel test. If the 
    criteria presented in Table F-1 of this subpart are met for this test, 
    the candidate sampler is approved for multi-day sampling with the 
    periodic maintenance schedule as specified by the manufacturer. For 
    example, if the candidate sampler passes the reevaluation tests 
    following loading with an aerosol mass equivalent to sampling a 150 
    g/m\3\ aerosol continuously for 7 days, then the sampler is 
    approved for 7 day field operation before cleaning is required.
        (b) Technical Definitions. (1) Effectiveness after loading. 
    Effectiveness after loading is the ratio (expressed as a percentage) of 
    the mass concentration of particles of a given size reaching the 
    sampler filter to the mass concentration of particles of the same size 
    approaching the sampler.
        (2) Effectiveness after extended loading. Effectiveness after 
    extended loading is the ratio (expressed as a percentage) of the mass 
    concentration of particles of a given size reaching the sampler filter 
    to the mass concentration of particles of the same size approaching the 
    sampler.
        (c) Facilities and equipment required. (1) Particle delivery 
    system. The particle delivery system shall consist of a static chamber 
    or a low velocity wind tunnel having a sufficiently large cross-
    sectional area such that the test sampler, or portion thereof, may be 
    installed in the test section. At a minimum, the system must have a 
    sufficiently large cross section to house the candidate sampler inlet 
    as well as a collocated isokinetic nozzle for measuring total aerosol 
    concentration. The mean velocity in the test section of the static 
    chamber or wind tunnel shall not exceed 2 km/hr.
        (2) Aerosol generation equipment. For purposes of these tests, the 
    test aerosol shall be produced from commercially available, bulk 
    Arizona road dust. To provide direct interlaboratory comparability of 
    sampler loading characteristics, the bulk dust is specified as 0-10 
    m ATD available from Powder Technology Incorporated 
    (Burnsville, MN). To efficiently deagglomerate the bulk test dust, 
    either a fluidized bed aerosol generator, Wright dust feeder, or sonic 
    nozzle shall be used for the aerosol generation. Other dust generators 
    may be used contingent upon prior approval by the Agency.
        (3) Isokinetic sampler. Mean aerosol concentration within the 
    static chamber or wind tunnel shall be established using a single 
    isokinetic sampler containing a preweighed high-efficiency total 
    filter.
        (d) Test Procedure: 24 hour loading test. (1) Clean the candidate 
    sampler. Internal surfaces of the candidate sampler shall be thoroughly 
    cleaned and dried prior to performing these tests. The internal 
    fractionator surfaces shall then be prepared in strict accordance
    
    [[Page 65838]]
    
    with the operating instructions in the sampler's operating manual 
    referred to in Sec. 53.4(b)(3). Install the candidate sampler's inlet 
    and the isokinetic sampler within the test chamber or wind tunnel.
        (2) Generate a dust cloud. Generate a dust cloud composed of 
    Arizona test dust and introduce the dust cloud into the chamber. Allow 
    sufficient time for the particle concentration to become steady within 
    the chamber.
        (3) Sample aerosol with a total filter and the candidate sampler. 
    Sample the aerosol for a sufficient time to produce an equivalent time 
    weighted concentration (TWC) of 3600 g hr /m\3\. For example, 
    this TWC level may be achieved by sampling a 150 g/m\3\ mean 
    concentration for 24 hours. Alternatively, a 900 g/m\3\ 
    concentration may be sampled for a 4-hour time period to produce an 
    equivalent TWC value. Following shutdown of the system, record the 
    sampling time and all aerosol generation parameters.
        (4) Determine the time-weighted concentration. (i) Weigh the 
    isokinetic sampler's total filter on a gravimetric balance such that 
    the relative error is less than 5.0 percent. Subtract the filter's 
    initial mass from the final mass to determine the collected aerosol 
    mass.
        (ii)(A) Calculate and record the TWC as:
    
        [GRAPHIC] [TIFF OMITTED] TP13DE96.117
        
    
    where:
    M=collected aerosol mass, g
    Q=candidate volumetric flowrate, m\3\/hr
    t=sampling time, hr.
    
        (B) If the value of TWC deviates from 3600 g hr /m\3\ 
     15 percent, then the loaded mass is unacceptable and steps 
    in paragraphs (d) (1) through (3) of this section must be repeated.
        (5) Determine the candidate's performance after loading. The 
    candidate sampler's effectiveness as a function of particle aerodynamic 
    diameter must then be evaluated using by performing the test in 
    Sec. 53.62 (full wind tunnel test). A sampler which fits the category 
    of fractionator deviation in Sec. 53.60(e)(2) may opt to perform the 
    test in Sec. 53.64 (static fractionator test) in lieu of the full wind 
    tunnel test.
        (e) Test Procedure: Extended loading test. (1) Calculate the target 
    loading mass. Calculate and record the time weighted concentration of 
    Arizona road dust which is equivalent to exposing the sampler in an 
    environment of 150 g/m\3\ over the time specified by the 
    vendor as:
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.118
    
    
    where t = the number of hours specified by the manufacturer prior to 
    periodic cleaning.
    
        (2) Clean the candidate sampler. Internal surfaces of the candidate 
    sampler shall be cleaned and dried prior to performing these loading 
    tests. The internal fractionator surfaces shall then be prepared in 
    strict accordance with the operating instructions specified in the 
    sampler's operating manual referred to in Sec. 53.4(b)(3). Install the 
    candidate sampler's inlet and the isokinetic sampler within the test 
    chamber or wind tunnel.
        (3) Generate a dust cloud. Generate a dust cloud composed of 
    Arizona test dust and introduce the dust cloud into the chamber. Allow 
    sufficient time for the particle concentration to become steady within 
    the chamber.
        (4) Sample aerosol with a total filter and the candidate sampler. 
    Sample the aerosol for a time sufficient to produce an equivalent TWC 
    equal to that of the target TWC 15 percent. Following 
    shutdown of the system, record the sampling time and all aerosol 
    generation parameters.
        (5) Determine the time weighted concentration. Weigh the isokinetic 
    sampler's total filter on a gravimetric balance such that the relative 
    measurement error is less than 5.0 percent. Subtract the filter's 
    initial mass from the final mass to determine the collected aerosol 
    mass.
        (i) (A) Calculate and record the TWC as:
    
        [GRAPHIC] [TIFF OMITTED] TP13DE96.119
        
    
        (B) If the value of TWC deviates from the target TWC  
    15 percent, then the loaded mass is unacceptable and steps in 
    paragraphs (e) (1) through (4) of this section must be repeated.
        (6) Determine the candidate's effectiveness after extended loading. 
    The candidate sampler's effectiveness as a function of particle 
    aerodynamic diameter must then be evaluated by performing the test in 
    Sec. 53.62 (full wind tunnel test). A sampler which fits the category 
    of fractionator deviation in Sec. 53.60(e)(2) may opt to perform the 
    test in Sec. 53.64 (static fractionator test) in lieu of the full wind 
    tunnel test.
        (f) Test results. (1) 24-hour test results. If the C's 
    determined in the effectiveness evaluation pass the criteria 
    established in Table F-1 of this subpart for the 24-hour loading test, 
    then the candidate passes this test with the stipulation that the 
    sampling train be cleaned after each 24 hours of operation.
        (2) Extended test results. If the C's determined in the 
    effectiveness evaluation pass the criteria established in Table F-1 of 
    this subpart for the extended loading test, then the candidate sampler 
    passes this test with the stipulation that the sampling train be 
    cleaned at least of often as the frequency tested.
    
    
    Sec. 53.66  Test Procedure: Volatility test.
    
        (a) Overview. This test procedure is designed to ensure that the 
    candidate sampler's volatility losses when sampling semi-volatile 
    ambient aerosol will be comparable to that of a federal reference 
    method sampler. The candidate sampler must meet or exceed the 
    acceptance criteria in Table F- 1 of this subpart.
        (b) Technical definition. Residual mass (RM) is defined as the 
    difference between the final filter weight following the blow-off phase 
    and the initial filter weight preceding the loading phase.
        (c) Facilities and equipment required. (1) Chambers and test 
    atmosphere. This test requires two chambers, one inside the other. The 
    internal chamber is used to produce a well-mixed test atmosphere from 
    which the sampling is performed. The air velocity in the chamber shall 
    be 2.0 km/hr  10 percent, perpendicular to the sampling 
    inlet. The test section shall be sufficiently large such that the 
    inlet, or portion installed thereof, shall block no more than 15percent 
    of the chamber cross section in the test area. At least one reference 
    and one candidate sampler must be tested simultaneously. Such a 
    configuration is designated as a case. Each case needs to be repeated 
    three times for each of the different blow-off phases (1, 2, 3, 4 hours 
    in duration). The external chamber is used to condition, handle and 
    weigh filters. The temperature in both chambers shall be maintained at 
    22  0.5  deg.C. The relative humidity (RH) in both chambers 
    shall be maintained at 40 percent  3 percent.
    
    [[Page 65839]]
    
        (2) Aerosol generation system. A pressure nebulizer shall be used 
    to produce a polydisperse aerosol at a mass median diameter of less 
    than 2.5 m. The polydisperse aerosol shall be generated from 
    A.C.S. reagent grade glycerol of 99.5 percent minimum purity. To 
    provide direct interlaboratory comparability of sampler volatility 
    characteristics, the required nebulizer is Part # 5207, manufactured by 
    Seamless, a division of Professional Medical Products, Inc (Greenwood, 
    SC). The concentration of the aerosol inside the internal chamber shall 
    not exceed 2 mg/m3, or any concentration that would overload the 
    filters; (such overloading can be observed as ``wetted areas'). The 
    concentration inside the chamber shall be at least 1 mg/m3 to 
    obtain significant filter loading.
        (3) Air velocity verification. The chamber air velocity must be 
    measured using an appropriate technique capable of 5 percent precision 
    or better.
        (d) Test procedures. (1) This procedure shall be used to test the 
    performance of candidate equivalent methods of type I and type II in 
    which suspended particulate matter is collected on a filter. Two 
    candidate samplers and two reference method samplers must be tested. 
    One reference method sampler and one candidate sampler must be 
    simultaneously subjected to the entire test procedure to ensure that 
    both samplers are exposed to the identical aerosol. This can be 
    achieved by using a manifold which allows connection of two samplers 
    outside the internal chamber.
        (2) This method consists of three consecutive phases. In the first 
    phase designated as A, temperature, relative humidity inside and 
    outside the internal chamber must be maintained at the levels in 
    paragraph (d)(1) of this section and the aerosol concentration and size 
    distribution inside the internal chamber must be stabilized at the 
    level prescribed in paragraph (d)(1) of this section. The samplers'' 
    filters are conditioned dynamically by drawing aerosol-free air. Such 
    air can be produced by filtering air from the external chamber through 
    the absolute (HEPA) filter. The duration of filter conditioning shall 
    be sufficient to obtain complete filter equilibration. In the second 
    phase, designated as B, both samplers shall draw aerosol-laden air at a 
    constant flow rate for 30 minutes. In the third phase designated as C, 
    samplers draw aerosol-free and aerosol compound vapor free air, to 
    produce partial volatilization of the collected aerosol, over single 
    time periods of 1, 2, 3, and 4 hours. In each test, phase C is preceded 
    by phase A and phase B using a new set of filters. Phase C shall be 
    conducted immediately after completion of the phase B. The setup used 
    in phase A can be used to produce air needed in phase C.
        (e) Filter handling. Careful handling of the filter during 
    sampling, conditioning, and weighing is necessary to avoid errors due 
    to damaged filters or loss of collected particles from the filters. All 
    filters must be weighed immediately after phase A and phase C.
        (f) Temperature, humidity, and static charge considerations.--(1) 
    Temperature and humidity. The effects of temperature and humidity can 
    be minimized by equilibrating the test filters at conditions inside the 
    external chamber. Total dynamic conditioning can be established by 
    sequential filter weighing every 30 minutes following repetitive 
    dynamic conditioning. The filters are considered sufficiently 
    conditioned if the sequential weights are repeatable to 
    3g. The temperature and relative humidity changes 
    in which the filter is exposed during the entire procedure must not 
    exceed + 0.5  deg.C for the temperature and  3 
    percent RH, respectively.
        (2) Static charge. The following procedure is suggested for 
    minimizing charge effects. Place six or more Polonium static control 
    devices (PSCD) inside the microbalance weighing chamber, (MWC). Two of 
    them must be placed horizontally on the floor of the MWC and the 
    remainder placed vertically on the back wall of the MWC. Taping two 
    PSCD's together or using double-sided tape will help to keep them from 
    falling. Place the filter that is to be weighed on the horizontal PSCDs 
    facing aerosol coated surface up. Close the MWC and wait 1 minute. Open 
    the MWC and place the filter on the balance dish. Wait 1 minute. If the 
    charges have been neutralized the weight will stabilize within 30-60 
    seconds. Repeat the procedure of neutralizing charges and weighing as 
    prescribed above several times (typically 2-4 times) until consecutive 
    weights will differ by no more than 3 micrograms. Record the last 
    measured weight and use this value for all subsequent calculations.
        (g) Artifacts. Additional negative or positive artifacts in 
    collected mass during the first sampling period may occur. Such 
    artifacts shall be minimized by producing and preserving the chemical 
    composition of the air inside the internal chamber to provide 
    thermodynamic and physicochemical states of equilibrium for the 
    particles.
        (h) Calculations. Filters shall be weighed before the aerosol 
    loading phase and immediately after the blow-off phase. The latter 
    weight is subtracted from the former weight to calculate the residual 
    mass (RM). The mass on the filter from the tested candidate sampler is 
    multiplied by the volumetric sampling flows ratio, i.e., Frm flow rate/
    Candidate flow rate, to produce a corrected residual mass (CRM).
        (i) Test for comparability. Comparability of the candidate method 
    shall be established by calculating regression parameters for the 
    regression of the CRMs obtained using candidate devices on RMs obtained 
    using FRM devices. If the linear regression parameters [slope, 
    intercept and correlation] meet the following values: Slope=1 
     0.1, intercept=0  0.15, correlation r 
    0.97, the candidate method passes this test for 
    comparability.
    
    Tables to Subpart F of Part 53
    
      Table F-1.--Performance Specifications for PM2.5 Class II Equivalent  
                                    Samplers                                
    ------------------------------------------------------------------------
          Performance test           Specifications      Acceptance criteria
    ------------------------------------------------------------------------
    Full Wind Tunnel Evaluation   VOAG produced         Dp50 = 2.5 m    
                                   and 24 km/hr.         0.2 m;    
                                                         Numerical Analysis 
                                                         Results: 95% Rc105% for          
                                                         distributions      
                                                         presented in Tables
                                                         F-4, F-5, and F-6. 
    Wind Tunnel Inlet Aspiration  3.5 m        Relative Aspiration:
     Test Sec.  53.63.             liquid VOAG           95% Means
                                   size in conjunction   105%, CV  10%.           
                                   2 km/hr and 24 km/                       
                                   hr.                                      
    Static Fractionator Test      Evaluation of the     Dp50 = 2.5 m    
                                   static conditions.    0.2 m;    
                                   See Table F-2 for     Numerical Analysis 
                                   specifications        Results: 95% Rc105% for          
                                   types.                distributions      
                                                         presented in Tables
                                                         F-4, F-5, and F-6. 
    
    [[Page 65840]]
    
                                                                            
    Loading Test Sec.  53.65....  Loading of the clean  24 hour test and    
                                   candidate under       Extended test; Dp50
                                   laboratory            = 2.5 m   
                                   conditions: 24 hour    0.2   
                                   test, extended test.  m;        
                                                         Numerical Analysis 
                                                         Results: 95% Rc105% for          
                                                         distributions      
                                                         presented in Tables
                                                         F-4, F-5, and F-6. 
    Volatility Test Sec.  53.66.  Polydisperse liquid   Regression          
                                   aerosol produced by   Parameters Slope = 
                                   air nebulization of   1  0.1,
                                   A.C.S. reagent        Intercept = 0  0.15 r      
                                   99.5% minimum         0.97.              
                                   purity.                                  
    ------------------------------------------------------------------------
    
    
     Table F-2.--Particle Sizes and Wind Speeds for Full Wind Tunnel Evaluation, Wind Tunnel Inlet Aspiration Test, 
                                                 and Static Chamber Test                                            
    ----------------------------------------------------------------------------------------------------------------
                                             Full wind tunnel test   Inlet aspiration test     Static               
     Primary partical mean size a (m)                     2 km/hr    24 km/hr     2 km/hr    24 km/hr       test         test   
    ----------------------------------------------------------------------------------------------------------------
    1.50.25...................          S           S                                    S              
    2.00.25...................          S           S                                    S              
    2.50.25...................          S           S                                    S              
    2.80.25...................          S           S                                    S              
    3.50.25...................          S           S           L           L            S              
    4.00.5....................          S           S                                    S              
    Polydisperse Glycerol Aerosol.........                                                                        L 
    ----------------------------------------------------------------------------------------------------------------
    a Aerodynamic diameter.                                                                                         
    S=solid particles.      L=liquid particles.                                                                     
    
    
                                        Table F-3.--Critical Parameters of Idealized Ambient Particle Size Distributions                                    
    --------------------------------------------------------------------------------------------------------------------------------------------------------
                                                                 Fine particle mode                    Coarse particle mode                      FRM sampler
                                                      ------------------------------------------------------------------------------   PM2.5/     expected  
                  Idealized distribution                                            Conc.                                  Conc.        PM10     mass conc. 
                                                       MMD (g/  MMD (g/    ratio    (g/
                                                           m>m)         Dev.         m3)          m>m)         Dev.         m3)                      m3)    
    --------------------------------------------------------------------------------------------------------------------------------------------------------
    Coarse...........................................         0.50           2         12.0            10           2         88.0        0.27       13.814 
    ``Typical''......................................         0.50           2         33.3            10           2         66.7        0.55       34.284 
    Fine.............................................         0.85           2         85.0            15           2         15.0        0.94       78.539 
    --------------------------------------------------------------------------------------------------------------------------------------------------------
    
    
    BILLING CODE 6560-50-P
    
    [[Page 65841]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.120
    
    
    
    [[Page 65842]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.121
    
    
    
    [[Page 65843]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.122
    
    
    
    [[Page 65844]]
    
    Figures to Subpart F of Part 53
    [GRAPHIC] [TIFF OMITTED] TP13DE96.123
    
    
    [[Page 65845]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.124
    
    
    
    BILLING CODE 6560-50-C
    
    [[Page 65846]]
    
    Appendix A to Subpart F of Part 53--References
    
    1. Marple, V.A., K.L. Rubow, W. Turner, and J.D. Spangler, Low Flow 
    Rate Sharp Cut Impactors for Indoor Air Sampling: Design and 
    Calibration., JAPCA, 37: 1303-1307 (1987).
    2. Vanderpool, R.W. and K.L. Rubow, ``Generation of Large, Solid 
    Calibration Aerosols'', J. of Aer. Sci. and Tech., 9:65-69 (1988).
    
    PART 58--[AMENDED]
    
        1. The authority citation for part 58 continues to read as follows:
    
        Authority: 42 U.S.C. 7410, 7601(a), 7613, and 7619.
    
        2. Section 58.1 is amended by revising paragraph (s) and adding 
    paragraphs (jj) through (vv) to read as follows:
    
    
    Sec. 58.1  Definitions.
    
    * * * * *
        (s) Traceable means that a local standard has been compared and 
    certified, either directly or via not more than one intermediate 
    standard, to a National Institute of Standards and Technology (NIST)-
    certified primary standard such as a NIST-Traceable Reference Material 
    (NTRM) or a NIST-certified Gas Manufacturer's Internal Standard (GMIS).
    * * * * *
        (jj) Consolidated Metropolitan Statistical Area means the most 
    recent area as designated by the U.S. Office of Management and Budget 
    and population figures from the Bureau of the Census. The Department of 
    Commerce provides ``that within metropolitan complexes of 1 million or 
    more population, separate component areas are defined if specific 
    criteria are met. Such areas are designated primary metropolitan 
    statistical areas (PMSAs; and any area containing PMSAs is designated 
    consolidated metropolitan statistical area (CMSA).''
        (kk) Core PM2.5 SLAMS means SLAMS sites which are the basic 
    component sites of the PM2.5 SLAMS regulatory network. Population-
    oriented core sites are intended to reflect community-wide exposure to 
    air pollution.
        (ll) Equivalent method means a method of sampling and analyzing the 
    ambient air for an air pollutant that has been designated as an 
    equivalent method in accordance with this part; it does not include a 
    method for which an equivalent method designation has been canceled in 
    accordance with 40 CFR 53.11 or 53.16.
        (mm) Metropolitan Statistical Area (MSA) means the most recent area 
    as designated by the U.S. Office of Management and Budget and 
    population figures from the U.S. Bureau of the Census. The Department 
    of Commerce defines a metropolitan area as ``one of a large population 
    nucleus, together with adjacent communities which have a high degree of 
    economic and social integration with that nucleus.''
        (nn) Monitoring Planning Area (MPA) means a contiguous geographic 
    area with established, well defined boundaries, such as a metropolitan 
    statistical area, county or State, having a common area that is used 
    for planning monitoring locations for PM2.5. MPAs may cross State 
    boundaries, such as the Philadelphia PA-NJ MSA, and be further 
    subdivided into spatial averaging zones. MPAs are generally oriented 
    toward areas with populations greater than 250,000, but for 
    convenience, those portions of a State that are not part of MSAs can be 
    considered as a single MPA. MPAs must be defined, where applicable, in 
    a State monitoring plan.
        (oo) Particulate Matter Monitoring Plan means a detailed plan, 
    prepared by control agencies and submitted to EPA for approval, that 
    describes their PM2.5 and PM10 air quality surveillance 
    network.
        (pp) PM2.5 means particulate matter with an aerodynamic 
    diameter less than or equal to a nominal 2.5 micrometers as measured by 
    a reference method based on appendix L of part 50 of this chapter and 
    designated in accordance with part 53 of this chapter or by an 
    equivalent method designated in accordance with part 53 of this 
    chapter.
        (qq) Population oriented monitoring or sites applies to residential 
    areas, commercial areas, recreational areas, industrial areas where 
    workers from more than one company are located, and other areas where a 
    substantial number of people may spend a significant fraction of their 
    day.
        (rr) Primary Metropolitan Statistical Area (PMSA) is a separate 
    component of a consolidated metropolitan statistical area. For the 
    purposes of this regulation, PMSA is used interchangeably with MSA.
        (ss) Reference method means a method of sampling and analyzing the 
    ambient air for an air pollutant that is specified as a reference 
    method in an appendix to part 50 of this chapter, or a method that has 
    been designated as a reference method in accordance with this part; it 
    does not include a method for which a reference method designation has 
    been canceled in accordance with 40 CFR 53.11 or 53.16.
        (tt) Spatial averaging zone (SAZ) means an area with established, 
    well defined boundaries, such as a county or census block, within a MPA 
    that has relatively uniform concentrations of PM2.5. Monitors 
    within a SAZ that meet certain requirements as set forth in Appendix D 
    of this part are used to compare with the primary annual PM2.5 
    NAAQS using a spatial averaging procedure specified in Appendix K of 40 
    CFR Part 50. A SAZ may have one or more monitors. An MPA must have at 
    least one SAZ and may have several SAZs.
        (uu) SPM monitors is a generic term used for all monitors other 
    than SLAMS, NAMS, PAMS, and PSD monitors included in an agency's 
    monitoring plan or for monitors used in special study whose data are 
    officially reported to EPA.
        (vv) Annual State Air Monitoring Report (ASAMR) is an annual 
    report, prepared by control agencies and submitted to EPA for approval, 
    that consists of an annual data summary report for all pollutants and a 
    detailed report describing any proposed changes to their air quality 
    surveillance network.
        3. Section 58.13 is amended by revising paragraph (d) and adding 
    new paragraphs (e) and (f) as follows:
    
    
    Sec. 58.13  Operating schedule.
    
    * * * * *
        (d) For PM10 samplers--a 24-hour sample must be taken a 
    minimum of every sixth day.
        (e) For PM2.5 samplers, everyday sampling is required for all 
    core SLAMS, including NAMS and PAMS core stations, except during 
    seasons or as otherwise exempted by the Regional Administrator in 
    accordance with EPA guidance. For other SLAMS, a minimum frequency of 1 
    in 6 day sampling schedule is allowed and suggested. Alternative 
    sampling frequencies are also allowed for SLAMS sites which are 
    principally intended for comparisons to the 24-hour NAAQS. Such 
    modifications must be approved by the EPA Administrator in accordance 
    with EPA guidance.
        (f) Alternatives to everyday sampling. (1) PM2.5 core SLAMS 
    sites located in monitoring planning areas (as described in section 2.8 
    of Appendix D of this subpart) are required to sample every day with a 
    reference or equivalent method operating in accordance with 40 CFR part 
    53 and Section 2 of Appendix C to this part. However, in accordance 
    with the monitoring priority as defined in paragraph (f)(2) of this 
    section, established by the control agency and approved by EPA, a core 
    SLAMS monitor may operate with a reference or equivalent method on a 1 
    in 3 day schedule and produce data that may be compared to the NAAQS, 
    provided that
    
    [[Page 65847]]
    
    it is collocated with an acceptable continuous fine particle PM 
    analyzer that is correlated with the reference or equivalent method. If 
    the alternative sampling schedule is selected by the control agency and 
    approved by EPA, the alternative schedule shall be implemented on 
    January 1 of the year in which everyday sampling is required. The 
    selection of correlated acceptable continuous PM analyzers and 
    procedures for correlation with the intermittent reference or 
    equivalent method shall be in accordance with procedures to be 
    established and included in EPA guidance. Unless the continuous fine 
    particle analyzer satisfies the requirements of Section 2 of Appendix C 
    to 40 CFR Part 58, however, the data derived from the correlated 
    acceptable continuous monitor are not eligible for direct comparisons 
    to the NAAQS in accordance with Part 50.
        (2) A Metropolitan Statistical Area (or primary metropolitan 
    statistical area) with greater than 1 million population and high 
    concentrations of PM2.5 (greater than or equal to 80 percent of 
    the NAAQS) shall be a Priority 1 PM monitoring area. Other monitoring 
    planning areas may be designated as Priority 2 PM monitoring areas.
        (3) Core SLAMS having a correlated acceptable continuous analyzer 
    collocated with a reference or equivalent method in a Priority 1 PM 
    monitoring area may operate on the 1 in 3 sampling frequency only after 
    reference or equivalent data are collected for at least two complete 
    years and the area is determined to be attainment with the PM2.5 
    NAAQS in accordance with Appendix K to 40 CFR Part 50. See Figure 
    below. After this time and for as long as the area is in attainment 
    with the PM2.5 NAAQS, the correlated acceptable continuous option 
    may be used in conjunction with 1 in 3 day intermittent sampling. Other 
    core SLAMS may utilize correlated acceptable continuous monitors in 
    conjunction with intermittent sampling on a 1 in 3 schedule for the 
    first year of required PM2.5 sampling.
        (4) After one complete year of PM2.5 sampling, if a violation 
    of the NAAQS is determined (in accordance with Appendix K to 40 CFR 
    part 50), then everyday sampling with reference or equivalent method 
    would be required subsequently. Otherwise, the core SLAMS in this area 
    may continue to sample a minimum of 1 in 3 days using a reference or 
    equivalent method together with the correlated acceptable continuous 
    monitor. Background and transport PM2.5 core SLAMS in States with 
    population-oriented core monitors may sample with correlated acceptable 
    continuous alternative in accordance with the highest priority 
    PM2.5 core SLAMS for the State. In States without population-
    oriented core monitors or where operation of population-oriented core 
    monitors has been exempted by the Regional Administrator, the 
    background and transport PM2.5 core SLAMS may also sample a 
    minimum of 1 in 3 days. Background PM2.5 sites which are downwind 
    of areas without anthropogenic sources of PM2.5, (e.g., the 
    Pacific Ocean) may also sample 1 in 3 days.
        (5) In all monitoring situations, with a correlated acceptable 
    continuous alternative, FRM samplers or filter-based equivalent 
    analyzers should preferably accompany the correlated acceptable 
    continuous monitor.
        4. Section 58.14 is revised as follows:
    
    
    Sec. 58.14  Special purpose monitors.
    
        (a) Except as specified in paragraph (b) of this section, any 
    ambient air quality monitoring station other than a SLAMS or PSD 
    station from which the State intends to use the data as part of a 
    demonstration of attainment or nonattainment or in computing a design 
    value for control purposes of the National Ambient Air Quality 
    Standards (NAAQS) must meet the requirements for SLAMS as described in 
    Sec. 58.22 and, after January 1, 1983, must also meet the requirements 
    for SLAMS described in Sec. 58.13 and Appendices A and E of this part.
        (b) PM2.5 NAAQS violations shall not be made based on data 
    produced at an SPM site during the first 3 years following the 
    effective date of the final rule. However, a notice of NAAQS violations 
    resulting from SPMs shall be reported to EPA in the State's annual 
    monitoring plan and be considered by the State in the design of its 
    overall SLAMS network, and should be considered to become permanent 
    SLAMS during the annual network review in accordance with Sec. 58.25.
        (c) Any ambient air quality monitoring station other than a SLAMS 
    or PSD station from which the State intends to use the data for SIP-
    related functions other than as described in paragraph (a) of this 
    section is not necessarily required to comply with the requirements for 
    a SLAMS station under paragraph (a) of this section but must be 
    operated in accordance with a monitoring schedule, methodology, quality 
    assurance procedures, and probe or instrument-siting specifications 
    approved by the Regional Administrator.
        5. A new Sec. 58.15 is added to read as follows:
    
    
    Sec. 58.15  Designation of monitoring sites eligible for comparison to 
    the PM2.5 NAAQS.
    
        (a) SLAMS and SPM monitors that will be used to make comparisons 
    with the 24-hour and annual NAAQS for PM2.5 shall be identified in 
    the State's monitoring plan, subject to annual review and approval by 
    the Regional Administrator, and designated as code ``B'' in EPA's AIRS 
    monitoring site file.
        (b) SLAMS and SPM monitors that will be used to make comparisons 
    only with the 24-hour NAAQS for PM2.5 shall be identified in the 
    States monitoring plan, subject to annual review and approval by the 
    Regional Administrator, and designated as code ``D'' in EPA's AIRS 
    monitoring site file.
        (c) All other PM2.5 sites would be designated as code ``O'' 
    sites in EPA's AIRS monitoring site file.
        6. Section 58.20 is amended by revising paragraphs (d), (e) 
    introductory text, and (e)(5); by redesignating paragraph (f) as (g); 
    and adding a new paragraph (f) to read as follows:
    
    
    Sec. 58.20  Air quality surveillance: Plan control.
    
    * * * * *
        (d) Provide for the review of the air quality surveillance system 
    on an annual basis to determine if the system meets the monitoring 
    objectives defined in Sec. 2.8 of appendix D to this part as well as 
    the minimum requirements for networks of SLAMS stations for PM2.5 
    described in Sec. 2.8.2 of appendix D of this part. Such review must 
    identify needed modifications to the network such as termination or 
    relocation of unnecessary stations or establishment of new stations 
    which are necessary. For PM2.5, the review must identify needed 
    changes to core stations, monitoring planning areas, spatial averaging 
    zones, or monitoring sites which are eligible for comparison to the 
    NAAQS.
        (e) Provide for having a SLAMS network description, including 
    monitoring planning areas and spatial averaging zones for PM2.5, 
    available for public inspection and submission to the Administrator 
    upon request. The network description must be available at the time of 
    plan revision submittal except for PM10 and PM2.5, which must 
    be available by 6 months after the effective date of promulgation and 
    must contain the following information for each SLAMS:
    * * * * *
        (5) The monitoring objective, spatial scale of representativeness, 
    and for PM2.5, the monitoring planning area, spatial averaging 
    zone, and the site code designation to identify which site will be used 
    to determine violations of the appropriate PM NAAQS (annual or 24-
    
    [[Page 65848]]
    
    hour), as defined in appendix D to this part.
        (f) Provide for having a list of all PM2.5 monitoring 
    locations including SLAMS, NAMS and SPMs, which are included in the 
    State's monitoring plan and are intended for comparison to the NAAQS, 
    available for public inspection
    * * * * *
        7. Section 58.23 is amended by revising the introductory text and 
    adding a new paragraph (c) to read as follows:
    
    
    Sec. 58.23  Monitoring network completion.
    
        By January 1, 1983, with the exception of PM10 samplers which 
    shall be within 6 months of the date of publication of the final rule 
    and with the exception of PM2.5 samplers which shall be as 
    described in paragraph (c) of this section.
    * * * * *
        (c) Each PM2.5 station in the SLAMS network must be in 
    operation in accordance with the minimum requirements of appendix D of 
    this part, be sited in accordance with the criteria in appendix E to 
    this part, and be located as described on the station's AIRS site 
    identification form, according to the following schedule:
        (1) Within 1 year of the effective date of promulgation, the 
    required core PM2.5 SLAMS for at least one MPA must be in 
    operation;
        (2) Within 2 years of promulgation, all other required core-
    population oriented sites and core background and transport sites must 
    be in operation; and
        (3) Within 3 years of promulgation, a continuous PM monitor in 
    areas with greater than 1 million population, all NAMS sites and all 
    additional required PM2.5 SLAMS must be in operation.
        8-9. In Sec. 58.26, revise the section heading paragraph (b) 
    introductory text and add paragraphs (d) and (e) to read as follows:
    
    
    Sec. 58.26  Annual State Air Monitoring Report.
    
    * * * * *
        (b) The SLAMS annual data summary report must contain:
    * * * * *
        (d) For PM--
        (1) The State shall submit a summary to the appropriate Regional 
    Office (for SLAMS) or Administrator (through the Regional Office) (for 
    NAMS) which details proposed changes to the PM Monitoring Plan and to 
    be in accordance with the annual network review requirements 
    Sec. 58.25. This shall discuss the existing PM networks, including 
    modifications to the number, size or boundaries of monitoring planning 
    areas and spatial averaging zones; number and location of PM SLAMS; 
    number or location of core PM2.5 SLAMS; alternative sampling 
    frequencies proposed for PM2.5 SLAMS (including core PM2.5 
    SLAMS and PM2.5 NAMS), core PM2.5 SLAMS to be designated 
    PM2.5 NAMS; and PM SLAMS to be designated PM NAMS.
        (2) the State shall submit an annual summary to the appropriate 
    Regional Office of all the ambient air quality monitoring PM data from 
    all special purpose monitors which are described in the States 
    monitoring plan and are intended for SIP purposes. These include those 
    population oriented SPMs which are eligible for comparison to the PM 
    NAAQS. The State shall certify the data in accordance with paragraph 
    (c) of this section.
        (e) The Annual State Air Monitoring Report shall be submitted to 
    the Regional Administrator by July 1 or by alternative annual date to 
    be negotiated between the State and Regional Administrator. The Region 
    shall provide review and approval/disapproval within 45 days. After the 
    first 3 years following effective promulgation of the PM2.5 NAAQS 
    or once a SAZ has been determined to violate the NAAQS, then changes to 
    an MPA shall require public review and notification.
    
    
    Sec. 58.30  NAMS network establishment.
    
        10. In Sec. 58.30, paragraph (a) introductory text is revised to 
    read as follows:
        (a) By January 1, 1980, with the exception of PM10 samplers, 
    which shall be by 6 months after the effective date of the final rule, 
    and PM2.5, which shall be by 3 years after the effective date of 
    promulgation, the State shall:
    * * * * *
        11. In Sec. 58.31, paragraph (f) is revised to read as follows:
    
    
    Sec. 58.31  NAMS network description.
    
    * * * * *
        (f) The monitoring objective, spatial scale of representativeness, 
    and for PM2.5, the monitoring planning area, spatial averaging 
    zone, and the site code designation to identify which site will be used 
    to determine violations of the appropriate NAAQS (annual or 24-hour), 
    as defined in appendix D to this part.
    * * * * *
        12. In Sec. 58.34, the introductory text is revised to read as 
    follows:
    
    
    Sec. 58.34  NAMS network completion.
    
        By January 1, 1981, with the exception of PM10 samplers, which 
    shall be by 6 months after the effective date of final rule, and 
    PM2.5, which shall be by 3 years after the effective date of final 
    rule:
    * * * * *
        13. In Sec. 58.35, the first sentence of paragraph (b) is revised 
    to read as follows:
    
    
    Sec. 58.35  NAMS data submittal.
    
    * * * * *
        (b) The State shall report to the Administrator all ambient air 
    quality data for SO2, CO, O3, NO2, Pb, PM10, and 
    PM2.5, and information specified by the AIRS Users Guide (Volume 
    II, Air Quality Data Coding, and Volume III, Air Quality Data Storage) 
    to be coded into the AIRS-AQS format.
    * * * * *
        14. Revise Appendix A of part 58 to read as follows:
    
    Appendix A to Part 58--Quality Assurance Requirements for State and 
    Local Air Monitoring Stations (SLAMS)
    
    1. General Information.
    
        1.1 This appendix specifies the minimum quality assurance/
    quality control requirements applicable to SLAMS air monitoring data 
    submitted to EPA. State and local agencies are encouraged to develop 
    and maintain quality assurance programs more extensive than the 
    required minimum.
        1.2 To assure the quality of data from air monitoring 
    measurements, two distinct and important interrelated functions must 
    be performed. One function is the control of the measurement process 
    through broad quality assurance activities, such as establishing 
    policies and procedures, assigning roles and responsibilities, 
    conducting oversight and reviews, and implementing corrective 
    actions. The other function is the control of the measurement 
    process through the implementation of specific quality control 
    procedures, such as calibrations, checks, replicates, routine self-
    assessments, etc. In general, the greater the control of a given 
    monitoring system, the better will be the resulting quality of the 
    monitoring data. The results of quality assurance reviews and 
    assessments indicate whether the control efforts are adequate or 
    need to be improved.
        1.3  Documentation of all quality assurance and quality control 
    efforts implemented during the data collection, analysis, and 
    reporting phases is important to data users, who can then consider 
    the impact of these control efforts on the data quality (see 
    Reference 1 of this appendix). Both qualitative and quantitative 
    assessments of the effectiveness of these control efforts should 
    identify those areas most likely to impact the data quality and to 
    what extent.
        1.4  Periodic assessments of SLAMS data quality are required to 
    be reported to EPA. To provide national uniformity in this 
    assessment and reporting of data quality for all SLAMS networks, 
    specific assessment and reporting procedures are prescribed in 
    detail in sections 3, 4, and 5 of this appendix. On the other hand, 
    the selection and extent of the quality assurance and quality 
    control
    
    [[Page 65849]]
    
    activities used by a monitoring agency depend on a number of local 
    factors such as the field and laboratory conditions, the objectives 
    of the monitoring, the level of the data quality needed, the 
    expertise of assigned personnel, the cost of control procedures, 
    pollutant concentration levels, etc. Therefore, the quality system 
    requirements, in section 2 of this appendix, are specified in 
    general terms to allow each State to develop a quality assurance 
    program that is most efficient and effective for its own 
    circumstances.
    
    2. Quality System Requirements
    
        2.1  Each State and local agency must develop and implement a 
    quality assurance program consisting of policies, procedures, 
    specifications, standards, and documentation necessary to:
        (1) Provide data of adequate quality to meet monitoring 
    objectives, and
        (2) Minimize loss of air quality data due to malfunctions or 
    out-of-control conditions. This quality assurance program must be 
    described in detail, suitably documented, and approved by the 
    appropriate Regional Administrator, or the Administrator's designee. 
    The Quality Assurance Program will be reviewed during the systems 
    audits described in section 2.5 of the appendix.
        2.2  Primary guidance for developing the quality assurance 
    program is contained in References 2-7 of this appendix, which also 
    contain many suggested procedures, checks, and control 
    specifications. Reference 7 of this appendix describes specific 
    guidance for the development of a Quality Assurance Program for 
    SLAMS. Many specific quality control checks and specifications for 
    manual methods are included in the respective reference methods 
    described in part 50 of this chapter or in the respective manual 
    equivalent method descriptions available from EPA (see Reference 8 
    of this appendix). Similarly, quality control procedures related to 
    specifically designated reference and equivalent method analyzers 
    are contained in the respective operation or instruction manuals 
    associated with those analyzers. Quality assurance guidance for 
    meteorological systems at PAMS is contained in Reference 9. Quality 
    assurance procedures for VOC, NOx (including NO and NO2), 
    O3, and carbonyl measurements at PAMS must be consistent with 
    EPA guidance. Quality assurance and control programs must follow the 
    requirements established by ANSI E-4 (Reference 2 of this appendix) 
    and must undergo systems audits demonstrating attainment of the 
    requirements. This guidance, and any other pertinent information 
    from appropriate sources, should be used by the agencies in 
    developing their quality assurance programs. As a minimum, each 
    quality assurance program must include operational procedures for 
    each of the following activities:
        (1) Selection of methods, analyzers, or samplers;
        (2) Training;
        (3) Installation of equipment;
        (4) Selection and control of calibration standards;
        (5) Calibration;
        (6) Zero/span checks and adjustments of automated analyzers;
        (7) Control checks and their frequency;
        (8) Control limits for zero, span and other control checks, and 
    respective corrective actions when such limits are surpassed;
        (9) Calibration and zero/span checks for multiple range 
    analyzers (see section 2.6 of Appendix C of this part);
        (10) Preventive and remedial maintenance;
        (11) Quality control procedures for air pollution episode 
    monitoring;
        (12) Recording and validating data;
        (13) Data quality assessment (precision and accuracy);
        (14) Documentation of quality assurance and quality control 
    information; and
        (15) Control of pertinent documents and records in print and 
    electronic forms.
        2.3  Pollutant Concentration and Flow Rate Standards.
        2.3.1  Gaseous pollutant concentration standards (permeation 
    devices or cylinders of compressed gas) used to obtain test 
    concentrations for CO, SO2, NO, and NO2 must be traceable 
    to either a National Institute of Standards and Technology (NIST) 
    NIST-Traceable Reference Material (NTRM) or a NIST-certified Gas 
    Manufacturer's Internal Standard (GMIS), certified in accordance 
    with one of the procedures given in Reference 10.
        2.3.2  Test concentrations for O3 must be obtained in 
    accordance with the UV photometric calibration procedure specified 
    in appendix D of part 50 of this chapter, or by means of a certified 
    ozone transfer standard. Consult References 11 and 12 for guidance 
    on primary and transfer standards for O3.
        2.3.3  Flow rate measurements must be made by a flow measuring 
    instrument that is traceable to an authoritative volume or other 
    applicable standard. Guidance for certifying some types of 
    flowmeters is provided in Reference 7.
        2.4  National Performance Audit Program. Agencies operating 
    SLAMS are required to participate in EPA's National Performance 
    Audit Program. These audits are described in sections 2.0.10 and 
    2.0.11 of Reference 7. For further instructions, agencies should 
    contact either the appropriate EPA Regional Quality Assurance 
    Coordinator or the National Performance Audit Program Coordinator, 
    Quality Assurance Branch (MD-77B), National Exposure Research 
    Laboratory, U.S. Environmental Protection Agency, Research Triangle 
    Park, NC 27711.
        2.5  Systems Audit Programs. Systems audits of the ambient air 
    monitoring programs of agencies operating SLAMS shall be conducted 
    at least every three years by the appropriate EPA Regional Office. 
    Quality assurance and control programs must follow the requirements 
    established by ANSI E-4 (Reference 2 of this appendix) and described 
    in Reference 7. For further instructions, agencies should contact 
    either the appropriate EPA Regional Quality Assurance Coordinator or 
    the Systems Audit Quality Assurance Coordinator, Office of Air 
    Quality Planning and Standards, Emissions Monitoring and Analysis 
    Division (MD-14), U.S. Environmental Protection Agency, Research 
    Triangle Park, NC 27711.
    
    3. Data Quality Assessment Requirements.
    
        3.0.1  All ambient monitoring methods or analyzers used in SLAMS 
    shall be tested periodically, as described in this section, to 
    quantitatively assess the quality of the SLAMS data being routinely 
    produced. Measurement accuracy and precision are estimated for both 
    automated and manual methods. The individual results of these tests 
    for each method or analyzer shall be reported to EPA as specified in 
    section 4. EPA will then calculate quarterly integrated estimates of 
    precision and accuracy applicable to the SLAMS data as described in 
    section 5 of this appendix. Data assessment results should be 
    reported to EPA only for methods and analyzers approved for use in 
    SLAMS monitoring under appendix C of this part.
        3.0.2  The integrated estimates of the data quality will be 
    calculated on the basis of ``reporting organizations'' and may also 
    be calculated for each region and for the entire nation. These 
    estimates will primarily pool all methods for each pollutant, but 
    estimates may also be made for specific instrument types identified 
    by EPA method code, which is uniquely related to each reference and 
    equivalent method designated by the EPA under part 53 of this 
    chapter. A ``reporting organization'' is defined as a State, 
    subordinate organization within a State, or other organization that 
    is responsible for a set of stations that monitors the same 
    pollutant and for which precision or accuracy assessments can be 
    pooled. States must define one or more reporting organizations for 
    each pollutant such that each monitoring station in the State SLAMS 
    network is included in one, and only one, reporting organization.
        3.0.3  Each reporting organization shall be defined such that 
    precision or accuracy among all stations in the organization can be 
    expected to be reasonably homogeneous, as a result of common 
    factors. Common factors that should be considered by States in 
    defining reporting organizations include:
        (1) Operation by a common team of field operators;
        (2) Common calibration facilities; and
        (3) Support by a common laboratory or headquarters. Where there 
    is uncertainty in defining the reporting organizations or in 
    assigning specific sites to reporting organizations, States shall 
    consult with the appropriate EPA Regional Office for guidance. All 
    definitions of reporting organizations shall be subject to final 
    approval by the appropriate EPA Regional Office.
        3.0.4  Assessment results shall be reported as specified in 
    section 4 of this Appendix. Concentration and flow rate standards 
    must be as specified in sections 2.3 or 3.4 of this Appendix. In 
    addition, working standards and equipment used for accuracy audits 
    must not be the same standards and equipment used for routine 
    calibrations. Additional information and guidance in the technical 
    aspects of conducting these tests may be found in Reference 7 or in 
    the operation or instruction manual associated with the analyzer or 
    sampler. Concentration measurements reported from analyzers or 
    analytical systems (indicated concentrations) should be based on 
    stable readings and must
    
    [[Page 65850]]
    
    be derived by means of the same calibration curve and data 
    processing system used to obtain the routine air monitoring data 
    (see Reference 1 and Reference 7 of this Appendix). Table A-1 of 
    this Appendix provides a summary of the minimum data quality 
    assessment requirements, which are described in more detail in the 
    following sections.
        3.1  Precision of Automated Methods.
        3.1.1  Methods for SO2, NO2, O3 and CO. A one-
    point precision check must be performed at least once every two 
    weeks on each automated analyzer used to measure SO2, NO2, 
    O3 and CO. The precision check is made by challenging the 
    analyzer with a precision check gas of known concentration 
    (effective concentration for open path analyzers) between 0.08 and 
    0.10 ppm for SO2, NO2, and O3 analyzers, and between 
    8 and 10 ppm for CO analyzers. To check the precision of SLAMS 
    analyzers operating on ranges higher than 0 to 1.0 ppm SO2, 
    NO2, and O3, or 0 to 100 ppm for CO, use precision check 
    gases of appropriately higher concentration as approved by the 
    appropriate Regional Administrator or the Regional Administrator's 
    designee. However, the results of precision checks at concentration 
    levels other than those specified above need not be reported to EPA. 
    The standards from which precision check test concentrations are 
    obtained must meet the specifications of section 2.3 of this 
    Appendix.
        3.1.1.1  Except for certain CO analyzers described below, point 
    analyzers must operate in their normal sampling mode during the 
    precision check, and the test atmosphere must pass through all 
    filters, scrubbers, conditioners and other components used during 
    normal ambient sampling and as much of the ambient air inlet system 
    as is practicable. If permitted by the associated operation or 
    instruction manual, a CO point analyzer may be temporarily modified 
    during the precision check to reduce vent or purge flows, or the 
    test atmosphere may enter the analyzer at a point other than the 
    normal sample inlet, provided that the analyzer's response is not 
    likely to be altered by these deviations from the normal operational 
    mode. If a precision check is made in conjunction with a zero or 
    span adjustment, it must be made prior to such zero or span 
    adjustments. Randomization of the precision check with respect to 
    time of day, day of week, and routine service and adjustments is 
    encouraged where possible.
        3.1.1.2  Open path analyzers are tested by inserting a test cell 
    containing a precision check gas concentration into the optical 
    measurement beam of the instrument. If possible, the normally used 
    transmitter, receiver, and as appropriate, reflecting devices should 
    be used during the test, and the normal monitoring configuration of 
    the instrument should be altered as little as possible to 
    accommodate the test cell for the test. However, if permitted by the 
    associated operation or instruction manual, an alternate local light 
    source or an alternate optical path that does not include the normal 
    atmospheric monitoring path may be used. The actual concentration of 
    the precision check gas in the test cell must be selected to produce 
    an ``effective concentration'' in the range specified above. 
    Generally, the precision test concentration measurement will be the 
    sum of the atmospheric pollutant concentration and the precision 
    test concentration. If so, the result must be corrected to remove 
    the atmospheric concentration contribution. The ``corrected 
    concentration'' is obtained by subtracting the average of the 
    atmospheric concentrations measured by the open path instrument 
    under test immediately before and immediately after the precision 
    check test from the precision test concentration measurement. If the 
    difference between these before and after measurements is greater 
    than 20 percent of the effective concentration of the test gas, 
    discard the test result and repeat the test. If possible, open path 
    analyzers should be tested during periods when the atmospheric 
    pollutant concentrations are relatively low and steady.
        3.1.1.3  Report the actual concentration (effective 
    concentration for open path analyzers) of the precision check gas 
    and the corresponding concentration measurement (corrected 
    concentration, if applicable, for open path analyzers) indicated by 
    the analyzer. The percent differences between these concentrations 
    are used to assess the precision of the monitoring data as described 
    in section 5.1.
        3.1.2  Methods for particulate matter. A one-point precision 
    check must be performed at least once every two weeks on each 
    automated analyzer used to measure PM10 and PM2.5. The 
    precision check is made by checking the operational flow rate of the 
    analyzer. If a precision flow rate check is made in conjunction with 
    a flow rate adjustment, it must be made prior to such flow rate 
    adjustment. Randomization of the precision check with respect to 
    time of day, day of week, and routine service and adjustments is 
    encouraged where possible.
        3.1.2.1  Standard procedure: Use a flow rate transfer standard 
    certified in accordance with section 2.3.3 to check the analyzer's 
    normal flow rate. Care should be used in selecting and using the 
    flow rate measurement device such that it does not alter the normal 
    operating flow rate of the analyzer. Report the actual analyzer flow 
    rate measured by the transfer standard and the corresponding flow 
    rate measured, indicated, or assumed by the analyzer.
        3.1.2.2  Alternative procedure:
        3.1.2.2.1  It is permissible to obtain the precision check flow 
    rate data from the analyzer's internal flow meter without the use of 
    an external flow rate transfer standard, provided that--
        3.1.2.2.1.1  the flow meter is audited with an external flow 
    rate transfer standard at least every 6 months;
        3.1.2.2.1.2  records of at least the 3 most recent flow audits 
    of the instrument's internal flow meter over at least several weeks 
    confirm that the flow meter is stable, verifiable and accurate to 
    4%; and
        3.1.2.2.1.3  the instrument and flow meter give no indication of 
    improper operation.
        3.1.2.2.2  With suitable communication capability, the precision 
    check may thus be carried out remotely. For this procedure, report 
    the set-point flow rate as the ``actual flow rate'' along with the 
    flow rate measured or indicated by the analyzer flow meter.
        3.1.2.2.3  For either procedure, the percent differences between 
    the actual and indicted flow rates are used to assess the precision 
    of the monitoring data as described in section 5.1 of this Appendix 
    A (using flow rates in lieu of concentrations). The percent 
    differences between these concentrations are used to assess the 
    precision of the monitoring data as described in section 5.1.
        3.2  Accuracy of Automated Methods.
        3.2.1  Methods for SO2, NO2, O3, or CO.
        3.2.1.1  Each calendar quarter (during which analyzers are 
    operated), audit at least 25 percent of the SLAMS analyzers that 
    monitor for SO2, NO2, O3, or CO such that each 
    analyzer is audited at least once per year. If there are fewer than 
    four analyzers for a pollutant within a reporting organization, 
    randomly reaudit one or more analyzers so that at least one analyzer 
    for that pollutant is audited each calendar quarter. Where possible, 
    EPA strongly encourages more frequent auditing, up to an audit 
    frequency of once per quarter for each SLAMS analyzer.
        3.2.1.2  The audit is made by challenging the analyzer with at 
    least one audit gas of known concentration (effective concentration 
    for open path analyzers) from each of the following ranges 
    applicable to the analyzer being audited:
    
    ------------------------------------------------------------------------
                                               Concentration range, ppm     
                  Audit level            -----------------------------------
                                            SO2, O3        NO2         CO   
    ------------------------------------------------------------------------
    1...................................    0.03-0.08    0.03-0.08      3-8 
    2...................................    0.15-0.20    0.15-0.20     15-20
    3...................................    0.35-0.45    0.35-0.45     35-45
    4...................................    0.80-0.90  ...........     80-90
    ------------------------------------------------------------------------
    
    
    [[Page 65851]]
    
        NO2 audit gas for chemiluminescence-type NO2 analyzers 
    must also contain at least 0.08 ppm NO.
        3.2.1.3  NO concentrations substantially higher than 0.08 ppm, 
    as may occur when using some gas phase titration (GPT) techniques, 
    may lead to audit errors in chemiluminescence analyzers due to 
    inevitable minor NO-NOX channel imbalance. Such errors may be 
    atypical of routine monitoring errors to the extent that such NO 
    concentrations exceed typical ambient NO concentrations at the site. 
    These errors may be minimized by modifying the GPT technique to 
    lower the NO concentrations remaining in the NO2 audit gas to 
    levels closer to typical ambient NO concentrations at the site.
        3.2.1.4  To audit SLAMS analyzers operating on ranges higher 
    than 0 to 1.0 ppm for SO2, NO2, and O3 or 0 to 100 
    ppm for CO, use audit gases of appropriately higher concentration as 
    approved by the appropriate Regional Administrator or the 
    Administrators's designee. The results of audits at concentration 
    levels other than those shown in the above table need not be 
    reported to EPA.
        3.2.1.5  The standards from which audit gas test concentrations 
    are obtained must meet the specifications of section 2.3. The gas 
    standards and equipment used for auditing must not be the same as 
    the standards and equipment used for calibration or calibration span 
    adjustments. The auditor should not be the operator or analyst who 
    conducts the routine monitoring, calibration, and analysis.
        3.2.1.6  For point analyzers, the audit shall be carried out by 
    allowing the analyzer to analyze the audit test atmosphere in its 
    normal sampling mode such that the test atmosphere passes through 
    all filters, scrubbers, conditioners, and other sample inlet 
    components used during normal ambient sampling and as much of the 
    ambient air inlet system as is practicable. The exception provided 
    in section 3.1 for certain CO analyzers does not apply for audits.
        3.2.1.7  Open path analyzers are audited by inserting a test 
    cell containing the various audit gas concentrations into the 
    optical measurement beam of the instrument. If possible, the 
    normally used transmitter, receiver, and, as appropriate, reflecting 
    devices should be used during the audit, and the normal monitoring 
    configuration of the instrument should be modified as little as 
    possible to accommodate the test cell for the audit. However, if 
    permitted by the associated operation or instruction manual, an 
    alternate local light source or an alternate optical path that does 
    not include the normal atmospheric monitoring path may be used. The 
    actual concentrations of the audit gas in the test cell must be 
    selected to produce ``effective concentrations'' in the ranges 
    specified in this section 3.2. Generally, each audit concentration 
    measurement result will be the sum of the atmospheric pollutant 
    concentration and the audit test concentration. If so, the result 
    must be corrected to remove the atmospheric concentration 
    contribution. The ``corrected concentration'' is obtained by 
    subtracting the average of the atmospheric concentrations measured 
    by the open path instrument under test immediately before and 
    immediately after the audit test (or preferably before and after 
    each audit concentration level) from the audit concentration 
    measurement. If the difference between the before and after 
    measurements is greater than 20 percent of the effective 
    concentration of the test gas standard, discard the test result for 
    that concentration level and repeat the test for that level. If 
    possible, open path analyzers should be audited during periods when 
    the atmospheric pollutant concentrations are relatively low and 
    steady. Also, the monitoring path length must be reverified to 
    within 3 percent to validate the audit, since the 
    monitoring path length is critical to the determination of the 
    effective concentration.
        3.2.1.8  Report both the actual concentrations (effective 
    concentrations for open path analyzers) of the audit gases and the 
    corresponding concentration measurements (corrected concentrations, 
    if applicable, for open path analyzers) indicated or produced by the 
    analyzer being tested. The percent differences between these 
    concentrations are used to assess the accuracy of the monitoring 
    data as described in section 5.2.
        3.2.2  Methods for particulate matter.
        3.2.2.1  Each calendar quarter, audit the flow rate of each 
    SLAMS PM2.5 analyzer and at least 25 percent of the SLAMS 
    PM10 analyzers such that each PM10 analyzer is audited at 
    least once per year. If there are fewer than four PM10 analyzers 
    within a reporting organization, randomly re-audit one or more 
    analyzers so that at least one analyzer is audited each calendar 
    quarter. Where possible, EPA strongly encourages more frequent 
    auditing, up to an audit frequency of once per quarter for each 
    SLAMS analyzer.
        3.2.2.2  The audit is made by measuring the analyzer's normal 
    operating flow rate, using a flow rate transfer standard certified 
    in accordance with section 2.3.3. The flow rate standard used for 
    auditing must not be the same flow rate standard used to calibrate 
    the analyzer. However, both the calibration standard and the audit 
    standard may be referenced to the same primary flow rate or volume 
    standard. Great care must be used in auditing the flow rate to be 
    certain that the flow measurement device does not alter the normal 
    operating flow rate of the analyzer. Report the audit (actual) flow 
    rate and the corresponding flow rate indicated or assumed by the 
    sampler. The percent differences between these flow rates are used 
    to calculate accuracy as described in section 5.4.1.
        3.3  Precision of Manual Methods.
        3.3.1  For each network of manual methods other than for 
    PM2.5, select one or more monitoring sites within the reporting 
    organization for duplicate, collocated sampling as follows: for 1 to 
    5 sites, select 1 site; for 6 to 20 sites, select 2 sites; and for 
    over 20 sites, select 3 sites. For each network of manual methods 
    for PM2.5, select one or more monitoring sites within the 
    reporting organization for duplicate, collocated sampling as 
    follows: for 1 to 10 sites, select 1 site; for 11 to 20 sites, 
    select 2 sites; and for over 20 sites, select 3 sites. Where 
    possible, additional collocated sampling is encouraged. For purposes 
    of precision assessment, networks for measuring TSP, PM10, and 
    PM2.5 shall be considered separately from one another. Sites 
    having annual mean particulate matter concentrations among the 
    highest 25 percent of the annual mean concentrations for all the 
    sites in the network must be selected or, if such sites are 
    impractical, alternative sites approved by the Regional 
    Administrator may be selected.
        3.3.2  In determining the number of collocated sites required 
    for PM10, monitoring networks for lead should be treated 
    independently from networks for particulate matter, even though the 
    separate networks may share one or more common samplers. However, a 
    single pair of samplers collocated at a common-sampler monitoring 
    site that meets the requirements for both a collocated lead site and 
    a collocated particulate matter site may serve as a collocated site 
    for both networks.
        3.3.3  In determining the number of collocated sites required 
    for PM2.5, monitoring networks for visibility should not be 
    treated independently from networks for particulate matter, as the 
    separate networks may share one or more common samplers. However, 
    for class I visibility areas, EPA will accept visibility aerosol 
    mass measurement in lieu of a PM2.5 measurement if the latter 
    measurement is unavailable.
        3.3.4  The two collocated samplers must be within 4 meters of 
    each other, and particulate matter samplers must be at least 2 
    meters apart to preclude airflow interference. Calibration, 
    sampling, and analysis must be the same for both collocated samplers 
    and the same as for all other samplers in the network.
        3.3.5  For each pair of collocated samplers, designate one 
    sampler as the primary sampler whose samples will be used to report 
    air quality for the site, and designate the other as the duplicate 
    sampler. The paired samplers must each have the same designation 
    number. Each duplicate sampler must be operated concurrently with 
    its associated routine sampler at least once per week. The operation 
    schedule should be selected so that the sampling days are 
    distributed evenly over the year and over the seven days of the 
    week. The every-6-day schedule used by many monitoring agencies is 
    recommended. Report the measurements from both samplers at each 
    collocated sampling site, including measurements falling below the 
    limits specified in 5.3.1. The percent differences in measured 
    concentration (g/m\3\) between the two collocated samplers 
    are used to calculate precision as described in section 5.3.
        3.4  Accuracy of Manual Methods. The accuracy of manual sampling 
    methods is assessed by auditing a portion of the measurement 
    process. For particulate matter methods, the flow rate during sample 
    collection is audited. For SO2 and NO2 methods, the 
    analytical measurement is audited. For Pb methods, the flow rate and 
    analytical measurement are audited.
        3.4.1  Methods for PM2.5 and PM10.
        3.4.1.1  Each calendar quarter, audit the flow rate of each 
    PM2.5 sampler and audit at least 25 percent of the PM10 
    samplers such
    
    [[Page 65852]]
    
    that each PM10 sampler is audited at least once per year. If 
    there are fewer than four PM10 samplers within a reporting 
    organization, randomly reaudit one or more samplers so that one 
    sampler is audited each calendar quarter. Audit each sampler at its 
    normal operating flow rate, using a flow rate transfer standard 
    certified in accordance with section 2.3.3. The flow rate standard 
    used for auditing must not be the same flow rate standard used to 
    calibrate the sampler. However, both the calibration standard and 
    the audit standard may be referenced to the same primary flow rate 
    standard. The flow audit should be scheduled so as to avoid 
    interference with a scheduled sampling period. Report the audit 
    (actual) flow rate and the corresponding flow rate indicated by the 
    sampler's normally used flow indicator. The percent differences 
    between these flow rates are used to calculate accuracy as described 
    in section 5.4.1.
        3.4.1.2  Great care must be used in auditing high-volume 
    particulate matter samplers having flow regulators because the 
    introduction of resistance plates in the audit flow standard device 
    can cause abnormal flow patterns at the point of flow sensing. For 
    this reason, the flow audit standard should be used with a normal 
    filter in place and without resistance plates in auditing flow-
    regulated high-volume samplers, or other steps should be taken to 
    assure that flow patterns are not perturbed at the point of flow 
    sensing.
        3.4.2  SO2 Methods.
        3.4.2.1  Prepare audit solutions from a working sulfite-
    tetrachloromercurate (TCM) solution as described in section 10.2 of 
    the SO2 Reference Method (appendix A of part 50 of this 
    chapter). These audit samples must be prepared independently from 
    the standardized sulfite solutions used in the routine calibration 
    procedure. Sulfite-TCM audit samples must be stored between 0 and 5 
    deg.C and expire 30 days after preparation.
        3.4.2.2  Prepare audit samples in each of the concentration 
    ranges of 0.2-0.3, 0.5-0.6, and 0.8-0.9 g SO2/ml. 
    Analyze an audit sample in each of the three ranges at least once 
    each day that samples are analyzed and at least twice per calendar 
    quarter. Report the audit concentrations (in g SO2/ml) 
    and the corresponding indicated concentrations (in g 
    SO2/ml). The percent differences between these concentrations 
    are used to calculate accuracy as described in section 5.4.2.
        3.4.3  NO2 Methods. Prepare audit solutions from a working 
    sodium nitrite solution as described in the appropriate equivalent 
    method (see Reference 8). These audit samples must be prepared 
    independently from the standardized nitrite solutions used in the 
    routine calibration procedure. Sodium nitrite audit samples expire 
    in 3 months after preparation. Prepare audit samples in each of the 
    concentration ranges of 0.2-0.3, 0.5-0.6, and 0.8-0.9 g 
    NO2/ml. Analyze an audit sample in each of the three ranges at 
    least once each day that samples are analyzed and at least twice per 
    calendar quarter. Report the audit concentrations (in g 
    NO2/ml) and the corresponding indicated concentrations (in 
    g NO2/ml). The percent differences between these 
    concentrations are used to calculate accuracy as described in 
    section 5.4.2.
        3.4.4  Pb Methods.
        3.4.4.1  For the Pb Reference Method (appendix G of part 50 of 
    this chapter), the flow rates of the high-volume Pb samplers shall 
    be audited as part of the TSP network using the same procedures 
    described in Section 3.4.1. For agencies operating both TSP and Pb 
    networks, 25 percent of the total number of high-volume samplers are 
    to be audited each quarter.
        3.4.4.2  Each calendar quarter, audit the Pb Reference Method 
    analytical procedure using glass fiber filter strips containing a 
    known quantity of Pb. These audit sample strips are prepared by 
    depositing a Pb solution on unexposed glass fiber filter strips of 
    dimensions 1.9 cm by 20.3 cm (\3/4\ inch by 8 inch) and allowing 
    them to dry thoroughly. The audit samples must be prepared using 
    batches of reagents different from those used to calibrate the Pb 
    analytical equipment being audited. Prepare audit samples in the 
    following concentration ranges:
    
    ------------------------------------------------------------------------
                                                               Equivalent   
                                                 Pb            ambient Pb   
                    Range                  concentration,    concentration, 
                                          g/strip    g/m3 
    ---------------------------------------------------------------\1\------
    1...................................           100-300           0.5-1.5
    2...................................          600-1000          3.0-5.0 
    ------------------------------------------------------------------------
    \1\ Equivalent ambient Pb concentration in g/m3 is based on    
      sampling at 1.7 m3/min for 24 hours on a 20.3 cm x 25.4 cm (8 inch x  
      10 inch) glass fiber filter.                                          
    
        3.4.4.3  Audit samples must be extracted using the same 
    extraction procedure used for exposed filters.
        3.4.4.4  Analyze three audit samples in each of the two ranges 
    each quarter samples are analyzed. The audit sample analyses shall 
    be distributed as much as possible over the entire calendar quarter. 
    Report the audit concentrations (in g Pb/strip) and the 
    corresponding measured concentrations (in g Pb/strip) using 
    unit code 77. The percent differences between the concentrations are 
    used to calculate analytical accuracy as described in section 5.4.2.
        3.4.4.5  The accuracy of an equivalent Pb method is assessed in 
    the same manner as for the reference method. The flow auditing 
    device and Pb analysis audit samples must be compatible with the 
    specific requirements of the equivalent method.
    
    4. Reporting Requirements
    
        For each pollutant, prepare a list of all monitoring sites and 
    their AIRS site identification codes in each reporting organization 
    and submit the list to the appropriate EPA Regional Office, with a 
    copy to AIRS-AQS. Whenever there is a change in this list of 
    monitoring sites in a reporting organization, report this change to 
    the Regional Office and to AIRS-AQS.
        4.1  Quarterly Reports. For each quarter, each reporting 
    organization shall report to AIRS-AQS directly (or via the 
    appropriate EPA Regional Office for organizations not direct users 
    of AIRS) the results of all valid precision and accuracy tests it 
    has carried out during the quarter. The quarterly reports of 
    precision and accuracy data must be submitted consistent with the 
    data reporting requirements specified for air quality data as set 
    forth in Sec. 58.35(c). Each organization shall report all 
    collocated measurements including those falling below the levels 
    specified in section 5.3.1. Report results from invalid tests, from 
    tests carried out during a time period for which ambient data 
    immediately prior or subsequent to the tests were invalidated for 
    appropriate reasons, and from tests of methods or analyzers not 
    approved for use in SLAMS monitoring networks under Appendix C of 
    this part. Such data should be flagged so that it will not be 
    utilized for quantitative assessment of precision and accuracy.
        4.2  Annual Reports.
        4.2.1  When precision and accuracy estimates for a reporting 
    organization have been calculated for all four quarters of the 
    calendar year, EPA will calculate the properly weighted probability 
    limits for precision and accuracy for the entire calendar year. 
    These limits will then be associated with the data submitted in the 
    annual SLAMS report required by Sec. 58.26.
        4.2.2  Each reporting organization shall submit, along with its 
    annual SLAMS report, a listing by pollutant of all monitoring sites 
    in the reporting organization.
    
    5. Calculations for Data Quality Assessment
    
        Calculation of estimates of integrated precision and accuracy 
    are carried out by EPA according to the following procedures. 
    Reporting organizations should report the results of individual 
    precision and accuracy tests as specified in sections 3 and 4 of 
    this appendix even though they may elect to perform some or all of 
    the calculations in this section on their own.
        5.1  Precision of Automated Methods. Estimates of the precision 
    of automated methods are calculated from the results of biweekly 
    precision checks as specified in section 3.1. At the end of each 
    calendar quarter, an integrated precision probability interval for 
    all SLAMS analyzers in the organization is calculated for each 
    pollutant.
        5.1.1  Single Analyzer Precision.
        5.1.1.1  The percent difference (di) for each precision 
    check is calculated using equation 1, where Yi is the 
    concentration indicated by the analyzer for the I-th precision check 
    and Xi is the known concentration for the I-th precision check.
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.125
    
    
        5.1.1.2  For each analyzer, the quarterly average (dj) is 
    calculated with equation 2, and the standard deviation (Sj) 
    with equation 3, where n is the number of precision checks on the 
    instrument made during the calendar quarter. For example, n should 
    be 6 or 7 if precision checks are made biweekly during a quarter.
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.126
    
    
    [[Page 65853]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.127
    
    
    
        5.1.2  Precision for Reporting Organization.
        5.1.2.1  For each pollutant, the average of averages (D) and the 
    pooled standard deviation (Sa) are calculated for all analyzers 
    audited for the pollutant during the quarter, using either equations 
    4 and 5 or 4a and 5a, where k is the number of analyzers audited 
    within the reporting organization for a single pollutant.
    [GRAPHIC] [TIFF OMITTED] TP13DE96.128
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.129
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.130
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.131
    
    
        5.1.2.2  Equations 4 and 5 are used when the same number of 
    precision checks are made for each analyzer. Equations 4a and 5a are 
    used to obtain a weighted average and a weighted standard deviation 
    when different numbers of precision checks are made for the 
    analyzers.
        5.1.2.3  For each pollutant, the 95 Percent Probability Limits 
    for the precision of a reporting organization are calculated using 
    equations 6 and 7.
    
    Upper 95 Percent Probability
    
    Limit=D+1.96 Sa    (6)
    
    Lower 95 Percent Probability
    
    Limit=D-1.96 Sa    (7)
        5.2  Accuracy of Automated Methods. Estimates of the accuracy of 
    automated methods are calculated from the results of independent 
    audits as described in section 3.2. At the end of each calendar 
    quarter, an integrated accuracy probability interval for all SLAMS 
    analyzers audited in the reporting organization is calculated for 
    each pollutant. Separate probability limits are calculated for each 
    audit concentration level in section 3.2.
        5.2.1  Single Analyzer Accuracy. The percentage difference 
    (di) for each audit concentration is calculated using equation 
    1, where Yi is the analyzer's indicated concentration 
    measurement from the I-th audit check and Xi is the actual 
    concentration of the audit gas used for the I-th audit check.
        5.2.2  Accuracy for Reporting Organization.
        5.2.2.1  For each audit concentration level of a particular 
    pollutant, the average (D) of the individual percentage differences 
    (di) for all n analyzers audited during the quarter is 
    calculated using equation 8.
    [GRAPHIC] [TIFF OMITTED] TP13DE96.132
    
    
        5.2.2.2  For each concentration level of a particular pollutant, 
    the standard deviation (Sa) of all the individual percentage 
    differences for all n analyzers audited during the quarter is 
    calculated, using equation 9.
    [GRAPHIC] [TIFF OMITTED] TP13DE96.133
    
    
        5.2.2.3  For reporting organizations having four or fewer 
    analyzers for a particular pollutant, only one audit is required 
    each quarter. For such reporting organizations, the audit results of 
    two consecutive quarters are required to calculate an average and a 
    standard deviation, using equations 8 and 9. Therefore, the 
    reporting of probability limits shall be on a semiannual (instead of 
    a quarterly) basis.
        5.2.2.4  For each pollutant, the 95 Percent Probability Limits 
    for the accuracy of a reporting organization are calculated at each 
    audit concentration level using equations 6 and 7.
        5.3  Precision of Manual Methods. Estimates of precision of 
    manual methods are calculated from the results obtained from 
    collocated samplers as described in section 3.3. At the end of each 
    calendar quarter, an integrated precision probability interval for 
    all collocated samplers operating in the reporting organization is 
    calculated for each manual method network.
        5.3.1  Single Sampler Precision.
        5.3.1.1  At low concentrations, agreement between the 
    measurements of collocated samplers, expressed as percent 
    differences, may be relatively poor. For this reason, collocated 
    measurement pairs are selected for use in the precision calculations 
    only when both measurements are above the following limits:
    
    TSP: 20 g/m3;
    SO2: 45 g/m3;
    NO2: 30 g/m3;
    Pb: 0.15 g/m3;
    PM10: 20 g/m3; and
    PM2.5: 6 g/m3.
    
        5.3.1.2  For each selected measurement pair, the percent 
    difference (di) is calculated, using equation 10,
    
    [[Page 65854]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.134
    
    
    
    where Yi is the pollutant concentration measurement obtained 
    from the duplicate sampler and Xi is the concentration 
    measurement obtained from the primary sampler designated for 
    reporting air quality for the site. For each site, the quarterly 
    average percent difference (dj) is calculated from equation 2 
    and the standard deviation (Sj) is calculated from equation 3, 
    where n=the number of selected measurement pairs at the site.
        5.3.2  Precision for Reporting Organization.
        5.3.2.1  For each pollutant, the average percentage difference 
    (D) and the pooled standard deviation (Sa) are calculated, 
    using equations 4 and 5, or using equations 4a and 5a if different 
    numbers of paired measurements are obtained at the collocated sites. 
    For these calculations, the k of equations 4, 4a, 5 and 5a is the 
    number of collocated sites.
        5.3.2.2  The 95 Percent Probability Limits for the integrated 
    precision for a reporting organization are calculated using 
    equations 11 and 12.
    
    Upper 95 Percent Probability
    [GRAPHIC] [TIFF OMITTED] TP13DE96.135
    
    
    Lower 95 Percent Probability
    
    Limit=D  1.96 Sa/2    (12)
    
        5.4  Accuracy of Manual Methods. Estimates of the accuracy of 
    manual methods are calculated from the results of independent audits 
    as described in section 3.4. At the end of each calendar quarter, an 
    integrated accuracy probability interval is calculated for each 
    manual method network operated by the reporting organization.
        5.4.1  Particulate Matter Samplers other than PM2.5 
    (including reference method Pb samplers).
        5.4.1.1  Single Sampler Accuracy. For the flow rate audit 
    described in Section 3.4.1, the percentage difference (di) for 
    each audit is calculated using equation 1, where Xi represents 
    the known flow rate and Yi represents the flow rate indicated 
    by the sampler.
        5.4.1.2  Accuracy for Reporting Organization. For each type of 
    particulate matter measured (e.g., TSP/Pb), the average (D) of the 
    individual percent differences for all similar particulate matter 
    samplers audited during the calendar quarter is calculated using 
    equation 8. The standard deviation (Sa) of the percentage 
    differences for all of the similar particulate matter samplers 
    audited during the calendar quarter is calculated using equation 9. 
    The 95 percent probability limits for the integrated accuracy for 
    the reporting organization are calculated using equations 6 and 7. 
    For reporting organizations having four or fewer particulate matter 
    samplers of one type, only one audit is required each quarter, and 
    the audit results of two consecutive quarters are required to 
    calculate an average and a standard deviation. In that case, 
    probability limits shall be reported semi-annually rather than 
    quarterly.
        5.4.2  Analytical Methods for SO2, NO2, and Pb.
        5.4.2.1  Single Analysis-Day Accuracy. For each of the audits of 
    the analytical methods for SO2, NO2, and Pb described in 
    sections 3.4.2, 3.4.3, and 3.4.4, the percentage difference 
    (dj) at each concentration level is calculated using equation 
    1, where Xj represents the known value of the audit sample and 
    Yj represents the value of SO2, NO2, or Pb indicated 
    by the analytical method.
        5.4.2.1  Accuracy for Reporting Organization. For each 
    analytical method, the average (D) of the individual percent 
    differences at each concentration level for all audits during the 
    calendar quarter is calculated using equation 8. The standard 
    deviation (Sa) of the percentage differences at each 
    concentration level for all audits during the calendar quarter is 
    calculated using equation 9. The 95 percent probability limits for 
    the accuracy for the reporting organization are calculated using 
    equations 6 and 7.
    
    6.0  Annual Operational Evaluation of PM2.5 Methods.
    
        All PM2.5 monitoring methods or analyzers used in SLAMS 
    shall be evaluated annually, as described in this section, to 
    quantitatively assess the quality of the SLAMS data being routinely 
    produced. This evaluation is derived from the results of collocated 
    PM2.5 measurements made at each monitoring station at least 6 
    times per year and applies to both automated and manual methods. 
    Individual samplers or monitors are screened for bias and excessive 
    imprecision. Estimates of integrated measurement precision and 
    accuracy, in the form of 95 percent probability limits, for each 
    designated PM2.5 method are determined for each reporting 
    organization and on a national basis. Reporting organizations are 
    defined as in section 3 of this Appendix. The results of the latter 
    evaluation shall be used to review instrument and reporting 
    organization performance. The absolute value of the 95 percent 
    probability limits on a national basis for each designated method 
    must be within 15 percent for the method to maintain its reference 
    or equivalent method designation.
        6.1  Operational field test audits. For each SLAMS PM2.5 
    monitor, collocate a PM2.5 reference method sampler, referred 
    to as an ``audit sampler,'' and operate it simultaneously with the 
    SLAMS monitor at least 6 times per year. These collocated audits are 
    required even for SLAMS PM2.5 monitors located at sites that 
    have a collocated PM2.5 monitor as required under section 3.3 
    of this appendix, unless the collocated monitor is a PM2.5 
    reference method sampler and is a designated audit device as 
    described in the Section 2.12 of the Quality Assurance Handbook 
    (Reference 7). The collocated audit sampler shall be located between 
    2 and 4 meters from the SLAMS monitor, with its inlet at the same 
    height above ground as the inlet of the SLAMS monitor. Calibration 
    and operation of the audit sampler and analysis of the audit sample 
    filter shall be as specified in the sampler's operation or 
    instruction manual and in general accordance with the guidance 
    provided in Section 2.12 of Reference 7. Calibration and operation 
    of the SLAMS monitor shall be the same as for its routine SLAMS 
    operation, and it shall not receive any special or non-scheduled 
    service immediately prior to, or specifically associated with, the 
    collocated sample collection. The 6 or more collocated PM2.5 
    measurement pairs shall be obtained at approximately equal intervals 
    over the year, such as every other month, and shall be reported to 
    the EPA as set forth in Section 4 of this Appendix for other 
    precision and accuracy test results. All collocated measurements 
    shall be reported, even those which might be considered invalid 
    because of identified malfunctions or other problems occurring 
    during the sample collection period. Collocated measurements shall 
    be reported to EPA only for methods and analyzers approved for use 
    in SLAMS monitoring under part 58 of this chapter. The EPA will 
    calculate annual evaluations from the reported test measurements, as 
    described in sections 6.2 and 6.3.
        6.2  Screening Test for Bias and Excessive Imprecision of 
    Individual Monitors. This section describes a simple test, based on 
    the
    
    [[Page 65855]]
    
    binomial distribution, that checks for gross bias or inadequate 
    precision in the field operation of either the SLAMS monitor or the 
    audit sampler. However, since the audit sampler is a reference method, 
    the test results apply primarily to the SLAMS monitor. The test uses 
    the collocated audit measurements described in section 6.1, and may be 
    used with 4 to 12 measurement pairs.
        6.2.1  (1) For the annual evaluation, the EPA will calculate the 
    relative percent difference (RPD) for each measurement pair obtained 
    for the year as:
    [GRAPHIC] [TIFF OMITTED] TP13DE96.137
    
    
    where
    C = the concentration measured by the SLAMS monitor, and
    Caudit = the concentration measured by the audit sampler.
        (2) All collocated measurements will be used for this test, even 
    those which might be considered invalid because of identified 
    malfunctions or other problems occurring during the sample 
    collection period.
        6.2.2  There are three situations that can develop from 
    analyzing the collocated data:
        Situation A: All the RPD's are within 15% in absolute value. For 
    situation A, the SLAMS monitor shows no indication of bias or 
    inadequate precision and therefore passes this screening test.
        Situation B: Some or all of the RPD's are extreme in that they 
    exceed 15% in absolute value, and the extreme RPD's all have the 
    same sign (for example, -19, -21, -16). This may indicate a bias. 
    For situation B, Table A-2 specifies the minimum number of extreme 
    RPD's, all having the same sign, that indicates that the SLAMS 
    monitor has a significant, unacceptable bias with respect to the 
    audit reference method.
        Situation C: Some or all of the RPD's are extreme in that they 
    exceed 15% in absolute value, and the extreme RPD's do not all have 
    the same sign (for example, -17, +19, -18). This may indicate 
    unacceptable precision. For situation C, Table A-2 specifies the 
    minimum number of extreme RPD's, all not having the same sign, that 
    indicate that the SLAMS monitor has excessive imprecision with 
    respect to the audit reference method.
        6.2.3  If either bias (Situation B) or excessive imprecision 
    (Situation C) is indicated by this screening test for a particular 
    SLAMS monitor, the reporting organization will be notified by the 
    EPA within 60 days after the end of the year that no monitors of the 
    type (identified by its reference or equivalent method designation 
    number) that failed the screening test shall be used for further 
    SLAMS monitoring at any SLAMS site in the reporting organization 
    unless and until the probable cause or causes of the test failure 
    have been identified and corrected, the correction has been 
    appropriately addressed in the applicable quality assurance plan, 
    and the organization has received approval by the EPA Regional 
    Office to resume use of monitors of the type identified for SLAMS 
    purposes. General guidance in identifying and correcting common or 
    typical types of such quality assurance problems for reference 
    methods and Class I equivalent methods is provided in section 2.12 
    of Reference 7 of this appendix.
    
    Table A-2.--Table for Determining Bias or Excessive Inadequate Precision
                               for Screening Test                           
    ------------------------------------------------------------------------
                                                                 Situation C
                                                    Situation B    Number of
                                                      Number of    RPD's of 
                                                      RPD's of     absolute 
                                                      absolute    value over
                                                     value over    15%--all 
                                                      15%--all    not having
              Number of measurement pairs            having the    the same 
                                                    same sign--   sign--that
                                                        that       indicate 
                                                      indicate    excessive 
                                                    significant  imprecision
                                                    bias of the     of the  
                                                       SLAMS        SLAMS   
                                                      monitor      monitor  
    ------------------------------------------------------------------------
    4.............................................            2            3
    5.............................................            2            3
    6.............................................            3            4
    7.............................................            3            4
    8.............................................            3            4
    9.............................................            3            5
    10............................................            4            5
    11............................................            4            5
    12............................................            4            6
    ------------------------------------------------------------------------
    
        6.2.4  The basis of this test is as follows:
        6.2.4.1  For both instruments, the precision is assumed to be a 
    percentage of the concentration being measured. The distributions of 
    the instruments measurements are assumed to be normal, with an 
    operating precision (1.96  x  standard deviation) of no more than 
    15%. The relative percent difference (RPD) is then approximately 
    normally distributed, with a standard deviation of about 15  x  
    sqrt(2)/1.96 = 10.7%. Thus, the absolute value of RPD will exceed 
    15% approximately 20% of the time.
        6.2.4.2  In the first situation (situation A), all the RPD's are 
    within 15% in absolute value, and the performance is acceptable.
        6.2.4.3  When encountering a situation where RPD's are to one 
    extreme or the other (situation B), one can set up the following 
    hypotheses. Null Hypothesis: The mean measurements of both 
    instruments are the same. Alternative Hypothesis: The mean of 
    measurement of the SLAMS instrument is higher (lower) than the mean 
    measurement of the audit instrument. The test of these hypotheses is 
    based on the binomial distribution. Table A-2 gives the number of 
    extreme values, for various numbers of measurement pairs, that would 
    lead to a rejection of the null hypothesis in favor of the 
    alternative hypothesis.
        6.2.4.4  When encountering the situation where RPD's are extreme 
    in both directions (situation C), one can set up the following 
    hypotheses. Null Hypothesis: The precisions of both instruments are 
    less than or equal to 15% (2-sigma). Alternative Hypothesis: The 
    precision of at least one instrument exceeds 15%. Again, the test is 
    based on the binomial distribution, and Table A-2 gives the number 
    of extreme values, for various numbers of measurement pairs, that 
    would lead to a rejection of the null hypothesis in favor of the 
    alternative hypothesis.
        6.2.4.5  These tests described above are stringent, using 
    p=0.01, meaning that less than 1 time out of 100 would one expect to 
    find the result randomly.
        6.2.4.6  As an example, suppose one takes 6 pairs of 
    simultaneous measurements and finds that 4 of the 6 RPD's for the 
    SLAMS monitor are greater than 15% and none of the remaining two 
    RPD's are below--15%. Since there are 4 RPD's with absolute value 
    above 15% and they all have the same sign (i.e. they are all above 
    15%), this example would be situation B. Table A-2 indicates that 
    for situation B with 6 measurement pairs, 3 or more extreme RPD's 
    means that the SLAMS monitor is biased (in this case, higher) than 
    the audit (reference) method.
        6.3  Integrated Precision and Accuracy for Reporting 
    Organizations and for Specific Methods.
        This section describes how integrated estimates of monitoring 
    data quality are calculated for specific monitoring methods (as 
    identified by a unique reference or equivalent method designation 
    number) on a national basis and for each reporting organization. 
    These estimates are based on the collocated audit measurements 
    described in section 6.1.
        6.3.1  Annual evaluation. Using the collocated measurement pair 
    data, as described in Section 6.1 for the applicable year, the EPA 
    shall determine the operating precision for each designated method, 
    on a national basis and for each reporting organization, as follows:
        6.3.1.1.  For each monitoring station for which PM2.5 data 
    has been reported to AIRS during the year, calculate the percent 
    difference (di) for each measurement pair using equation 1 in 
    section 5.1.1 of this Appendix, where Yi is the concentration 
    measurement from the SLAMS monitor for the I-th audit measurement 
    pair, Xi is the concentration measurement from the audit 
    sampler. Include only stations at which at least 4 collocated 
    measurement pairs are available for the year, and only measurement 
    pairs in which Xi is above the limit for PM2.5 specified 
    in section 5.3.1 of this Appendix.
        6.3.1.2  For each monitoring station for which PM2.5 data 
    has been reported to AIRS, calculate the average (dj) and the 
    standard deviation (Sj) for the year for each station at which 
    the method is used for SLAMS monitoring, using equations 2 and 3
    
    [[Page 65856]]
    
    (respectively) in section 5.1.1 of this Appendix, where n is the 
    number of measurement pairs reported for the year. Include only 
    stations at which at least 4 collocated measurement pairs are 
    available for the year.
        6.3.1.3  For each designated method and for each reporting 
    organization, calculate the average of averages (D) and the pooled 
    estimate of standard deviation (Sa), using equations 4a and 5a 
    (respectively) of Section 5.1.2, where k in this case is the number 
    of stations in the reporting organization at which the method is 
    used for SLAMS monitoring (and at least 4 measurement pairs are 
    reported). Call these estimates DR,M and SR,M, where R 
    identifies the reporting organization and M identifies the 
    designated method.
        6.3.1.4  For each designated method, calculate the average of 
    averages (D) and the pooled standard deviation (Sa) at the 
    national level using equations 4a and 5a (respectively) of Section 
    5.1.2, where k in this case is the number of sites nationwide at 
    which the method is used for SLAMS monitoring (and at least 4 
    measurement pairs are reported). Call these estimates 
    Dnational, M and Snational, M, where M identifies the 
    designated method. A 95 percent confidence interval shall also be 
    determined for each national pooled standard deviation.
        6.3.1.5  For each designated method, calculate the 95 percent 
    probability limits for each reporting organization, using equations 
    6 and 7 of Section 5.1.2, where D=DR,M and Sa=SR,M. 
    Similarly, calculate the 95 percent probability limits for each 
    method on a national basis, using equations 6 and 7 of Section 
    5.1.2, where D=Dnational,M and Sa=Snational,M.
        Note: Pooling individual site estimates of precision across a 
    reporting organization or across the nation using equation 5a 
    assumes that the individual site estimates of precision using 
    equation 3 are reasonably homogeneous across the year for a 
    designated method.
        6.3.2  Reporting organization method operational performance. A 
    summary of the results calculated in section 6.3.1.5 shall be 
    reported annually to the appropriate EPA Regional Office. If the 
    absolute value of either the upper or lower probability limit for a 
    reporting organization calculated in section 6.3.1.5 for any 
    designated method is found to be greater than 15 percent or 
    substantially higher than the corresponding limits calculated for 
    the method on the national basis, the reporting organization shall 
    be identified and notified by the EPA that its quality assurance in 
    the operation of the particular PM2.5 method may be inadequate. 
    Each reporting organization so identified and notified must 
    demonstrate, through an appropriate quality assurance plan or 
    modified plan, that it will achieve better performance in future 
    monitoring operations using the method. General guidance in 
    identifying and correcting common or typical types of such quality 
    assurance problems for reference methods and Class I equivalent 
    methods is provided in section 2.12 of Reference 7 of this appendix.
        6.3.3  National method operational performance. If the absolute 
    value of either the upper or lower probability limit calculated in 
    section 6.3.1.5 for any designated method on a national basis is 
    found to be greater than 15 percent, the method shall be deemed to 
    have failed the annual operational performance assessment test. This 
    result shall constitute a ground for cancellation of the reference 
    or equivalent method in accordance with Sec. 53.11 of this chapter, 
    and the EPA shall take the actions specified in that section within 
    150 days.
    
    References in Appendix A of Part 58
    
        1. Rhodes, R.C. Guideline on the Meaning and Use of Precision 
    and Accuracy Data Required by 40 CFR part 58 appendices A and B. 
    EPA-600/4-83/023. U.S. Environmental Protection Agency, Research 
    Triangle Park, NC 27711, June, 1983.
        2. ``American National Standard--Specifications and Guidelines 
    for Quality Systems for Environmental Data Collection and 
    Environmental Technology Programs.'' ANSI/ASQC E4-1994. January 
    1995. Available from American Society for Quality Control, 611 East 
    Wisconsin Avenue, Milwaukee, WI 53202.
        3. ``EPA Requirements for Quality Management Plans.'' EPA QA/R-
    2. August 1994. Available from U.S. Environmental Protection Agency, 
    ORD Publications Office, Center for Environmental Research 
    Information (CERI), 26 W. Martin Luther King Drive, Cincinnati, OH 
    45268.
        4. ``EPA Requirements for Quality Assurance Project Plans for 
    Environmental Data Operations.'' EPA QA/R-5. August 1994. Available 
    from U.S. Environmental Protection Agency, ORD Publications Office, 
    Center for Environmental Research Information (CERI), 26 W. Martin 
    Luther King Drive, Cincinnati, OH 45268.
        5. ``Guidance for the Data Quality Objectives Process.'' EPA QA/
    G-4. September 1994. Available from U.S. Environmental Protection 
    Agency, ORD Publications Office, Center for Environmental Research 
    Information (CERI), 26 W. Martin Luther King Drive, Cincinnati, OH 
    45268.
        6. ``Quality Assurance Handbook for Air Pollution Measurement 
    Systems, Volume 1--A Field Guide to Environmental Quality 
    Assurance.'' EPA-600/R-94/038a. April 1994. Available from U.S. 
    Environmental Protection Agency, ORD Publications Office, Center for 
    Environmental Research Information (CERI), 26 W. Martin Luther King 
    Drive, Cincinnati, OH 45268.
        7. ``Quality Assurance Handbook for Air Pollution Measurement 
    Systems, Volume II--Ambient Air Specific Methods (Interim 
    Edition).'' EPA-600/R-94/038b. April 1994. Available from U.S. 
    Environmental Protection Agency, ORD Publications Office, Center for 
    Environmental Research Information (CERI), 26 W. Martin Luther King 
    Drive, Cincinnati, OH 45268. [Note: Section 2.12 of Volume II is 
    currently under development and will not be available from the CERI 
    address until it is published as an addition to EPA/600/R-94/038b. 
    Prepublication draft copies of section 2.12 will be available from 
    Department E (MD-77B), U.S. EPA, Research Triangle Park, NC 27711, 
    or from the contact identified at the beginning of this proposed 
    rule].
        8. ``List of Designated Reference and Equivalent Methods.'' 
    Available from U.S. Environmental Protection Agency, National 
    Exposure Research Laboratory, Quality Assurance Branch, MD-77B, 
    Research Triangle Park, NC 27711.
        9. Technical Assistance Document for Sampling and Analysis of 
    Ozone Precursors. Atmospheric Research and Exposure Assessment 
    Laboratory, U.S. Environmental Protection Agency, Research Triangle 
    Park, NC 27711. EPA 600/8-91-215. October 1991.
        10. ``EPA Traceability Protocol for Assay and Certification of 
    Gaseous Calibration Standards.'' EPA-600/R-93/224. September 1993. 
    Available from U.S. Environmental Protection Agency, ORD 
    Publications Office, Center for Environmental Research Information 
    (CERI), 26 W. Martin Luther King Drive, Cincinnati, OH 45268.
        11. Paur, R.J. and F.F. McElroy. Technical Assistance Document 
    for the Calibration of Ambient Ozone Monitors. EPA-600/4-79-057. 
    U.S. Environmental Protection Agency, Research Triangle Park, NC 
    27711, September, 1979.
        12. McElroy, F.F. Transfer Standards for the Calibration of 
    Ambient Air Monitoring Analyzers for Ozone. EPA-600/4-79-056. U.S. 
    Environmental Protection Agency, Research Triangle Park, NC 27711, 
    September, 1979.
    
    Tables to Appendix A of Part 58
    
                                    Table A-1.--Minimum Data Assessment Requirements                                
    ----------------------------------------------------------------------------------------------------------------
                                                                                                      Parameters    
                 Method                Assessment method       Coverage        Minimum frequency       reported     
    ----------------------------------------------------------------------------------------------------------------
    Precision:                                                                                                      
        Automated methods for SO2,    Response check at   Each analyzer.....  Once per 2 weeks..  Actual            
         NO2, O3, and CO.              concentration                                               concentration \2\
                                       between .08 and                                             and measured     
                                       .10 ppm (8 & 10                                             concentration \3\
                                       ppm for CO) \2\.                                            .                
    
    [[Page 65857]]
    
                                                                                                                    
        Manual methods: All methods   Collocated          1 site for 1-5      Once per week.....  Two concentration 
         except PM25.                  samplers.           sites; 2 sites                          measurements.    
                                                           for 6-20 sites; 3                                        
                                                           sites >20 sites;                                         
                                                           (sites with                                              
                                                           highest conc.).                                          
        PM25 methods................  Collocated          1 site for 1-10     Once per week.....  Two concentration 
                                       samplers.           sites; 2 sites                          measurements.    
                                                           for 11-20 sites;                                         
                                                           3 sites >20                                              
                                                           sites; (sites                                            
                                                           with highest                                             
                                                           conc.).                                                  
    Accuracy:                                                                                                       
        Automated methods for SO2,    Response check at   1. Each analyzer;   1. Once per year;   Actual            
         NO2, O3, and CO.              .03-.08 ppm,1,2     2. 25% of           2. Each calendar    concentration \2\
                                       .15-.20 ppm;1,2     analyzers (at       quarter.            and measured     
                                       .35-.45 ppm;1,2     least 1).                               (indicated)      
                                       ,80-.90 ppm;1,2                                             concentration \3\
                                       (if applicable).                                            for each level.  
        Manual methods for SO2, and   Check of            Analytical system.  Each day samples    Actual            
         NO2.                          analytical                              are analyzed, at    concentration and
                                       procedure with                          least twice per     measured         
                                       audit standard                          quarter.            (indicated)      
                                       solutions.                                                  concentration for
                                                                                                   each audit       
                                                                                                   solution.        
        TSP, PM10...................  Check of sampler    1. Each sampler;    1. Once per year;   Actual flow rate  
                                       flow rate.          2. 25% of           2. Each calendar    and flow rate    
                                                           samplers (at        quarter.            indicated by the 
                                                           least 1).                               sampler.         
        PM25........................  1. Check of         1. Each sampler,    1. Minimum of       1. Actual flow    
                                       sampler flow rate.  all locations.      every calendar      rate and flow    
                                                                               quarter, 4 checks   rate indicated by
                                                                               per year.           sampler.         
                                      2. Audit with       2. Each sampler,    2. Minimum of       2. Particle mass  
                                       reference method.   all locations.      every other         concentration    
                                                                               month, 6            indicated by     
                                                                               measurements per    sampler and by   
                                                                               year.               audit reference  
                                                                                                   sampler.         
        Lead........................  1. Check of         1. Each sampler...  1. Include with     1. Same as for    
                                       sampler flow rate                       TSP.                TSP.             
                                       as TSP;.                                                                     
                                      2. Check of         2. Analytical       2. Each quarter...  2. Actual         
                                       analytical system   system.                                 concentration and
                                       with Pb audit                                               measured         
                                       strips.                                                     (indicated)      
                                                                                                   concentration of 
                                                                                                   audit samples    
                                                                                                   (g Pb/  
                                                                                                   strip).          
    ----------------------------------------------------------------------------------------------------------------
    \1\ Concentration times 100 for CO.                                                                             
    \2\ Effective concentration for open path analyzers.                                                            
    \3\ Corrected concentration, if applicable, for open path analyzers.                                            
    
    Appendix C--[Amended]
    
        15. Appendix C, is amended by revising section 2.2 and adding 
    sections 2.2.1 through 2.2.2.2 to read as follows:
    
    2.2  Substitute PM samplers.
    
        2.2.1  Substitute PM10 samplers.
        2.2.1.1  For purposes of showing compliance with the NAAQS for 
    particulate matter, a high volume TSP sampler described in Appendix 
    B of part 50 of this chapter may be used in a SLAMS in lieu of a 
    PM10 monitor as long as the ambient concentrations of particles 
    measured by the TSP sampler are below the PM10 NAAQS. If the 
    TSP sampler measures a single value that is higher than the 
    PM10 24-hour standard, or if the annual average of its 
    measurements is greater than the PM10 annual standard, the TSP 
    sampler operating as a substitute PM10 sampler must be replaced 
    with a PM10 monitor. For a TSP measurement above the 24-hour 
    standard, the TSP sampler should be replaced with a PM10 
    monitor before the end of the calendar quarter following the quarter 
    in which the high concentration occurred. For a TSP annual average 
    above the annual standard, the PM10 monitor should be operating 
    by June 30 of the year following the exceedance.
        2.2.1.2  In order to maintain historical continuity of ambient 
    particulate matter trends and patterns for PM10 NAMS that were 
    previously TSP NAMS, the TSP high volume sampler must be operated 
    concurrently with the PM10 monitor for a one-year period 
    beginning with the PM10 NAMS start-up date. The operating 
    schedule for the TSP sampler must be at least once every six days 
    regardless of the PM10 sampling frequency.
        2.2.2  Substitute PM2.5 samplers.
        2.2.2.1  For purposes of showing compliance with the NAAQS for 
    particulate matter, a PM10 monitor designated as a reference or 
    equivalent method for PM10 under part 53 of this chapter may be 
    used in a SLAMS in lieu of a PM2.5 monitor as long as the 
    ambient concentration of particles measured by the PM10 monitor 
    is below the PM2.5 NAAQS. If the PM10 monitor measures a 
    single value that is higher than the PM2.5 24-hour standard, or 
    the annual average of its measurements is greater than the 
    PM2.5 annual standard, the PM10 monitor operating as a 
    substitute PM2.5 monitor must be replaced with a PM2.5 
    monitor. For a PM10 measurement above the 24-hour PM2.5 
    standard, the PM10 monitor should be replaced with a PM2.5 
    monitor before the end of the calendar quarter following the quarter 
    in which the high concentration occurred. For a PM10 annual 
    average above the annual PM2.5 standard, the PM2.5 monitor 
    should be operating by June 30 of the year following the exceedance.
        2.2.2.2  In order to maintain historical continuity of ambient 
    particulate matter trends and patterns for PM2.5 NAMS that were 
    previously PM10 NAMS, the PM10 monitor must be operated 
    concurrently with the PM2.5 monitor for a one-year period 
    beginning with the PM2.5 NAMS start-up date. The operating 
    schedule for the PM10 monitor must be at least once every six 
    days regardless of the PM2.5 sampling frequency.
    
        16. Appendix C amended by adding a new sections 2.4 through 2.4.6 
    to read as follows:
    
        2.4  Approval of non-designated PM2.5 methods operated at 
    specific individual sites. A method for PM2.5 that has not been 
    designated as a reference or equivalent method as defined in 
    Sec. 50.1 of this chapter may be approved for use for purposes of 
    section 2.1 of this Appendix at a particular SLAMS under the 
    following stipulations.
        2.4.1  The method must be demonstrated to meet the comparability 
    requirements (except as provided in this section 2.4.1) set forth in 
    Sec. 53.34 of this chapter in each of the four seasons at the site 
    at which it is intended to be used. For purposes of this
    
    [[Page 65858]]
    
    section 2.4.1, the requirements of 40 CFR 53.34 shall be modified as 
    follows:
        2.4.1.1  The method shall be tested at the site at which it is 
    intended to be used, and there shall be no requirement for tests at 
    any other test site.
        2.4.1.2  For purposes of this section 2.4, the seasons shall be 
    defined as follows: spring shall be the months of March, April, and 
    May; summer shall be the months of June, July, and August; fall 
    shall be the months of September, October, and November; and winter 
    shall be the months of December, January, and February.
        2.4.1.3  No PM10 samplers shall be required for the test, 
    as determination of the PM2.5/PM10 ratio at the test site 
    shall not be required.
        2.4.1.4  The specifications given in Table C-4 of part 53 of 
    this chapter for Class I methods shall apply, except that there 
    shall be no requirement for any minimum number of sample sets with 
    Rj above 40 g/m\3\ for 24-hour samples or above 30 
    g/m\3\ for 48-hour samples.
        2.4.2  The monitoring agency wishing to use the method must 
    develop and implement appropriate quality assurance procedures for 
    the method.
        2.4.3  The monitoring agency wishing to use the method must 
    develop and implement appropriate procedures for assessing and 
    reporting the precision and accuracy of the method comparable to the 
    procedures set forth in Appendix A of this part for designated 
    reference and equivalent methods.
        2.4.4  The assessment of network operating precision using 
    collocated measurements with reference method ``audit'' samplers 
    required under section 6 of Appendix A of this section shall be 
    carried out semi-annually rather than annually (i.e., monthly audits 
    with assessment determinations each 6 months).
        2.4.5  Requests for approval under this section 2.4 must meet 
    the general submittal requirements of sections 2.7.1 and 2.7.2.1 of 
    this appendix and must include the requirements in sections 2.4.5.1 
    through 2.4.5.7 of this appendix.
        2.4.5.1  A clear and unique description of the site at which the 
    method or sampler will be used and tested, and a description of the 
    nature or character of the site and the particulate matter that is 
    expected to occur there.
        2.4.5.2  A detailed description of the method and the nature of 
    the sampler or analyzer upon which it is based.
        2.4.5.3  A brief statement of the reason or rationale for 
    requesting the approval.
        2.4.5.4  A detailed description of the quality assurance 
    procedures that have been developed and that will be implemented for 
    the method.
        2.4.5.5  A detailed description of the procedures for assessing 
    the precision and accuracy of the method that will be implemented 
    for reporting to AIRS.
        2.4.5.6  Test results from the comparability tests required 
    above.
        2.4.5.7  Such further supplemental information as may be 
    necessary or helpful to support the required statements and test 
    results.
        2.4.6  Within 120 days after receiving a request for approval of 
    the use of a method at a particular site under this section 2.4 and 
    such further information as may be requested for purposes of the 
    decision, the Administrator will approve or disapprove the method by 
    letter to the person or agency requesting such approval.
    
        17. Appendix C is amended by adding a new section 2.5 to read as 
    follows:
    
        2.5  Approval of non-designated methods under Sec. 58.13(f). An 
    automated (continuous) method for PM2.5 that is not designated 
    as either a reference or equivalent method as defined in Sec. 50.1 
    of this chapter may be approved under Sec. 58.13(f) for use at a 
    SLAMS for the limited purposes of Sec. 58.13(f). Such an analyzer 
    that is approved for use at a SLAMS under Sec. 58.13(f), identified 
    as correlated acceptable continuous (CAC) monitors, shall not be 
    considered a reference or equivalent method as defined in Sec. 50.1 
    of this chapter by virtue of its approval for use under 
    Sec. 58.13(f), and the PM2.5 monitoring data obtained from such 
    a monitor shall not be otherwise used for purposes of part 50 of 
    this chapter.
    
        18. Appendix C is amended by revising the section 2.7.1 to read as 
    follows:
    
        2.7.1  Requests for approval under sections 2.4, 2.6.2, or 2.8 
    must be submitted to: Director, National Exposure Assessment 
    Laboratory, Department E, (MD-77B), U.S. Environmental Protection 
    Agency, Research Triangle Park, North Carolina 27711.
    
        19. Appendix C is amended by adding a new section 2.9 to read as 
    follows:
    
        2.9  Use of IMPROVE Samplers at a SLAMS. ``IMPROVE'' samplers 
    may be used in SLAMS for monitoring of regional background 
    concentrations of fine particulate matter. The IMPROVE samplers were 
    developed for use in the Interagency Monitoring of Protected Visual 
    Environments (IMPROVE) network to characterize all of the major 
    components and many trace constituents of the particulate matter 
    that impair visibility in Federal Class I Areas. These samplers are 
    routinely operated at about 70 locations in the United States. 
    IMPROVE samplers consist of four sampling modules that are used to 
    collect twice weekly 24-hour duration simultaneous samples. Modules 
    A, B, and C collect PM2.5 on three different filter substrates 
    that are compatible with a variety of analytical techniques, and 
    module D collects a PM10 sample. PM2.5 mass and elemental 
    concentrations are determined by analysis of the 25mm diameter 
    stretched Teflon filters from module A. More complete descriptions 
    of the IMPROVE samplers and the data they collect are available 
    elsewhere (References 5.2, 5.3, and 5.4 of this Appendix).
    
        20. Appendix C, section 6.0 amended by adding references, 4 through 
    6 to read as follows:
    
    6.0  References
    
    * * * * *
        4. Eldred, R.A., Cahill, T.A., Wilkenson, L.K., et al., 
    ``Measurements of fine particles and their chemical components in 
    the IMPROVE/NPS networks,'' in ``Transactions of the International 
    Specialty Conference on Visibility and Fine Particles,'' Air and 
    Waste Management Association: Pittsburgh, PA, 1990; pp 187-196.
        5. Sisler, J.F., Huffman, D., and Latimer, D.A.; ``Spatial and 
    temporal patterns and the chemical composition of the haze in the 
    United States: An analysis of data from the IMPROVE network, 1988-
    1991,'' ISSN No. 0737-5253-26, National Park Service, Ft. Collins, 
    CO, 1993.
        6. Eldred, R.A., Cahill, T.A., Pitchford, M., and Malm, W.C.; 
    ``IMPROVE--a new remote area particulate monitoring system for 
    visibility studies,'' Proceedings of the 81st Annual Meeting of the 
    Air Pollution Control Association, Dallas, Paper 88-54.3, 1988.
    
    Appendix D--[Amended]
    
        21. In Appendix D the first three paragraphs and Table 1 of section 
    1 are revised as follows:
    
    1. SLAMS Monitoring Objectives and Spatial Scales
    
        The purpose of this appendix is to describe monitoring 
    objectives and general criteria to be applied in establishing the 
    State and Local Air Monitoring Stations (SLAMS) networks and for 
    choosing general locations for new monitoring stations. It also 
    describes criteria for determining the number and location of 
    National Air Monitoring Stations (NAMS), Photochemical Assessment 
    Monitoring Stations (PAMS), and core Stations for PM2.5. These 
    criteria will also be used by EPA in evaluating the adequacy of the 
    SLAMS/NAMS/PAMS and core PM2.5 networks.
        The network of stations which comprise SLAMS should be designed 
    to meet a minimum of six basic monitoring objectives. These basic 
    monitoring objectives are:
        (1) To determine highest concentrations expected to occur in the 
    area covered by the network;
        (2) To determine representative concentrations in areas of high 
    population density;
        (3) To determine the impact on ambient pollution levels of 
    significant sources or source categories;
        (4) To determine general background concentration levels;
        (5) To determine the extent of Regional pollutant transport 
    among populated areas; and in support of secondary standards; and
        (6) To determine the welfare-related impacts in more rural and 
    remote areas (such as visibility impairment and effects on 
    vegetation).
        It should be noted that this appendix contains no criteria for 
    determining the total number of stations in SLAMS networks, except 
    that a minimum number of lead SLAMS and PM2.5 are prescribed 
    and the minimal network introduced in 58.20 is explained. The 
    optimum size of a particular SLAMS network involves trade offs among 
    data needs and available resources which EPA believes can best be 
    resolved during the network design process.
    * * * * *
    
    [[Page 65859]]
    
    
    
         Table 1.--Relationship Among Monitoring Objectives and Scale of    
                               Representativeness                           
    ------------------------------------------------------------------------
               Monitoring objective               Appropriate siting scales 
    ------------------------------------------------------------------------
    Highest concentration.....................  Micro, Middle, neighborhood 
                                                 (sometimes urbana).        
                                                Neighborhood, urban.        
    Population................................  Micro, middle, neighborhood.
    Source impact.............................  Neighborhood, urban,        
                                                 regional.                  
    General/background........................  Urban/regional.             
    Regional transport........................  ............................
    Welfare-related impacts...................  Urban/regional.             
    ------------------------------------------------------------------------
    a Urban denotes a geographic scale applicable to both cities and rural  
      areas.                                                                
    
    * * * * *
        22. In Appendix D, section 2 is amended by revising the second 
    paragraph and adding a new paragraph to the end of the section before 
    section 2.1 to read as follows:
    
    2. SLAMS Network Design Procedures
    
    * * * * *
        The discussion of scales in sections 2.3 through 2.8 does not 
    include all of the possible scales for each pollutant. The scales 
    which are discussed are those which are felt to be most pertinent 
    for SLAMS network design.
    * * * * *
        Information such as emissions density, housing density, 
    climatological data, geographic information, traffic counts, and the 
    results of modeling will be useful in designing regulatory networks. 
    Air pollution control agencies have shown the value of screening 
    studies, such as intensive studies conducted with portable samplers, 
    in designing networks. In many cases, in selecting sites for core 
    PM2.5 or carbon monoxide SLAMS, and for defining the boundaries 
    of PM2.5 spatial averaging zone, air pollution control agencies 
    will benefit from using such studies to evaluate the spatial 
    distribution of pollutants.
    * * * * *
        23. Section 2.8 is revised as follows:
    
    2.8  Particulate Matter Design Criteria for SLAMS
    
        As with other pollutants measured in the SLAMS network, the 
    first step in designing the particulate matter network is to collect 
    the necessary background information. Various studies in References 
    11, 12, 13, 14, 15, and 16 of this appendix have documented the 
    major source categories of particulate matter and their contribution 
    to ambient levels in various locations throughout the country.
        2.8.0.1  Sources of background information would be regional and 
    traffic maps, and aerial photographs showing topography, 
    settlements, major industries and highways. These maps and 
    photographs would be used to identify areas of the type that are of 
    concern to the particular monitoring objective. After potentially 
    suitable monitoring areas for particulate matter have been 
    identified on a map, modeling may be used to provide an estimate of 
    particulate matter concentrations throughout the area of interest. 
    After completing the first step, existing particulate matter 
    stations should be evaluated to determine their potential as 
    candidates for SLAMS designation. Stations meeting one or more of 
    the six basic monitoring objectives described in section 1 of this 
    appendix must be classified into one of the five scales of 
    representativeness (micro, middle, neighborhood, urban and regional) 
    if the stations are to become SLAMS. In siting and classifying 
    particulate matter stations, the procedures in reference 17 should 
    be used.
        2.8.0.2  The most important spatial scales to effectively 
    characterize the emissions of particulate matter from both mobile 
    and stationary sources are the middle and neighborhood scales. For 
    purposes of establishing monitoring stations to represent large 
    homogenous areas other than the above scales of representativeness 
    and to characterize Regional transport, urban or regional scale 
    stations would also be needed.
        2.8.0.3  Microscale--This scale would typify areas such as 
    downtown street canyons and traffic corridors where the general 
    public would be exposed to maximum concentrations from mobile 
    sources. In some circumstances, the microscale is appropriate for 
    particulate stations; core SLAMS on the microscale should, however, 
    be limited to urban sites that are representative of long-term human 
    exposure and of many such microenvironments in the area. In general, 
    microscale particulate matter sites should be located near inhabited 
    buildings or locations where the general public can be expected to 
    be exposed to the concentration measured. Emissions from stationary 
    sources such as primary and secondary smelters, power plants, and 
    other large industrial processes may, under certain plume 
    conditions, likewise result in high ground level concentrations at 
    the microscale. In the latter case, the microscale would represent 
    an area impacted by the plume with dimensions extending up to 
    approximately 100 meters. Data collected at microscale stations 
    provide information for evaluating and developing ``hot spot'' 
    control measures. Unless these sites are indicative of population-
    oriented monitoring, they may be more appropriately classified as 
    SPMs.
        2.8.0.4  Middle Scale--Much of the measurement of short-term 
    public exposure to particulate matter is on this scale and on the 
    neighborhood scale; core SLAMS especially should represent 
    community-wide air pollution. People moving through downtown areas, 
    or living near major roadways, encounter particles that would be 
    adequately characterized by measurements of this spatial scale. 
    Thus, measurements of this type would be appropriate for the 
    evaluation of possible short-term public health effects of 
    particulate matter pollution. This scale also includes the 
    characteristic concentrations for other areas with dimensions of a 
    few hundred meters such as the parking lot and feeder streets 
    associated with shopping centers, stadia, and office buildings. In 
    the case of PM10, unpaved or seldom swept parking lots 
    associated with these sources could be an important source in 
    addition to the vehicular emissions themselves.
        2.8.0.5  Neighborhood Scale--Measurements in this category would 
    represent conditions throughout some reasonably homogeneous urban 
    subregion with dimensions of a few kilometers and of generally more 
    regular shape than the middle scale. Homogeneity refers to the 
    particulate matter concentrations, as well as the land use and land 
    surface characteristics. Much of the PM2.5 exposures are 
    expected to be associated with this scale of measurement. In some 
    cases, a location carefully chosen to provide neighborhood scale 
    data would represent not only the immediate neighborhood but also 
    neighborhoods of the same type in other parts of the city. Stations 
    of this kind provide good information about trends and compliance 
    with standards because they often represent conditions in areas 
    where people commonly live and work for periods comparable to those 
    specified in the NAAQS. This category also may include industrial 
    and commercial neighborhoods especially in districts of diverse land 
    use where residences are interspersed.
        2.8.0.6  Neighborhood scale data could provide valuable 
    information for developing, testing, and revising models that 
    describe the larger-scale concentration patterns, especially those 
    models relying on spatially smoothed emission fields for inputs. The 
    neighborhood scale measurements could also be used for neighborhood 
    comparisons within or between cities. This is the most likely scale 
    of measurements to meet the needs of planners.
        2.8.0.7  Urban Scale--This class of measurement would be made to 
    characterize the particulate matter concentration over an entire 
    metropolitan or rural area ranging in size from 4 to 50 km. Such 
    measurements would be useful for assessing trends in area-wide air 
    quality, and hence, the effectiveness of large scale air pollution 
    control strategies.
        2.8.0.8  Regional Scale--These measurements would characterize 
    conditions over areas with dimensions of as much as hundreds of 
    kilometers. As noted earlier, using representative conditions for an 
    area implies some degree of homogeneity in that area. For this 
    reason, regional scale measurements would be most applicable to 
    sparsely populated areas with reasonably uniform ground cover. Data 
    characteristics of this scale would provide information about larger 
    scale processes of particulate matter emissions, losses and 
    transport. Especially in the case of PM2.5, transport 
    contributes to particulate concentrations and may affect multiple 
    urban and State entities with large populations such as in the 
    Eastern United States. Development of effective pollution control 
    strategies requires an understanding at regional geographical scales 
    of the emission sources and atmospheric processes that are 
    responsible for elevated PM2.5 levels and may also be 
    associated with elevated ozone and regional haze.
    
        24. New sections 2.8.1, 2.8.2, 2.8.3, and 2.8.4 are added after 
    Section 2.8 to read as follows:
    
    [[Page 65860]]
    
    2.8.1  Monitoring Planning Areas and Spatial Averaging Zones
    
        2.8.1.1  Monitoring planning areas (MPA's) and spatial averaging 
    zones (SAZ's) shall be used to conform to the population-oriented, 
    spatial averaging approach used for the PM2.5 NAAQS given in 40 
    CFR Part 50. MPA's are required to include all metropolitan 
    statistical areas (MSA's) with population greater than 500,000, and 
    all other areas determined to be in violation of the PM2.5 
    NAAQS.1 Although not required, MPA's should generally be 
    designated to also include all MSA's with population greater than 
    250,000 which have measured or modeled PM2.5 concentrations 
    greater than 80 percent of the PM2.5 NAAQS. Monitoring planning 
    areas for other designated parts of the State are optional.
    ---------------------------------------------------------------------------
    
        \1\ The boundaries of MPA's do not have to necessarily 
    correspond to those of MSA's and existing intra or interstate air 
    pollution planning districts may be utilized.
    ---------------------------------------------------------------------------
    
        2.8.1.2  The SAZs shall define the area within which monitoring 
    data will be averaged for comparison with the annual PM2.5 
    NAAQS. This approach is directly related to epidemiological studies 
    used as the basis for the PM2.5 NAAQS. A SAZ should 
    characterize an area of relatively similar annual average air 
    quality (e.g., the annual average concentrations at individual sites 
    should not exceed the spatial average by more than +/- 20 percent) 
    and exhibit similar day to day variability (e.g., the monitoring 
    sites should not have low correlations, say less than 0.8). 
    Moreover, the entire SAZ should principally be affected by the same 
    major emission sources of particulate matter.
        2.8.1.3  Each monitoring planning area shall have at least one 
    spatial averaging zone, which may or may not cover the entire MPA. 
    In metropolitan statistical areas (MSA's) for which MPA's are 
    required, the SAZ's shall completely cover the entire MSA. 
    Exceptions to the requirement are allowed (say for areas with low 
    population density) provided that it receives approval from the 
    appropriate EPA Regional Administrator. In MPA's for other areas, 
    the SAZ's are not required to completely cover the entire MPA. All 
    MPA's and SAZ's shall be defined on the basis of existing, 
    delineated mapping data limited to State boundaries, county 
    boundaries, zip codes, census blocks, or census block groups; 
    however, SAZ's shall not overlap in their geographical coverage.
        2.8.1.4  Spatial averaging zones should generally include a 
    minimum of 250,000 and not more than two million population, but all 
    areas in the ambient air may become a spatial averaging zone. The 
    SAZ should emphasize population that spends a substantial portion of 
    time within the zone to reflect exposure from multiple spatial 
    locations, but does not need to account for all day-night population 
    shifts. Consequently, large MSA's with population greater than one 
    million should be subdivided into smaller portions, such as 
    counties, to better reflect the variability in exposure to the 
    average population for large numbers of people.
        2.8.1.5  A SAZ can be represented by a single monitoring 
    location, but in most cases multiple locations will be needed. For 
    example, a single monitor may not be adequate to characterize the 
    average air quality in a large geographic area; in large areas of 
    relatively low population or population density, population centers 
    and monitoring sites may be geographically disjoint. In such cases, 
    the spatial representativeness of the monitoring site should be 
    considered in defining the SAZ boundaries. Until more monitoring 
    stations are established, the monitored air quality in areas outside 
    of SAZ's is unknown. Accordingly, a station that is established in 
    the ambient air outside the boundaries of a SAZ but that is in or 
    near a populated area, meets siting criteria, and produces quality-
    assured data (i.e., meets the requirements of Part 58, 58.13, and 
    Appendices A, C, and E) can also be presumed to produce data that is 
    eligible for comparison to both the 24-hour and annual NAAQS for 
    PM2.5 and to represent some zone. At the discretion of the 
    responsible air pollution control agency, such a zone should be 
    defined as a SAZ during the annual network review. In this way, the 
    network coverage of the population can be gradually improved.
    
    2.8.2  PM2.5 Monitoring Sites within the State PM Monitoring Plan
    
        2.8.2.0.1  The minimum required number and type of monitoring 
    sites and sampling requirements for PM2.5 are based on 
    monitoring planning areas and spatial averaging zones for each MPA, 
    which must be included in a monitoring plan and proposed by the 
    States in accordance with Sec. 58.20.
        2.8.2.0.2  As stated in Sec. 58.15, comparisons to the 
    PM2.5 NAAQS may be based on data from SPMs in addition to SLAMS 
    (including NAMS, core SLAMS and collocated PM2.5 sites at 
    PAMS), which meet the requirements of part 58, 58.13, and appendices 
    A, C and E, which are population-oriented and which are included in 
    the monitoring plan. Figure 1 of this Appendix shows a conceptual 
    (Venn) diagram illustrating which PM2.5 sites in an MPA and SAZ 
    are eligible for comparison with the PM2.5 NAAQS. Special 
    purpose monitors which meet part 58 requirements will be exempt from 
    NAAQS comparisons with the PM2.5 NAAQS for 3 years following 
    promulgation of the PM2.5 NAAQS to encourage PM2.5 
    monitoring initially. After this time, however, any SPM which 
    records a violation of the PM2.5 NAAQS must be seriously 
    considered as a potential SLAMS site during the annual SLAMS network 
    review in accordance with Sec. 58.25. If such SPM's are not 
    established as a SLAMS the agency must document in its annual 
    report, the technical basis for excluding it as a SLAMS.
    
    BILLING CODE 6560-50-P
    
    [[Page 65861]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.138
    
    
    
    BILLING CODE 6560-50-C
    
    [[Page 65862]]
    
        2.8.2.0.3  Figure 1 is intended to show the relationship between 
    NAAQS eligible sites to the entire monitoring network. Sites 
    eligible for comparison to both standards and only the daily (i.e, 
    24-hour) standard are shown. The diagram applies to all the sites in 
    a Monitoring Planning Area including special purpose, industrial as 
    well as the NAMS/SLAMS/Core networks. The sub-areas shown do not 
    necessarily represent contiguous geographic regions.
        2.8.2.0.4  All sites eligible for PM2.5 NAAQS comparisons 
    would be designated ``B'' or ``D'', and all other sites would be 
    designated ``O.'' Sites ``B'' and ``D'' must be NAMS/SLAMS or other 
    population-oriented sites, be included in the State's Monitoring 
    Plan and meet requirements of Part 58 .13 and Appendices A, C and E. 
    The codes ``B,'' ``D'' and ``O'' would become new pollutant specific 
    codes on the AIRS monitoring site file to identify PM-2.5 sites 
    eligible for NAAQS comparisons. The codes could distinguish between 
    State submitted codes and those receiving EPA Regional Office 
    approval (as currently done with Exceptional Event data codes). This 
    will reflect EPA review and approval of the site information 
    presented in the State's annual Monitoring Plan.
        2.8.2.0.5  Within each MPA and SAZ, the responsible air 
    pollution control agency shall install core SLAMS, other required 
    SLAMS and as many PM2.5 stations judged necessary to satisfy 
    the SLAMS requirements and monitoring objectives of this appendix.
        2.8.2.1  Core Monitoring Stations for PM2.5
        Core monitoring stations or sites are a subset of the SLAMS 
    network for PM2.5 for which more frequent (daily) sampling of 
    PM2.5 is required. These core sites fall into three categories:
        Population-oriented SLAMS monitors, background and transport 
    sites, and sites to be collocated at PAMS.
        2.8.2.1.2  Within each monitoring planning area, the responsible 
    air pollution control agency shall install:
        (a) At least two population-oriented core stations for 
    PM2.5, unless exempted by the Regional Administrator, including 
    at least one station in a population oriented area of expected 
    maximum concentration; (b) At least one station in an area of poor 
    air quality and representative of maximum population impact and (c) 
    At least one additional core monitor collocated at a PAMS site if 
    the MPA is also a PAMS area.2
    ---------------------------------------------------------------------------
    
        \2\ The core monitor to be collocated at a PAMS site shall not 
    be considered a part of the PAMS as described in section 4 of this 
    appendix, but shall instead be considered to be a component of the 
    particular MPA PM2.5 network
    ---------------------------------------------------------------------------
    
        2.8.2.1.3  The site situated in the area of expected maximum 
    concentration is analogous to NAMS ``category a.'' 3 This will 
    henceforth be termed a category a core SLAMS site. The site located 
    in the area of poor air quality with high population density or 
    representative of maximum population impact is analogous to NAMS, 
    ``category b.'' 4 This second site will be called a category b 
    core SLAMS site.
    ---------------------------------------------------------------------------
    
        \3\ The measured maximum concentrations at core population-
    oriented sites should be consistent with the averaging time of the 
    NAAQS. Therefore, sites only with high concentrations for shorter 
    averaging times (say 1-hour) should not be core SLAMS monitors and 
    may in fact be more appropriately designated special purpose 
    monitors.
        \4\ Population-oriented sites are representative of residential, 
    recreational and business locations where people are present for a 
    substantial portion of the NAAQS averaging time period or locations 
    indicative of ambient air to which the population can be expected to 
    be exposed.
    ---------------------------------------------------------------------------
    
        2.8.1.1.4  Those MPA's which are substantially impacted by 
    several different and geographically disjoint local sources of fine 
    particles should have separate core sites to monitor each 
    influencing source region.
        2.8.2.1.5  Each spatial averaging zone in a required MPA shall 
    have at least one core monitor; the SAZ for an optional MPA should 
    have at least one core monitor; and there should be one core site 
    for each SAZ with four or more SLAMS. Rural MPA's and areas with 
    disperse towns and small cities may have a single core station per 
    MPA but may have additional PM2.5 stations of other categories.
        2.8.2.1.6  The State shall also install at least one core SLAMS 
    to monitor for regional background and at least one core SLAMS to 
    monitor regional transport. These core monitoring stations may be 
    population oriented and their requirement may be satisfied by a 
    corresponding core monitoring in a representative area having 
    similar air quality in another State.
        2.8.2.1.7  Within each monitoring planning area, one core 
    monitor may be exempted by the Regional Administrator. This may be 
    appropriate in areas where the highest concentration is expected to 
    occur at the same location as the area of maximum or sensitive 
    population impact, or areas with low concentrations (e.g. highest 
    concentrations are less than 80 percent of the NAAQS). When only one 
    population-oriented core monitor for PM2.5 may be included in a 
    MPA/SAZ, however, a ``type b'' core site is strongly preferred to 
    determine representative PM2.5 concentrations in areas of high 
    population density.
        2.8.2.1.8  A subset of the core PM2.5 SLAMS shall be 
    designated NAMS as discussed in section 3.7 of this appendix. The 
    selection of core monitoring sites in relation to MPA's and SAZs is 
    discussed further in section 2.8.3 of this appendix.
        2.8.2.2.  Other PM2.5 SLAMS locations
        In addition to the required core sites described in section 
    2.8.2.1 of this appendix, the State shall also be required to 
    establish a minimum number of additional SLAMS. The number of 
    stations shall be based on the total population outside the 
    monitoring planning areas which contain population-oriented core 
    SLAMS. There shall be one such additional SLAMS for each 250,000 
    people. This number of monitors are in addition to the core SLAMS 
    required for monitoring planning areas. This may be satisfied, in 
    part, by the regional background and regional transport core SLAMS 
    if the latter sites are population-oriented. The minimum number of 
    SLAMS may be developed anywhere in the State to satisfy the SLAMS 
    monitoring objectives described in Section 1 of this appendix. Other 
    SLAMS may also be established and are encouraged in a State 
    PM2.5 network.
        2.8.2.3  Continuous fine particle monitoring at Core SLAMS
        At least one continuous fine particle analyzer (e.g., beta 
    attenuation analyzer; tapered-element, oscillating microbalance 
    (TEOM); transimissometer; nephelometer; or other acceptable 
    continuous fine particle monitor) shall be located at a core 
    monitoring PM2.5 site in each metropolitan area with a 
    population greater than 1 million. The analyzer shall preferably 
    sample the ambient air of the same spatial averaging zone as a 
    category (b) core SLAMS. These analyzers shall be used to provide 
    improved temporal resolution to better understand the processes and 
    causes of elevated PM2.5 concentrations and to facilitate 
    public reporting of PM2.5 air quality. The methodology and QA/
    QC requirements will be provided in supplementary EPA guidance.
        2.8.2.4  Additional PM2.5 Analysis Requirements
        Air pollution control agencies shall archive PM2.5 filters 
    from all SLAMS sites for a minimum of one year after collection. All 
    PM2.5 filters from core NAMS sites shall be archived for a 
    minimum of 5 years. These filters shall be made available for 
    supplemental analyses at the request of EPA or to provide 
    information to State and local agencies on the composition and 
    trends for PM2.5. The filters shall be archived in accordance 
    with EPA guidance.
        2.8.3  Selection of Monitoring locations within SAZs and MPA's
        2.8.3.1  Figure 2 of this appendix illustrates a hypothetical 
    monitoring planning area and shows the location of monitors in 
    relation to population and areas of poor air quality. Figure 3 of 
    this appendix shows the same hypothetical MPA as Figure 2 of this 
    appendix and illustrates potential spatial averaging zones and the 
    location of core monitoring sites within them. Figure 4 of this 
    appendix illustrates which sites within the SAZs of the same MPA may 
    be used for comparison to the PM2.5 NAAQS.
    
    BILLING CODE 6560-50-P
    
    [[Page 65863]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.139
    
    
    
    BILLING CODE 6560-50-C
    
    [[Page 65864]]
    
        2.8.3.2  In Figure 2 of this appendix, a hypothetical monitoring 
    planning area is shown representing a typical Eastern US urban 
    areas. The ellipses represent zones with relatively high population 
    and poor air quality, respectively. Concentration isopleths are also 
    depicted. The highest population density is indicated by the urban 
    icons, while the area of worst air quality is presumed to be near 
    the industrial symbols. Each monitoring planning area is required to 
    have at least two core population-oriented monitors (with PAMS areas 
    requiring three) and may have as many other SLAMS and SPMS as 
    necessary. All SLAMS should generally be population-oriented, while 
    the SPMs can focus more on other monitoring objectives, e.g. 
    identifying source impacts and the area boundaries with maximum 
    concentration. ``Ca'' denotes ``category a'' core SLAMS site 
    (populated-oriented site in area of expected maximum concentration); 
    shown within the populated area and closest to the area with highest 
    concentration. `` Cb'' denotes a ``category b'' core SLAMS site 
    (area of poor air quality with high population density or 
    representative of maximum population impact); it is shown in the 
    area of poor air quality, closest to highest population density. 
    ``S'' denotes other SLAMS sites (monitoring for any objective: max 
    concentration, population exposure, source-oriented, background, or 
    regional transport or in support of secondary NAAQS). Finally, `` 
    p'' denotes a Special Purpose Monitor (a specialized monitor which 
    may use a non-reference sampler).
        2.8.3.3  A Monitoring Planning Area would have one or more 
    Spatial Averaging Zones (SAZ) for aggregation of data for comparison 
    to the annual NAAQS. The planning area has large gradients of 
    average air quality and, as shown in Figure 3 is assigned 3 SAZs: an 
    industrial zone, a downtown central business district (CBD) and a 
    residential area. (If there is not a large difference between 
    downtown concentrations and other residential areas, a separate CBD 
    zone would not be necessary). If a required Monitoring Planning Area 
    has multiple SAZ's, then each SAZ must have at least one core 
    location. Therefore, in this example with 3 SAZ's, the MPA must have 
    at least one additional core site (i.e. one SLAMS in the downtown 
    CBD must be a core site).
    
    BILLING CODE 6560-50-P
    
    [[Page 65865]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.140
    
    
    
    BILLING CODE 6560-50-C
    
    [[Page 65866]]
    
        2.8.3.4  The Figure 4 of this appendix diagram shows the 
    designation of monitoring sites according to the eligible NAAQS with 
    which comparisons are permitted. Note that site type ``B'' can be 
    core, SLAMS or SPMs. D's may be SLAMS or SPMs. Within the 
    residential zone, all monitors shown represent areawide air quality 
    and can be averaged for comparison to the annual PM-2.5 NAAQS 
    and also be used for comparison to the daily PM-2.5 standard. 
    In the downtown CBD, one site is a local ``hot spot,'' used for 
    comparison to the daily NAAQS only. The other site is typical of the 
    CBD and can by itself represent this zone for comparison to the 
    annual NAAQS. In this example area, the State might need to further 
    subdivide the CBD into additional sub-zones: if concentration 
    gradients are large or are associated with large areas/populations 
    (e.g. Madison Avenue NYC with diesel buses). Then one or more sites 
    in each sub-zone would be averaged and be eligible for comparison to 
    the annual NAAQS. In the industrial zone shown, three sites shown 
    are averaged for comparison to the annual NAAQS and are also used 
    individually for comparison to the daily NAAQS. One site is 
    additionally used for comparison to the daily standard and the 
    remaining two special study sites shown either do not satisfy Part 
    58 requirements or are not in the Monitoring Plan and therefore are 
    not eligible for comparison to either PM2.5 NAAQS. One of the 
    sites identified as ``B'' was a SPM. Finally note that all SPM's 
    would be subject to the 3-year moratorium against data comparison to 
    the NAAQS.
    
    BILLING CODE 6560-50-P
    
    [[Page 65867]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.141
    
    
    
    BILLING CODE 6560-50-C
    
    [[Page 65868]]
    
        2.8.3.5   Figure 5 of this appendix illustrates how potential 
    SAZs and PM2.5 monitors might be located in a hypothetical MPA 
    typical of a Western State. Figure 6 of this appendix shows how the 
    MPA's, SAZs, and PM2.5 monitors might be distributed within a 
    hypothetical State. Western States with more localized sources of PM 
    and larger geographic area could require a different mix of SLAMS 
    and SPM monitors and may need more spatial averaging areas. Figure 5 
    of this appendix illustrates a monitoring planning area for a 
    hypothetical western State in which ``B's'' and ``D's'' represent 
    the sites which are eligible for comparison the both NAAQS or the 
    daily NAAQS only. Triangles are other special study sites. Spatial 
    averaging zones are shown by shaded areas. As the networks are 
    deployed, the available monitors may not be sufficient to completely 
    represent all geographic portions of the Monitoring Planning Area. 
    Due to the distribution of pollution and population and because of 
    the number and spatial representativeness of monitors, the MPA's and 
    SAZ's may not cover the entire State. NAAQS are indicated by ``X.'' 
    The appropriate monitors within an SAZ would be averaged for 
    comparison to the annual NAAQS and examined individually for 
    comparison to the daily NAAQS. Other monitors are only eligible for 
    comparison to the daily NAAQS. Both within the MPA's and in the 
    remainder of the State, some special study monitors might not 
    satisfy applicable part 58 requirements or will not be included in 
    the State Monitoring Plan and will not be eligible for comparison to 
    the NAAQS. The latter may include SLAMS monitors designated to study 
    regional transport or to support secondary NAAQS in unpopulated 
    areas.
    
    BILLING CODE 6560-50-P
    
    [[Page 65869]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.142
    
    
    
    [[Page 65870]]
    
    [GRAPHIC] [TIFF OMITTED] TP13DE96.143
    
    
    
    BILLING CODE 6560-50-C
    
    [[Page 65871]]
    
        2.8.4   Substitute PM Monitoring Sites
        2.8.4.1  Appendix C (section 2.2) to part 58 describes 
    conditions under which PMPM10 samplers may be used as 
    substitutes for PM2.5 samplers and when such PM10 samplers must 
    be replaced with PM2.5 samplers. Analogous rules are described 
    for TSP samples which can be used as substitutes for PM10. This 
    provision is intended to be used when PM concentrations are low and 
    substitute samplers can be used to satisfy the minimum number of PM 
    samplers needed for an adequate PM network. This may be most 
    appropriate when sufficient resources to purchase new PM samplers 
    may not exist and existing samplers can be temporarily used to serve 
    a new PM network.
        2.4.4.2  Monitoring sites at which PM10 samplers are 
    intended to be used as substitute PM2.5 samplers must be 
    identified in the PM monitoring plan. In order for a PM10 
    sampler to be used as a substitute for PM2.5, the existing 
    PM10 samplers must meet the quality assurance requirements of 
    appendix A of this part, the siting requirements of appendix E of 
    this part, and are located in areas of suspected maximum 
    concentrations as described in section 3 of this appendix, and if 
    the PM10 levels are below the ambient PM2.5 standards, 
    analogous language applies to substitute TSP samplers for PM10. 
    Moreover, if existing TSP sites satisfy these criteria, the TSP 
    samplers may continue to be used as substitutes for PM10 SLAMS 
    samplers under the provisions of section 2.2 of Appendix C of this 
    part.
        2.4.4.3  If data produced by substitute PM samplers exceed the 
    concentration levels described in Appendix C of this part, then this 
    sampler shall be converted to a PM10 or PM2.5 sampler, 
    whichever is indicated. If the State does not believe that a 
    PM10 or PM2.5 sampler should alternatively be sited in a 
    different location, the State shall submit documentation to EPA as 
    part of its annual PM report to justify this decision. If a PM site 
    is not designated as a substitute site in the PM monitoring plan, 
    then high concentrations at this site would not necessarily cause 
    this site to become a PM10 site.
        2.4.4.4  Consistent with Sec. 58.1, combinations of SLAMS 
    PM10 or PM2.5 monitors and other monitors may occupy the 
    same structure without any mutual effect on the regulatory 
    definition of the monitors.
    
        25. Section 3 is amended by revising the third and fifth paragraphs 
    to read as follows:
    
    3. Network Design for National Air Monitoring Stations (NAMS)
    
    * * * * *
        Category (a): Stations located in area(s) of expected maximum 
    concentrations (generally microscale for CO, microscale or middle 
    scale for Pb, middle scale or neighborhood scale for population 
    oriented particulate matter, urban or regional scale for Regional 
    transport PM2.5, neighborhood scale for SO2, and NO2, 
    and urban scale for O3.
    * * * * *
        For each MSA where NAMS are required, both categories of 
    monitoring stations must be established. In the case of SO2 if 
    only one NAMS is needed, then category (a) must be used. In the case 
    of PM2.5, category (b) is strongly. The analysis and 
    interpretation of data from NAMS should consider the distinction 
    between these types of stations as appropriate.
    * * * * *
        26. Section 3.7 is revised and section 3.7.1 through 3.7.6.4 are 
    added to read as follows:
    
        3.7  Particulate Matter Design Criteria for NAMS
        3.7.1  Table 4 indicates the approximate number of permanent 
    stations required in MSA's to characterize national and regional 
    PM10 air quality trends and geographical patterns. The number 
    of PM10 stations in areas where MSA populations exceed 
    1,000,000 must be in the range from 2 to 10 stations, while in low 
    population urban areas, no more than two stations are required. A 
    range of monitoring stations is specified in Table 4 because sources 
    of pollutants and local control efforts can vary from one part of 
    the country to another and therefore, some flexibility is allowed in 
    selecting the actual number of stations in any one locale.
        3.7.2  Through promulgation of the NAAQS for PM2.5, the 
    number of PM10 SLAMS is expected to decrease, but requirements 
    to maintain PM10 NAMS remain in effect. The PM10 NAMS are 
    retained to provide trends data, to support national assessments and 
    decisions, and in some cases to continue demonstration that a NAAQS 
    for PM10 is maintained as a requirement under a State 
    Implementation Plan.
        3.7.3  The PM2.5 NAMS shall be a subset of the core SLAMS 
    network. The PM2.5 NAMS are planned as long-term monitoring 
    stations concentrated in metropolitan areas. A target range of 200 
    to 300 stations shall be designated nationwide. The largest 
    metropolitan areas (those with a population greater than 
    approximately one million) shall have at least two PM2.5 NAMS 
    stations.
        3.7.4  The number of total PM2.5 NAMS per Region will be 
    based on recommendations of the EPA Regional Offices, in concert 
    with their State and local agencies, in accordance with the network 
    design goals described in sections 3.7.5 and 3.7.6 of this Appendix. 
    The selected stations should represent the range of conditions 
    occurring in the Regions and will consider factors such as total 
    number or type of sources, ambient concentrations of particulate 
    matter, and regional transport.
        3.7.5  The approach is intended give State and local agencies 
    maximum flexibility while apportioning a limited national network. 
    By advancing a range of monitors per Region, EPA intends to balance 
    the national network with respect to geographic area and population. 
    Table 5 presents the target number of NAMS per Region to meet the 
    national goal of 200 to 300 stations. These numbers consider a 
    variety of factors such as Regional differences in metropolitan 
    population, population density, land area, sources of particulate 
    emissions, and the numbers of PM10 NAMS.
        3.7.6  Since emissions associated with the operation of motor 
    vehicles contribute to urban area particulate matter levels, 
    consideration of the impact of these sources must be included in the 
    design of the NAMS network, particularly in MSA's greater than 
    500,000 population. In certain urban areas particulate emissions 
    from motor vehicle diesel exhaust currently is or is expected to be 
    a significant source of particulate matter ambient levels. The 
    actual number of NAMS and their locations must be determined by EPA 
    Regional Offices and the State agencies, subject to the approval of 
    the Administrator as required by Sec. 58.32. The Administrator's 
    approval is necessary to insure that individual stations conform to 
    the NAMS selection criteria and that the network as a whole is 
    sufficient in terms of number and location for purposes of national 
    analyses.
    
                                 Table 4.--PM10 National Air Monitoring Station Criteria                            
                                        [Approximate Number of Stations per MSA]                                    
    ----------------------------------------------------------------------------------------------------------------
                                                                              High          Medium          Low     
                            Population category                          concentration  concentration  concentration
                                                                               (b)            (c)            (d)    
    ----------------------------------------------------------------------------------------------------------------
    >1,000,000.........................................................      6-10               4-8          2-4    
    500,000-1,000,000..................................................       4-8               2-4          1-2    
    250,000-500,000....................................................       3-4               1-2          0-1    
    100,000-250,000....................................................       1-2               0-1            0    
    ----------------------------------------------------------------------------------------------------------------
    
        3.7.6.1  Selection of urban areas and actual number of stations 
    per area will be jointly determined by EPA and the State agency.
        3.7.6.2  High concentration areas are those for which: Ambient 
    PM10 data show ambient
    
    [[Page 65872]]
    
    concentrations exceeding either PM10 NAAQS by 20 percent or 
    more.
        3.7.6.3  Medium concentration areas are those for which: Ambient 
    PM10 data show ambient concentrations exceeding either 80 
    percent of the PM10 NAAQS.
        3.7.6.4  Low concentration areas are those for which: Ambient 
    PM10 data show ambient concentrations less than 80 percent of 
    the PM10 NAAQS.
    
    
               Table 5.--Goals for Number of PM2.5 NAMS by Region           
    ------------------------------------------------------------------------
                                                               Percent of   
                EPA region                Number of NAMS     national total 
    -------------------------------------------\1\--------------------------
    1.................................  15 to 20.........  6 to 8.          
    2.................................  20 to 30.........  8 to 12.         
    3.................................  20 to 25.........  8 to 10.         
    4.................................  35 to 50.........  14 to 20.        
    5.................................  35 to 50.........  14 to 20.        
    6.................................  25 to 35.........  10 to 14.        
    7.................................  10 to 15.........  4 to 6.          
    8.................................  10 to 15.........  4 to 6.          
    9.................................  25 to 40.........  10 to 16.        
    10................................  10 to 15.........  4 to 6.          
                                       -------------------------------------
        Total.........................  205-295..........  100.             
    ------------------------------------------------------------------------
    \1\ Each region will have one to three NAMS having the monitoring of    
      regional transport as a primary objective.                            
    
        27. Section 4.2 is amended by redesignating Figures 1 and 2 as 
    Figures 7 and 8.
        28. Section 5 is revised to read as follows:
    
    5. Summary
    
        Table 6 of this appendix shows by pollutant, all of the spatial 
    scales that are applicable for SLAMS and the required spatial scales 
    for NAMS. There may also be some situations, as discussed later in 
    appendix E of this part, where additional scales may be allowed for 
    NAMS purposes.
    
    
                       Table 6.--Summary of Spatial Scales for SLAMS and Required Scales for NAMS                   
    ----------------------------------------------------------------------------------------------------------------
                                                              Scales applicable for SLAMS                           
            Spatial scale        -----------------------------------------------------------------------------------
                                      SO2         CO          O3          NO2         Pb         PM10        PM2.5  
    ----------------------------------------------------------------------------------------------------------------
    Micro.......................                                                        
    Middle......................                                   
    Neighborhood................                                   
    Urban.......................                                          
    Regional....................                                                 
                                                                                                                    
                                                                                                                    
    (6)Scales required for NAMS                                                                                     
                                                                                                                    
    Micro.......................                                                     \1\  
    Middle......................                                                               
    Neighborhood................                                   
    Urban.......................                                                            \2\  
    Regional....................                                                                          \2\   
    ----------------------------------------------------------------------------------------------------------------
    \1\ Only permitted if representative of many such micro-scale environments.                                     
    \2\ Either urban or regional scale for regional transport sites.                                                
    
        28. Section 6 is amended by revising reference 18 to read as 
    follows:
    
    6. References
    
    * * * * *
        18. Network Design and Siting Criteria for PM2.5 prepared 
    for U.S. Environmental Protection Agency, Research Triangle Park, 
    NC. In preparation.
    
        29. Appendix E is amended by revising the heading of section 8, 
    adding a sentence to the last paragraph of section 8.1 to read as 
    follows, and in section 8.3 removing the term PM10 and adding in 
    its place ``PM.''
    
    Appendix E--Probe and Open Path Siting Criteria for Ambient Air 
    Quality Monitoring
    
    * * * * *
    
    8. Particulate Matter (PM10 and PM2.5)
    
        8.1  Vertical Placement
        * * * Although microscale stations are not the preferred spatial 
    scale for PM2.5 sites, there are situations where microscale 
    sites representative of several locations within an area where large 
    segments of the population may live or work (e.g., mid-town 
    Manhattan in New York City). In these cases, the sampler inlet for 
    such microscale PM2.5 stations must also be 2-7 meters above 
    ground level.
    
    Appendix F--[Amended]
    
        30. Appendix F is amended by redesignating section 2.7.3 as section 
    2.7.4 and adding a new section 2.7.3 to read as follows:
    
        2.7.3  Annual Summary Statistics. Annual arithmetic mean 
    (g/m3) as specified in appendix K of 40 CFR part 50. 
    All daily PM-fine values above the level of the 24-hour PM-fine 
    NAAQS and dates of occurrence. Sampling schedule used such as once 
    every 6 days, everyday, etc. Number of 24-hour average 
    concentrations in ranges:
    
    ------------------------------------------------------------------------
                                                                  Number of 
                               Range                                values  
    ------------------------------------------------------------------------
    0 to 15 (g/m\3\)..................................             
    16 to 30...................................................             
    31 to 50...................................................             
    51 to 70...................................................             
    71 to 90...................................................             
    91 to 110..................................................             
    Greater than 110...........................................             
    ------------------------------------------------------------------------
    
    [FR Doc. 96-31437 Filed 12-12-96; 8:45 am]
    BILLING CODE 6560-50-P
    
    
    

Document Information

Published:
12/13/1996
Department:
Environmental Protection Agency
Entry Type:
Proposed Rule
Action:
Proposed rule.
Document Number:
96-31437
Dates:
Comments must be submitted on or before February 18, 1997.
Pages:
65780-65872 (93 pages)
Docket Numbers:
AD-FRL-5659-2
RINs:
2060-AH09
PDF File:
96-31437.pdf
CFR: (83)
40 CFR 53.4)
40 CFR 99.5%
40 CFR 53.40(a)
40 CFR 53.4(b)(3)
40 CFR 53.9(b)]
More ...