02-59. Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Information Disseminated by Federal Agencies  

  • Start Preamble

    AGENCY:

    Office of Management and Budget, Executive Office of the President.

    ACTION:

    Final guidelines.

    SUMMARY:

    These final guidelines implement section 515 of the Treasury and General Government Appropriations Act for Fiscal Year 2001 (Public Law 106-554; H.R. 5658). Section 515 directs the Office of Management and Budget (OMB) to issue government-wide guidelines that “provide policy and procedural guidance to Federal agencies for ensuring and maximizing the quality, objectivity, utility, and integrity of information (including statistical information) disseminated by Federal agencies.” By October 1, 2002, agencies must issue their own implementing guidelines that include “administrative mechanisms allowing affected persons to seek and obtain correction of information maintained and disseminated by the agency” that does not comply with the OMB guidelines. These final guidelines also reflect the changes OMB made to the guidelines issued September 28, 2001, as a result of receiving additional comment on the “capable of being substantially reproduced” standard (paragraphs V.3.B, V.9, and V.10), which OMB previously issued on September 28, 2001, on an interim final basis.

    DATES:

    Effective Date: January 3, 2002.

    Start Further Info

    FOR FURTHER INFORMATION CONTACT:

    Brooke J. Dickson, Office of Information and Regulatory Affairs, Office of Management and Budget, Washington, DC 20503. Telephone (202) 395-3785 or by e-mail to informationquality@omb.eop.gov.

    End Further Info End Preamble Start Supplemental Information

    SUPPLEMENTARY INFORMATION:

    In section 515(a) of the Treasury and General Government Appropriations Act for Fiscal Year 2001 (Public Law 106-554; H.R. 5658), Congress directed the Office of Management (OMB) to issue, by September 30, 2001, government-wide guidelines that “provide policy and procedural guidance to Federal agencies for ensuring and maximizing the quality, objectivity, utility, and integrity of information (including statistical information) disseminated by Federal agencies * * *” Section 515(b) goes on to state that the OMB guidelines shall:

    “(1) apply to the sharing by Federal agencies of, and access to, information disseminated by Federal agencies; and

    “(2) require that each Federal agency to which the guidelines apply—

    “(A) issue guidelines ensuring and maximizing the quality, objectivity, utility, and integrity of information (including statistical information) disseminated by the agency, by not later than 1 year after the date of issuance of the guidelines under subsection (a);

    “(B) establish administrative mechanisms allowing affected persons to seek and obtain correction of information maintained and disseminated by the agency that does not comply with the guidelines issued under subsection (a); and

    “(C) report periodically to the Director—

    “(i) the number and nature of complaints received by the agency regarding the accuracy of information disseminated by the agency and

    “(ii) how such complaints were handled by the agency.”

    Proposed guidelines were published in the Federal Register on June 28, 2001 (66 FR 34489). Final guidelines were published in the Federal Register on September 28, 2001 (66 FR 49718). The Supplementary Information to the final guidelines published in September 20001 provides background, the underlying principles OMB followed in issuing the final guidelines, and statements of intent concerning detailed provisions in the final guidelines.

    In the final guidelilnes published in September 2001, OMB also requested additional comment on the “capable of being substantially reproduced” standard and the related definition of “influential scientific or statistical information” (paragraphs V.3.B, V.9, and V.10), which were issued on an interim final basis. The final guidelines published today discuss the public comments OMB received, the OMB response, and amendments to the final guidelines published in September 2001.

    In developing agency-specific guidelines, agencies should refer both to the Supplementary Information to the final guidelines published in the Federal Register on September 28, 2001 (66 FR 49718), and also to the Supplementary Information published today. We stress that the three “Underlying Principles” that OMB followed in drafting the guidelines that we published on September 28, 2001 (66 FR 49719), are also applicable to the amended guidelines that we publish today.

    In accordance with section 515, OMB has designed the guidelines to help agencies ensure and maximize the quality, utility, objectivity and integrity of the information that they disseminate (meaning to share with, or give access Start Printed Page 370to, the public). It is crucial that information Federal agencies disseminate meets these guidelines. In this respect, the fact that the Internet enables agencies to communicate information quickly and easily to a wide audience not only offers great benefits to society, but also increases the potential harm that can result from the dissemination of information that does not meet basic information quality guidelines. Recognizing the wide variety of information Federal agencies disseminate and the wide variety of dissemination practices that agencies have, OMB developed the guidelines with several principles in mind.

    First, OMB designed the guidelines to apply to a wide variety of government information dissemination activities that may range in importance and scope. OMB also designed the guidelines to be generic enough to fit all media, be they printed, electronic, or in other form. OMB sought to avoid the problems that would be inherent in developing detailed, prescriptive, “one-size-fits-all” government-wide guidelines that would artificially require different types of dissemination activities to be treated in the same manner. Through this flexibility, each agency will be able to incorporate the requirements of these OMB guidelines into the agency's own information resource management and administrative practices.

    Second, OMB designed the guidelines so that agencies will meet basic information quality standards. Given the administrative mechanisms required by section 515 as well as the standards set forth in the Paperwork Reduction Act, it is clear that agencies should not disseminate substantive information that does not meet a basic level of quality. We recognize that some government information may need to meet higher or more specific information quality standards than those that would apply to other types of government information. The more important the information, the higher the quality standards to which it should be held, for example, in those situations involving “influential scientific, financial, or statistical information” (a phrase defined in these guidelines). The guidelines recognize, however, that information quality comes at a cost. Accordingly, the agencies should weigh the costs (for example, including costs attributable to agency processing effort, respondent burden, maintenance of needed privacy, and assurances of suitable confidentiality) and the benefits of higher information quality in the development of information, and the level of quality to which the information disseminated will be held.

    Third, OMB designed the guidelines so that agencies can apply them in a common-sense and workable manner. It is important that these guidelines do not impose unnecessary administrative burdens that would inhibit agencies from continuing to take advantage of the Internet and other technologies to disseminate information that can be of great benefit and value to the public. In this regard, OMB encourages agencies to incorporate the standards and procedures required these guidelines into their existing information resources management and administrative practices rather than create new and potentially duplicative or contradictory processes. The primary example of this is that the guidelines recognize that, in accordance with OMB Circular A-130, agencies already have in place well-established information quality standards and administrative mechanisms that allow persons to seek and obtain correction of information that is maintained and disseminated by the agency. Under the OMB guidelines, agencies need only ensure that their own guidelines are consistent with these OMB guidelines, and then ensure that their administrative are consistent with these OMB guidelines, and then ensure that their administrative mechanisms satisfy the standards and procedural requirements in the new agency guidelines. Similarly, agencies may rely on their implementation of the Federal Government's computer security laws (formerly, the Computer Security Act, and now the computer security provisions of the Paperwork Reduction Act) to establish appropriate security safeguards for ensuring the “integrity” of the information that the agencies disseminate.

    In addition, in response to concerns expressed by some of the agencies, we want to emphasize that OMB recognizes that Federal agencies provide a wide variety of data and information. Accordingly, OMB understands that the guidelines discussed below cannot be implemented in the same way by each agency. In some cases, for example, the data disseminated by an agency are not collected by that agency; rather, the information the agency must provide in a timely manner is compiled from a variety of sources that are constantly updated and revised and may be confidential. In such cases, while agencies' implementation of the guidelines may differ, the essence of the guidelines will apply. That is, these agencies must make their methods transparent by providing documentation, ensure quality by reviewing the underlying methods used in developing the data and consulting (as appropriate) with experts and users, and keep users informed about corrections and revisions.

    Summary of OMB Guidelines

    These guidelines apply to Federal agencies subject to the Paperwork Reduction Act (44 U.S.C. chapter 35). Agencies are directed to develop information resources management procedures for reviewing and substantiating (by documentation or other means selected by the agency) the quality (including the objectivity, utility, and integrity) of information before it is disseminated. In addition, agencies are to establish administrative mechanisms allowing affected persons to seek and obtain, where appropriate, correction of information disseminated by the agency that does not comply with the OMB or agency guidelines. Consistent with the underlying principles described above, these guidelines stress the importance of having agencies apply these standards and develop their administrative mechanisms so they can be implemented in a common sense and workable manner. Moreover, agencies must apply these standards flexibly, and in a manner appropriate to the nature and timeliness of the information to be disseminated, and incorporate them into existing agency information resources management and administrative practices.

    Section 515 denotes four substantive terms regarding information disseminated by Federal agencies: quality, utility, objectivity, and integrity. It is not always clear how each substantive term relates—or how the four terms in aggregate relate—to the widely divergent types of information that agencies disseminate. The guidelines provide definitions that attempt to establish a clear meaning so that both the agency and the public can readily judge whether a particular type of information to be disseminated does or does not meet these attributes.

    In the guidelines, OMB defines “quality” as the encompassing term, of which “utility,” “objectivity,” and “integrity” are the constituents. “Utility” refers to the usefulness of the information to the intended users. “Objectivity” focuses on whether the disseminated information is being presented in an accurate, clear, complete, and unbiased manner, and as a matter of substance, is accurate, reliable, and unbiased. “Integrity” refers to security—the protection of information from unauthorized access or revision, to ensure that the information is not compromised Start Printed Page 371through corruption or falsification. OMB modeled the definitions of “information,” “government information,” “information dissemination product,” and “dissemination” on the longstanding definitions of those terms in OMB Circular A-130, but tailored them to fit into the context of these guidelines.

    In addition, Section 515 imposes two reporting requirements on the agencies. The first report, to be promulgated no later than October 1, 2002, must provide the agency's information quality guidelines that describe administrative mechanisms allowing affected persons to seek and obtain, where appropriate, correction of disseminated information that does not comply with the OMB and agency guidelines. The second report is an annual fiscal report to OMB (to be first submitted on January 1, 2004) providing information (both quantitative and qualitative, where appropriate) on the number, nature, and resolution of complaints received by the agency regarding its perceived or confirmed failure to comply with these OMB and agency guidelines.

    Public Comments and OMB Response

    Applicability of Guidelines. Some comments raised concerns about the applicability of these guidelines, particularly in the context of scientific research conducted by Federally employed scientists or Federal grantees who publish and communicate their research findings in the same manner as their academic colleagues. OMB believes that information generated and disseminated in these contexts is not covered by these guidelines unless the agency represents the information as, or uses the information in support of, an official position of the agency.

    As a general matter, these guidelines apply to “information” that is “disseminated” by agencies subject to the Paperwork Reduction Act (44 U.S.C. 3502(1)). See paragraphs II, V.5 and V.8. The definitions of “information” and “dissemination” establish the scope of the applicability of these guidelines. “Information” means “any communication or representation of knowledge such as facts or data * * *” This definition of information in paragraph V.5 does “not include opinions, where the agency's presentation makes it clear that what is being offered is someone's opinion rather than fact or the agency's views.”

    “Dissemination” is defined to mean “agency initiated or sponsored distribution of information to the public.” As used in paragraph V.8, “agency INITIATED * * * distribution of information to the public” refers to information that the agency disseminates, e.g., a risk assessment prepared by the agency to inform the agency's formulation of possible regulatory or other action. In addition, if an agency, as an institution, disseminates information prepared by an outside party in a manner that reasonably suggests that the agency agrees with the information, this appearance of having the information represent agency views makes agency dissemination of the information subject to these guidelines. By contrast, an agency does not “initiate” the dissemination of information when a Federally employed scientist or Federal grantee or contractor publishes and communicates his or her research findings in the same manner as his or her academic colleagues, even if the Federal agency retains ownership or other intellectual property rights because the Federal government paid for the research. To avoid confusion regarding whether the agency agrees with the information (and is therefore disseminating it through the employee or grantee), the researcher should include an appropriate disclaimer in the publication or speech to the effect that the “views are mine, and do not necessarily reflect the view” of the agency.

    Similarly, as used in paragraph V.8., “agency * * * SPONSORED distribution of information to the public” refers to situations where an agency has directed a third-party to disseminate information, or where the agency has the authority to review and approve the information before release. Therefore, for example, if an agency through a procurement contract or a grant provides for a person to conduct research, and then the agency directs the person to disseminate the results (or the agency reviews and approves the results before they may be disseminated), then the agency has “sponsored” the dissemination of this information. By contrast, if the agency simply provides funding to support research, and it the researcher (not the agency) who decides whether to disseminate the results and—if the results are to be released—who determines the content and presentation of the dissemination, then the agency has not “sponsored” the dissemination even though it has funded the research and even if the Federal agency retains ownership or other intellectual property rights because the Federal government paid for the research. To avoid confusion regarding whether the agency is sponsoring the dissemination, the researcher should include an appropriate disclaimer in the publication or speech to the effect that the “views are mine, and do not necessarily reflect the view” of the agency. On the other hand, subsequent agency dissemination of such information requires that the information adhere to the agency's information quality guidelines. In sum, these guidelines govern an agency's dissemination of information, but generally do not govern a third-party's dissemination of information (the exception being where the agency is essentially using the third-party to disseminate information on the agency's behalf). Agencies, particularly those that fund scientific research, are encouraged to clarify the applicability of these guidelines to the various types of information they and their employees and grantees disseminate.

    Paragraph V.8 also states that the definition of “dissemination” does not include “* * * distribution limited to correspondence with individuals or persons, press releases, archival records, public filings, subpoenas or adjudicative processes.” The exemption from the definition of “dissemination” for “adjudicative processes” is intended to exclude, from the scope of these guidelines, the findings and determinations that an agency makes in the course of adjudications involving specific parties. There are well-established procedural safeguards and rights to address the quality of adjudicatory decisions and to provide persons with an opportunity to contest decisions. These guidelines do not impose any additional requirements on agencies during adjudicative proceedings and do not provide parties to such adjudicative proceedings any additional rights of challenge or appeal.

    The Presumption Favoring Peer-Reviewed Information. As a general matter, in the scientific and research context, we regard technical information that has been subjected to formal, independent, external peer review as presumptively objective. As the guidelines state in paragraph V.3.b.i: “If data and analytic results have been subjected to formal, independent, external peer review, the information may generally be presumed to be of acceptable objectivity.” An example of a formal, independent, external peer review is the review process used by scientific journals.

    Most comments approved of the prominent role that peer review plays in the OMB guidelines. Some comments contended that peer review was not accepted as a universal standard that incorporates an established, practiced, and sufficient level of objectively. Other Start Printed Page 372comments stated that the guidelines would be better clarified by making peer review one of several factors that an agency should consider in assessing the objectivity (and quality in general) of original research. In addition, several comments noted that peer review does not establish whether analytic results are capable of being substantially reproduced. In light of the comments, the final guidelines in new paragraph V.3.b.i qualify the presumption in favor of peer-reviewed information as follows: “However, this presumption is rebuttable based on a persuasive showing by the petitioner in a particular instance.”

    We believe that transparency is important for peer review, and these guidelines set minimum standards for the transparency of agency-sponsored peer review. As we state in new paragraph V.3.b.i: “If data and analytic results have been subjected to formal, independent, external peer review, the information may generally be presumed to be of acceptable objectivity. However, this presumption is rebuttable based on a persuasive showing by the petitioner in particular instance. If agency-sponsored peer review is employed to help satisfy the objectively standard, the review process employed shall meet the general criteria for competent and credible peer review recommended by OMB-OIRA to the President's Management Council (9/20/01) (http://www.whitehouse.gov/​omb/​inforeg/​oira_​review-process.html),, namely, ‘that (a) peer reviewers be selected primarily on the basis of necessary technical expertise, (b) peer reviewers be expected to disclosed to agencies prior technical/policy positions they may have taken on the issues at hand, (c) peer reviewers be expected to disclose to agencies their sources of personal and institutional funding (private or public sector), and (d) peer reviews be conducted in an open and rigorous manner.’ ”

    The importance of these general criteria for competent and credible peer review has been supported by a number of expert bodies. For example. “the work of fully competent peer-review panels can be undermined by allegations of conflict of interest and bias. Therefore, the best interests of the Board are served by effective policies and procedures regarding potential conflicts of interest, impartiality, and panel balance.” (EPA's Science Advisory Board Panels: Improved Policies and Procedures Needed to Ensure Independence and Balance, GAO-01-536, General Accounting Office, Washington, DC, June 2001, page 19.) As another example, “risk analyses should be peer-reviewed and accessible—both physically and intellectually—so that decision-makers at all levels will be able to respond critically to risk characterizations. The intensity of the peer reviews should be commensurate with the significance of the risk or its management implications.” (Setting Priorities, Getting Results: A New Direction for EPA, Summary Report, National Academy of Public Administration, Washington, DC, April 1995, page 23.)

    These criteria for peer reviewers are generally consistent with the practices now followed by the National Research Council of the National Academy of Sciences. In considering these criteria for peer reviewers, we note that there are many types of peer reviews and that agency guidelines concerning the use of peer review should tailor the rigor of peer review to the importance of the information involved. More generally, agencies should define their peer-review standards in appropriate ways, given the nature and importance of the information they disseminate.

    Is Journal Peer Review Always Sufficient? Some comments argued that journal peer review should be adequate to demonstrate quality, even for influential information that can be expected to have major effects or public policy. OMB believes that this position overstates the effectiveness of journal peer review as a quality-control mechanism.

    Although journal peer review is clearly valuable, there are cases where flawed science has been published in respected journals. For example, the NIH Office of Research Integrity recently reported the following case regarding environmental health research:

    “Based on the report of an investigation conducted by [XX] University, dated July 16, 1999, and additional analysis conducted by ORI in its oversight review, the US Public Health Service found that Dr. [X] engaged in scientific misconduct. Dr. [X] committed scientific misconduct by intentionally falsifying the research results published in the journal SCIENCE and by providing falsified and fabricated materials to investigating officials at [XX] University in response to a request for original data to support the research results and conclusions report in the SCIENCE paper. In addition, PHS finds that there is no original data or other corroborating evidence to support the research results and conclusions reported in the SCIENCE paper as whole.” (66 FR 52137, October 12, 2001).

    Although such cases of falsification are presumably rare, there is a significance scholarly literature documenting quality problems with articles published in peer-reviewed research. “In a [peer-reviewed] meta-analysis that surprised many—and some doubt—researchers found little evidence that peer review actually improves the quality of research papers.” (See, e.g., Science, Vol. 293, page 2187 (September 21, 2001.)) In part for this reason, many agencies have already adopted peer review and science advisory practices that go beyond journal peer review. See, e.g., Sheila Jasanoff, The Fifth Branch: Science Advisers as Policy Makers, Cambridge, MA, Harvard University Press, 1990; Mark R. Powell, Science at EPA: Information in the Regulatory Process. Resources for the Future, Washington, DC., 1999, pages 138-139; 151-153; Implementation of the Environmental Protection Agency's Peer Review Program: An SAB Evaluation of Three Reviews, EPA-SAB-RSAC-01-009, A Review of the Research Strategies Advisory Committee (RSAC) of the EPA Science Advisory Board (SAB), Washington, DC., September 26, 2001. For information likely to have an important public policy or private sector impact, OMB believes that additional quality checks beyond peer review are appropriate.

    Definition of “Influential”. OMB guidelines apply stricter quality standards to the dissemination of information that is considered “influential.” Comments noted that the breadth of the definition of “influential” in interim final paragraph V.9 requires much speculation on the part of agencies.

    We believe that this criticism has merit and have therefore narrowed the definition. In this narrower definition, “influential”, when used in the phrase “influential scientific, financial, or statistical information”, is amended to mean that “the agency can reasonably determine that dissemination of the information will have or does have a clear and substantial impact on important public policies or important private sector decisions.” The intent of the new phrase “clear and substantial” is to reduce the need for speculation on the part of agencies. We added the present tense—“or does have”—to this narrower definition because on occasion, an information dissemination may occur simultaneously with a particular policy change. In response to a public comment, we added an explicit reference to “financial” information as consistent with our original intent.

    Given the differences in the many Federal agencies covered by these guidelines, and the differences in the nature of the information they disseminate, we also believe it will be helpful if agencies elaborate on this definition of “influential” in the context of their missions and duties, with due consideration of the nature of the Start Printed Page 373information they disseminate. As we state in amended paragraph V.9, “Each agency is authorized to define ‘influential’ in ways appropriate for it given the nature and multiplicity of issues for which the agency is responsible.”

    Reproducibility. As we state in new paragraph V.3.b.ii: “If an agency is responsible for disseminating influential scientific, financial, or statistical information, agency guidelines shall include a high degree of transparency about data and methods to facilitate the reproducibility of such information by qualified third parties.” OMB believes that a reproducibility standard is practical and appropriate for information that is considered “influential”, as defined in paragraph V.9—that “will have or does have a clear and substantial impact on important public policies or important private sector decisions.” The reproducibility standard applicable to influential scientific, financial, or statistical information is intended to ensure that information disseminated by agencies is sufficiently transparent in terms of data and methods of analysis that it would be feasible for a replication to be conducted. The fact that the use of original and supporting data and analytic results have been deemed “defensible” by peer-review procedures does not necessarily imply that the results are transparent and replicable.

    Reproducibility of Original and Supporting Data. Several of the comments objected to the exclusion of original and supporting data from the reproducibility requirements. Comments instead suggested that OMB should apply the reproducibility standard to original data, and that OMB should provide flexibility to the agencies in determining what constitutes “original and supporting” data. OMB agrees and asks that agencies consider, in developing their own guidelines, which categories of original and supporting data should be subject to the reproducibility standard and which should not. To help in resolving this issue, we also ask agencies to consult directly with relevant scientific and technical communities on the feasibility of having the selected categories of original and supporting data subject to the reproducibility standard. Agencies are encouraged to address ethical, feasibility, and confidentiality issues with care. As we state in new paragraph V.3.b.ii.A, “Agencies may identify, in consultation with the relevant scientific and technical communities, those particular types of data that can practicably be subjected to a reproducibility requirement, given ethical, feasibility, or confidentiality constraints.” Further, as we state in our expanded definition of “reproducibility” in paragraph V.10, “If agencies apply the reproducibility test to specific types of original or supporting data, the associated guidelines shall provide relevant definitions of reproducibility (e.g. standards for replication of laboratory data).” OMB urges caution in the treatment of original and supporting data because it may often be impractical or even impermissible or unethical to apply the reproducibility standard to such data. For example, it may not be ethical to repeat a “negative” (ineffective) clinical (therapeutic) experiment and it may not be feasible to replicate the radiation exposures studied after the Chernobyl accident. When agencies submit their draft agency guidelines for OMB review, agencies should include a description of the extent to which the reproducibility standard is applicable and reflect consultations with relevant scientific and technical communities that were used in developing guidelines related to applicability of the reproducibility standard to original and supporting data.

    It is also important to emphasize that the reproducibility standard does not apply to all original and supporting data disseminated by agencies. As we state in new paragraph V.3.b.ii.A, “With regard to original and supporting data related [to influential scientific, financial, or statistical information], agency guidelines shall not require that all disseminated data be subjected to a reproducibility requirement.” In addition, we encourage agencies to address how greater transparency can be achieved regarding original and supporting data. As we also state in new paragraph V.3.b.ii.A, “It is understood that reproducibility of data is an indication of transparency about research design and methods and thus a replication exercise (i.e., a new experiment, test, or sample) shall not be required prior to each dissemination.” Agency guidelines need to achieve a high degree of transparency about data even when reproducibility is not required.

    Reproducibility of Analytic Results. Many public comments were critical of the reproducibility standard and expressed concern that agencies would be required to reproduce each analytical result before it is disseminated. While several comments commended OMB for establishing an appropriate balance in the “capable of being substantially reproduced” standard, others considered this standard to be inherently subjective. There were also comments that suggested the standard would cause more burden for agencies.

    It is no OMB's intent that each agency must reproduce each analytic result before it is disseminated. The purpose of the reproducibility standard is to cultivate a consistent agency commitment to transparency about how analytic results are generated: the specific data used, the various assumptions employed, the specific analytical methods applied, and the statistical procedures employed. If sufficient transparency is achieved on each of these matters, then an analytic result should meet the “capable of being substantially reproduced” standard.

    While there is much variation in types of analytic results, OMB believes that reproducibility is a practical standard to apply to most types of analytic results. As we state in new paragraph V.3.b.ii.B, “With regard to analytic results related [to influential scientific, financial, or statistical information], agency guidelines shall generally require sufficient transparency about data and methods that an independent reanalysis could be undertaken by a qualified member of the public. These transparency standards apply to agency analysis of data from a single study as well as to analyses that combine information from multiple studies.” We elaborate upon this principle in our expanded definition of “reproducibility” in paragraph V.10: “With respect to analytic results, ‘capable of being substantially reproduced’ means that independent analysis of the original or supporting data using identical methods would generate similar analytic results, subject to an acceptable degree of imprecision or error.”

    Even in a situation where the original and supporting data are protected by confidentiality concerns, or the analytic computer models or other research methods may be kept confidential to protect intellectual property, it may still be feasible to have the analytic results subject to the reproducibility standard. For example, a qualified party, operating under the same confidentiality protections as the original analysts, may be asked to use the same data, computer model or statistical methods to replicate the analytic results reported in the original study. See, e.g., “Reanalysis of the Harvard Six Cities Study and the American Cancer Society Study of Particulate Air Pollution and Mortality,” A Special Report of the Health Effects Institute's Particle Epidemiology Reanalysis Project, Cambridge, MA, 2000.Start Printed Page 374

    The primary benefit of public transparency is not necessarily that errors in analytic results will be detected, although error correction is clearly valuable. The more important benefit of transparency is that the public will be able to assess how much an agency's analytic result hinges on the specific analytic choices made by the agency. Concreteness about analytic choices allows, for example, the implications of alternative technical choices to be readily assessed. This type of sensitivity analysis is widely regarded as an essential feature of high-quality analysis, yet sensitivity analysis cannot be undertaken by outside parties unless a high degree of transparency is achieved. The OMB guidelines do not compel such sensitivity analysis as a necessary dimension of quality, but the transparency achieved by reproducibility will allow the public to undertake sensitivity studies of interest.

    We acknowledge that confidentiality concerns will sometimes preclude public access as an approach to reproducibility. In response to public comment, we have clarified that such concerns do include interests in “intellectual property.” To ensure that the OMB guidelines have sufficient flexibility with regard to analytic transparency, OMB has, in new paragraph V.3.b.ii.B.i, provided agencies an alternative approach for classes or types of analytic results that cannot practically be subject to the reproducibility standard. “[In those situations involving influential scientific, financial, or statistical information * * * ] making the data and methods publicly available will assist in determining whether analytic results are reproducible. However, the objectivity standard does not override other compelling interests such as privacy, trade secrets, intellectual property, and other confidentiality protections. ” Specifically, in cases where reproducibility will not occur due to other compelling interests, we expect agencies (1) to perform robustness checks appropriate to the importance of the information involved, e.g., determining whether a specific statistic is sensitive to the choice of analytic method, and, accompanying the information disseminated, to document their efforts to assure the needed robustness in information quality, and (2) address in their guidelines the degree to which they anticipate the opportunity for reproducibility to be limited by the confidentiality of underlying data. As we state in new paragraph V.3.b.ii.B.ii, “In situations where public access to date and methods will not occur due to other compelling interests, agencies shall apply especially rigorous robustness checks to analytic results and document what checks were undertaken. Agency guidelines shall, however, in all cases, require a disclosure of the specific data sources that have been used and the specific quantitative methods and assumptions that have been employed.”

    Given the differences in the many Federal agencies covered by these guidelines, and the differences in robustness checks and the level of detail for documentation thereof that might be appropriate for different agencies, we also believe it will helpful if agencies elaborate on these matters in the context of their missions and duties, with due consideration of the nature of the information they disseminate. As we state in new paragraph V.3.b.ii.B.ii, “Each agency is authorized to define the type of robustness checks, and the level of detail for documentation thereof, in ways appropriate for it given the nature and multiplicity of issues for which the agency is responsible.”

    We leave the determination of the appropriate degree of rigor to the discretion of agencies and the relevant scientific and technical communities that work with the agencies. We do, however, establish a general standard for the appropriate degree of rigor in our expanded definition of “reproducibility” in paragraph V.10: “ ‘Reproducibility’ means that the information is capable of being substantially reproduced, subject to an acceptable degree of imprecision. For information judged to have more (less) important impacts, the degree of imprecision that is tolerated is reduced (increased).” OMB will review each agency's treatment of this issue when reviewing the agency guidelines as a whole.

    Commercial also expressed concerns regarding interim final paragraph V.3.B.iii, “making the data and models publicly available will assist in determining whether analytic results are capable of being substantially reproduced,” and whether it could be interpreted to constitute public dissemination of these materials, rendering moot the reproducibility test. (For the equivalent provision, see new paragraph V.3.b.ii.B.i.) The OMB guidelines do not require agencies to reproduce each disseminated analytic result by independent reanalysis. Thus, public dissemination of data and models per se does not mean that the analytic result has been reproduced. It means only that the result should be CAPABLE of being reproduced. The transparency associated with this capability of reproduction is what the OMB guidelines are designed to achieve.

    We also want to build on a general observation that we made in our final guidelines published in September 2001. In those guidelines we stated: “... in those situations involving influential scientific[, financial,] or statistical information, the substantial reproducibility standard is added as a quality standard above and beyond some peer review quality standards” (66 FR 49722 (September 28, 2001)). A hypothetical example may serve to illustrate this point. Assume that two Federal agencies initiated or sponsored the dissemination of five scientific studies after October 1, 2002 (see paragraph III.4) that were, before dissemination, subjected to formal, independent, external peer review, i.e., that met the presumptive standard for “objectivity” under paragraph V.3.b.i. Further assume, at the time of dissemination, that neither agency reasonably expected that the dissemination of any of these studies would have “a clear and substantial impact” on important public policies, i.e., that these studies were not considered “influential” under paragraph V.9, and thus not subject to the reproducibility standards in paragraphs V.3.b.ii.A or B. Then assume, two years later, in 2005, that one of the agencies decides to issue an important and far-reaching regulation based clearly and substantially on the agency's evaluation of the analytic results set forth in these five studies and that such agency reliance on these five studies as published in the agency's notice of proposed rulemaking would constitute dissemination of these five studies. These guidelines would require the rulemaking agency, prior to publishing the notice of proposed rulemaking, to evaluate these five studies to determine if the analytic results stated therein would meet the “capable of being substantially reproduced” standards in paragraph V.3.b.ii.B and, if necessary, related standards governing original and supporting data in paragraph V.3.b.ii.A. If the agency were to decide that any of the five studies would not meet the reproducibility standard, the agency may still rely on them but only if they satisfy the transparency standard and—as applicable—the disclosure of robustness checks required by these guidelines. Otherwise, the agency should not disseminate any of the studies that did not meet the applicable standards in the guidelines at the time Start Printed Page 375it publishes the notice of proposed rulemaking.

    Some comments suggested that OMB consider replacing the reproducibility standard with a standard concerning “confirmation” of results for influential scientific and statistical information. Although we encourage agencies to consider “confirmation” as a relevant standard—at least in some cases—for assessing the objectivity of original and supporting data, we believe that “confirmation” is too stringent a standard to apply to analytic results. Often the regulatory impact analysis prepared by an agency for a major rule, for example, will be the only formal analysis of an important subject. It would be unlikely that the results of the regulatory impact analysis had already been confirmed by other analyses. The “capable of being substantially reproduced” standard is less stringent than a “confirmation” standard because it simply requires that an agency's analysis be sufficiently transparent that another qualified party could replicate it through reanalysis.

    Health, Safety, and Environmental Information. We note, in the scientific context, that in 1996 the Congress, for health decisions under the Safe Drinking Water Act, adopted a basic standard of quality for the use of science in agency decisionmaking. Under 42 U.S.C. 300g-1(b)(3)(A), an agency is directed, “to the degree that an Agency action is based on science,” to use “(i) the best available, peer-reviewed science and supporting studies conducted in accordance with sound and objective scientific practices; and (ii) data collected by accepted methods or best available methods (if the reliability of the method and the nature of the decision justifies use of the data).”

    We further note that in the 1996 amendments to the Safe Drinking Water Act, Congress adopted a basic quality standard for the dissemination of public information about risks of adverse health effects. Under 42 U.S.C. 300g-1(b)(3)(B), the agency is directed, “to ensure that the presentation of information [risk] effects is comprehensive, informative, and understandable.” The agency is further directed, “in a document made available to the public in support of a regulation [to] specify, to the extent practicable—(i) each population addressed by any estimate [of applicable risk effects]; (ii) the expected risk or central estimate of risk for the specific populations [affected]; (iii) each appropriate upper-bound or lower-bound estimate of risk; (iv) each significant uncertainty identified in the process of the assessment of [risk] effects and the studies that would assist in resolving the uncertainty; and (v) peer-reviewed studies known to the [agency] that support, are directly relevant to, or fail to support any estimate of [risk] effects and the methodology used to reconcile inconsistencies in the scientific data.”

    As suggested in several comments, we have included these congressional standards directly in new paragraph V.3.b.ii.C, and made them applicable to the information disseminated by all the agencies subject to these guidelines: “With regard to analysis of risks to human health, safety and the environment maintained or disseminated by the agencies, agencies shall either adopt or adapt the quality principles applied by Congress to risk information used and disseminated pursuant to the Safe Drinking Water Act Amendments of 1996 (42 U.S.C. 300g-1(b)(3)(A) & (B)).” The word “adapt” is intended to provide agencies flexibility in applying these principles to various types of risk assessment.

    Comments also argued that the continued flow of vital information from agencies responsible for disseminating health and medical information to medical providers, patients, and the public may be disrupted due to these peer review and reproducibility standards. OMB responded by adding to new paragraph V.3.ii.C: “Agencies responsible for dissemination of vital health and medical information shall interpret the reproducibility and peer-review standards in a manner appropriate to assuring the timely flow of vital information from agencies to medical providers, patients, health agencies, and the public. Information quality standards may be waived temporarily by agencies under urgent situations (e.g., imminent threats to public health or homeland security) in accordance with the latitude specified in agency-specific guidelines.”

    Administrative Correction Mechanisms. In addition to commenting on the substantive standards in these guidelines, many of the comments noted that the OMB guidelines on the administrative correction of information do not specify a time period in which the agency investigation and response must be made. OMB has added the following new paragraph III.3.i to direct agencies to specify appropriate time periods in which the investigation and response need to be made. “Agencies shall specify appropriate time periods for agency decisions on whether and how to correct the information, and agencies shall notify the affected persons of the corrections made.”

    Several comments stated that the OMB guidelines needed to direct agencies to consider incorporating an administrative appeal process into their administrative mechanisms for the correction of information. OMB agreed, and added the following new paragraph III.3.ii: “If the person who requested the correction does not agree with the agency's decision (including the corrective action, if any), the person may file for reconsideration within the agency. The agency shall establish an administrative appeal process to review the agency's initial decision, and specify appropriate time limits in which to resolve such requests for reconsideration.” Recognizing that many agencies already have a process in place to respond to public concerns, it is not necessarily OMB's intent to require these agencies to establish a new or different process. Rather, our intent is to ensure that agency guidelines specify an objective administrative appeal process that, upon further complaint by the affected person, reviews an agency's decision to disagree with the correction request. An objective process will ensure that the office that originally disseminates the information does not have responsibility for both the initial response and resolution of a disagreement. In addition, the agency guidelines should specify that if the agency believes other agencies may have an interest in the resolution of any administrative appeal, the agency should consult with those other agencies about their possible interest.

    Overall, OMB does not envision administrative mechanisms that would burden agencies with frivolous claims. Instead, the correction process should serve to address the genuine and valid needs of the agency and its constituents without disrupting agency processes. Agencies, in making their determination of whether or not to correct information, may reject claims made in bad faith or without justification, and are required to undertake only the degree of correction that they conclude is appropriate for the nature and timeliness of the information involved, and explain such practices in their annual fiscal year reports to OMB.

    OMS's issuance of these final guidelines is the beginning of an evolutionary process that will include draft agency guidelines, public comment, final agency guidelines, development of experience with OMB and agency guidelines, and continued refinement of both OMB and agency guidelines. Just as OMB requested public comment before issuing these final guidelines, OMB will refine these guidelines as experience develops and further public comment is obtained.

    Start Signature
    Start Printed Page 376

    Dated: December 21, 2001.

    John D. Graham,

    Administrator, Office of Information and Regulatory Affairs.

    End Signature

    Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Information Disseminated by Federal Agencies

    I. OMB Responsibilities

    Section 515 of the Treasury and General Government Appropriations Act for FY2001 (Public Law 106-554) directs the Office of Management and Budget to issue government-wide guidelines that provide policy and procedural guidance to Federal agencies for ensuring and maximizing the quality, objectivity, utility, and integrity of information, including statistical information, disseminated by Federal agencies.

    II. Agency Responsibilities

    Section 515 directs agencies subject to the Paperwork Reduction Act (44 U.S.C. 3502(1)) to—

    1. Issue their own information quality guidelines ensuring and maximizing the quality, objectivity, utility, and integrity of information, including statistical information, disseminated by the agency no later than one year after the date of issuance of the OMB guidelines;

    2. Establish administrative mechanisms allowing affected persons to seek and obtain correction of information maintained and disseminated by the agency that does not comply with these OMB guidelines; and

    3. Report to the Director of OMB the number and nature of complaints received by the agency regarding agency compliance with these OMB guidelines concerning the quality, objectivity, utility, and integrity of information and how such complaints were resolved.

    III. Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Information Disseminated by Federal Agencies

    1. Overall, agencies shall adopt a basic standard of quality (including objectivity, utility, and integrity) as a performance goal and should take appropriate steps to incorporate information quality criteria into agency information dissemination practices. Quality is to be ensured and established at levels appropriate to the nature and timeliness of the information to be disseminated. Agencies shall adopt specific standards of quality that are appropriate for the various categories of information they disseminate.

    2. As a matter of good and effective agency information resources management, agencies shall develop a process for reviewing the quality (including the objectivity, utility, and integrity) of information before it is disseminated. Agencies shall treat information quality as integral to every step of an agency's development of information, including creation, collection, maintenance, and dissemination. This process shall enable the agency to substantiate the quality of the information it has disseminated through documentation or other means appropriate to the information.

    3. To facilitate public review, agencies shall establish administrative mechanisms allowing affected persons to seek and obtain, where appropriate, timely correction of information maintained and disseminated by the agency that does not comply with OMB or agency guidelines. These administrative mechanisms shall be flexible, appropriate to the nature and timeliness of the disseminated information, and incorporated into agency information resources management and administrative practices.

    i. Agencies shall specify appropriate time periods for agency decisions on whether and how to correct the information, and agencies shall notify the affected persons of the corrections made.

    ii. If the person who requested the correction does not agree with the agency's decision (including the corrective action, if any), the person may filed for reconsideration within the agency. The agency shall establish an administrative appeal process to review the agency's initial decision, and specify appropriate time limits in which to resolve such requests for reconsideration.

    4. The Agency's pre-dissemination review, under paragraph III.2, shall apply to information that the agency first disseminates on or after October 1, 2002. The agency's administrative's mechanisms, under paragraph III.3., shall apply to information that the agency disseminates on or after October 1, 2001, regardless of when the agency first disseminated the information.

    IV. Agency Reporting Requirements

    1. Agencies must designate the Chief Information Officer or another official to be responsible for agency compliance with these guidelines.

    2. The agency shall respond to complaints in a manner appropriate to the nature and extent of the complaint. Examples of appropriate responses include personal contacts via letter or telephone, form letters, press releases or mass mailings that correct a widely disseminated error or address or frequently raised complaint.

    3. Each agency must prepare a draft report, no later than April 1, 2002, providing the agency's information quality guidelines and explaining how such guidelines will ensure and maximize the quality, objectivity, utility, and integrity of information, including statistical information, disseminated by the agency. This report must also detail the administrative mechanisms developed by that agency to allow affected persons to seek and obtain appropriate correction of information maintained and disseminated by the agency that does not comply with the OMB or the agency guidelines.

    4. The agency must publish a notice of availability of this draft report in the Federal Register, and post this report on the agency's website, to provide an opportunity for public comment.

    5. Upon consideration of public comment and after appropriate revision, the agency must submit this draft report to the OMB for review regarding consistency with these OMB guidelines no later than July 1, 2001. Upon completion of that OMB review and completion of this report, agencies must publish notice of the availability of this report in its final form in the Federal Register, and post this report on the agency's web site no later than October 1, 2002.

    6. On an annual fiscal-year basis, each agency must submit a report to the Director of OMB providing information (both quantitative and qualitative, where appropriate) on the number and nature of complaints received by the agency regarding agency compliance with these OMB guidelines and how such complaints were resolved. Agencies must submit these reports no later than January 1 of each following year, with the first report due January 1, 2004.

    V. Definitions

    1. “Quality” is an encompassing term comprising utility, objectivity, and integrity. Therefore, the guidelines sometimes refer to these four statutory terms, collectively, as “quality.”

    2. “Utility” refers to the usefulness of the information to its intended users, including the public. In assessing the usefulness of information that the agency disseminates to the public, the agency needs to reconsider the uses of the information not only from perspective of the agency but also from the perspective of the public. As a result, when transparency of information is relevant for assessing the information's usefulness from the Start Printed Page 377public's perspective, the agency must take care to ensure that transparency has been addressed in its review of the information.

    3. “Objectivity” involves two distinct elements, presentations and substance.

    a. “Objectivity” includes whether disseminated information is being presented in an accurate, clear, complete, and unbiased manner. This involves whether the information is presented within a proper context. Sometimes, in disseminating certain types of information to the public, other information must also be disseminated in order to ensure an accurate, clear, complete, and unbiased presentation. Also, the agency needs to identify the sources of the disseminated information (to the extent possible, consistent with confidentiality protections) and, in a specific, financial, or statistical context, the supporting data and models, so that the public can assess for itself whether there may be some reason to question the objectivity of the sources. Where appropriate, data should have full, accurate, transparent documentation, and error sources affecting data quality should be identified and disclosed to users.

    b. In addition, “objectivity” involves a focus on ensuring accurate, reliable, and unbiased information. In a scientific, financial, or statistical context, the original and supporting data shall be generated, and the analytic results shall be developed, using sound statistical and research methods.

    i. If data and analytic results have been subjected to formal, independent, external peer review, the information may generally be presumed to be of acceptable objectivity. However, this presumption is rebuttable based on a persuasive showing by the petitioner in a particular instance. If agency-sponsored peer review is employed to help satisfy the objectivity standard, the review process employed shall meet the general criteria for competent and credible peer review recommended by OMB-OIRA to the President's Management Council (9/20/01) (http://www.whitehouse.gov/​omb/​inforeg/​oira_​review-process.html),, namely, “that (a) peer reviewers be selected primarily on the basis of necessary technical expertise, (b) peer reviewers be expected to disclose to agencies prior technical/policy positions they may have taken on the issues at hand, (c) peer reviewers be expected to disclose to agencies their sources of personal and institutional funding (private or public sector), and (d) peer reviews be conducted in an open and vigorous manner.”

    ii. If an agency is response for disseminating influential scientific, financial, or statistical information, agency guidelines shall include a high degree of transparency about data and methods to facilitate the reproducibility of such information by qualified third parties.

    A. With regard to original and supporting data related thereto, agency guidelines shall not require that all disseminated data be subjected to a reproducibility requirement. Agencies may identify, in consultation with the relevant scientific and technical communities, those particular types of data that can practicable be subjected to a reproducibility requirement, given ethical, feasibility, or confidentiality constraints. It is understood that reproducibility of data is an indication of transparency about research design and methods and thus a replication exercise (i.e., a new experiment, test, or sample) shall not be required prior to each dissemination.

    B. With regard to analytic results related thereto, agency guidelines shall generally require sufficient transparency about data and methods that an independent reanalysis could be undertaken by a qualified member of the public. These transparency standards apply to agency analysis of data from a single study as well as to analyses that combine information from multiple studies.

    i. Making the data and methods publicly available will assist in determining whether analytic results are reproducible. However, the objectivity standard does not override other compelling interests such as privacy, trade secrets, intellectual property, and other confidentiality protections.

    ii. In situations where public access to data and methods will not occur due to other compelling interests, agencies shall apply especially rigorous robustness checks to analytic results and document what checks were undertaken. Agency guidelines shall, however, in all cases, require a disclosure of the specific data sources that have been used and the specific quantitative methods and assumptions that have been employed. Each agency is authorized to define the type of robustness checks, and the level of detail for documentation thereof, in ways appropriate for it given the nature and multiplicity of issues for such the agency is responsible.

    C. With regard to analysis of risks to human health, safety and the environment maintained or disseminated by the agencies, agencies shall either adopt or adapt the equality principles applied by Congress to risk information used and disseminated pursuant to the Safe Drinking Water Act Amendments of 1996 (42 U.S.C. 300g-1(b)(3)(A) & (B)). Agencies responsible for dissemination of vital health and medical information shall interpret the reproducibility and peer-review standards in a manner appropriate to assuring the timely flow of vital information from agencies to medical providers, patients, health agencies, and the public. Information quality standards may be waived temporarily by agencies under urgent situations (e.g., imminent threats to public health or homeland security) in accordance with the latitude specified in agency-specific guidelines.

    4. “Integrity” refers to the security of information—protection of the information from unauthorized access or revision, to ensure that the information is not compromised through corruption or falsification.

    5. “Information” means any communication or representation of knowledge such as facts or data, in any medium or form, including textual, numerical, graphic, cartographic, narrative, or audiovisual forms. This definition includes information that an agency disseminates from a web page, but does not include the provision of hyperlinks to information that others disseminate. This definition does not include opinions, where the agency's presentation makes it clear that what is being offered is someone's opinion rather than fact or the agency's views.

    6. “Government information” means information created, collected, processed, disseminated, or disposed of by or for the Federal Government.

    7. “Information dissemination product” means any books, paper, map, machine-readable material, audiovisual production, or other documentary material, regardless of physical form or characteristic, an agency disseminates to the public. This definition includes any electronic document, CD-ROM, or web page.

    8. “Dissemination” means agency initiated or sponsored distribution of information to the public (see 5 CFR 1320.3(d) (definition of “Conduct or Sponsor”)). Dissemination does not include distribution limited to government employees or agency contractors or grantees; intra- or inter-agency use or sharing of government information; and responses to requests for agency records under the Freedom of Information Act, the Privacy Act, the Federal Advisory Committee Act or other similar law. This definition also does not include distribution limited to correspondence with individuals or persons, press releases, archival records, Start Printed Page 378public filings, subpoenas or adjudicative processes.

    9. “Influential”, when used in the phrase “influential scientific, financial, or statistical information”, means that the agency can reasonably determine that dissemination of the information will have or does have a clear and substantial impact on important public policies or important private sector decisions. Each agency is authorized to define “influential” in ways appropriate for it given the nature and multiplicity of issues for which the agency is responsible.

    10. “Reproducibility” means that the information is capable of being substantially reproduced, subject to an acceptable degree of imprecision. For information judged to have more (less) important impacts, the degree of imprecision that is tolerate is reduced (increased). If agencies apply the reproducibility test to specific types of original or supporting data, the associated guidelines shall provide relevant definitions of reproducibility (e.g., standards for replication of laboratory data). With respect to analytic results, “capable of being substantially reproduced” means that independent analysis of the original or supporting data using identical methods would generate similar analytic results, subject to an acceptable degree of imprecision or error.

    End Supplemental Information

    [FR Doc. 02-59 Filed 1-2-02; 8:45 am]

    BILLING CODE 3110-01-M

Document Information

Published:
01/03/2002
Department:
Management and Budget Office
Entry Type:
Notice
Action:
Final guidelines.
Document Number:
02-59
Pages:
369-378 (10 pages)
PDF File:
02-59.pdf