2016-23821. Submission for OMB Review; Comment Request  

  • Start Preamble

    The Department of Commerce will submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection of information under the provisions of the Paperwork Reduction Act (44 U.S.C. chapter 35).

    Agency: U.S. Census Bureau.

    Title: American Community Survey (ACS) Methods Panel, Online Communications Improving Survey Response Campaign.

    OMB Control Number: 0607-0936.

    Form Number(s): ACS-1, ACS-1 (Spanish), ACS CATI, ACS CAPI, ACS Internet.

    Type of Request: Nonsubstantive Change Request.

    Number of Respondents: None.

    Average Hours per Response: None.

    Burden Hours: No additional burden hours are requested under this nonsubstantive change request.

    Needs and Uses: The American Community Survey collects detailed socioeconomic data from about 3.5 million households in the United States and 36,000 in Puerto Rico each year. The ACS also collects detailed socioeconomic data from about 195,000 residents living in Group Quarter (GQ) facilities. An ongoing data collection effort with an annual sample of this magnitude requires that the ACS continue research, testing, and evaluations aimed at improving data quality, achieving survey cost efficiencies, and improving ACS questionnaire content and related data collection materials. The ACS Methods Start Printed Page 67958Panel is a research program that is designed to address and respond to issues and survey needs. In line with the Census Bureau's goal to increase survey response rates through communications, the Census Bureau seeks to launch a pilot of a targeted digital advertising campaign. During the 2000 and 2010 decennial enumerations, the Census Bureau saw an uptick of ACS response rates.[1] A year-over-year increase of 6.4 percentage points was observed in the Savannah, GA media market during the 2015 Census Site Test.[2]

    Outside of decennial years, traditional broad-based advertising methods are cost-prohibitive because of the relatively small sample size for most Census Bureau surveys compared to the general population. With the advent of digital advertising tactics, however, the Census Bureau now has the potential opportunity to cost-effectively deliver promotional messages to individual households within a survey sample. The ACS offers a large enough national sample to field a test of such tactics and determine whether they lift response rates. If digital advertisements encourage recipients to respond to a survey early in the process of data collection, including responding online, then the Census Bureau will save money on costly follow-up efforts to collect data from nonrespondents, including sending Census Bureau interviewers to respondents' households in person. Offsetting data-collection costs in this way would ultimately save taxpayers money. Findings from this pilot campaign will have applications across the range of the Census Bureau's collection efforts as advertisements will not be survey-specific and will focus on the value of the Census Bureau's work in general.

    We propose to execute the pilot campaign aiming to using the January and February 2017 ACS production samples. We will deliver targeted digital advertisements to a panel of in-sample residents that can be linked by household address to digital profiles (including cookies and/or device ID) by a third-party data vendor. This technique is an emerging standard in online advertising, in line with the advertising households receive from companies and organizations every day. We will place video, display banners, and paid social media advertisements. Linked households will be served ads shortly before they receive a mailed survey questionnaire and during the ACS data collection process. Ads will not directly call on recipients to complete the ACS or any particular survey, nor will they mention any survey by name. Rather they will be designed to create positive associations with the Census Bureau's work generally and make the case for the importance of completing a Census Bureau questionnaire if selected. When an advertisement is clicked, the user will be directed to a Census.gov web landing page featuring general information about the value of the Census Bureau's work and a link to the “Are You in a Survey?” page.[3]

    The purpose of this test is to study the impact of these changes on self-response behavior and assess any potential savings overall or with subgroups. The advertisements will include a mix of online video, banner display ads, and paid social media content on both desktop and mobile devices. They will be displayed around the web on various Web sites targeted to linked households in the treatment groups. Ad serving will be optimized based on audience reach and user engagement with the ads (measured in terms of video and click metrics). The optimal media mix will be applied evenly across both treatments. We will prioritize rich media placements including video and social video over standard placements such as banner display, with the goal to maximize video advertising to tell a compelling story to raise awareness of the Census Bureau's work.

    This pilot will include two experimental treatments (a high-spend group and a low-spend group) as well as a control group. Households in the high-spend group will receive roughly twice the number of advertisement exposures as households in the low-spend treatment group, though the channel mix and content of the advertisements will remain the same between the two groups. The Control group will not receive any advertisements.

    To field this test, we plan to use ACS production (clearance number: 0607-0810, expires 06/30/2018). Thus, there is no increase in burden from this test since the treatment will result in approximately the same burden estimate per interview (40 minutes). The ACS sample design consists of randomly assigning each monthly sample panel into 24 groups of approximately 12,000 addresses each. Each group, called a methods panel group, within a monthly sample is representative of the full monthly sample. Each monthly sample is a representative subsample of the entire annual sample and is representative of the sampling frame.

    The test will include two months of production sample (aiming for January and February 2017). We will choose eight randomly selected methods panel groups per month for each of the two experimental treatments; the remaining eight methods panel groups will be the control. Over the two production months, each treatment will use 16 methods panel groups, or a mail out sample of roughly 192,000 addresses, which will be used for linking to establish eligibility for micro targeted digital advertising. We estimate that approximately 31 percent of the mailable addresses will be eligible for digital advertising, which is approximately 30,000 addresses for each of the two experimental treatments per month.

    We will compare the Internet return rates at the cut date for the replacement mailing, the Internet, mail, and self-response return rates before the start of Computer Assisted Telephone Interviewing (CATI), and the Internet, mail, self-response, and CATI return rates prior to the start of Computer Assisted Personal Interviewing (CAPI). We will compare the self-response and CAPI return rates as well as the overall response rates when all data collection activities end. Additionally, the overall response rate will be calculated for all sample addresses. For each comparison, we will use α = 0.1 and a two-tailed test so that we can measure the impact on the evaluation measure in either direction with 80 percent power. Based on previous year's data for the January and February panels we calculated effective sample sizes. We assumed an Undeliverable as Addressed (UAA) rate of 18.0 percent (these addresses may be advertised to, but will be removed from self-response analysis because they do not have an opportunity to respond), a self-response rate of 57.5 percent for all three groups, a CATI response rate of 25 percent, and a CAPI response rate of 85 percent. We expect to be able to detect self-response differences between the high- and low-spend treatment panel of 0.8 percentage points, and between a treatment panel and the control on the order of 0.8 percentage points. Additional metrics of interest include overall costs and response rates by subgroups.

    Affected Public: Individuals or households.Start Printed Page 67959

    Frequency: One-time test as part of the monthly American Community Survey.

    Respondent's Obligation: Mandatory.

    Legal Authority: Title 13, United States Code, Sections 141, 193, and 221.

    This information collection request may be viewed at www.reginfo.gov. Follow the instructions to view Department of Commerce collections currently under review by OMB.

    Written comments and recommendations for the proposed information collection should be sent within 30 days of publication of this notice to OIRA_Submission@omb.eop.gov or fax to (202) 395-5806.

    Start Signature

    Sheleen Dumas,

    PRA Departmental Lead, Office of the Chief Information Officer.

    End Signature End Preamble

    Footnotes

    1.  Chesnut, J. & M. Davis. (2011). “Evaluation of the ACS Mail Materials and Mailing Strategy during the 2010 Census.” American Community Survey Research and Evaluation Program. U.S. Census Bureau.

    Back to Citation

    2.  Walejko, G. et al. (2015). “Modeling the Effect of Diverse Communication Strategies on Decennial Census Test Response Rates.” Presentation. 2015 Federal Committee on Statistical Methodology Research Conference. December 2nd, 2015. Washington, DC.

    Back to Citation

    [FR Doc. 2016-23821 Filed 9-30-16; 8:45 am]

    BILLING CODE 3510-07-P

Document Information

Published:
10/03/2016
Department:
Commerce Department
Entry Type:
Notice
Document Number:
2016-23821
Pages:
67957-67959 (3 pages)
PDF File:
2016-23821.pdf