Document headings vary by document type but may contain the following:
See the Document Drafting Handbook for more details.
AGENCY:
Census Bureau, Commerce.
ACTION:
Notice of information collection, request for comment.
SUMMARY:
The Department of Commerce, in accordance with the Paperwork Reduction Act (PRA) of 1995, invites the general public and other Federal agencies to comment on proposed, and continuing information collections, which helps us assess the impact of our information collection requirements and minimize the public's reporting burden. The purpose of this notice is to allow for 60 days of public comment on the proposed revision of the American Community Survey Methods Panel Tests, prior to the submission of the information collection request (ICR) to OMB for approval.
DATES:
To ensure consideration, comments regarding this proposed information collection must be received on or before December 23, 2024.
ADDRESSES:
Interested persons are invited to submit written comments by email acso.pra@census.gov. Please reference American Community Survey Methods Panel Tests in the subject line of your comments. You may also submit comments, identified by Docket Number USBC-2024-0027, to the Federal e-Rulemaking Portal: http://www.regulations.gov. Click the “Comment Now!” icon, complete the required fields, and enter or attach your comments. All comments received are part of the public record. No comments will be posted to https://www.regulations.gov for public viewing until after the comment period has closed. Comments will generally be posted without change. All Personally Identifiable Information (for example, name and address) voluntarily submitted by the commenter may be publicly accessible. Do not submit Confidential Business Information or otherwise sensitive or protected information. You may submit attachments to electronic comments in Microsoft Word, Excel, or Adobe PDF file formats.
FOR FURTHER INFORMATION CONTACT:
Requests for additional information or specific questions related to collection activities should be directed to G. Brian Wilson, U.S. Census Bureau, American Community Survey Office, 301-763-2819, George.Brian.Wilson@census.gov.
SUPPLEMENTARY INFORMATION:
I. Abstract
The American Community Survey (ACS) is an ongoing monthly survey that collects detailed social, economic, housing and demographic data from about 3.5 million addresses in the United States and about 36,000 addresses in Puerto Rico each year (where it is called the Puerto Rico Community Survey). The ACS also collects detailed data from about 150,000 residents living in group quarters (GQ) facilities in the United States and Puerto Rico. Resulting tabulations from this data collection are provided on a yearly basis. The ACS allows the Census Bureau to provide timely and relevant social, economic, housing, and demographic statistics, even for low levels of geography.
An ongoing data collection effort with an annual sample of this magnitude requires that the Census Bureau continue research, tests, and evaluations aimed at improving data quality, reducing data collection costs, and improving the ACS questionnaire content and related data collection materials. The ACS Methods Panel is a research program at the Census Bureau designed to address and respond to survey issues and needs of the ACS. As part of the Decennial Census Program, the ACS also provides an opportunity to research and test elements of survey data collection that relate to the decennial census. As such, the ACS Methods Panel can serve as a testbed for the decennial census. From 2025 to 2028, the ACS Methods Panel may test ACS and decennial census methods for reducing survey cost, addressing respondent burden, and improving survey response, data quality, and survey efficiencies for housing units and group quarters. The ACS Methods Panel may also address other emerging needs of the program.
At this time, proposals are in place for several tests related to self-response. Tests may also be conducted for nonresponse follow-up data collection and other ACS operations. Because the ACS Methods Panel is designed to address emerging issues, we may propose additional testing as needed. Any testing would focus on methods for reducing data collection costs, improving data quality, improving the respondent experience, revising content, or testing new questions for the Decennial Census Program. The proposed tests are outlined below.
Questionnaire Timing Test: In an effort to boost self-response rates and decrease survey costs, the Questionnaire Timing Test will test whether changing the timing of when the ACS paper questionnaire is sent to sampled addresses can increase self-response (overall and by data collection mode) and/or reduce data collection costs. The test will also evaluate the impact of including a Quick Response (QR) code directing respondents to the internet data collection instrument. If successful, adopting these changes could decrease data collection costs associated with the paper questionnaire and the Computer- Assisted Personal Interviewing (CAPI) nonresponse follow-up operation.
Internet Instrument Response Option and Error Message Design Test: This test will provide information to aid the development of web design standards for household and group quarters data collection instruments used throughout the Census Bureau. This test will focus on design standards related to response options and error messages to increase data quality and the response experience. The test for the response options will compare the use of standard radio buttons (the current design) to the use of response buttons, which have a border around the radio button and response option wording. The response buttons will highlight when hovered over and change to green once selected. This test will determine if these changes decrease response time, change response distributions, or affect item nonresponse. An additional change is a modification to error message design to explore how respondents react to a different display. Current error messages display at the top of the page within a box and use an exclamation mark and color to draw attention. For missing write-in fields, an arrow shows where the error occurred. This experiment will test a change in colors used to draw attention to the error. Instead of an arrow showing where there is a missing write-in, a change in the write-in border will be used.
Additional Internet Instrument Testing: In 2013, the ACS incorporated the use of an internet instrument to collect survey responses. The design of the instrument reflected the research and standards of survey data collection at that time. With a growing population using the internet to respond to the ACS, as well as the increased use of smartphones and other electronic devices with smaller screens, an evaluation of the internet instrument is needed. Design elements will be developed and tested based on input from experts in survey methodology and web survey design. Testing may include revisions focused on improving login procedures and screen navigation, improving the user interface design, as well as methods to decrease respondent burden. Multiple tests may be conducted.
Self-Response Mail Messaging and Contact Strategies Testing: In response to declining ACS response rates and increasing data collection costs, the Census Bureau plans to study methods to increase self-response to the survey, as this mode of data collection is the least expensive. The Census Bureau currently sends up to five mailings to a sampled address to inform the occupants that their address has been selected to participate in the ACS and to encourage them to self-respond to the survey. The proposed tests would evaluate changes to the mailings, including the use of additional plain language to improve communication, redesigning the visual appearance of the mail materials, improving messaging to motivate response, and adding or removing materials included in the mailings. Changes to the contact method, the number of contacts, and the timing of the contacts may also be tested. Multiple tests may be conducted.
Content Testing: Working through the Office of Management and Budget Interagency Committee for the ACS, the Census Bureau will solicit proposals from other Federal agencies to change existing questions or add new questions to the ACS. The objective of content testing is to determine the impact of changing question wording and response categories, as well as redefining underlying constructs, on the quality of the data collected. The Census Bureau evaluates changes to current questions by comparing the revised questions to the current ACS questions. For new questions, the Census Bureau proposes comparing the performance of two versions of any new questions and benchmark results with other well-known sources of such information. The questions would be tested using all modes of data collection. Response bias or variance may also be measured to evaluate the questions by conducting a follow-up interview with respondents. Multiple tests may be conducted.
Nonresponse Follow-up Data Collection Testing: The Census Bureau is proposing to test modifications to nonresponse follow-up data collection operations to increase response to the survey. The proposed tests would evaluate changes to the materials used by ACS field representatives (FRs), including changes to the messaging to motivate response or changes to the types of materials used. Testing may also include evaluation of modifications to operational approaches and data collection procedures, such as contact methods and timing. Multiple tests may be conducted.
II. Method of Collection
The American Community Survey is collected via the following modes: internet, paper questionnaire, telephone interview, and in-person interview (CAPI). The Census Bureau sends up to five mailings to eligible housings units to encourage self-response. Respondents may receive help by utilizing an Interactive Voice Response (IVR) system (though survey response cannot be provided by IVR). Respondents can also call our Telephone Questionnaire Assistance (TQA) help line for help or to respond. FRs may visit a housing unit or sampled GQ facility to conduct an interview in person or may conduct the interview by phone. Administrative records are also used to replace, supplement, and support data collection. The ACS Methods Panel Tests use all of these modes of data collection or a subset of the modes, depending on the purpose of the test. Specific modes for the tests are noted below.
Questionnaire Timing Test: This test will evaluate mailout materials, number of mailings, and the timing of mailouts that solicit self-response using paper questionnaire responses. The test will include housing units only.
Internet Instrument Response Option and Error Message Design Test: This test will assess modifications to the internet instrument conducted via a split-sample experiment. Only the internet mode of the self-response phase of data collection is included in the testing.
Additional internet Instrument Testing: This testing will assess modifications to the internet instrument conducted via split-sample experiments. Only the internet mode of the self-response phase of data collection is included in the testing.
Self-Response Mail Messaging and Contact Strategies Testing: This testing will evaluate mailout materials that solicit self-response using internet, paper questionnaire, and telephone responses. Tests will be done as a split sample and will include housing units only.
Content Testing: This testing is for item-level changes and will be conducted as a split-sample experiment, with half of the sampled addresses receiving one version of the questions and the other half receiving a different version of the questions. All modes of ACS data collection are included in the test. Additionally, a follow-up reinterview may be conducted with all households that respond to measure response bias or response variance.
Nonresponse Follow-up Data Collection Testing: This testing will be done as a split sample focusing on in-person and telephone interviews conducted by FRs. As part of their interaction with respondents, FRs also encourage response online and provide materials to respondents. Respondents may also mail back a paper questionnaire they received during the self-response phase of the ACS.
III. Data
OMB Control Number: 0607-0936.
Form Number(s): ACS-1, ACS-1(GQ), ACS-1(PR)SP, ACS CAPI(HU), and ACS RI(HU).
Type of Review: Regular submission, Request for a Revision of a Currently Approved Collection.
Affected Public: Individuals or households.
Estimated Number of Respondents:
Test | Estimated number of respondents |
---|---|
Questionnaire Timing Test | 288,000. |
Response Option and Error Message Design Test | 288000. |
Additional Internet Instrument Testing | Test A—60,000, Test B—60,000. |
Self-Response Mail Messaging and Contact Strategies Testing | Test A—60,000, Test B—60,000, Test C—60,000. |
Content Testing | Test A—40,000, Test B—40,000. |
Content Testing Follow-up Interview | Test A—40,000, Test B—40,000. |
Nonresponse Follow-up Data Collection Testing | 100,000. |
Test | Estimated time per response |
---|---|
Questionnaire Timing Test | 40 |
Response Option and Error Message Design Test | 40 |
Additional Internet Instrument Testing | 40 |
Self-Response Mail Messaging and Contact Strategies Testing | 40 |
Content Testing | 40 |
Content Testing Follow-up Interview | 20 |
Nonresponse Follow-up Data Collection Testing | 40 |
Test | Estimated number of respondents | Estimated time per response (in minutes) | Total burden hours |
---|---|---|---|
Questionnaire Timing Test | 288,000 | 40 | 192,000 |
Response Option and Error Message Design Test | 288000 | 40 | 192,000 |
Additional Internet Instrument Testing | Test A—60,000 Test B—60,000 | 40 | 40,000 40,000 |
Self-Response Mail Messaging and Contact Strategies Testing | Test A—60,000 Test B—60,000 Test C—60,000 | 40 | 40,000 40,000 40,000 |
Content Testing | Test A—40,000 Test B—40,000 | 40 | 26,667 26,667 |
Content Testing Follow-up Interview | Test A—40,000 Test B—40,000 | 20 | 13,333 13,333 |
Nonresponse Follow-up Data Collection Testing | 100,000 | 40 | 66,667 |
Total (over 3 years) * | 1,136,000 | 730,667 | |
Annual Burden Hours | 378,667 | 243,556 | |
* Note: This is the maximum burden requested for these tests. Every effort is taken to use existing production sample for testing when the tests do not involve content changes. |