U.S. flag

An official website of the United States government

Skip Header


Final Report of the Web and In-Person Cognitive Testing of Privacy and Confidentiality Respondent Messaging

Written by:
Working Paper Number rsm2018-15

Abstract

The Census Bureau is required by law to inform respondents about access to and protections of the data it collects from them. For example, required messages include topics such as who has access to respondent data, what their data are used for, and, how data are kept confidential. These and other requirements are not only spelled out in various federal laws, but are also consistent with the Census Bureau’s principles of openness and transparency. For example, the Paperwork Reduction Act (PRA) requires that we tell respondents the authority under which data is collected, the purpose for the survey, an estimate of burden, whether responses are voluntary or mandatory, the extent of confidentiality protection, an approval number from the Office of Management and Budget (OMB) and a statement that an agency may not conduct a collection without the approval number. A general review of the messages the Census Bureau presents respondents to explain data access and confidentiality found them to be not consistent across decennial census and ongoing surveys and recommended research on options for this messaging.

In response, staff from the Center for Survey Measurement (CSM) conducted cognitive testing of the range of the Census Bureau’s respondent messaging concerning privacy and confidentiality. This research was designed to explore various ways of communicating the required description of access to data collected by Title 13, as well as various other language required by the Paperwork Reduction Act (PRA). Testing was designed to identify the messages that were clear to respondents and communicated the intended messages, with the goal of standardizing the messages across Census Bureau collections.

This research project was two-staged, starting with a large online study exploring many possible options for this language, followed by a smaller-scale cognitive test of those options that seemed most viable and reliable based on findings from the larger study. Online data collection for this study was conducted from late November to mid-December 2015. The follow-up cognitive testing was conducted from early February to late March 2016 with thirty participants who were interviewed in-person. This report documents findings from both stages of this study. Final recommendations include language that is clear and easy to understand for respondents and avoids vague and complex language.

Page Last Revised - October 8, 2021
Is this page helpful?
Thumbs Up Image Yes Thumbs Down Image No
NO THANKS
255 characters maximum 255 characters maximum reached
Thank you for your feedback.
Comments or suggestions?

Top

Back to Header