U.S. flag

An official website of the United States government

Skip Header


2020 Census Program for Evaluations, Experiments, and Assessments

Written by:

Estimated reading time: 9 minutes

By law, the 1790 Census schedules were posted in the “two most public places within each jurisdiction, there to remain for the inspection of all concerned.” From the beginning, one of the hallmarks of the U.S. census has been conducting an objective process and subjecting the results to a thorough review. 

The U.S. Census Bureau continues the proud tradition of striving to conduct the highest quality census possible. We build quality into how we collect and process the data, and we evaluate the quality of those activities and their results.

Given the challenges the census faced in 2020, we know there is keen interest in how we are evaluating the census. To evaluate the quality of the census, we use a variety of methods that are independently reviewed by nationally and internationally recognized outside experts.

  • We compare the census results to estimates of the population such as demographic analysis and our population estimates.
  • We conduct the Post-Enumeration Survey to measure the proportion of people and housing units potentially missed or counted erroneously in the census.
  • We analyze operational quality metrics on how we collected the responses. For the first time, we released a number of data quality indicators along with the first results from the 2020 Census. These metrics provide information on the status of addresses in the census and how we resolved addresses across each of the data collection modes. These were released from April 26 through August 25, 2021.
  • We conduct a series of formal evaluations and assessments that measure different aspects of census operations and specific challenges. This rigorous and comprehensive evaluation program is known as the Census Program for Evaluations and Experiments (CPEX).

In this blog, we discuss more about the CPEX program.

How Did the CPEX Come About?

Since 1950, the Census Bureau has incorporated evaluative programs intended to assess the current census and speak to potential improvements for the next one. The 2000 Census Testing, Evaluations, and Experiments Program was extensive. Its scope included 87 evaluations and assessments across 18 categories and was designed to measure the effectiveness of methods, procedures and operations.

The 2010 CPEX focused resources on 22 evaluations and six experiments, in addition to a large suite of operational assessments. The 2010 program provided the basis for many of the innovations implemented in the 2020 Census. For example:

  • The 2010 Census Address Canvassing Targeting and Cost Reduction Evaluation Report provided baseline data and findings to support the 2020 Census innovation of replacing 100 percent field canvassing with a combination of in-office and in-field address canvassing, reducing the in-field canvassing to only 35 percent of addresses nationwide.
  • The 2010 Census Match Study was one of the first deep-dive studies into the quality of administrative records and kicked off the important work over the next decade to build the administrative records enumeration capability for the 2020 Census.

The Census Bureau started planning for the 2020 CPEX by asking external groups to envision what a 2030 Census might look like and how society might be different in 2030. The Census Bureau took those visionary reports and derived guiding principles to help researchers develop proposals.

The final set of studies forms a cohesive research and evaluation program. The 2020 CPEX is a culmination of seven decades of expertise and experience in developing a comprehensive research program to evaluate how well census operations reflect the Census Bureau’s strict quality standards. 

What Is Included in the 2020 CPEX?

The 2020 CPEX program is designed to document and evaluate the current decennial census and facilitate planning efforts for the next one by way of four types of studies:

  • Experiments test new methods during the decennial census with its unique environment to inform future census processes and operations.
  • Evaluations focus on how aspects of the 2020 Census performed to identify opportunities for improvement or innovation.
  • Operational assessments provide detailed information on workload volumes, production rates, operational costs, and lessons learned for nearly all 2020 Census operations.
  • Quality control results focus on how well major census operations followed established procedures in order to maintain high standards of quality.

The 2020 CPEX consists of three experiments, 14 evaluations, 50 operational assessments, and five quality control results reports. The three experiments are described as follows:

  • Real-Time 2020 Administrative Census Simulation developed an administrative record census in real time in 2020 to determine how the population statistics compare between an administrative record census and survey-style collection within the same time frame.
  • Extending the Census Environment to the Mailing Materials measured our ability to increase self-response by tapping into the unique environment surrounding the decennial census through the materials we use to contact households such as sticker inserts and direct mail promotional postcards.
  • Optimization of Self-Response in the 2020 Census was conducted to study the net impacts of the 2020 Census mailing strategy and the overall influence of the online self-response option.

The following seven 2020 Census evaluations cover aspects of the 2020 Census from the early listing operations through specific enumeration challenges in later follow-up operations:

  • Administrative Record Dual System Estimation will research if dual system estimates could be generated without conducting an independent post-enumeration survey. The census would continue to serve as the first source, but administrative records would serve as the second source, rather than the post-enumeration survey results.
  • Reengineered Address Canvassing Evaluation will evaluate the redesign of the Address Canvassing operation, including the use of in-office address canvassing and interactive review.
  • Group Quarters Advance Contact (GQAC): Refining Classification of College or University Student Housing will help us determine, for privately owned student housing, which addresses are group quarters and which should be considered housing units so that we include them in the appropriate data collection operations.
  • Evaluating Privacy and Confidentiality Concerns will assess respondents’ privacy and confidentiality concerns about responding to the census generally, as well as assessing concerns of certain types of respondents with a follow-up questionnaire.
  • Research on Hard-to-Count Populations: Non-English Speakers and Complex Household Residents including Undercount of Children will conduct both an evaluation and an experimental research project about historically hard-to-count populations such as non-English speakers and complex household residents.
  • Analysis of Census Internet Self-Response Paradata by Language will study paradata from the internet self-response instrument to learn about issues associated with language.
  • The Undercount of Young Children: A Qualitative Evaluation of Census Materials and Operations will explore reasons for the undercount of young children.

The remaining seven 2020 Census evaluations focus on the creation and impact of the communications campaign, as well as on measuring public perception over time in the unprecedented census environment.

  • Evaluating the 2020 Census Communications Campaign: Census Mindset Measures Before and After the Campaign will use a panel of survey respondents to measure intent to participate in the census over time in order to evaluate the 2020 Census Communications Campaign.
  • 2020 Census Quantitative Creative Testing will measure the performance of 2020 Census television and radio advertisements.
  • Investigating Digital Advertising and Online Self-Response will analyze the 2020 Census website paradata to investigate the relationship between digital advertising materials and online self-response.
  • Matching 2018 Census Barriers, Attitudes, and Motivators Study (CBAMS) Survey Sample to 2020 Census will evaluate the relationship between intended response behaviors and actual response behaviors, as well as the characteristics of nonresponding households.
  • Comparing 2019 Census Test and 2020 Census Self-Response Rates to Estimate Decennial Environment will match 2019 Census Test data to 2020 Census data to compare self-response behavior with and without the decennial environment.
  • Evaluating Large Technology Platforms — Selected Digital Partnerships will measure the impact from a subset of digital partnership activities (i.e., social media, email, webinars).
  • 2020 Census Tracking Survey will study U.S. public sentiment concerning matters that may bear upon 2020 Census participation.

In addition to the 2020 Census experiments and evaluations, we will conduct 50 operational assessments that cover nearly all 35 census operations. Some operations have multiple assessments. For example, the Reengineered Address Canvassing operation has assessments for both the in-field portion and the in-office portion. Assessments provide invaluable documentation of operational metrics that are critical inputs for future planning. They provide detailed information on workloads, production rates, actual costs compared with planned costs and lessons learned. The operational assessments, along with the experiments, evaluations and quality control results, provide insight and direction for census improvements and innovation.

How Does the Census Bureau Ensure the Quality of These Reports?

For all CPEX reports, authors and analysts follow a quality process that includes rigorous fact-checking and indexing to data sources to ensure that these products meet the Census Bureau’s quality standards and abide by all agency product requirements. The process serves as a quality assurance tool and has been in place since the 2000 Census.

The purpose of the process is to ensure that quality is built into the products throughout the development cycle from requirements definition all the way to issuance of the final report. The quality process activities are formally implemented and are tracked in the 2020 Census Integrated Master Schedule. Because it involves specific steps, the process provides frequent opportunities to identify and correct errors, which leads to a superior final product and establishes confidence in the accuracy of the final results.

When Will These Reports Be Available?

By design, the scope of our CPEX program is quite large and requires a wide breadth of data and analytical and subject-matter expertise. The studies generally begin in the year the census completes (the “0” year). The studies are completed and published on a flow basis over two to three years. For example, the first study from the 2010 CPEX was published in the summer of 2011 and the final study was completed in early 2013.

Pandemic-related delays to 2020 Census data collection and processing have also pushed back our release dates for CPEX reports. The table below provides planned release dates for the earliest CPEX reports:

 

Year Quarter CPEX Report Release
2022 Jan-Mar Operational Assessment: In-Office Address Canvassing
2022 Apr-Jun Operational Assessment: Update Leave
2022 Apr-Jun Operational Assessment: Update Enumerate
2022 Apr-Jun Operational Assessment: In-Field Address Canvassing
2022 Apr-Jun Operational Assessment: Non-ID Automated and Clerical Processing
2022 Apr-Jun Operational Assessment: Federally Affiliated Count Overseas (FACO)
2022 Apr-Jun 2020 CPEX: Extending the Decennial Census Environment to the Mailing Materials

 

We’re developing a quarterly release schedule for the full CPEX scope. We’ll post this schedule on our 2020 Census Data Quality page in the future — and update it as dates change.

 

Page Last Revised - October 8, 2021
Is this page helpful?
Thumbs Up Image Yes Thumbs Down Image No
NO THANKS
255 characters maximum 255 characters maximum reached
Thank you for your feedback.
Comments or suggestions?

Top

Back to Header