U.S. flag

An official website of the United States government

Skip Header


Survey News Volume 8, Issue 4

Bureau of Justice Statistics Releases Results from the 2019 National Crime Victimization Survey

by Meagan Meuchel, Survey Director for Crime Surveys, Demographic Programs Directorate, U.S. Census Bureau

On September 14, 2020, the Bureau of Justice Statistics (BJS) released results from the 2019 National Crime Victimization Survey (NCVS) in the annual report Criminal Victimization, 2019. The NCVS is the nation’s largest crime survey and collects data on nonfatal crimes both reported and not reported to police. NCVS data collection is conducted by the Census Bureau on behalf of the BJS.

Written by BJS statisticians Rachel E. Morgan and Jennifer L. Truman, Criminal Victimization, 2019 describes the characteristics of crimes, victims, and offenders. In addition, this year, BJS provides new classifications of urban, suburban, and rural areas, with the goal of presenting a more accurate picture of where criminal victimizations occur. Statistics on crimes that have occurred in 2020, during the coronavirus pandemic, are being collected now and will be reported next year.

Highlights from the 2019 NCVS results include:

  • The rate of violent crime excluding simple assault declined 15% from 2018 to 2019, from 8.6 to 7.3 victimizations per 1,000 persons age 12 or older.
  • Among females, the rate of violent victimization excluding simple assault fell 27% from 2018 to 2019.
  • There were 880,000 fewer victims of serious crimes (generally felonies) in 2019 than in 2018, a 19% drop.
  • From 2018 to 2019, 29% fewer black persons and 22% fewer white persons were victims of serious crimes.
  • The rate of violent victimization in urban areas—based on the NCVS's new classifications of urban, suburban, and rural areas—declined 20% from 2018 to 2019.

The full report, Criminal Victimization, 2019 (NCJ 255113), is available on the BJS website at //www.bjs.gov/index.cfm?ty=pbdetail&iid=7046. Additional information about the NCVS is also available on the BJS website at https://www.bjs.gov/index.cfm?ty=dcdetail&iid=245.

Exploring Text Messages as a Data Collection Mode for the NTPS Follow-up Surveys

by Allison Zotti, Center for Optimization & Data Science and Shawna Cox, Survey Director for Education Surveys, Demographic Programs Directorate, U.S. Census Bureau

The National Teacher and Principal Survey (NTPS), sponsored by the National Center for Education Statistics (NCES), is a nationwide sample survey of elementary and secondary schools and the principals and teachers who staff them. The survey is a primary source of information about what is happening in K–12 schools across the United States and provides policy makers and researchers with relevant and timely data on the characteristics and conditions of America’s public, charter, and private K–12 schools and the professionals who work in them.

During the school year following NTPS data collection, the Teacher Follow-up Survey (TFS) and the Principal Follow-up Survey (PFS) are conducted in an effort to measure the attrition rate for teachers and principals, respectively, who had completed the NTPS in the previous year. In addition to providing the attrition rate for teachers, the TFS also examines the characteristics of teachers who stayed in the teaching profession and those who changed professions or retired, obtains occupational data for those who left the position of a K-12 teacher, obtains reasons for moving to a new school or leaving the K-12 teaching profession, and collects data on job satisfaction and student debt.

For the upcoming 2021-22 data collection cycle of the TFS and PFS, NCES expressed interest in exploring the use of text messaging as a new data collection mode by including an experiment to test different methods of texting survey participants.  In order to adhere to the necessary guidelines for texting per the Census Bureau’s Policy Office, the 2020-21 NTPS principal and teacher questionnaires were updated to include a checkbox consenting contact by text message in the future alongside the respondent’s cell phone number.

To explore using text messaging as a contact method for TFS, the experiment will include three treatment groups:

  1. Replace the second and third web invitation letter mail-outs with text message contacts that include the link to complete the TFS online
  2. Include a text message contact that includes a link to complete the TFS online concurrent with the fourth mail-out (the first paper questionnaire mailing), and replace the fifth mail-out with a text message contact that includes a link to complete the TFS online
  3. Include an interactive questions text message contact at the time of the fourth mail-out and replace the fifth mail-out with an interactive questions text message contact

For groups one and two, the text message will invite the teacher to complete the TFS questionnaire online by providing the link to the web instrument in the text message.  For group three, questions about current teaching status will be texted directly to each sampled teacher and his or her response will be collected via their text messaged response. These interactive texts will follow the same general skip pattern as the web instrument, where subsequent texts will ask follow-up questions based on the teacher’s responses.  Results from this experiment will help determine whether a respondent is more likely to respond via a texted survey web link earlier in data collection versus later in data collection, as well as whether a respondent is more likely to respond via a texted survey web link or directly in an interactive text exchange later in data collection.

Because the PFS does not have a web instrument for data collection, all text message contacts for the survey will be an interactive exchange with the principal. PFS data collection begins with contacts to the principal at the NTPS school and then shifts to contacts with the sampled principals directly using the personal contact information they provided on the NTPS if the school is not responsive.

To explore using text messaging as a contact method for PFS, the experiment will include two treatment groups:

  1. Include an interactive questions text message contact to complete the survey concurrent with both principal-level mail-outs
  2. Replace the principal-level mail-outs with an interactive questions text message contact to complete the survey

This experiment will help determine whether survey participants would be willing to respond to a survey via text, generally. In addition, results from this experiment will help determine whether a respondent is more likely to respond via a text exchange instead of returning a paper questionnaire or if the text serves as a reminder, prompting respondents to return their paper questionnaire.

Data collection for the TFS and PFS will be completed in early summer 2022, and results from these text messages are expected shortly thereafter.

National Household Education Survey Usability Testing During a Worldwide Pandemic

by Erica Olmsted-Hawala, Center for Behavioral Science Methods, U.S. Census Bureau

This past June, staff in the Center for Behavioral Science Methods (CBSM) of the U.S. Census Bureau conducted remote usability testing sessions on a subset of questions for the upcoming 2023 National Household Education Survey (NHES) online survey.  The NHES, sponsored by the National Center for Education Statistics (NCES), provides descriptive data on the educational activities of the U.S. population and offers researchers, educators, and policy makers a variety of statistics on the condition of education in the United States.

Usability testing is a method used to evaluate interactions between users and a product, in this case an online survey instrument.  During testing, users attempt tasks using the online survey while thinking aloud.  Typically, a user works by herself while speaking aloud, as an interviewer watches and listens to what she has to say, noting where the user struggles, or experiences confusion.  Difficulties users have with the survey are identified and hopefully fixed prior to the official release.

The online version of the NHES has undergone usability testing in the past, but typically that testing has occurred on the production version of the instrument, right before fielding.  This meant that design problems could only be fixed for the next production cycle of the survey. 

In 2019, the Census Bureau acquired the use of Qualtrics software, which allows CBSM staff to develop survey instruments for testing independent of the software currently used for production surveys. For the NHES usability testing conducted this past June, CBSM staff took a subset of the production instrument questions that were identified as needing improvement from the earlier round of testing and programmed them in Qualtrics.  It took approximately two months to program and test out the modified instrument in preparation for usability testing.  Most of the design modifications to the questions had been recommended in the earlier cycle of user testing and included wording and label changes, in addition to a variation on how to display the within-screen branching. 

Preparations for in-person testing of the NHES instrument were well underway when COVID-19 hit.  In-person testing is standard practice for many reasons:  it helps interviewers orient participants to the testing procedures, gives interviewers ample opportunities to observe participant behavior, and simplifies logistics such as providing participant payments.  New social distancing requirements made such testing impossible and required the usability team to consider the possibility of remote testing—that is, virtual testing using the Internet, with interviewer and participant in different locations. However, it was unclear whether participants would be willing to download software that allowed us to observe their computer screen as they answered the survey questions.  It was also necessary to establish processes for informed consent and paying participants, in addition to procedures for securely allowing remote observers into sessions.

Our solution was to use Skype for Business, a recently approved tool for secure screen sharing at the Census Bureau. We worked with each participant prior to their scheduled session to ensure that the software could be downloaded onto their computer. Once that was accomplished, we practiced having them share their screen with us so that both the participant and interviewer would be confident in the technology on the day of the user session. During this step, we were able to screen out participants that were unable to install the approved software, did not have an acceptable personal computer (e.g., software incompatible on certain tablets or Chrome books), or had other technical issues that made it impossible for them to share their screen.  In addition, making the human connection with the participants at this stage seemed to carry over to a committed participant engaged in working with us on the scheduled day of the actual user session.  We had only one participant who was a “no show” that had to be replaced. (In-person sessions often have more “no-shows.”)

We also met virtually with the observers prior to the testing period to explain procedures and respond to questions. We worked out procedures to use trackable USPS priority mail for sending the incentive to the participants at the conclusion of the session.  Finally, we adapted previously approved procedures for obtaining informed oral consent to the remote setting.

Twenty sessions were conducted remotely.  As this was the first time, we conducted remote usability testing of participants in their homes and on their own computers, we were unsure if it would be successful.  In the end, not only did we find that we produced data comparable to in-person testing, but that the approach had some added benefits.  One bonus was geographic diversity: we now had participants from a variety of states across the U.S., in addition to the D.C. area. Another was that the NHES sponsor found the sessions easy to access and observed all the sessions, which is difficult to achieve with in-person sessions. Still, remote testing does have some drawbacks, such as the inability to see participants’ facial expressions or read their body language, both of which provide non-verbal cues about problems they are experiencing. But overall, the use of Qualtrics provided a more flexible timeline for providing usable feedback, and remote testing expanded capabilities in important ways that will be of continued usefulness long after we move beyond the current pandemic environment. As the old saying goes, “Necessity is the mother of invention.”

Articles/Blogs from Census Internet

World Statistics Day - 2020

Every five years, World Statistics Day is celebrated around the world by the global statistical community to honor the importance of trust, authoritative data, innovation and the public good in national statistical systems.

The U.S. Census Bureau celebrated World Statistics Day on October 20, 2020 by highlighting the many ways our work and statistics impact the world. From the first census in 1790 to the 2020 Census, the economic census, and more than 100 annual surveys, we continue to innovate to find better ways to collect data and release trusted statistics. For more information about World Statistics Day, please visit the 2020 World Statistics Day homepage.

New Experimental Data Product: Interactive Monthly State Retail Sales

The Census Bureau has launched a new experimental data product featuring modeled state-level retail sales. The Monthly State Retail Sales (MSRS) report features interactive state-level data visualizations that combine data from Monthly Retail Trade Survey data, administrative data, and third-party data. Learn more about our experimental data products.

Statement on 2020 Census Data Collection Ending

As of October 13, 2020, well over 99.9% of housing units have been accounted for in the 2020 Census. Self-response and field data collection operations for the 2020 Census concluded on October 15, 2020. Read the official U.S. Census Bureau statement here.

New Earnings and Employment Data for College Graduates

The U.S. Census Bureau updated the Post-Secondary Employment Outcomes (PSEO) statistics with the release of earnings tabulations for four more higher education systems: the City University of New York (CUNY), the State University of New York (SUNY), Pennsylvania State University, and the Texas Higher Education Coordinating Board. The addition of these new higher education systems increases the number of institutions represented in the PSEO from 47 to 243. The PSEO visualization tool is updated to include these institutions, and enhanced the search features for specific majors. With this release, PSEO will also be available on the Census Bureau’s application programming interface.

Check out the latest Household Pulse Survey Updates

The U.S. Census Bureau is in a unique position to produce data on the social and economic effects of COVID-19 on American households and small businesses. 

Results from Phase 2 of the Household Pulse Survey are released bi-weekly. You can explore the latest data releases here and access the interactive data tables to learn more. Subscribe to the Census Bureau newsletters to stay up to date.

COVID-19 Data Hub and Resources

The Census Bureau has launched a new tool — the COVID-19 Data Hub — designed to help guide the nation as it begins recovery efforts from the sweeping COVID-19 pandemic by providing economic and demographic data. Version 1.4 is now available!

Other Recent Data Releases

August 2020 Data and Publication Releases

 

September 2020 Data and Publication Releases

 

October 2020 Data and Publication Releases

Page Last Revised - December 16, 2021
Is this page helpful?
Thumbs Up Image Yes Thumbs Down Image No
NO THANKS
255 characters maximum 255 characters maximum reached
Thank you for your feedback.
Comments or suggestions?

Top

Back to Header