U.S. flag

An official website of the United States government

Skip Header


Behavior Coding Report of 2010 Census Coverage Measurement Person Interviews

Written by:
2010 Census Planning Memo No. 251

Executive Summary

The purpose of this behavior coding study is to determine how interviewers ask questions as well as how well respondents answer them during the 2010 Census Coverage Measurement Person Interview. These results can provide insights on how to improve survey questions, administrative procedures, and training of interviewers for future operations in preparation for the 2020 Census.

The 2010 Census Coverage Measurement Person Interview is part of an independent survey operation that measures the accuracy of the within household coverage of the census. For the 2010 Census, temporary interviewers conducted face-to-face interviews via Computer Assisted Personal Interviewing instrument. The 2010 Census Coverage Measurement Person Interview protocol also included an Information Sheet that interviewers handed to respondents for use during the interview (see Appendix 1). Respondents could keep this sheet.

Behavior coding is a survey research method for systematically analyzing interactions between interviewers and respondents. This method involves the application of a set of uniform codes to interviewer and respondent verbal behavior. Examples of codes applied to interviewer behaviors include reading the question as worded, making a major change to the question, and skipping the question entirely. Respondent behaviors include providing an answer that matches one of the response options, asking for clarification, and giving an answer that is not easily mapped on to the response options, among others. High rates of non-ideal behaviors (such as interviewers changing question wording or respondents providing answers that do not match response options) can indicate problems with specific questions. For example, if a particular question is associated with high rates of major changes, especially if interviewers administer other survey questions as written, it suggests that there may be issues with that question. Behavior coding can also identify problems with interviewer performance. For example, if the majority of questions are not read as worded, it can suggest a need for additional training or supervision if a standardized interview is to be achieved.

The research question for this study was: How well do the Census Coverage Measurement Person Interview survey questions perform? Specifically, the goal of the study was to document how interviewers asked questions and how respondents answered them. This research question was answered by examining 271 audiotaped 2010 Census Coverage Measurement Person Interviews. Six experienced Census Bureau interviewers trained in behavior coding assigned the codes. Each interviewer coded approximately 45 interviews. These six interviewers did not work on any other part of the 2010 Census Coverage Measurement Person Interview operation. The use of audiotapes limited our analysis to verbal behaviors; as non-verbal behaviors were not recorded. For each question in the 2010 Census Coverage Measurement Person Interview survey instrument, interviewers coded the interviewer’s administration of the question, the respondent’s first reply and the respondent’s final answer. Additionally, all coders coded five of the same cases to test for reliability of the coding. Using Fliess’ kappa statistic, we found substantial agreement among behavior coders for their coding of interviewer behavior and moderate agreement for respondent behavior.

Page Last Revised - October 8, 2021
Is this page helpful?
Thumbs Up Image Yes Thumbs Down Image No
NO THANKS
255 characters maximum 255 characters maximum reached
Thank you for your feedback.
Comments or suggestions?

Top

Back to Header