2012 Federal CASIC Workshops
 
Workshops Information:
FedCASIC Workshops Presentations By Year:
 

Held March 27th to March 29th, 2012 at the Bureau of Labor Statistics, Washington, D.C.
Sponsored by the Bureau of Labor Statistics and the U.S. Census Bureau.

2012 FedCASIC Presentations

Tuesday Sessions

Wednesday Sessions

  1. New Technologies: Mobile Data Collection (5 presentations)
  2. Survey Uses of Metadata (4 presentations)
  3. Management Challenges in CAI Survey Organizations
  4. Addressing and Reducing Respondent Burden to Gain Cooperation (4 presentations)
  5. New Technologies: New Uses of Existing Technologies (5 presentations)
  6. CARI: Approaches for Quality Control and Interviewer Performance Feedback (4 presentations)
  7. Usability and Accessibility (5 presentations)
  8. Data Management (5 presentations)

Thursday Sessions

  1. New Technologies: Survey Management Technologies (5 presentations)
  2. Web-Based Surveys (4 presentations)
  3. Going Mobile: Ensuring Security of New Data Collection Platforms (3 presentations)
  4. Blending CASIC Designs with Administrative Records (4 presentations)

Opening Keynote Speaker

Paradata within the Total Survey Error Framework: Successes, Challenges, and Gaps
Dr. Frauke Kreuter Link to a non-federal Web site
Associate Professor
Joint Program in Survey Methodology
University of Maryland

Paradata now play an increasing role in the survey production process within the Federal Statistical agencies. So far paradata are mostly used to address nonresponse error, though in some instances paradata are also used to examine measurement error. To expand our thinking about the roles paradata can plan, this presentation will have three parts: First, I will highlight current paradata collection activities within the Federal Statistical System and elsewhere and summarize typical applications. Second, I will discuss the challenges we have seen in the collection and use of paradata, in particular those arising through measurement error inherent in those paradata. Third, I will outline the use of paradata within the larger Total Survey Error framework. Paradata can help us understand many of the additional error sources beyond NR and ME, such as coverage and coding error. Some of the data we need for this larger viewpoint are already routinely collected, even if they are not necessarily thought of as paradata. We will discuss whether integrating these additional data into the TSE framework can help us improve the survey production process as a whole.

Frauke Kreuter is an Associate Professor at the University of Maryland in the Joint Program in Survey Methodology, and currently head of the statistical methods research department at the Institute for Employment Research in Nuremberg, Germany. She has taught several short courses on paradata, and is guest editor of a special issue of JRSSA on paradata. Right now she is organizing an edited volume on the improvement of surveys through paradata. Her research on paradata is partially funded through a 3-year ESRC Research Grant (with G. Durrant and P. Smith both at the University of Southampton: RES-062-23-2997).

Presentation Materials:
Paradata within the Total Survey Error Framework Successes, Challenges, and Gaps (Adobe PDF Icon, 722kb)

Plenary Panel

The Use of Paradata to Improve Survey Quality:
Organizational Approaches and Challenges

In recent years, FedCASIC paradata workshops have focused on how data collection organizations have used paradata to monitor survey process quality, and on the development of systems for paradata collection and processing and of dashboards for improving paradata accessibility. Presentations often have provided examples of paradata used to monitor survey data collection for a study, or the use of one type of paradata, such as audit trails or call history data. There have been fewer examples of the use of standard and easily tailored indicators across modes and types of surveys (e.g., cross-sectional, longitudinal, or enterprise surveys) to assess survey progress quality.

In this plenary panel, several organizations will address the following issues in relation to their own use or planned use of paradata at organizational and/or study levels:

The panelists will include:

Coordinators:Sue Ellen Hansen, Survey Research Center, Un. of Michigan
Chris Stringer , U.S. Census Bureau
Presentation Materials:
The Use of Paradata to Improve Survey Quality: Organizational Approaches and Challenges (Adobe PDF Icon, 133kb)
The Use of Paradata to Improve Survey Quality: Organizational Approaches and Challenges (Adobe PDF Icon, 203kb)
The Use of Paradata to Improve Survey Quality: Census Bureau (Adobe PDF Icon, 1mb)
Use of Paradata to Improve Survey Quality: Standardized Indicators across Survey Modes and Types (Adobe PDF Icon, 176kb)

Recent Innovations

This session will give organizations an opportunity to share information about recent innovations in CASIC approaches, including hardware, software, training, research, major organizational changes, new surveys, etc. Organizations that wish to participate should contact Bill Mockovak in advance.

The following speakers will be presenting:
Coordinator:Bill Mockovak, Bureau of Labor Statistics

Demonstrations

This year we will continue to offer demonstrations of CASIC instruments and software in a mini exhibit hall setting, where attendees can move among exhibitors throughout the demonstration period.
Coordinator:Louis Harrell, Bureau of Labor Statistics

New Technologies: Mobile Data Collection

The presentations in this session will cover uses and applications of mobile technologies for data collection in Africa and the developing world, listing, and study recruitment. Technologies will include tablets, Android systems, and digital pen and paper.
Coordinators:Patty LeBaron, Research Triangle Institute
Lew Berman, Centers for Disease Control
Caitlin Blair, Bureau of Labor Statistics
Presentation Materials:
Mobile Data Collection Lessons Learned (Adobe PDF Icon, 725kb)Josh Seeger, NORC
Use of Mobile Technology in Support of Study Recruitment and Field Data Collection (Adobe PDF Icon, 2.1mb)Abie Reifer, Westat
Mobile to Web: An Integrated Model for Mobile Data Collection and Web-Based Monitoring and Reporting (Adobe PDF Icon, 1.3mb)David Cantor, ICFI
Mobile Technology Applications for Verbal Autopsy (Adobe PDF Icon, 678kb)Erin Nichols, CDC / NCHS
Sam Notzon, CDC / NCHS
Surveys in the Developing World: UN Nutrition Surveys Using Digital Pen and Paper (Adobe PDF Icon, 2.4mb)Greg Clary, Mi-Co

Survey Uses of Metadata

Metadata are data that describe other data or processes. For users of data, the metadata are the record of how those data were produced and what the data mean. Metadata are analogous to the work you had to show when solving a math problem in high school. In order to understand the data a survey produces, you must understand the steps that were taken to conduct that survey.

Survey work provides many opportunities to use metadata fruitfully, throughout the survey life-cycle. For instance, data dissemination, data harmonization, and survey documentation all use or produce metadata. This session will explore these and related issues.
Coordinator:Dan Gillman, Bureau of Labor Statistics
Presentation Materials:
Metadata: To Boldly Go Where No One Has Gone Before? (Adobe PDF Icon, 1.2mb)Pascal Heus, Metadata Technology
Open Government Vocabularies and Metadata (Adobe PDF Icon, 512kb)Dan Gillman, Bureau of Labor Statistics
A Metadata Reference Model for IRS Data (Adobe PDF Icon, 1.2mb)Jeff Butler, IRS
Using DDI 3 to Manage the Use of Coded Data in Longitudinal Studies (Adobe PDF Icon, 2.1mb)Alexandra Shlionskaya, Booz-Allen Hamilton
Sophia Kuan, Booz-Allen Hamilton
Jay Greenfield, Booz-Allen Hamilton

Management Challenges in CAI Survey Organizations

This session will provide a venue for those grappling with management and administrative challenges in today's CAI environment to share their knowledge and learn from others. A panel of 4-5 management experts from government and industry will discuss the following topics:
Panelists will include:

Audience participation in the form of questions and shared experiences will be encouraged. Session attendees will hear about the techniques used in different organizations to address key management issues, participate in a discussion of these issues, and have an opportunity to ask the panelists about effective approaches to common situations.
Coordinators:Karen Davis, Research Triangle Institute
Jane Shepherd, Westat
LaTerri Bynum, U.S. Census Bureau

Addressing and Reducing Respondent Burden to Gain Cooperation

This session will be considering respondent burden and possible ways to alleviate it. Respondent burden is created by questionnaire length, questionnaire complexity, and repeated contact. This session will initiate a discussion of how to reduce respondent burden and gain cooperation. Elements that could be discussed with respect to respondent burden are questionnaire design, longitudinal study design, and the technology to make administration quicker, easier, or simpler. Case studies of successes or failures could also be discussed as a method of brainstorming solutions to the problem of maintaining respondent cooperation as it relates to burden. The target audience consists of survey authors, implementers, and administrators.
Coordinator:Barbara Bibb, Research Triangle Institute
Presentation Materials:
Survey Burden: an Enabler's Perspective and Guilt Feelings (Adobe PDF Icon, 352kb)Mark Pierzchala, MMP Survey Services, LLC
Questionnaire Complexity and Respondent Burden (Adobe PDF Icon, 427kb)Adrianne Gilbert, RTI International
Dawn Thomas-Banks, RTI International
Gaining Cooperation while Minimizing Respondent Burden on NHANES (Adobe PDF Icon, 2mb)Mercy Merino Rodriguez, CDC NCHS
Tatiana Nwankwo, CDC NCHS
Respondent Burden: Summary (Adobe PDF Icon, 248kb)Barbara Bibb, Research Triangle Institute
Lillie Barber, RTI International
Behnaz Whitmire, RTI International
Ansu Koshy, RTI International
Chuchun Chien, RTI International

New Technologies: New Uses of Existing Technologies

This New Technologies session will feature discussion of the support existing technologies provide to various stages of the survey lifecycle. Presenters from both the public and private sectors will discuss, for instance, technology's role in creating a dual-frame sample, decreasing nonresponse, and providing case management across multiple modes. Presenters will provide evidence and anecdotes of the potential for existing technologies to enhance and improve survey design. Questions and discussion are strongly encouraged from session attendees.
Coordinators:Patty LeBaron, Research Triangle Institute
Lew Berman, Centers for Disease Control
Caitlin Blair, Bureau of Labor Statistics
Presentation Materials:
Addressing non-response bias in TPOPS: results from a cell phone frame test (Adobe PDF Icon, 408kb)Gabriela Arcos, Bureau of Labor Statistics
Has text messaging increased participant compliance on NHANES? (Adobe PDF Icon, 1.8mb)Tatiana Nwankwo, CDC NCHS
Innovations: Video Data Collection, Processing and Coding (Adobe PDF Icon, 1.2mb)Rick Dulaney, Westat
Chris de los Santos, Westat
Rick Rogers, Fenestra
Greg Binzer, Westat
HINTS-GEM: Using Science 2.0 to Facilitate Data Integration in Constructing a National Health Survey (Adobe PDF Icon, 2.2mb)Richard P. Moser, NCI
Ellen Burke Beckjord, University of Pittsburgh Medical Center
Lila Finney Rutten, SAIC-Frederick, Inc.
Kelly Blake, NCI
Bradford W. Hesse, NCI
Nirvana - An Enlightened Survey Management System (Adobe PDF Icon, 505kb)R. Suresh, RTI International

CARI: Approaches for Quality Control and Interviewer Performance Feedback

Over the past ten years, the Computer Audio-Recorded Interviews (CARI) has become an important tool in survey administration. Survey managers and methodologists have used recorded interviews to manage and monitor the quality of data collection, address interviewers' effectiveness and also identify improvements necessary in questionnaires. Many survey organizations have developed their own tools and procedures to utilize this CARI technology. In this session, we will focus on the process of reviewing the recordings:
Coordinators:Gina Cheung, Survey Research Center, Un. of Michigan
Patty Maher, Survey Research Center, Un. of Michigan
Presentation Materials:
Use of CARI to Standardize Field Interviewer Performance (Adobe PDF Icon, 461kb)Susan H. Kinsey, RTI International
Providing Interviewer Performance Feedback Using CARI (Adobe PDF Icon, 655kb)Carl Fisher, RTI International
Using CARI to Conduct Behavior Coding Analysis of Alternative Questionnaires (Adobe PDF Icon, 2mb)Joanne Pascale, U.S. Census Bureau
Examining interviewer behavior in handling 'difficult' cases (Adobe PDF Icon, 333kb)Wendy Hicks, Westat
Aaron Maitland, Westat
Brad Edwards, Westat

Usability and Accessibility

This session will cover the usability and accessibility of CATI and CAPI instruments along with web surveys. Presentations may include topics such as how to incorporate usability and accessibility into the development process and methods for conducting evaluations. Presenters will also discuss lessons learned from their usability or accessibility experience.
Coordinator:Jean Fox, Bureau of Labor Statistics
Presentation Materials:
Section 508 Refresh Overview (Adobe PDF Icon, 2.8mb)Jennifer Horan, DOL
Accessibility Testing: The Role of Tools, Screen Readers, and Manual Methods (Adobe PDF Icon, 451kb)Karen Brenner, Westat
Usability vs Accessibility in Websites/Web Surveys (Adobe PDF Icon, 256kb)Sandhya Bikmal, RTI International
Sridevi Sattaluri, RTI International
Unmoderated Cognitive and Usability Testing Using the Web (Adobe PDF Icon, 916kb)Jennifer Edgar, Bureau of Labor Statistics
Bill Mockovak, Bureau of Labor Statistics
First Fridays Product Testing Program (Adobe PDF Icon, 3mb)Kristal Byrd, GSA

Data Management

This year's session will focus on how CAI data becomes public data, and how external public data can enhance the usefulness of CAI data. Topics could include:
Coordinators:Jane Shepherd, Westat
David Uglow, Research Triangle Institute
Presentation Materials:
Selective Editing (Adobe PDF Icon, 513kb)Elizabeth Panarelli, EIA
Interoperability Through Vocabulary Registries (Adobe PDF Icon, 484kb)Dan Gillman, Bureau of Labor Statistics
The NCSES Data System and Metadata Schema (Adobe PDF Icon, 2.7mb)Kimberly Noonan, NSF
Evolution of Data Release Documentation for Continuous NHANES (NHANES 99+) (Adobe PDF Icon, 2mb)
Dashboards and Portals: Tailoring the User View (Adobe PDF Icon, 675kb)Leena Dave, RTI International
Deepa Avula, CSAT
Susan Eversole, RTI International
Bharathi Golla, RTI International
Sujatha Lakshmikanthan, RTI International
William Savage, RTI International

New Technologies: Survey Management Technologies

This New Technologies session will focus on the complex challenges of managing a survey and on the ways in which new technology affects the future of survey management. Presenters from both the public and private sectors will discuss a variety of facets of survey management, ranging from the challenges of data collection application development to system monitoring to integrating data systems. We will be discussing current challenges and ideas for the future as well as new innovations in the field. Questions and discussion are strongly encouraged from session attendees.
Coordinators:Patty LeBaron, Research Triangle Institute
Lew Berman, Centers for Disease Control
Caitlin Blair, Bureau of Labor Statistics
Presentation Materials:
Cross-platform, Disconnected State Mobile Application Development (Adobe PDF Icon, 833kb)Jonathan Krentel, Gunnison Consulting Group, Inc.
Sandra Dyer, U.S. Census Bureau
It's in the Clouds: Electronic Data Collection (Adobe PDF Icon, 2.3mb)Susan Harris, Energy Information Administration
Custom Dash Board for Comprehensive Project and Facility Management on a Large Scale Study (Adobe PDF Icon, 595kb)Maria Hobbs, RTI International
David Forvendel, RTI International
Expanding DCAS (Data Collection Application Suite) for Survey Data Collection on Mobile Platforms (Adobe PDF Icon, 1.9mb)Michael Volynski, InfoPro Systems, Inc.
Integrated Sample Management (Adobe PDF Icon, 340kb)Steven Lehrfeld, Mathematica Policy Research, Inc.

Web-Based Surveys

Web-based surveys continue to increase in popularity and advances in technology have made them easier to develop and deploy. Given their popularity, we are increasingly interested in many aspects of web surveys such as creating and testing web-based instruments, sample management, web survey deployment on smartphones and tablets, usability, and the always difficult task of maintaining a high response rate. In this session, we will explore several technical developments in web surveys, usability testing, case management, and methods to increase web response rates. Audience input will be strongly encouraged.
Coordinators:Mark Brinkley, Mathematica Policy Research, Inc.
Kirsten Barrett, Mathematica Policy Research, Inc.
Presentation Materials:
Household Survey and Establishment Electronic Data Collection (Adobe PDF Icon, 1.2mb)Mike Hart, UK Office for National Statistics
Lessons Learned: Adding Support for a Foreign Language to the CES Data Collection Web Site (Adobe PDF Icon, 491kb)Julie Hatch, Bureau of Labor Statistics
Testing Complex Web Surveys Using Automated Test Tools (Adobe PDF Icon, 558kb)Anwar Mohammed, RTI International
Gilberto Munoz, RTI International
Role of Survey Management Systems in Web Based Data Collection (Adobe PDF Icon, 171kb)Bryan Davis, Westat

Going Mobile: Ensuring Security of New Data Collection Platforms

To reduce cost and increase response rates, it makes business sense to consider engaging with collection staff and respondents using the types of devices and social networking sites with which they are accustomed and familiar.

However, can we ensure adequate security for these methods? Can we say ?yes? to data collection, and/or related activities, via tablets, smartphones, and Facebook?

In this session we will explore the challenges, and potential solutions, to ensuring that data collection activities using mobile devices, or through social media sites, maintains adequate security and compliance protections.

Target Audience: Anyone interested in knowing more about these technology trends and how they might be safely employed for information collection activities. While some material will be technical in nature, presentation as a whole will be relatable for any audience
Coordinators:Paul Blahusch, Bureau of Labor Statistics
Bill Connett, Survey Research Center, Un. of Michigan
Presentation Materials:
Mobile: Ensuring Security of New Data Collection Platforms (Adobe PDF Icon, 638kb)Paul Blahusch, Bureau of Labor Statistics
Mobile Device Data Collection and its Security Attack Surfaces (Adobe PDF Icon, 207kb)Glenn Jones, Mathematica Policy Research
Security Control for Utilizing Social Networking Sites (Adobe PDF Icon, 390kb)Diana Salazar, NORC

Blending CASIC Designs with Administrative Records

With decreasing response rates and increasing survey costs, interest in ways to incorporate administrative records in surveys has surged. These ways range from abandoning surveys entirely (e.g., censuses in some European countries) to closely integrating survey questions and administrative records data in early stages of survey design. This session will present some examples of innovative uses of administrative records in household and establishment surveys, and discuss their impact on survey quality and costs.
Coordinator:Brad Edwards, Westat
Presentation Materials:
Use of Administrative Records in NCES Secondary and Postsecondary Sample Surveys (Adobe PDF Icon, 402kb)Kristin M. Dudley, RTI International
Moving towards convergence: The National Immunization Survey (NIS) and Immunization Information Systems (IIS) (Adobe PDF Icon, 419kb)Stacie Greby, CDC NCHS
Karen Cullen, CDC NCHS
Ken Copeland, NORC at the University of Chicago
Vicki Pineau, NORC at the University of Chicago
Sabrina Bauroth, NORC at the University of Chicago
The Effect of Reporting Mode on Administrative Records: Are We Sacrificing Quality for Convenience? (Adobe PDF Icon, 320kb)Marilyn Worthy, Energy Information Administration
Danielle Mayclin, Energy Information Administration
Administrative Records as a Potential Source of Data for Expenditure Surveys (Adobe PDF Icon, 557kb)Sid Schneider, Westat
David Cantor, Westat
Brad Edwards, Westat
Abie Reifer, Westat