2014 Federal CASIC Workshops
 
Workshops Information:
FedCASIC Workshops Presentations By Year:
 

Held March 18th to March 20th, 2014 at the Bureau of Labor Statistics, Washington, D.C.
Sponsored by the Bureau of Labor Statistics and the U.S. Census Bureau.

2014 FedCASIC Presentations

Tuesday Sessions

Wednesday Sessions

  1. Mobile Devices (4 presentations)
  2. Improving Response Rates (4 presentations)
  3. Case Studies (4 presentations)
  4. CASIC Survey Management Challenges
  5. Survey Evaluation Methods (5 presentations)
  6. Adaptive Design and Multimode Surveys (4 presentations)
  7. Confidentiality and Security (5 presentations)
  8. Field Operations (4 presentations)

Thursday Sessions

  1. Usability and Accessibility (4 presentations)
  2. Incorporating New Technologies on the Consumer Expenditure Survey (5 presentations)
  3. Hard-to-reach Populations (4 presentations)
  4. Administrative and Linked Records (6 presentations)
  5. Survey Development Tools and Standards (6 presentations)

Opening Keynote Speaker

What Does Adaptive Design Mean to You?
Peter Miller, U.S. Census Bureau


There is much talk in the survey research community about adaptive design and responsive design. A number of papers to be presented at this meeting advertise the concepts. What do these terms mean? Is there a single definition for each? Are they the same? Do they refer to something new, or are they just new labels for old approaches? Regardless of how we define them, what do they imply about survey goals? What do they describe about how surveys are executed? What do adaptive or responsive designs actually achieve? What, if anything, do they portend for the future of the survey enterprise?

I will offer answers to these questions based mainly on observations of work at the Census Bureau. The panel discussion that follows will reveal what adaptive design means to others. Some clarity and an agenda for research will result.

Peter V. Miller is Chief of the Center for Survey Measurement at the United States Bureau of the Census, and Chief Scientist in the Bureau's Center for Adaptive Design. He joined the staff of the Census Bureau in 2011.

Before arriving at Census, Miller spent 29 years at Northwestern University, where he holds an appointment as Professor Emeritus. At Northwestern, he served at various times as Associate Professor, Van Zelst Research Professor, Director of the Institute for Modern Communication, Chair of the Department of Communication Studies and Associate Dean for External Programs in the School of Communication. He also has held faculty positions at the University of Michigan, University of Illinois, Urbana-Champaign, and at Purdue University.

Miller was Editor-in-Chief of Public Opinion Quarterly from 2001 to 2008. He has held several elective offices in the American Association for Public Opinion Research (AAPOR), most recently serving as President in 2009-2010. He received the Harry W. O'Neill Award for Outstanding Achievement from the New York Chapter of AAPOR in 2012. He was also named a Fellow of the Midwest Chapter of AAPOR in 2012. His research has included work on interviewer and mode effects in surveys and survey nonresponse.
Presentation Materials:
What does Adaptive Design mean to you? (Adobe PDF Icon, 464kb) 

Plenary Panel

What is Adaptive Design in Practice?


The panel will pick up on themes introduced in the keynote talk. Participants from agencies and survey organizations will discuss their direct experience with adaptive design. Each participant will address these questions: What are the goals of adaptive survey design in particular studies? What design features and interventions have been tried? What are the paradata resources employed? How is monitoring of aspects of data collection accomplished and what are decision rules for adaptation? What systems development has been undertaken to make adaptive design implementation possible? Have goals been achieved or not? What impediments and difficulties have been experienced? What is the agenda for continuing work?

Panelists:
Session Chair: Peter Miller, U.S. Census Bureau
Presentation Materials:
What is Adaptive (or Responsive) Design in Practice? Approaches, Experiences, and Perspectives (Adobe PDF Icon, 291kb)Dan Pratt - RTI International

Recent Innovations

This session will give organizations an opportunity to share information about recent innovations in CASIC approaches, including hardware, software, training, research, new surveys, etc.

The following speakers will be presenting:
Coordinator:Barbara LoPresti, U.S. Census Bureau

Demonstrations

The 2014 FedCASIC will host demonstrations from organizations with innovative CASIC instruments and software. Participating organizations will have the opportunity to demonstrate and showcase their CASIC related technologies with government and academic survey methodologists. The demonstrations will take place in a small exhibit hall setting, with tables set up for demonstrations. Demonstrators may use laptops, displays, and handouts to present their technologies to FedCASIC attendees. Attendees will be free to move throughout the exhibits and learn about new CASIC instruments and software.
Coordinator:Matthew Burgess, Bureau of Labor Statistics
Presentation Materials:
Liberty Demonstration: Integrating Performance Measurement with Healthcare Sensors (Adobe PDF Icon, 195kb)Stacy Stonich, Bhanuj Soni - NORC at the University of Chicago

Mobile Devices

The increasing use of smart-phone technology, iPads, and Android platforms is changing the landscape for survey data collection, providing alternatives to traditional methods. These new technologies present technological, human-computer interface design, coverage and sampling, data quality, and methodological challenges. They also present advances in opportunities to monitor field surveys with real-time tracking and communication. This session examines some of these challenges and advances.
Coordinators:Lew Berman, ICF International
Patricia LeBaron, RTI International
Presentation Materials:
Does Screen Size Affect Interviewer Data Quality for CAPI Surveys? A Comparison of Smartphones and Tablets from Kenya (Adobe PDF Icon, 3.1mb)Sam Haddaway, Sarah Hughes - NORC at the University of Chicago
Field Data Collection in Area Frame Survey Utilizing iPads - USDA's June Area Survey (Adobe PDF Icon, 4.2mb)Claire Boryan, Michael Gerling - National Agricultural Statistics Service
Experiences in Mobile Survey Technology: Technological and Practical Issues (Adobe PDF Icon, 1.4mb)
Experiences in Mobile Survey Technology - Technological and Practical Issues - Ajay Sethi Presentation.pdf (Adobe PDF Icon, 11.7mb)
Andrew Jeavons - Survey Analytics, LLC,
Ajay K. Sethi - University of Wisconsin-Madison
Use of Smartphones to Collect Information about Health Behaviors: A Feasibility Study (Adobe PDF Icon, 1.9mb)Sean Hu - CDC NCHS,
Naomi Freedner, Piper DuBray, James Dayton - ICF International

Improving Response Rates

Today's presentations deal with the use of incentives, methods to reduce decline by increasing the precision and complexity of factors used to contact sample members, and use of social media affiliation.
Coordinator:Jim Caplan, Department of Defense
Presentation Materials:
Evaluating the Effectiveness of Early Bird Incentives in a Web Survey (Adobe PDF Icon, 937kb)Christopher Ward, Michael Stern, Jennifer Vanicek - NORC at the University of Chicago,
Carla L. Black, Cindi Knighton, Larry Wilkinson - CDC NCHS
Approaches to Increase Survey Participation and Data Quality in an At-risk, Youth Population (Adobe PDF Icon, 319kb)Lisbeth Goble, Amanda L. Skaff, Jillian Stein, Lisa K. Schwartz - Mathematica Policy Research, Inc.
Social Media in Survey Research: Can Facebook Friendship Enhance Traditional Survey Strategies? (Adobe PDF Icon, 176kb)Jillian Stein, Amanda L. Skaff, Lisa K. Schwartz - Mathematica Policy Research, Inc.
Strategies for Increasing Efficiency of Cellular Telephone Samples (Adobe PDF Icon, 1.1mb)Kurt Peters, William Robb - ICF International,
Cristine Delnevo, Daniel A. Gundersen - Rutgers School of Public Health

Case Studies

Case studies highlight issues of data collection and processing utilizing, including innovative and efficient approaches to survey operations. Case studies explore the best practices from a variety of surveys, including the Canadian Consumer Price Index (Statistics Canada), the Quarterly Census of Employment and Wages (BLS), Survey of Small Businesses (Census), and a survey conducted by the City of Los Angeles.
Coordinator:Barbara Bibb, RTI International
Presentation Materials:
Consumer Price Index Collection in Canada - Today and Tomorrow (Adobe PDF Icon, 677kb)Andr?Girard, Elizabeth Abraham - Statistics Canada
2012 Survey of Business Owners (Adobe PDF Icon, 1.8mb)Mary Frauenfelder - U.S. Census Bureau
Annual Refiling Survey Web Collection in the Quarterly Census of Employment and Wages Program (Adobe PDF Icon, 1.7mb)John Peters, Dipak Subedi - Bureau of Labor Statistics
Combining Data Collection on the Web in the Quarterly Census of Employment and Wages Program (Adobe PDF Icon, 1.7mb)Kelly Quinn, Ben Cuttitta, Sania Khan - Bureau of Labor Statistics

CASIC Survey Management Challenges

This panel provides a venue for presenting and discussing the management and administrative challenges in today's CAI environment. The session is divided into two topic areas and within each of these topics, four panelists and a moderator address current issues, approaches taken, and lessons learned. Following the panelist presentations, the moderator leads a discussion on the topic area with audience participation. Panelists will be representatives from government agencies and contractor organizations. Session attendees will hear about the techniques used in different organizations to address key management issues, participate in a discussion of these issues, and have an opportunity to ask the panelists about effective approaches to common situations.

Management Challenges related to Recent Technology Advancements Panelists:
Management Challenges related to Survey Project Management Panelists:
Coordinators:Karen Davis, RTI International
Jane Shepherd, Westat

Survey Evaluation Methods

Survey evaluation is critical, especially for continuous quality improvement of survey procedures and results. Numerous methods are available for testing and evaluating computer-assisted data collection. The goal of the evaluation is to increase data quality and to reduce cost throughout the survey lifecycle, including survey redesign, question development, and monitoring fieldwork operations. Innovative approaches to survey evaluation include research on implementation of mixed modes, adoption of mobile devices, cognitive and usability testing, and Event History Calendars. The research presented identifies challenges and opportunities for future survey development.
Coordinators:Carl Ramirez, Government Accountability Office
Gina Cheung, Survey Research Center, Un. of Michigan
Presentation Materials:
Using Mixed Methods to Evaluate Survey Questionnaires (Adobe PDF Icon, 612kb)Heather Ridolfo, Kathy Ott - National Agricultural Statistics Service
Multi-mode Testing at Statistics Canada (Adobe PDF Icon, 222kb)Mireille Paquette - Statistics Canada
How Did We Do? Evaluating Data for the 2013 Survey of Income and Program Participation Field Test (Adobe PDF Icon, 5.5mb)Matthew C. Marlay, Jason M. Fields - U.S. Census Bureau
New Dimensions of Data Quality Measurement in Mobile Field Data Collection (Adobe PDF Icon, 1.3mb)Michael Keating, Charles Loftis, Joseph McMichael, Jamie Ridenhour - RTI International
The Q-Suite (Q-Bank and Q-Notes) (Adobe PDF Icon, 1.1mb)Meredith Massey, Justin Mezetin - CDC NCHS

Adaptive Design and Multimode Surveys

Multimode designs are an increasingly common approach to obtaining the optimal balance of cost and quality. Multimode designs continue to evolve, often in parallel with changes in technology.

Adaptive design and paradata supply intelligence for decision making in the planning and execution of survey processes, often in conjunction with multimode survey designs.

This session will showcase ways in which agencies are utilizing these methodologies and techniques (and associated technologies) on a diverse set of studies.

The session will include lively discussion with both presenters and audience members regarding recent findings and innovations in the use of adaptive design, the challenges and limitations organizations face in utilizing adaptive design and mixed mode surveys, and the use of paradata-driven solutions, to enhance efficiencies and effectiveness of operations.
Coordinators:Chris Stringer , RTI International
Dan Zahs, Survey Research Center, Un. of Michigan
Presentation Materials:
Census Non-Response Follow Up: The 2016 Plan (Adobe PDF Icon, 570kb)Parry Steen - Statistics Canada
Multimode Survey Management (Adobe PDF Icon, 1.4mb)Jerry Wernimont, Kathleen O'Reagan - Westat
Practical Implementation of Adaptive Design in CATI Surveys - Can Adaptive Designs Really be 'Better, Faster and Cheaper'? (Adobe PDF Icon, 2.5mb)John Boyle, James Dayton - ICF International
On-site vs. Remote Records Data Collection (Adobe PDF Icon, 465kb)Alicia Leonard, Emily Weaver - Mathematica Policy Research

Confidentiality and Security

These topics cover a range of issues from protecting the interviewer and interviewing tools in the field, to configuring systems locally and in the cloud to meet current security standards and compliance requirements, to emerging technology and implications for security. A variety of tools and methods are assessed, including pro-active approaches for ensuring confidentiality.
Coordinator:David Charbonneau, U.S. Census Bureau
Presentation Materials:
Using the Cloud as a Software Testing Solution in an USGCB Environment (Adobe PDF Icon, 1mb)Roger Jesrani, Christopher Siege, Nathan Sikes - RTI International
Conducting an Anonymous Survey with Follow-Up Targeted to Non-Responders (Adobe PDF Icon, 842kb)Margaret Collins, Betsy Payn - Battelle Health & Analytics
A Method of Determining the Typology of Surveyed Employee Groups (Adobe PDF Icon, 1mb)Jeremy Cochran, Robert Teclaw, Katerine Osatuke - Veterans Health Administration
Attention to Data Confidentiality in CAI Studies (Adobe PDF Icon, 708kb)Tom Krenzke - Westat
Security for Mobile Devices (Adobe PDF Icon, 495kb)Dennis Pickett - Westat

Field Operations

These presentations feature a variety of traditional and innovative computer assisted interviewing practices, including new methods for evaluating and training interviewers through the use of paradata and eLearning. This session will also cover the use of predictive dialing and workflow management solutions to reduce costs and improve efficiencies in field operations. Some of the presentations make references to behavior coding, CARI, help desk support, and interviewer certification.
Coordinators:Fern Bradshaw, U.S. Census Bureau
Kristina Rall, Mathematica Policy Research, Inc.
Presentation Materials:
Certifiable? Using Paradata to Evaluate Fields Representatives' Performance in the Survey of Income and Program Participation (Adobe PDF Icon, 5mb)Jason M. Fields, Matthew C. Marlay - U.S. Census Bureau
Creating a More User Friendly CARI System (Adobe PDF Icon, 761kb)Aaron Maitland, Laura Branden, Susan Genoversa - Westat
Use of eLearning to Support Training for CAI Data Collectors (Adobe PDF Icon, 963kb)
Use of eLearning to Support Training for CAI Data Collectors - Video.wmv (5mb)
Laura Branden - Westat
Enhancing an Organization's Capabilities for Technical Assistance and Stakeholder Communication (Adobe PDF Icon, 1.3mb)Barri Braddy Burrus, Susanna Cantor, Charles Ebel, David Foster, Michael Price, Nathan Sikes, Howard Speizer - RTI International

Usability and Accessibility

This session will cover usability and accessibility of government surveys. Two presentations will cover remote usability testing, a relatively new method for conducting usability tests with participants from anywhere. A third presentation will share what one agency learned by running many usability tests and cognitive interviews. The final presentation will discuss guidelines to improve the accessibility of web surveys.
Coordinator:Jean Fox, Bureau of Labor Statistics
Presentation Materials:
Using WebEx for Usability Testing: Considerations for Establishment Surveys (Adobe PDF Icon, 2.2mb)Heidi M. St. Onge, Herman A. Alvarado, Kristin J. Stettler - U.S. Census Bureau
A Comparison of Remote Unsynchronized Usability Testing and in-house Lab Testing of a Census Bureau Website (Adobe PDF Icon, 551kb)Elizabeth Nichols, Erica Olmsted-Hawala, Marylisa Gareau - U.S. Census Bureau
Planning for the Future: Usability Testing for the 2020 Census (Adobe PDF Icon, 2.9mb)Emily Geisen, Murrey Olmsted - RTI International,
Patti Goerman, Sabin Lakhe - U.S. Census Bureau
A Questionnaire Guide to Web Survey Accessibility (Adobe PDF Icon, 671kb)Mark Pierzchala - MMP Survey Services, LLC

Incorporating New Technologies on the Consumer Expenditure Survey

The Bureau of Labor Statistics (BLS) is redesigning the Current Expenditure Survey (CE) to incorporate new technologies, such as mobile applications and a web diary. This session provides an overview of the technological features of the new design, as well as the results of current R&D developments. The goal is to discuss critical issues considered when introducing new technologies for a large, continuing survey. The first paper provides an overview of the tradeoffs considered when choosing between different technological design features. The second paper summarizes results of a recently completed field test of a Web diary. The third and fourth paper describe the research and development of a mobile-optimized web instrument for individual diary-keepers in the household. The fifth paper reports on a survey of CE data users who provide feedback on the new features of the redesign and its effects on the user community.
Coordinator:David Cantor, Westat
Presentation Materials:
Incorporating Technology in CE's New Design (Adobe PDF Icon, 381kb)David Cantor - Westat,
Laura Paszkiewicz - Bureau of Labor Statistics
Results from a Web Diary Feasibility Test (Adobe PDF Icon, 949kb)Ian Elkin - Bureau of Labor Statistics
Developing a Mobile-Optimized Web Instrument for the Consumer Expenditure Diary Survey (Adobe PDF Icon, 1.5mb)Nhien To, Brandon Kopp, Erica Yu, Jean Fox - Bureau of Labor Statistics
Evaluating the Usability of a Mobile-Optimized Web Instrument (Adobe PDF Icon, 2.1mb)Brandon Kopp, Erica Yu, Jean Fox, Nhien To - Bureau of Labor Statistics
Data Users' Concerns of the CE Redesign (Adobe PDF Icon, 527kb)Bill Passero - Bureau of Labor Statistics

Hard-to-reach Populations

Innovative and traditional concepts and techniques for surveying hard to reach populations will be presented in this session. They will address both the statistical and survey design aspects of including hard to reach groups. Researchers will report findings from censuses and surveys and other research related to the identification, definition, measurement, and methodologies for surveying and enumerating undercounted populations.
Coordinators:John Baker, U.S. Census Bureau
Andrew Zukerberg, National Center for Education Statistics
Presentation Materials:
Surveillance of Seasonal Influenza Vaccination Coverage among Pregnant Women and Health Care Personnel in the United States (Adobe PDF Icon, 1.2mb)Sara M.A. Donahue - Abt Associates,
Carla L. Black, Stacie Greby, Helen Ding, Anup Srivastav - CDC NCHS,
Rachel Martonik - Abt SRBI,
David Izrael, Sarah W. Ball - Abt Associates,
Charles DiSogra - Abt SRBI,
Deborah K. Walker - Abt Associates
Dynamic Sampling and Data Collection Systems for Hard-to-Reach Populations (Adobe PDF Icon, 1.3mb)Ronaldo Iachan, Tonja Kyle, David Radune, Deirdre Middleton - ICF International
Oversampling Minorities in the National Alcohol Survey using the Zip Code Tabulation Area File (Adobe PDF Icon, 408kb)Pedro Saavedra, Shelley Osborn, Naomi Freedner - ICF International,
Thomas Greenfield, Katherine Karriker-Jaffe - Public Health Institute
Using Google Ads to Target Potential Focus Group Respondents (Adobe PDF Icon, 2.5mb)Amelia Burke-Garcia - Westat

Administrative and Linked Records

With decreasing response rates and increasing survey costs, interest in ways to incorporate administrative records in surveys thrives. Options include replacing administrative data for survey data entirely, integrating administrative records and survey data, or utilizing administrative data to conduct assessment of survey data or vise versa.

This session will highlight examples of innovative uses of administrative records and record linkage methods for household and establishment surveys, and discuss their impact of these approaches on cost and data quality, including coverage.
Coordinator:Brad Edwards, Westat
Presentation Materials:
How Good Is Your Record Linkage System, Really? (Adobe PDF Icon, 1.5mb)K. Bradley Paxton - ADI LLC.
The Nature of the Bias When Studying Only Linkable Person Records: Evidence from the American Community Survey (Adobe PDF Icon, 2.6mb)Brittany Bond - U.S. Department of Commerce,
J. David Brown, Adela Luque, Amy O'Hara - U.S. Census Bureau
Coverage of the Foreign-Born Population in Administrative Records: Magnitude and Characteristics (Adobe PDF Icon, 1.5mb)Renuka Bhaskar, Leticia Fernandez, Sonya Rastogi - U.S. Census Bureau
Comparing Administrative Records and 2010 Census Persons (Adobe PDF Icon, 1.5mb)James Noon, Sonya Rastogi, Ellen Zapata - U.S. Census Bureau
Redesigning National School Surveys: Coverage Improvement Using Multiple Datasets (Adobe PDF Icon, 738kb)William Robb, Kate Flint, Alice Roberts, Ronaldo Iachan - ICF International
Exploring Synergy: Using Survey and Administrative Data Systems to Monitor Local, State, and National Immunization Programs. (Adobe PDF Icon, 2.2mb)Stacie Greby, Laura Pabst, LaTreace Harris, James A. Singleton - CDC NCHS,
Vicki Pineau, Margrethe Montgomery - NORC at the University of Chicago

Survey Development Tools and Standards

Survey development methods have a significant impact on operations and data quality in household and business data collections. Well-defined metadata and taxonomy/lexicon tools can support standards throughout a survey lifecycle. The importance of such tools has grown with the variety of survey modes - CADI, CAPI, CARI, etc. These presentations discuss innovative tools to effectively capture complex elements of survey systems.
Coordinator:Eileen O'Brien, Energy Information Administration
Presentation Materials:
The Evolution of Electronic Questionnaire Collection at Statistics Canada - Interviewer EQ / Respondent EQ (iEQ/rEQ) (Adobe PDF Icon, 627kb)Wade Kuseler - Statistics Canada
CAPI on a Shoestring: Lessons Learned, Bruises Earned (Adobe PDF Icon, 4.2mb)Benjamin J. Earnhart - University of Iowa
Developing Responsive Mobile Web Systems Using Open Source Frameworks (Adobe PDF Icon, 1.3mb)Christopher Siege, Debra Fleischmann - RTI International
Lessons Learned using Open Source Software (Adobe PDF Icon, 1.3mb)Abie Reifer - Westat
Taxonomy / Lexicon Project at BLS (Adobe PDF Icon, 466kb)Dan Gillman - Bureau of Labor Statistics
Standards Based Metadata Usage at Statistics Denmark and Statistics New Zealand (Adobe PDF Icon, 432kb)Jeremy Iverson, Dan Smith - Colectica