Workshops Information:
FedCASIC Workshops Presentations By Year:
Held March 18th to March 20th, 2014 at the Bureau of Labor Statistics, Washington, D.C.
Sponsored by the Bureau of Labor Statistics and the U.S. Census Bureau.
2014 FedCASIC Presentations
Opening Keynote Speaker
What Does Adaptive Design Mean to You?
Peter Miller, U.S. Census Bureau
There is much talk in the survey research community about adaptive design and responsive design. A number of papers to be presented at this meeting advertise the concepts. What do these terms mean? Is there a single definition for each? Are they the same? Do they refer to something new, or are they just new labels for old approaches? Regardless of how we define them, what do they imply about survey goals? What do they describe about how surveys are executed? What do adaptive or responsive designs actually achieve? What, if anything, do they portend for the future of the survey enterprise?
I will offer answers to these questions based mainly on observations of work at the Census Bureau. The panel discussion that follows will reveal what adaptive design means to others. Some clarity and an agenda for research will result.
Peter V. Miller is Chief of the Center for Survey Measurement at the United States Bureau of the Census, and Chief Scientist in the Bureau's Center for Adaptive Design. He joined the staff of the Census Bureau in 2011.
Before arriving at Census, Miller spent 29 years at Northwestern University, where he holds an appointment as Professor Emeritus. At Northwestern, he served at various times as Associate Professor, Van Zelst Research Professor, Director of the Institute for Modern Communication, Chair of the Department of Communication Studies and Associate Dean for External Programs in the School of Communication. He also has held faculty positions at the University of Michigan, University of Illinois, Urbana-Champaign, and at Purdue University.
Miller was Editor-in-Chief of Public Opinion Quarterly from 2001 to 2008. He has held several elective offices in the American Association for Public Opinion Research (AAPOR), most recently serving as President in 2009-2010. He received the Harry W. O'Neill Award for Outstanding Achievement from the New York Chapter of AAPOR in 2012. He was also named a Fellow of the Midwest Chapter of AAPOR in 2012. His research has included work on interviewer and mode effects in surveys and survey nonresponse.
Plenary Panel
What is Adaptive Design in Practice?
The panel will pick up on themes introduced in the keynote talk. Participants from agencies and survey organizations will discuss their direct experience with adaptive design. Each participant will address these questions: What are the goals of adaptive survey design in particular studies? What design features and interventions have been tried? What are the paradata resources employed? How is monitoring of aspects of data collection accomplished and what are decision rules for adaptation? What systems development has been undertaken to make adaptive design implementation possible? Have goals been achieved or not? What impediments and difficulties have been experienced? What is the agenda for continuing work?
Panelists:
- Dan Pratt, RTI
- Tiffany Waits, Mathematica Policy Research
- Wang-Ying Chang, National Science Foundation
- Brian Harris-Kojetin, Office of Management and Budget
Session Chair: Peter Miller, U.S. Census Bureau
Recent Innovations
This session will give organizations an opportunity to share information about recent innovations in CASIC approaches, including hardware, software, training, research, new surveys, etc.
The following speakers will be presenting:
- Karen Davis, RTI
- Abie Reifer, Westat
- Lon Hofmann, Statistics Netherlands
- Steve Lehrfeld, Mathematica
- Angela DeBello, NORC
- Ron Jarmin, U.S. Census Bureau
- Kimberly Noonan, NSF
Demonstrations
The 2014 FedCASIC will host demonstrations from organizations with innovative CASIC instruments and software. Participating organizations will have the opportunity to demonstrate and showcase their CASIC related technologies with government and academic survey methodologists. The demonstrations will take place in a small exhibit hall setting, with tables set up for demonstrations. Demonstrators may use laptops, displays, and handouts to present their technologies to FedCASIC attendees. Attendees will be free to move throughout the exhibits and learn about new CASIC instruments and software.
Mobile Devices
The increasing use of smart-phone technology, iPads, and Android platforms is changing the landscape for survey data collection, providing alternatives to traditional methods. These new technologies present technological, human-computer interface design, coverage and sampling, data quality, and methodological challenges. They also present advances in opportunities to monitor field surveys with real-time tracking and communication. This session examines some of these challenges and advances.
Presentation Materials:
Does Screen Size Affect Interviewer Data Quality for CAPI Surveys? A Comparison of Smartphones and Tablets from Kenya ( , 3.1mb) | Sam Haddaway, Sarah Hughes - NORC at the University of Chicago |
Field Data Collection in Area Frame Survey Utilizing iPads - USDA's June Area Survey ( , 4.2mb) | Claire Boryan, Michael Gerling - National Agricultural Statistics Service |
Experiences in Mobile Survey Technology: Technological and Practical Issues ( , 1.4mb) Experiences in Mobile Survey Technology - Technological and Practical Issues - Ajay Sethi Presentation.pdf ( , 11.7mb) | Andrew Jeavons - Survey Analytics, LLC, Ajay K. Sethi - University of Wisconsin-Madison |
Use of Smartphones to Collect Information about Health Behaviors: A Feasibility Study ( , 1.9mb) | Sean Hu - CDC NCHS, Naomi Freedner, Piper DuBray, James Dayton - ICF International |
Improving Response Rates
Today's presentations deal with the use of incentives, methods to reduce decline by increasing the precision and complexity of factors used to contact sample members, and use of social media affiliation.
Presentation Materials:
Evaluating the Effectiveness of Early Bird Incentives in a Web Survey ( , 937kb) | Christopher Ward, Michael Stern, Jennifer Vanicek - NORC at the University of Chicago, Carla L. Black, Cindi Knighton, Larry Wilkinson - CDC NCHS |
Approaches to Increase Survey Participation and Data Quality in an At-risk, Youth Population ( , 319kb) | Lisbeth Goble, Amanda L. Skaff, Jillian Stein, Lisa K. Schwartz - Mathematica Policy Research, Inc. |
Social Media in Survey Research: Can Facebook Friendship Enhance Traditional Survey Strategies? ( , 176kb) | Jillian Stein, Amanda L. Skaff, Lisa K. Schwartz - Mathematica Policy Research, Inc. |
Strategies for Increasing Efficiency of Cellular Telephone Samples ( , 1.1mb) | Kurt Peters, William Robb - ICF International, Cristine Delnevo, Daniel A. Gundersen - Rutgers School of Public Health |
Case Studies
Case studies highlight issues of data collection and processing utilizing, including innovative and efficient approaches to survey operations. Case studies explore the best practices from a variety of surveys, including the Canadian Consumer Price Index (Statistics Canada), the Quarterly Census of Employment and Wages (BLS), Survey of Small Businesses (Census), and a survey conducted by the City of Los Angeles.
CASIC Survey Management Challenges
This panel provides a venue for presenting and discussing the management and administrative challenges in today's CAI environment. The session is divided into two topic areas and within each of these topics, four panelists and a moderator address current issues, approaches taken, and lessons learned. Following the panelist presentations, the moderator leads a discussion on the topic area with audience participation. Panelists will be representatives from government agencies and contractor organizations. Session attendees will hear about the techniques used in different organizations to address key management issues, participate in a discussion of these issues, and have an opportunity to ask the panelists about effective approaches to common situations.
Management Challenges related to Recent Technology Advancements Panelists:
- Karen Davis, RTI
- Ananth Koppikar, Mathematica
- Gina-Qian Cheung, University of Michigan
- Josh Seeger, NORC
Management Challenges related to Survey Project Management Panelists:
- William Samples, U.S. Census Bureau
- Michael Horrigan, Bureau of Labor Statistics
- Debra Wright, Mathematica
- Patty Maher, University of Michigan
Survey Evaluation Methods
Survey evaluation is critical, especially for continuous quality improvement of survey procedures and results. Numerous methods are available for testing and evaluating computer-assisted data collection. The goal of the evaluation is to increase data quality and to reduce cost throughout the survey lifecycle, including survey redesign, question development, and monitoring fieldwork operations. Innovative approaches to survey evaluation include research on implementation of mixed modes, adoption of mobile devices, cognitive and usability testing, and Event History Calendars. The research presented identifies challenges and opportunities for future survey development.
Coordinators: | Carl Ramirez, Government Accountability Office Gina Cheung, Survey Research Center, Un. of Michigan |
Presentation Materials:
Using Mixed Methods to Evaluate Survey Questionnaires ( , 612kb) | Heather Ridolfo, Kathy Ott - National Agricultural Statistics Service |
Multi-mode Testing at Statistics Canada ( , 222kb) | Mireille Paquette - Statistics Canada |
How Did We Do? Evaluating Data for the 2013 Survey of Income and Program Participation Field Test ( , 5.5mb) | Matthew C. Marlay, Jason M. Fields - U.S. Census Bureau |
New Dimensions of Data Quality Measurement in Mobile Field Data Collection ( , 1.3mb) | Michael Keating, Charles Loftis, Joseph McMichael, Jamie Ridenhour - RTI International |
The Q-Suite (Q-Bank and Q-Notes) ( , 1.1mb) | Meredith Massey, Justin Mezetin - CDC NCHS |
Adaptive Design and Multimode Surveys
Multimode designs are an increasingly common approach to obtaining the optimal balance of cost and quality. Multimode designs continue to evolve, often in parallel with changes in technology.
Adaptive design and paradata supply intelligence for decision making in the planning and execution of survey processes, often in conjunction with multimode survey designs.
This session will showcase ways in which agencies are utilizing these methodologies and techniques (and associated technologies) on a diverse set of studies.
The session will include lively discussion with both presenters and audience members regarding recent findings and innovations in the use of adaptive design, the challenges and limitations organizations face in utilizing adaptive design and mixed mode surveys, and the use of paradata-driven solutions, to enhance efficiencies and effectiveness of operations.
Confidentiality and Security
These topics cover a range of issues from protecting the interviewer and interviewing tools in the field, to configuring systems locally and in the cloud to meet current security standards and compliance requirements, to emerging technology and implications for security. A variety of tools and methods are assessed, including pro-active approaches for ensuring confidentiality.
Presentation Materials:
Using the Cloud as a Software Testing Solution in an USGCB Environment ( , 1mb) | Roger Jesrani, Christopher Siege, Nathan Sikes - RTI International |
Conducting an Anonymous Survey with Follow-Up Targeted to Non-Responders ( , 842kb) | Margaret Collins, Betsy Payn - Battelle Health & Analytics |
A Method of Determining the Typology of Surveyed Employee Groups ( , 1mb) | Jeremy Cochran, Robert Teclaw, Katerine Osatuke - Veterans Health Administration |
Attention to Data Confidentiality in CAI Studies ( , 708kb) | Tom Krenzke - Westat |
Security for Mobile Devices ( , 495kb) | Dennis Pickett - Westat |
Field Operations
These presentations feature a variety of traditional and innovative computer assisted interviewing practices, including new methods for evaluating and training interviewers through the use of paradata and eLearning. This session will also cover the use of predictive dialing and workflow management solutions to reduce costs and improve efficiencies in field operations. Some of the presentations make references to behavior coding, CARI, help desk support, and interviewer certification.
Presentation Materials:
Certifiable? Using Paradata to Evaluate Fields Representatives' Performance in the Survey of Income and Program Participation ( , 5mb) | Jason M. Fields, Matthew C. Marlay - U.S. Census Bureau |
Creating a More User Friendly CARI System ( , 761kb) | Aaron Maitland, Laura Branden, Susan Genoversa - Westat |
Use of eLearning to Support Training for CAI Data Collectors ( , 963kb) Use of eLearning to Support Training for CAI Data Collectors - Video.wmv (5mb) | Laura Branden - Westat |
Enhancing an Organization's Capabilities for Technical Assistance and Stakeholder Communication ( , 1.3mb) | Barri Braddy Burrus, Susanna Cantor, Charles Ebel, David Foster, Michael Price, Nathan Sikes, Howard Speizer - RTI International |
Usability and Accessibility
This session will cover usability and accessibility of government surveys. Two presentations will cover remote usability testing, a relatively new method for conducting usability tests with participants from anywhere. A third presentation will share what one agency learned by running many usability tests and cognitive interviews. The final presentation will discuss guidelines to improve the accessibility of web surveys.
Coordinator: | Jean Fox, Bureau of Labor Statistics |
Presentation Materials:
Using WebEx for Usability Testing: Considerations for Establishment Surveys ( , 2.2mb) | Heidi M. St. Onge, Herman A. Alvarado, Kristin J. Stettler - U.S. Census Bureau |
A Comparison of Remote Unsynchronized Usability Testing and in-house Lab Testing of a Census Bureau Website ( , 551kb) | Elizabeth Nichols, Erica Olmsted-Hawala, Marylisa Gareau - U.S. Census Bureau |
Planning for the Future: Usability Testing for the 2020 Census ( , 2.9mb) | Emily Geisen, Murrey Olmsted - RTI International, Patti Goerman, Sabin Lakhe - U.S. Census Bureau |
A Questionnaire Guide to Web Survey Accessibility ( , 671kb) | Mark Pierzchala - MMP Survey Services, LLC |
Incorporating New Technologies on the Consumer Expenditure Survey
The Bureau of Labor Statistics (BLS) is redesigning the Current Expenditure Survey (CE) to incorporate new technologies, such as mobile applications and a web diary. This session provides an overview of the technological features of the new design, as well as the results of current R&D developments. The goal is to discuss critical issues considered when introducing new technologies for a large, continuing survey. The first paper provides an overview of the tradeoffs considered when choosing between different technological design features. The second paper summarizes results of a recently completed field test of a Web diary. The third and fourth paper describe the research and development of a mobile-optimized web instrument for individual diary-keepers in the household. The fifth paper reports on a survey of CE data users who provide feedback on the new features of the redesign and its effects on the user community.
Coordinator: | David Cantor, Westat |
Presentation Materials:
Incorporating Technology in CE's New Design ( , 381kb) | David Cantor - Westat, Laura Paszkiewicz - Bureau of Labor Statistics |
Results from a Web Diary Feasibility Test ( , 949kb) | Ian Elkin - Bureau of Labor Statistics |
Developing a Mobile-Optimized Web Instrument for the Consumer Expenditure Diary Survey ( , 1.5mb) | Nhien To, Brandon Kopp, Erica Yu, Jean Fox - Bureau of Labor Statistics |
Evaluating the Usability of a Mobile-Optimized Web Instrument ( , 2.1mb) | Brandon Kopp, Erica Yu, Jean Fox, Nhien To - Bureau of Labor Statistics |
Data Users' Concerns of the CE Redesign ( , 527kb) | Bill Passero - Bureau of Labor Statistics |
Hard-to-reach Populations
Innovative and traditional concepts and techniques for surveying hard to reach populations will be presented in this session. They will address both the statistical and survey design aspects of including hard to reach groups. Researchers will report findings from censuses and surveys and other research related to the identification, definition, measurement, and methodologies for surveying and enumerating undercounted populations.
Coordinators: | John Baker, U.S. Census Bureau Andrew Zukerberg, National Center for Education Statistics |
Presentation Materials:
Surveillance of Seasonal Influenza Vaccination Coverage among Pregnant Women and Health Care Personnel in the United States ( , 1.2mb) | Sara M.A. Donahue - Abt Associates, Carla L. Black, Stacie Greby, Helen Ding, Anup Srivastav - CDC NCHS, Rachel Martonik - Abt SRBI, David Izrael, Sarah W. Ball - Abt Associates, Charles DiSogra - Abt SRBI, Deborah K. Walker - Abt Associates |
Dynamic Sampling and Data Collection Systems for Hard-to-Reach Populations ( , 1.3mb) | Ronaldo Iachan, Tonja Kyle, David Radune, Deirdre Middleton - ICF International |
Oversampling Minorities in the National Alcohol Survey using the Zip Code Tabulation Area File ( , 408kb) | Pedro Saavedra, Shelley Osborn, Naomi Freedner - ICF International, Thomas Greenfield, Katherine Karriker-Jaffe - Public Health Institute |
Using Google Ads to Target Potential Focus Group Respondents ( , 2.5mb) | Amelia Burke-Garcia - Westat |
Administrative and Linked Records
With decreasing response rates and increasing survey costs, interest in ways to incorporate administrative records in surveys thrives. Options include replacing administrative data for survey data entirely, integrating administrative records and survey data, or utilizing administrative data to conduct assessment of survey data or vise versa.
This session will highlight examples of innovative uses of administrative records and record linkage methods for household and establishment surveys, and discuss their impact of these approaches on cost and data quality, including coverage.
Presentation Materials:
How Good Is Your Record Linkage System, Really? ( , 1.5mb) | K. Bradley Paxton - ADI LLC. |
The Nature of the Bias When Studying Only Linkable Person Records: Evidence from the American Community Survey ( , 2.6mb) | Brittany Bond - U.S. Department of Commerce, J. David Brown, Adela Luque, Amy O'Hara - U.S. Census Bureau |
Coverage of the Foreign-Born Population in Administrative Records: Magnitude and Characteristics ( , 1.5mb) | Renuka Bhaskar, Leticia Fernandez, Sonya Rastogi - U.S. Census Bureau |
Comparing Administrative Records and 2010 Census Persons ( , 1.5mb) | James Noon, Sonya Rastogi, Ellen Zapata - U.S. Census Bureau |
Redesigning National School Surveys: Coverage Improvement Using Multiple Datasets ( , 738kb) | William Robb, Kate Flint, Alice Roberts, Ronaldo Iachan - ICF International |
Exploring Synergy: Using Survey and Administrative Data Systems to Monitor Local, State, and National Immunization Programs. ( , 2.2mb) | Stacie Greby, Laura Pabst, LaTreace Harris, James A. Singleton - CDC NCHS, Vicki Pineau, Margrethe Montgomery - NORC at the University of Chicago |
Survey Development Tools and Standards
Survey development methods have a significant impact on operations and data quality in household and business data collections. Well-defined metadata and taxonomy/lexicon tools can support standards throughout a survey lifecycle. The importance of such tools has grown with the variety of survey modes - CADI, CAPI, CARI, etc. These presentations discuss innovative tools to effectively capture complex elements of survey systems.
Coordinator: | Eileen O'Brien, Energy Information Administration |