U.S. flag

An official website of the United States government


end of header
2014 Federal CASIC Workshops
 

Workshops Program

Opening Day - March 18, 2014, 9:00-noon

Opening Keynote Speaker

Tuesday, March 18, 9:00-10:25
What Does Adaptive Design Mean to You?
Peter Miller, U.S. Census Bureau


There is much talk in the survey research community about adaptive design and responsive design. A number of papers to be presented at this meeting advertise the concepts. What do these terms mean? Is there a single definition for each? Are they the same? Do they refer to something new, or are they just new labels for old approaches? Regardless of how we define them, what do they imply about survey goals? What do they describe about how surveys are executed? What do adaptive or responsive designs actually achieve? What, if anything, do they portend for the future of the survey enterprise?

I will offer answers to these questions based mainly on observations of work at the Census Bureau. The panel discussion that follows will reveal what adaptive design means to others. Some clarity and an agenda for research will result.

Peter V. Miller is Chief of the Center for Survey Measurement at the United States Bureau of the Census, and Chief Scientist in the Bureau's Center for Adaptive Design. He joined the staff of the Census Bureau in 2011.

Before arriving at Census, Miller spent 29 years at Northwestern University, where he holds an appointment as Professor Emeritus. At Northwestern, he served at various times as Associate Professor, Van Zelst Research Professor, Director of the Institute for Modern Communication, Chair of the Department of Communication Studies and Associate Dean for External Programs in the School of Communication. He also has held faculty positions at the University of Michigan, University of Illinois, Urbana-Champaign, and at Purdue University.

Miller was Editor-in-Chief of Public Opinion Quarterly from 2001 to 2008. He has held several elective offices in the American Association for Public Opinion Research (AAPOR), most recently serving as President in 2009-2010. He received the Harry W. O'Neill Award for Outstanding Achievement from the New York Chapter of AAPOR in 2012. He was also named a Fellow of the Midwest Chapter of AAPOR in 2012. His research has included work on interviewer and mode effects in surveys and survey nonresponse.
Presentations:
What does Adaptive Design mean to you?

Plenary Panel

Tuesday, March 18, 10:35-noon
What is Adaptive Design in Practice?


The panel will pick up on themes introduced in the keynote talk. Participants from agencies and survey organizations will discuss their direct experience with adaptive design. Each participant will address these questions: What are the goals of adaptive survey design in particular studies? What design features and interventions have been tried? What are the paradata resources employed? How is monitoring of aspects of data collection accomplished and what are decision rules for adaptation? What systems development has been undertaken to make adaptive design implementation possible? Have goals been achieved or not? What impediments and difficulties have been experienced? What is the agenda for continuing work?

Panelists:
Session Chair: Peter Miller, U.S. Census Bureau
Presentations:
What is Adaptive (or Responsive) Design in Practice? Approaches, Experiences, and PerspectivesDan Pratt - RTI International

March 18, 2014, 1:30-4:30

Recent Innovations

Tuesday, March 18, 1:30-4:30
This session will give organizations an opportunity to share information about recent innovations in CASIC approaches, including hardware, software, training, research, new surveys, etc.

The following speakers will be presenting:
Coordinator:Barbara LoPresti, U.S. Census Bureau

Demonstrations

Tuesday, March 18, 1:30-4:30
The 2014 FedCASIC will host demonstrations from organizations with innovative CASIC instruments and software. Participating organizations will have the opportunity to demonstrate and showcase their CASIC related technologies with government and academic survey methodologists. The demonstrations will take place in a small exhibit hall setting, with tables set up for demonstrations. Demonstrators may use laptops, displays, and handouts to present their technologies to FedCASIC attendees. Attendees will be free to move throughout the exhibits and learn about new CASIC instruments and software.
Target Audience: Government, academic, and industry professionals with interest in computer assisted survey information collection (CASIC).
Coordinator:Matthew Burgess, Bureau of Labor Statistics
Presentations:
Enterprise Architecture of Big Data High Performance Computing Solutions for ResearchDaniel Gwynne, Johannes Huessy, Scot Ausborn, Timothy Mulcahy, Will Johnson - NORC at the University of Chicago
Liberty Demonstration: Integrating Performance Measurement with Healthcare SensorsStacy Stonich, Bhanuj Soni - NORC at the University of Chicago
Using Windows-based Tablets for CAI Data Collection: The Caribbean Netherlands ExperienceRoger Linssen - Statistics Netherlands
A Solution to Managing Field Data Collections: SmartFieldSean Harrington, Gene Shkolnikov, Shawn Marsh - Mathematica Policy Research, Inc.

Technical Workshop Session Topics

March 19, 2014, 9:00-noon

Mobile Devices

Wednesday, March 19, 9:00-noon
The increasing use of smart-phone technology, iPads, and Android platforms is changing the landscape for survey data collection, providing alternatives to traditional methods. These new technologies present technological, human-computer interface design, coverage and sampling, data quality, and methodological challenges. They also present advances in opportunities to monitor field surveys with real-time tracking and communication. This session examines some of these challenges and advances.
Target Audience: IT professionals, field and survey managers, usability specialists
Coordinators:Lew Berman, ICF International
Patricia LeBaron, RTI International
Presentations:
Does Screen Size Affect Interviewer Data Quality for CAPI Surveys? A Comparison of Smartphones and Tablets from KenyaSam Haddaway, Sarah Hughes - NORC at the University of Chicago
Field Data Collection in Area Frame Survey Utilizing iPads - USDA's June Area SurveyClaire Boryan, Michael Gerling - National Agricultural Statistics Service
Experiences in Mobile Survey Technology: Technological and Practical IssuesAndrew Jeavons - Survey Analytics, LLC,
Ajay K. Sethi - University of Wisconsin-Madison
Integrating Sensors into Mobile Data Collection - Challenges, Benefits, and Future PossibilitiesPreeta Chickermane, Steven Ross - NORC at the University of Chicago
Use of Smartphones to Collect Information about Health Behaviors: A Feasibility StudySean Hu - CDC NCHS,
Naomi Freedner, Piper DuBray, James Dayton - ICF International

Improving Response Rates

Wednesday, March 19, 9:00-noon
Today's presentations deal with the use of incentives, methods to reduce decline by increasing the precision and complexity of factors used to contact sample members, and use of social media affiliation.
Target Audience: Project managers, survey managers, survey methodologists, statisticians, analysts, field managers, survey operations specialists
Coordinator:Jim Caplan, Department of Defense
Presentations:
Evaluating the Effectiveness of Early Bird Incentives in a Web SurveyChristopher Ward, Michael Stern, Jennifer Vanicek - NORC at the University of Chicago,
Carla L. Black, Cindi Knighton, Larry Wilkinson - CDC NCHS
Approaches to Increase Survey Participation and Data Quality in an At-risk, Youth PopulationLisbeth Goble, Amanda L. Skaff, Jillian Stein, Lisa K. Schwartz - Mathematica Policy Research, Inc.
Social Media in Survey Research: Can Facebook Friendship Enhance Traditional Survey Strategies?Jillian Stein, Amanda L. Skaff, Lisa K. Schwartz - Mathematica Policy Research, Inc.
Strategies for Increasing Efficiency of Cellular Telephone SamplesKurt Peters, William Robb - ICF International,
Cristine Delnevo, Daniel A. Gundersen - Rutgers School of Public Health

Case Studies

Wednesday, March 19, 9:00-noon
Case studies highlight issues of data collection and processing utilizing, including innovative and efficient approaches to survey operations. Case studies explore the best practices from a variety of surveys, including the Canadian Consumer Price Index (Statistics Canada), the Quarterly Census of Employment and Wages (BLS), Survey of Small Businesses (Census), and a survey conducted by the City of Los Angeles.
Target Audience: Survey managers, survey methodologists, statisticians, analysts.
Coordinator:Barbara Bibb, RTI International
Presentations:
Making Ourselves Helpful and Welcome: Efficient, Informative, and Unobtrusive Data Collection for City GovernmentBrian Calfano - Missouri State Univ. and City of LA Human Relations Commission,
Sheldon Cruz, Francisco Ortega, Farrah Parker, Joumana Silyan-Saba, Patricia Villasenor - City of LA Human Relations Commission
Consumer Price Index Collection in Canada - Today and TomorrowAndr?Girard, Elizabeth Abraham - Statistics Canada
2012 Survey of Business OwnersMary Frauenfelder - U.S. Census Bureau
Annual Refiling Survey Web Collection in the Quarterly Census of Employment and Wages ProgramJohn Peters, Dipak Subedi - Bureau of Labor Statistics
Combining Data Collection on the Web in the Quarterly Census of Employment and Wages ProgramKelly Quinn, Ben Cuttitta, Sania Khan - Bureau of Labor Statistics

CASIC Survey Management Challenges

Wednesday, March 19, 9:00-noon
This panel provides a venue for presenting and discussing the management and administrative challenges in today's CAI environment. The session is divided into two topic areas and within each of these topics, four panelists and a moderator address current issues, approaches taken, and lessons learned. Following the panelist presentations, the moderator leads a discussion on the topic area with audience participation. Panelists will be representatives from government agencies and contractor organizations. Session attendees will hear about the techniques used in different organizations to address key management issues, participate in a discussion of these issues, and have an opportunity to ask the panelists about effective approaches to common situations.

Management Challenges related to Recent Technology Advancements Panelists:
Management Challenges related to Survey Project Management Panelists:
Target Audience: CAI technologists and project managers
Coordinators:Karen Davis, RTI International
Jane Shepherd, Westat

March 19, 2014, 1:30-4:30

Survey Evaluation Methods

Wednesday, March 19, 1:30-4:30
Survey evaluation is critical, especially for continuous quality improvement of survey procedures and results. Numerous methods are available for testing and evaluating computer-assisted data collection. The goal of the evaluation is to increase data quality and to reduce cost throughout the survey lifecycle, including survey redesign, question development, and monitoring fieldwork operations. Innovative approaches to survey evaluation include research on implementation of mixed modes, adoption of mobile devices, cognitive and usability testing, and Event History Calendars. The research presented identifies challenges and opportunities for future survey development.
Target Audience: Survey methodologists, survey, project and field managers, testers
Coordinators:Carl Ramirez, Government Accountability Office
Gina Cheung, Survey Research Center, Un. of Michigan
Presentations:
Using Mixed Methods to Evaluate Survey QuestionnairesHeather Ridolfo, Kathy Ott - National Agricultural Statistics Service
Multi-mode Testing at Statistics CanadaMireille Paquette - Statistics Canada
How Did We Do? Evaluating Data for the 2013 Survey of Income and Program Participation Field TestMatthew C. Marlay, Jason M. Fields - U.S. Census Bureau
New Dimensions of Data Quality Measurement in Mobile Field Data CollectionMichael Keating, Charles Loftis, Joseph McMichael, Jamie Ridenhour - RTI International
The Q-Suite (Q-Bank and Q-Notes)Meredith Massey, Justin Mezetin - CDC NCHS

Adaptive Design and Multimode Surveys

Wednesday, March 19, 1:30-4:30
Multimode designs are an increasingly common approach to obtaining the optimal balance of cost and quality. Multimode designs continue to evolve, often in parallel with changes in technology.

Adaptive design and paradata supply intelligence for decision making in the planning and execution of survey processes, often in conjunction with multimode survey designs.

This session will showcase ways in which agencies are utilizing these methodologies and techniques (and associated technologies) on a diverse set of studies.

The session will include lively discussion with both presenters and audience members regarding recent findings and innovations in the use of adaptive design, the challenges and limitations organizations face in utilizing adaptive design and mixed mode surveys, and the use of paradata-driven solutions, to enhance efficiencies and effectiveness of operations.
Target Audience: Survey managers and methodologists, and field managers
Coordinators:Chris Stringer , RTI International
Dan Zahs, Survey Research Center, Un. of Michigan
Presentations:
Census Non-Response Follow Up: The 2016 PlanParry Steen - Statistics Canada
Multimode Survey ManagementJerry Wernimont, Kathleen O'Reagan - Westat
Practical Implementation of Adaptive Design in CATI Surveys - Can Adaptive Designs Really be 'Better, Faster and Cheaper'?John Boyle, James Dayton - ICF International
Mobile Case Management for Real-Time Sample Prioritization Using SMARTFieldJennifer McNulty, Daniel J. Friend, Tiffany Waits - Mathematica Policy Research
On-site vs. Remote Records Data CollectionAlicia Leonard, Emily Weaver - Mathematica Policy Research

Confidentiality and Security

Wednesday, March 19, 1:30-4:30
These topics cover a range of issues from protecting the interviewer and interviewing tools in the field, to configuring systems locally and in the cloud to meet current security standards and compliance requirements, to emerging technology and implications for security. A variety of tools and methods are assessed, including pro-active approaches for ensuring confidentiality.
Target Audience: Project managers, survey managers, IT professionals, survey methodologists, accessibility professionals
Coordinator:David Charbonneau, U.S. Census Bureau
Presentations:
Using the Cloud as a Software Testing Solution in an USGCB EnvironmentRoger Jesrani, Christopher Siege, Nathan Sikes - RTI International
Conducting an Anonymous Survey with Follow-Up Targeted to Non-RespondersMargaret Collins, Betsy Payn - Battelle Health & Analytics
A Method of Determining the Typology of Surveyed Employee GroupsJeremy Cochran, Robert Teclaw, Katerine Osatuke - Veterans Health Administration
Attention to Data Confidentiality in CAI Studies Tom Krenzke - Westat
Security for Mobile DevicesDennis Pickett - Westat

Field Operations

Wednesday, March 19, 1:30-4:30
These presentations feature a variety of traditional and innovative computer assisted interviewing practices, including new methods for evaluating and training interviewers through the use of paradata and eLearning. This session will also cover the use of predictive dialing and workflow management solutions to reduce costs and improve efficiencies in field operations. Some of the presentations make references to behavior coding, CARI, help desk support, and interviewer certification.
Target Audience: Field, survey and project managers
Coordinators:Fern Bradshaw, U.S. Census Bureau
Kristina Rall, Mathematica Policy Research, Inc.
Presentations:
Certifiable? Using Paradata to Evaluate Fields Representatives' Performance in the Survey of Income and Program ParticipationJason M. Fields, Matthew C. Marlay - U.S. Census Bureau
Creating a More User Friendly CARI SystemAaron Maitland, Laura Branden, Susan Genoversa - Westat
Predictive Dialing for CATI SurveysPeter Ha - Westat
Use of eLearning to Support Training for CAI Data CollectorsLaura Branden - Westat
Enhancing an Organization's Capabilities for Technical Assistance and Stakeholder CommunicationBarri Braddy Burrus, Susanna Cantor, Charles Ebel, David Foster, Michael Price, Nathan Sikes, Howard Speizer - RTI International

March 20, 2014, 9:00-noon

Usability and Accessibility

Thursday, March 20, 9:00-noon
This session will cover usability and accessibility of government surveys. Two presentations will cover remote usability testing, a relatively new method for conducting usability tests with participants from anywhere. A third presentation will share what one agency learned by running many usability tests and cognitive interviews. The final presentation will discuss guidelines to improve the accessibility of web surveys.
Target Audience: Usability and accessibility professionals, and survey methodologists
Coordinator:Jean Fox, Bureau of Labor Statistics
Presentations:
Using WebEx for Usability Testing: Considerations for Establishment SurveysHeidi M. St. Onge, Herman A. Alvarado, Kristin J. Stettler - U.S. Census Bureau
A Comparison of Remote Unsynchronized Usability Testing and in-house Lab Testing of a Census Bureau WebsiteElizabeth Nichols, Erica Olmsted-Hawala, Marylisa Gareau - U.S. Census Bureau
Planning for the Future: Usability Testing for the 2020 CensusEmily Geisen, Murrey Olmsted - RTI International,
Patti Goerman, Sabin Lakhe - U.S. Census Bureau
A Questionnaire Guide to Web Survey AccessibilityMark Pierzchala - MMP Survey Services, LLC

Incorporating New Technologies on the Consumer Expenditure Survey

Thursday, March 20, 9:00-noon
The Bureau of Labor Statistics (BLS) is redesigning the Current Expenditure Survey (CE) to incorporate new technologies, such as mobile applications and a web diary. This session provides an overview of the technological features of the new design, as well as the results of current R&D developments. The goal is to discuss critical issues considered when introducing new technologies for a large, continuing survey. The first paper provides an overview of the tradeoffs considered when choosing between different technological design features. The second paper summarizes results of a recently completed field test of a Web diary. The third and fourth paper describe the research and development of a mobile-optimized web instrument for individual diary-keepers in the household. The fifth paper reports on a survey of CE data users who provide feedback on the new features of the redesign and its effects on the user community.
Target Audience: Survey Managers interested in transitioning ongoing surveys to mobile and web-based data collection methods.
Coordinator:David Cantor, Westat
Presentations:
Incorporating Technology in CE's New DesignDavid Cantor - Westat,
Laura Paszkiewicz - Bureau of Labor Statistics
Results from a Web Diary Feasibility TestIan Elkin - Bureau of Labor Statistics
Developing a Mobile-Optimized Web Instrument for the Consumer Expenditure Diary SurveyNhien To, Brandon Kopp, Erica Yu, Jean Fox - Bureau of Labor Statistics
Evaluating the Usability of a Mobile-Optimized Web Instrument Brandon Kopp, Erica Yu, Jean Fox, Nhien To - Bureau of Labor Statistics
Data Users' Concerns of the CE RedesignBill Passero - Bureau of Labor Statistics

Hard-to-reach Populations

Thursday, March 20, 9:00-noon
Innovative and traditional concepts and techniques for surveying hard to reach populations will be presented in this session. They will address both the statistical and survey design aspects of including hard to reach groups. Researchers will report findings from censuses and surveys and other research related to the identification, definition, measurement, and methodologies for surveying and enumerating undercounted populations.
Target Audience: Survey methodologists, survey managers, data collection managers, analysts, statisticians, sampling statisticians
Coordinators:John Baker, U.S. Census Bureau
Andrew Zukerberg, National Center for Education Statistics
Presentations:
Surveillance of Seasonal Influenza Vaccination Coverage among Pregnant Women and Health Care Personnel in the United StatesSara M.A. Donahue - Abt Associates,
Carla L. Black, Stacie Greby, Helen Ding, Anup Srivastav - CDC NCHS,
Rachel Martonik - Abt SRBI,
David Izrael, Sarah W. Ball - Abt Associates,
Charles DiSogra - Abt SRBI,
Deborah K. Walker - Abt Associates
Dynamic Sampling and Data Collection Systems for Hard-to-Reach PopulationsRonaldo Iachan, Tonja Kyle, David Radune, Deirdre Middleton - ICF International
Oversampling Minorities in the National Alcohol Survey using the Zip Code Tabulation Area FilePedro Saavedra, Shelley Osborn, Naomi Freedner - ICF International,
Thomas Greenfield, Katherine Karriker-Jaffe - Public Health Institute
Effect of Recruitment Mode on Survey Panel Participation, Retention, and ResponseHeather Driscoll, Kurt Peters - ICF International
Using Google Ads to Target Potential Focus Group RespondentsAmelia Burke-Garcia - Westat

Administrative and Linked Records

Thursday, March 20, 9:00-noon
With decreasing response rates and increasing survey costs, interest in ways to incorporate administrative records in surveys thrives. Options include replacing administrative data for survey data entirely, integrating administrative records and survey data, or utilizing administrative data to conduct assessment of survey data or vise versa.

This session will highlight examples of innovative uses of administrative records and record linkage methods for household and establishment surveys, and discuss their impact of these approaches on cost and data quality, including coverage.
Target Audience: Survey managers, program directors, test engineers, statisticians, methodologists, system evaluators, researchers, analysts
Coordinator:Brad Edwards, Westat
Presentations:
How Good Is Your Record Linkage System, Really?K. Bradley Paxton - ADI LLC.
The Nature of the Bias When Studying Only Linkable Person Records: Evidence from the American Community SurveyBrittany Bond - U.S. Department of Commerce,
J. David Brown, Adela Luque, Amy O'Hara - U.S. Census Bureau
Coverage of the Foreign-Born Population in Administrative Records: Magnitude and CharacteristicsRenuka Bhaskar, Leticia Fernandez, Sonya Rastogi - U.S. Census Bureau
Comparing Administrative Records and 2010 Census PersonsJames Noon, Sonya Rastogi, Ellen Zapata - U.S. Census Bureau
Redesigning National School Surveys: Coverage Improvement Using Multiple DatasetsWilliam Robb, Kate Flint, Alice Roberts, Ronaldo Iachan - ICF International
Exploring Synergy: Using Survey and Administrative Data Systems to Monitor Local, State, and National Immunization Programs. Stacie Greby, Laura Pabst, LaTreace Harris, James A. Singleton - CDC NCHS,
Vicki Pineau, Margrethe Montgomery - NORC at the University of Chicago

Survey Development Tools and Standards

Thursday, March 20, 9:00-noon
Survey development methods have a significant impact on operations and data quality in household and business data collections. Well-defined metadata and taxonomy/lexicon tools can support standards throughout a survey lifecycle. The importance of such tools has grown with the variety of survey modes - CADI, CAPI, CARI, etc. These presentations discuss innovative tools to effectively capture complex elements of survey systems.
Target Audience: Survey methodologists, survey managers, field managers, and those who design and use tools to meet standards for data collection, processing and dissemination
Coordinator:Eileen O'Brien, Energy Information Administration
Presentations:
The Evolution of Electronic Questionnaire Collection at Statistics Canada - Interviewer EQ / Respondent EQ (iEQ/rEQ)Wade Kuseler - Statistics Canada
CAPI on a Shoestring: Lessons Learned, Bruises EarnedBenjamin J. Earnhart - University of Iowa
Developing Responsive Mobile Web Systems Using Open Source FrameworksChristopher Siege, Debra Fleischmann - RTI International
Lessons Learned using Open Source SoftwareAbie Reifer - Westat
Taxonomy / Lexicon Project at BLSDan Gillman - Bureau of Labor Statistics
Standards Based Metadata Usage at Statistics Denmark and Statistics New ZealandJeremy Iverson, Dan Smith - Colectica
 

Go to Welcome


Is this page helpful?
Thumbs Up Image Yes Thumbs Down Image No
NO THANKS
255 characters maximum 255 characters maximum reached
Thank you for your feedback.
Comments or suggestions?