2015 Federal CASIC Workshops
 

Held March 4th to March 5th, 2015 at US Census Bureau Headquarters, Suitland, Maryland
Sponsored by the Bureau of Labor Statistics Link to a non-federal Web site and the US Census Bureau.

Workshops Video:

Welcome: Barbara LoPresti, 2015 FedCASIC Chair

Opening Remarks: John Thompson, Director, US Census Bureau

Keynote Speaker: Dr. Amy O'Hara, Center for Administrative Records Research and Applications, US Census Bureau

Workshops Information:
Previous FedCASIC Workshops:
 

2015 FedCASIC Presentations

Wednesday Sessions

  1. Keynote Speaker
  2. Special Topics in Administrative Records (3 presentations)
  3. Field Operations Management (4 presentations)
  4. Management Challenges - Technology Costs (1 presentation)
  5. Adaptive Design (4 presentations)
  6. Special Topics in Software Development (3 presentations)
  7. Coverage Issues in Administrative Records (4 presentations)
  8. Field Operations Training (4 presentations)
  9. Management Challenges - Human Capital (1 presentation)
  10. Paradata (3 presentations)
  11. Usability (3 presentations)
  12. Big Data (3 presentations)
  13. Special Topics in Mobile Technologies (3 presentations)
  14. Survey Redesign (3 presentations)
  15. Survey Design - Sampling (3 presentations)
  16. Data Collection Systems (3 presentations)

Thursday Sessions

  1. Plenary Session: Text Analytics (1 presentation)
  2. Estimation and Imputation (3 presentations)
  3. Bring Your Own Device (BYOD) (4 presentations)
  4. Metrics and Analytics (4 presentations)
  5. Case Studies (3 presentations)
  6. Process Management (3 presentations)
  7. Demonstrations and Poster Session (6 presentations)
  8. Sampling (4 presentations)
  9. Confidentiality, Privacy, and Security (4 presentations)
  10. Special Topics in Survey Management (1 presentation)
  11. Response Rates (4 presentations)
  12. Case Management Systems (3 presentations)

Keynote Speaker

Help Wanted
Dr. Amy O'Hara, U.S. Census Bureau


Abstract: Our surveys need help. Response rates are down. Users want more detail and real-time data. Competing sources offer similar information. We turn to auxiliary data sources, seeking methods to integrate administrative records, purchased third party data, and other big data sources. We must think of these external data sources as an enabling technology, helping traditional data collection adapt to budget and response rate realities. However, data integration solutions are only viable with corresponding investments in information technology (IT) and staffing. The IT needs can be identified and pursued. Staffing appropriately is challenging. The existing blend of survey methodologists, mathematical statisticians, and statisticians have succeeded in the task to date. We need help from other disciplines to conduct our surveys, including quantitative social scientists and computer scientists. Can government attract talent to design machine learning solutions, tweak algorithms, integrate files, and showcase the data? How can we identify, hire, and retain people to develop and produce integrated data solutions?

Amy O'Hara leads the Center for Administrative Records Research and Applications, exploring the statistical uses of data from federal, state, and third party sources. After completing her doctorate in Economics from the University of Notre Dame, she began her federal career at the Census Bureau in 2004 as a researcher in the Social, Economic and Housing Statistics Division. ?For over a decade, she has explored methods to integrate administrative records data into Census Bureau methods and products. She led the first national analysis of administrative records quality and coverage using 2010 Census data, linking billions of records. O’Hara has developed projects to test the use of auxiliary data to enhance survey frames, contact respondents, and treat missing data. She has initiated partnerships across government and academia to leverage survey data with administrative records to analyze program participation and improve population and economic measurements.

Special Topics in Administrative Records

Organizations acquire external data sources for various purposes in the survey lifecycle, This session explores using administrative records in the development of an alternative contact frame and the successes and challenges related to the matching of administrative data.
Coordinator:Paul Marck, U.S. Census Bureau
Presentation Materials:
Using Third Party Data to Contact Respondent (Adobe PDF Icon, 189kb)Dave Sheppard,
Bonnie Moore - U.S. Census Bureau
HUD Improper Payment Reporting - An Investigation of Tenant Misreporting of Income via a Data Match with the National Directory of New Hires (Adobe PDF Icon, 343kb)Sophia I. Zanakos,
Davia Spado,
Kelly Martin - ICF International
Overview of Administrative Data Matching to Postsecondary Studies at the National Center for Education Statistics (Adobe PDF Icon, 743kb)Tracy Hunt-White,
Sean Simone - U.S. Department of Education

Field Operations Management

These presentations feature new methods for hiring and evaluating field interviewer staff and sharing interviewer knowledge. Some of the presentations show how to utilize GPS and CARI to evaluate field staff by verifying locations, authenticating completed surveys, and creating a visual review of the interviewer's day superimposed on a map. Other presentations demonstrate tools that help to identify and fill vacant field interviewer positions and allow interviewers to share their knowledge about challenges in the communities in which they work.
Coordinator:Fern Bradshaw, U.S. Census Bureau
Presentation Materials:
NORC PLACES: A prototype for an interactive GIS-enabled tool for sharing and displaying community generated information about location specific obstacles to field work (Adobe PDF Icon, 726kb)Kyle Fennell - NORC at the University of Chicago
GPS and CARI for Quality Assurance on Mobile Data Collection (Adobe PDF Icon, 544kb)R. Suresh,
Steve Litavecz,
Charles Loftis,
Michael Keating - RTI International
An overview of locally developed tools to manage data collection processes in a hybrid structure of survey management (Adobe PDF Icon, 2.6mb)James Christy - U.S. Census Bureau
Efficiency Analysis through Geospatial Location Evaluation (Adobe PDF Icon, 574kb)Marsha Hasson - Westat

Management Challenges - Technology Costs

This panel will discuss current challenges for survey organization and project managers related to predicting technology costs for survey projects. With the recent advances in technology, specifically mobile devices, personal devices (i.e. SmartWatch) and sensors, how can survey organizations predict and plan for the constant need for utilizing the latest technologies to enhance data collection efforts.

Panelists:
Coordinators:Karen Davis, RTI International
Jane Shepherd, Westat
Presentation Materials:
Management Challenges - Predicting Technology Costs (Adobe PDF Icon, 476kb) 

Adaptive Design

Making surveys more responsive to the present conditions is at the forefront of survey research. Using paradata, survey data, and other sources of information to guide real-time processes and decision-making can help surveys save money and increase data quality. During this session, we will hear about various ways that surveys are implementing adaptive design paradigms to better their surveys.
Coordinator:Peter Miller, U.S. Census Bureau
Presentation Materials:
Adaptive Design Strategies for Addressing Nonresponse Error in NCES Longitudinal Surveys (Adobe PDF Icon, 903kb)Sarah Crissey,
Elise Christopher,
Ted Socha - National Center for Education Statistics
Developing Data Collection Reports for Adaptive Design (Adobe PDF Icon, 992kb)Amang Sukasih,
Michael Sinclair,
Debra Wright,
Shilpa Khambhati,
Brendan Kirwan - Mathematica Policy Research, Inc.
Multivariate Tests for Phase Capacity (Adobe PDF Icon, 645kb)Taylor Lewis - U.S. Office of Personnel Management
Adaptive Design Experiments in a Longitudinal Survey: Plans for the Survey of Income and Program Participation (Adobe PDF Icon, 500kb)Jason M. Fields,
Stephanie Coffey,
Matthew C. Marlay,
Benjamin Reist,
Mahdi Sundukchi,
Ashley Westra - U.S. Census Bureau

Special Topics in Software Development

These presentations highlight innovative uses of technology to address persistent survey challenges: making the most of open-source tools to improve the data collection functionality of Android mobile devices, replacing paper with a robust electronic sample listing and mapping system, and creating a highly-developed survey project tracking system using real-time paradata to help managers monitor and control costs.
Coordinator:David Cantor, Westat
Presentation Materials:
Using ODK for Survey Data Collection (Adobe PDF Icon, 856kb)Abie Reifer - Westat
An Enterprise Approach for Listing and Mapping Application (LiMA) (Adobe PDF Icon, 207kb)Kim Canada - U.S. Census Bureau
Mobile Maps Application for Field Surveys (Adobe PDF Icon, 722kb)Katherine Morton,
Charles Loftis,
James Cajka,
James Rineer,
Bonnie Shook-Sa - RTI International

Coverage Issues in Administrative Records

This session examines different ways administrative records can be used to find non-reported and verify reported information. Issues such as hard to find individuals, verifying proper reporting, and examining results of programs can be aided by use of administrative records. This session will highlight four such cases from the Census Bureau
Coordinator:Ashley Landreth, Office of Management and Budget
Presentation Materials:
Coverage of Children in Administrative Records (Adobe PDF Icon, 600kb)Catherine Massey - U.S. Census Bureau
Response Error and the Medicaid Undercount in the 2009 Current Population Survey (Adobe PDF Icon, 134kb)James Noon,
Sonya Rastogi - U.S. Census Bureau
Federal Grant Coverage of Females and Minorities in STEM Programs: Evidence from StarMetrics Data linked to the 2010 U.S. Decennial Census (Adobe PDF Icon, 205kb)Catherine Buffington,
Benjamin Cerf Harris - U.S. Census Bureau,
John King - Economic Research Service,
Brett McBride - Bureau of Labor Statistics,
Michael Tzen - U.S. Census Bureau
When Race and Hispanic Origin Reporting are Discrepant Across Administrative Records Sources: Exploring Methods to Assign Responses (Adobe PDF Icon, 151kb)Sharon R. Ennis,
Sonya Rastogi,
James Noon - U.S. Census Bureau

Field Operations Training

Training is a critical component of data collection and processing that is constantly changing as new technologies are being adapted and utilized. This session presents multiple perspectives to enhance and conduct in-person and distance training activities involving a variety of participants including enumerators, coders, and respondents.
Coordinator:Kristina Rall, Mathematica Policy Research, Inc.
Presentation Materials:
Audience Response Technology to Increase Learner Engagement (Adobe PDF Icon, 1.5mb)Gini Wilderson - U.S. Census Bureau
Automated Training for a New Census (Adobe PDF Icon, 431kb)Jennifer Kim,
Vicki McIntire,
Reginald Bingham - U.S. Census Bureau
Microsoft tools enhance training and management of offsite dietary coders (Adobe PDF Icon, 3.5mb)Amber Brown,
Deirdre Douglass,
Thea Palmer Zimmerman,
Suzanne McNutt - Westat
Interactive In-System Technology Training for Mobile Clinical CASIC Systems (Adobe PDF Icon, 5.5mb)Josh Kumpf - Henry M. Jackson Foundation

Management Challenges - Human Capital

This panel will discuss challenges related to recruiting, development and retaining technical staff, especially programmers and technologists. The current environment for technology staff is highly competitive, resulting in needs for creative approaches to recruiting and retention.

Examples of considerations that the panelists will discuss include:

Panelists:
Coordinators:Karen Davis, RTI International
Jane Shepherd, Westat
Presentation Materials:
Management Challenges - Human Capital (Adobe PDF Icon, 821kb) 

Paradata

Using paradata, or data about survey processes, can greatly enhance surveys. However, the surface is just being scratched with this new and innovative look at how to improve our surveys. These data can include trace files, contact history records, cost and effort data, and many other data that have traditionally been used only to monitor survey operations. In this session, we will see examples of how survey designers have used paradata, and new methods for organizing that data.
Coordinator:Chris Stringer , RTI International
Presentation Materials:
Use of Audit Trails in the Re-engineering of SIPP (Adobe PDF Icon, 1.7mb)Holly Fee,
Matthew C. Marlay,
Patrick Campanello - U.S. Census Bureau
Does "When" Matter: An Exploration of Timestamp Data on the Federal Employee Viewpoint Survey (Adobe PDF Icon, 3.1mb)Karl Hess,
Taylor Lewis - U.S. Office of Personnel Management
Standing up a New Office of Survey and Census Analytics at the U.S. Census Bureau (Adobe PDF Icon, 2.1mb)Frank Vitrano - U.S. Census Bureau

Usability

Usability in CASIC may manifest itself in more than a user’s experience with a data collection instrument. This session does address usability in terms of the interviewer or respondent experience, for example in how to use browser tools to help test accessibility, and how to incorporate user experience methods throughout the survey system development lifecycle. But it also reflects on making data quality monitoring systems more usable for survey management and field staff, and optimizing the performance of a data collection application across different computing environments, to make it more usable in a “Bring Your Own Device” scenario.
Coordinator:Jean Fox, Bureau of Labor Statistics
Presentation Materials:
Improving Survey Data Quality Assurance with a User-Friendly Stata Package (Adobe PDF Icon, 562kb)Nathan Cutler,
Danae Roumis - Social Impact, Inc.
Responsive Design (Adobe PDF Icon, 1.4mb)Linda Sloan - Agilex
Incorporating user experience methodologies in the software development process (Adobe PDF Icon, 2mb)Kathi Kohlmeyer - Northrop Grumman

Big Data

Big Data is one of the current buzz phrases along with adaptive design in today's survey world. This session delves into how Big Data is actually being used to improve survey response and quality of information collected. The presenters take a unique look at how this can help at both the Federal and State levels while also looking at how the two levels can work together using Big Data.
Coordinator:Steve Klement, U.S. Census Bureau
Presentation Materials:
Opportunities and Considerations for the Use of Big Data Techniques on the Consumer Expenditure Survey (Adobe PDF Icon, 527kb)Brett McBride - Bureau of Labor Statistics
Where Big Data Meets Administrative Data (Adobe PDF Icon, 266kb)Randall J. Olsen - Center for Human Resource Research, Ohio State University
Combining Statewide BRFSS Data to Produce National Prevalence Estimates (Adobe PDF Icon, 205kb)Kristie Healey - ICF International

Special Topics in Mobile Technologies

The increasing use of smart-phone technology, iPads, and Android platforms is changing the landscape for survey data collection, providing alternatives to traditional methods. These new technologies present implementation and security challenges. They also present advances in opportunities for field interviewers to track and record activities and by allowing users to enter data through the mode specific to their needs. This session examines some of these challenges and advances.
Coordinator:Lew Berman, ICF International
Presentation Materials:
Use of Mobile Technology for CAPI Management (Adobe PDF Icon, 1.6mb)Abie Reifer,
Ray Snowden - Westat
Mobile Security (Adobe PDF Icon, 6.6mb)Stephen M. Dye - Agilex
Multi-mode User input for Mobile Clinical CASIC Systems (Adobe PDF Icon, 936kb)Tiffanni Reidy - Henry M. Jackson Foundation

Survey Redesign

There are many issues to consider when designing or redesigning a survey - i.e., content, navigation, visual design. There are pros and cons of alternative modes of data collection – web, CAPI, CATI, and other modes. This session highlights conceptual differences between CAPI and CATI modes of data collection for a household survey, a web-based questionnaire development system for conducting a cross-sectional or a longitudinal healthcare study, and a new Blaise web questionnaire generator compatible with the existing metadata repository.
Coordinator:Carl Ramirez, Government Accountability Office
Presentation Materials:
Does CATI Have 9 Lives? Lessons Learned Converting a SIPP Supplement from a CAPI to CATI environment (Adobe PDF Icon, 296kb)Cindy Easton,
Denise Lewis - U.S. Census Bureau
Redesign of the Production Statistics at Stats Netherlands (Adobe PDF Icon, 1.1mb)Lon Hofman - Statistics Netherlands
Survey Item Bank (Adobe PDF Icon, 1004kb)Jessica Graber,
Ruth Brenner - NICHD,
Steven Fink,
Joan Wang - Avar Consulting, Inc.

Survey Design - Sampling

Random sampling has long been at the backbone for selecting the cases for surveys, especially in the federal context. In recent years, more innovative sample designs have been needed to combat rising survey costs and tight budgets. In this session, we will see examples of new methods to better survey sampling, including drone-assisted sampling, web-survey sampling, and using supplemental information to enhance the sampling frame and field readiness.
Coordinator:Joy Sharp, U.S. Department of Transportation
Presentation Materials:
Is it Feasible to Use Immunization Information Systems (IIS) as a Supplemental Sampling Frame for the National Immunization Survey (NIS)? (Adobe PDF Icon, 614kb)Stacie Greby,
Laura Pabst,
LaTreace Harris,
Sarah Reagan-Steiner,
Holly Hill,
Laurie D. Elam-Evans,
James A. Singleton - CDC NCHS,
Vicki Pineau,
Kathleen Santos,
Sari Schy,
Elizabeth Ormson,
Margrethe Montgomery - NORC at the University of Chicago
Drone Assisted Sample Design for Developing Countries (Adobe PDF Icon, 2.4mb)Joe Eyerman,
Karol Krotki,
Safaa Amer,
Ryan Gordon,
Jonathan Evans - RTI International,
Kyle Snyder - North Carolina State University
Innovations in General Population Web Panel Surveys of Households to Improve Sample Coverage and the Response Rate (Adobe PDF Icon, 677kb)Michael Stern - NORC at the University of Chicago

Data Collection Systems

Electronic data collection instruments and the platforms they operate on can take many forms. This session covers a variety of examples, with insights into how they might be better designed and implemented. This session will examine the use of tools that handle complex reporting scenarios, take advantage of new cloud computing utilities, and allow for adaptive design and cross-survey commonality.
Coordinator:Michael Gerling, National Agricultural Statistics Service
Presentation Materials:
Design Intuitive Interfaces that Simplify Survey Completion and Reduce Respondent Burden (Adobe PDF Icon, 1.2mb)Alex Schwartz - Henry M. Jackson Foundation
Cloud Deployment and Testing of Internet Data Submission Applications (Adobe PDF Icon, 670kb)Doug Smith - Northrop Grumman
Designing and Architecting a Shared Platform for Adaptive Data Collection (Adobe PDF Icon, 1.4mb)Michael T. Thieme,
Anup Mathur - U.S. Census Bureau

Plenary Session: Text Analytics

Open-ended questions have been an important part of survey questionnaires since their beginning, providing answers to important issues analysts neglected to ask and filling in gaps using “Other, specify” as an additional alternative. With computerization, very large samples, and increased urgency from decision-makers, unstructured text has fallen into disuse because of cost, time constraints, and unreliability of human coders. This panel offers the history and an update to where we are in the use of very smart computers (using new text analytic data mining tools) to sort, code and determine sentiment in unstructured text (including Big Data).

Panelists:
Coordinator:Jim Caplan, Department of Defense
Presentation Materials:
Text Analytics (Adobe PDF Icon, 1024kb) 

Estimation and Imputation

This session will give organizations an opportunity to share information about recent innovations in the areas of imputation and estimation. One of the presentations discusses an outlier detection and resolution method used during data collection. Two of the presentations will discuss using alternative imputation methods compared to the traditional hot deck imputation method including the use of administrative records. One of the presentations compares weighted estimates and item response rates using mixed regression models.
Coordinator:Robyn Sirkis, National Agricultural Statistics Service
Presentation Materials:
Comparison of Outlier Follow-up in an Individual vs Establishment Survey (Adobe PDF Icon, 107kb)Jennifer O’Brien,
Jocelyn Newsome,
Kerry Levin,
Sarah Bennett-Harper,
Stephanie Beauvais - Westat,
Brenda Schafer,
Patrick Langetieg - Internal Revenue Service
Exploratory Study Comparing Alternative Imputation Methods for the National Teacher and Principal Survey (Adobe PDF Icon, 274kb)Sarah Dial,
Jacob Enriquez,
Svetlana Mosina,
T. Trang Nguyen,
Allison Zotti - U.S. Census Bureau
Model Based Imputation for the Survey of Income and Program Participation (Adobe PDF Icon, 209kb)Martha Stinson,
Graton Gathright - U.S. Census Bureau

Bring Your Own Device (BYOD)

The use of mobile technologies to support data collection activities continues to evolve and gain prominence. Of particular interest are opportunities posed by mobile Bring Your Own Device (BYOD) solutions to support data collection, as well as the challenges associated with these BYOD solutions. These sessions will cover a range of topics including public and interviewer opinions of BYOD solutions, usability, and associated technologies.
Coordinator:Brian Head, RTI International
Presentation Materials:
Public Opinion on the Bring Your Own Device Concept for the 2020 Census (Adobe PDF Icon, 390kb)Casey Eggleston,
Jennifer Hunter Childs,
Aleia Clark Fobia - U.S. Census Bureau
Field Interviewer Attitudes and Usability Issues Surrounding BYOD for the 2020 Census (Adobe PDF Icon, 507kb)Jessica Holzberg,
Lawrence Malakhoff,
Lin Wang - U.S. Census Bureau
Using BYOD to Conduct Census NRFU Activities (Adobe PDF Icon, 725kb)Eric Atala - U.S. Census Bureau
BYOD Enabling Technology (Adobe PDF Icon, 827kb)Bryan Padgett - Agilex

Metrics and Analytics

Performance measurements are essential for monitoring survey operations and for conducting program evaluations. Performance measures are useful when seeking to identify, to treat, and to prevent sampling and nonsampling errors during survey operations. In addition, these performance measures provide actionable information when conducting program evaluations. This session focuses on approaches to improve call center customer service, to use paradata from adaptive design to monitor resources and measure performance of a multi-mode data collection system, and to assess data quality and cost-effectiveness of a survey program.
Coordinator:Gina Cheung, Survey Research Center, Un. of Michigan
Presentation Materials:
Applying Technology Solutions to Improve Federal Program Evaluation and Performance Monitoring (Adobe PDF Icon, 1.1mb)Bradley Epley - ICF International
Application of Data Analysis to Improve Contact Center Performance and Enhance Business Operations (Adobe PDF Icon, 930kb)Caryn Lesage-Jones - Agilex
An Update on Developing Response Metrics for the Economic Census (Adobe PDF Icon, 566kb)Eric B. Fink,
Joanna Fane Lineback - U.S. Census Bureau
Improving Survey Management: The Unified Tracking System and the National Survey of College Graduates (Adobe PDF Icon, 507kb)Stephanie Coffey - U.S. Census Bureau

Case Studies

This session highlights various topics for innovation in different types of surveys. We will hear about a new method for re-engineering the decennial census, survey redesign testing, and changing the collection of both consumer data and energy statistics.
Coordinator:Brad Edwards, Westat
Presentation Materials:
Reorganized Census with Integrated Technology (Adobe PDF Icon, 646kb)Stephanie Studds,
Jay Occhiogrosso - U.S. Census Bureau
From Telephone Interview Only to Web First Multimode: Lessons Learned from Usability and Field Testing In the Redesign of the National Survey of Children’s Health (NSCH) (Adobe PDF Icon, 671kb)Alyson Croen,
Sabrina Bauroth,
Michael Stern - NORC at the University of Chicago,
Reem Ghandour,
Catherine Vladutiu - Maternal and Child Health Bureau
Consumer Price Index Collection in Canada: Today and Tomorrow (Adobe PDF Icon, 382kb)Andrée Girard,
Paul Durk - Statistics Canada

Process Management

Development of complex, large-scale CASIC systems may be improved by incorporating Agile and DevOps practices. This session features two presentations specifically on “DevOps” (a collaborative approach to the software development lifecycle, based on agile principles of enlightened relationships between individuals and teams) and its application to survey data collection systems. A third presentation extends the concept of agility to the integration of multiple survey-related IT projects into one program.
Coordinator:Maria Hobbs, RTI International
Presentation Materials:
Dev Ops Factory (Adobe PDF Icon, 8.7mb)Dominic Delmolino - Agilex
Software Integration Modeled after the Scaled Agile Framework (Adobe PDF Icon, 155kb)Martine Kostanich - U.S. Census Bureau
Beyond the Waterfall, An Evolution to true Agility. DEVOPS in Practice (Adobe PDF Icon, 1.1mb)Josh Salmanson,
Keith Kapp - Customer Value Partners

Demonstrations and Poster Session

This year FedCASIC will feature 15 demonstrations and poster sessions in an open, exhibit setting in our auditorium. FedCASIC attendees will have the opportunity to see and talk with presenters directly about their demonstrations and poster topics.
Coordinators:Matthew Burgess, Bureau of Labor Statistics
Eric Falk, Defense Manpower Data Center
Presentation Materials:
Demo: Building ACASI Surveys Using State of the Art Text to Speech Technology (Adobe PDF Icon, 915kb) 
Demo: CARI That Weight: Obtaining Consent to Record SIPP Interviews (Adobe PDF Icon, 268kb) 
Demo: Facilitating the transparency and scientific rigor of cognitive interviewing methodology: Demonstrating the Q-Suite tools (Adobe PDF Icon, 128kb) 
Demo: Audits based on Enumerator Behavior to Collect Survey Metadata (Adobe PDF Icon, 4.3mb) 
Demo: Sentiment Score Analysis of Establishment Survey Interviewer Notes (Adobe PDF Icon, 751kb) 
Poster: Automated Statistical Systems (Adobe PDF Icon, 394kb) 

Sampling

This session will give organizations an opportunity to share information regarding innovative sample designs. One of the presentations will compare an equal probability subsampling design and optimal allocation subsampling design with the objective of selecting larger samples in industries that have initially lower response rates. Two of the presentations discuss the use of a stratified multi-stage design. One of the presentations mentions a hybrid design combining a stratified national probability sample with non-probability samples.
Coordinator:Sean Simone, U.S. Department of Education
Presentation Materials:
Venue-Based and Real-Time Sampling Methodologies in an Intercept Survey of Cyclists (Adobe PDF Icon, 3.1mb)Ronaldo Iachan,
Olivia Saucier - ICF International
Sampling Techniques used to Select a Nationally Representative Sample of WIC Participants for the WIC Infant and toddler Feeding Practices Study – 2 (ITFPS2) (Adobe PDF Icon, 293kb)Yumiko Sugawara,
Jill Montaquila,
Suzanne McNutt - Westat
Evaluating Propensity Score Adjustment for Combining Probability and Non-Probability Samples in a National Survey (Adobe PDF Icon, 1.1mb)Kurt Peters,
Heather Driscoll,
Pedro Saavedra - ICF International
Strategies for Subsampling Nonrespondents for Economic Programs (Adobe PDF Icon, 473kb)Stephen J. Kaputa,
Laura Bechtel,
Katherine Jenny Thompson - U.S. Census Bureau

Confidentiality, Privacy, and Security

Confidentiality, privacy, and security remain a major concern of organizations sponsoring and/or conducting surveys, as well as the respondents, themselves. These sessions focus attention on building trust with respondents, managing security vulnerabilities, and encouraging respondent participation via digital media.
Coordinator:David Charbonneau, U.S. Census Bureau
Presentation Materials:
Improving Confidentiality and Trust-building Measures for Active Duty Military Respondents in Clinical CASIC Systems (Adobe PDF Icon, 1.9mb)Chris Olsen,
Josh Kumpf - Henry M. Jackson Foundation
Security– Vulnerability management at Westat (Adobe PDF Icon, 366kb)Dennis Pickett - Westat
Trust: The Respondent View from Initial Contact through Completed Interviews (Adobe PDF Icon, 485kb)William S. Long - Centers for Medicare & Medicaid Services
Use of Digital Media in Recruiting Survey Participants (Adobe PDF Icon, 1.6mb)Amelia Burke-Garcia - Westat

Special Topics in Survey Management

Survey initiatives and field testing help to improve data quality and to reduce operational costs. This session focuses on the following innovations in survey management:
• How to create an easy-to-use graphical user interface (GUI) to allow multiple views of a timeline to improve respondent recall on important life events.
• How to locate mobile sample members in a longitudinal surveys, focusing on results from three tests to assess alternative procedures - mail-out/mail-back, use of administrative and third-party data sources, and use of data from other government surveys.
• How to configure a virtual call center to offer cost-savings and to ensure quality control.
Coordinator:Shawna Waugh, Energy Information Administration
Presentation Materials:
Mobile Samples and Movers: Locating Respondents in the 2014 SIPP Panel (Adobe PDF Icon, 453kb)Amber Phillips,
Jason M. Fields,
Daniel P. Doyle,
Bennett Adelman - U.S. Census Bureau

Response Rates

Response rates are dropping across all types of surveys. In times of decreasing budgets, new and innovative survey designs can help combat the falling response rate. In this session, we'll see a variety of methods to increase survey response within modes and methods to move to multi-mode surveys.
Coordinator:Allina Lee, Office of Management and Budget
Presentation Materials:
Increasing Electronic Reporting for the 2012 Survey of Business Owners (Adobe PDF Icon, 1.6mb)Mary Frauenfelder - U.S. Census Bureau
Impact of Paper Invitation on EQ Collection Strategy (Adobe PDF Icon, 845kb)Anie Marcil - Statistics Canada
Improving Response Rates Using Mixed Mode Approach Results from the National Health Care Interview Survey (Adobe PDF Icon, 995kb)Lindsay M. Howden - U.S. Census Bureau,
Sarah S. Joestl,
Robin A. Cohen - CDC NCHS
Designing a Multipurpose Longitudinal Incentive Experiment for the Survey of Income and Program Participation (Adobe PDF Icon, 440kb)Ashley Westra,
Mahdi Sundukchi - U.S. Census Bureau

Case Management Systems

Integration of case management and other survey support applications is the theme of this session. These organizations have redesigned or built management systems that integrate previously independent applications and various fieldwork processes (such as communication, scheduling, mixed mode data collection, data processing, and interviewer recruitment and payment. Their presentations discuss functional and operational features of their systems, how they have helped streamline and complex survey workflow, and lessons learned.
Coordinator:David Morgan, U.S. Census Bureau
Presentation Materials:
Development of an Integrated Data System for the Gulf Long-Term Follow-up Study (SSSI) (Adobe PDF Icon, 2.8mb)David A. Johndrow,
Chris Wachtstetter,
Nicholas M. Martilik,
Mark P. Della Valle,
Matthew D. Curry,
Polly P. Armsby,
Carley L. Prynn,
Katherine Sisco - Social & Scientific Systems, Inc.,
Richard K. Kwok,
Lawrence S. Engel,
Dale P. Sandler - National Institute of Environmental Health Sciences, University of North Carolina, Chapel Hill
CAPI Management at Westat (Adobe PDF Icon, 899kb)Mangal Subramanian,
Ray Snowden - Westat
Integrated Management of survey Processes at Westat (Adobe PDF Icon, 671kb)Jerry Wernimont - Westat