Workshops Program
Day 1 - March 4, 2015, 9:00-10:30
Keynote Speaker
Wednesday, March 4, 9:00-10:30, Room(s) Auditorium
Help Wanted
Dr. Amy O'Hara, U.S. Census BureauAbstract: Our surveys need help. Response rates are down. Users want more detail and real-time data. Competing sources offer similar information. We turn to auxiliary data sources, seeking methods to integrate administrative records, purchased third party data, and other big data sources. We must think of these external data sources as an enabling technology, helping traditional data collection adapt to budget and response rate realities. However, data integration solutions are only viable with corresponding investments in information technology (IT) and staffing. The IT needs can be identified and pursued. Staffing appropriately is challenging. The existing blend of survey methodologists, mathematical statisticians, and statisticians have succeeded in the task to date. We need help from other disciplines to conduct our surveys, including quantitative social scientists and computer scientists. Can government attract talent to design machine learning solutions, tweak algorithms, integrate files, and showcase the data? How can we identify, hire, and retain people to develop and produce integrated data solutions?
Amy O'Hara leads the Center for Administrative Records Research and Applications, exploring the statistical uses of data from federal, state, and third party sources. After completing her doctorate in Economics from the University of Notre Dame, she began her federal career at the Census Bureau in 2004 as a researcher in the Social, Economic and Housing Statistics Division. ?For over a decade, she has explored methods to integrate administrative records data into Census Bureau methods and products. She led the first national analysis of administrative records quality and coverage using 2010 Census data, linking billions of records. O’Hara has developed projects to test the use of auxiliary data to enhance survey frames, contact respondents, and treat missing data. She has initiated partnerships across government and academia to leverage survey data with administrative records to analyze program participation and improve population and economic measurements.
March 4, 2015, 10:50-12:20
Special Topics in Administrative Records
Wednesday, March 4, 10:50-12:20, Room(s) Auditorium 3
Organizations acquire external data sources for various purposes in the survey lifecycle, This session explores using administrative records in the development of an alternative contact frame and the successes and challenges related to the matching of administrative data.
Presentations:
Using Third Party Data to Contact Respondent | Dave Sheppard, Bonnie Moore - U.S. Census Bureau |
HUD Improper Payment Reporting - An Investigation of Tenant Misreporting of Income via a Data Match with the National Directory of New Hires | Sophia I. Zanakos, Davia Spado, Kelly Martin - ICF International |
Overview of Administrative Data Matching to Postsecondary Studies at the National Center for Education Statistics | Tracy Hunt-White, Sean Simone - U.S. Department of Education |
Field Operations Management
Wednesday, March 4, 10:50-12:20, Room(s) Conference Rooms 2 & 3
These presentations feature new methods for hiring and evaluating field interviewer staff and sharing interviewer knowledge. Some of the presentations show how to utilize GPS and CARI to evaluate field staff by verifying locations, authenticating completed surveys, and creating a visual review of the interviewer's day superimposed on a map. Other presentations demonstrate tools that help to identify and fill vacant field interviewer positions and allow interviewers to share their knowledge about challenges in the communities in which they work.
Presentations:
NORC PLACES: A prototype for an interactive GIS-enabled tool for sharing and displaying community generated information about location specific obstacles to field work | Kyle Fennell - NORC at the University of Chicago |
GPS and CARI for Quality Assurance on Mobile Data Collection | R. Suresh, Steve Litavecz, Charles Loftis, Michael Keating - RTI International |
An overview of locally developed tools to manage data collection processes in a hybrid structure of survey management | James Christy - U.S. Census Bureau |
Efficiency Analysis through Geospatial Location Evaluation | Marsha Hasson - Westat |
Management Challenges - Technology Costs
Wednesday, March 4, 10:50-12:20, Room(s) Conference Room 4
This panel will discuss current challenges for survey organization and project managers related to predicting technology costs for survey projects. With the recent advances in technology, specifically mobile devices, personal devices (i.e. SmartWatch) and sensors, how can survey organizations predict and plan for the constant need for utilizing the latest technologies to enhance data collection efforts.
Panelists:
- Ken Robertson, BLS
- Ananth Koppikar, Mathematica
- Stacy Stonich, NORC
- Gina-Qian Cheung, University of Michigan
Adaptive Design
Wednesday, March 4, 10:50-12:20, Room(s) Auditorium 1 & 2
Making surveys more responsive to the present conditions is at the forefront of survey research. Using paradata, survey data, and other sources of information to guide real-time processes and decision-making can help surveys save money and increase data quality. During this session, we will hear about various ways that surveys are implementing adaptive design paradigms to better their surveys.
Presentations:
Adaptive Design Strategies for Addressing Nonresponse Error in NCES Longitudinal Surveys | Sarah Crissey, Elise Christopher, Ted Socha - National Center for Education Statistics |
Developing Data Collection Reports for Adaptive Design | Amang Sukasih, Michael Sinclair, Debra Wright, Shilpa Khambhati, Brendan Kirwan - Mathematica Policy Research, Inc. |
Multivariate Tests for Phase Capacity | Taylor Lewis - U.S. Office of Personnel Management |
Adaptive Design Experiments in a Longitudinal Survey: Plans for the Survey of Income and Program Participation | Jason M. Fields, Stephanie Coffey, Matthew C. Marlay, Benjamin Reist, Mahdi Sundukchi, Ashley Westra - U.S. Census Bureau |
Special Topics in Software Development
Wednesday, March 4, 10:50-12:20, Room(s) Conference Room 1
These presentations highlight innovative uses of technology to address persistent survey challenges: making the most of open-source tools to improve the data collection functionality of Android mobile devices, replacing paper with a robust electronic sample listing and mapping system, and creating a highly-developed survey project tracking system using real-time paradata to help managers monitor and control costs.
Presentations:
Using ODK for Survey Data Collection | Abie Reifer - Westat |
An Enterprise Approach for Listing and Mapping Application (LiMA) | Kim Canada - U.S. Census Bureau |
Mobile Maps Application for Field Surveys | Katherine Morton, Charles Loftis, James Cajka, James Rineer, Bonnie Shook-Sa - RTI International |
Unified Tracking System Paradata Warehouse | David Morgan - U.S. Census Bureau |
March 4, 2015, 1:30-3:00
Coverage Issues in Administrative Records
Wednesday, March 4, 1:30-3:00, Room(s) Auditorium 3
This session examines different ways administrative records can be used to find non-reported and verify reported information. Issues such as hard to find individuals, verifying proper reporting, and examining results of programs can be aided by use of administrative records. This session will highlight four such cases from the Census Bureau
Presentations:
Coverage of Children in Administrative Records | Catherine Massey - U.S. Census Bureau |
Response Error and the Medicaid Undercount in the 2009 Current Population Survey | James Noon, Sonya Rastogi - U.S. Census Bureau |
Federal Grant Coverage of Females and Minorities in STEM Programs: Evidence from StarMetrics Data linked to the 2010 U.S. Decennial Census | Catherine Buffington, Benjamin Cerf Harris - U.S. Census Bureau, John King - Economic Research Service, Brett McBride - Bureau of Labor Statistics, Michael Tzen - U.S. Census Bureau |
When Race and Hispanic Origin Reporting are Discrepant Across Administrative Records Sources: Exploring Methods to Assign Responses | Sharon R. Ennis, Sonya Rastogi, James Noon - U.S. Census Bureau |
Field Operations Training
Wednesday, March 4, 1:30-3:00, Room(s) Conference Room 1
Training is a critical component of data collection and processing that is constantly changing as new technologies are being adapted and utilized. This session presents multiple perspectives to enhance and conduct in-person and distance training activities involving a variety of participants including enumerators, coders, and respondents.
Presentations:
Audience Response Technology to Increase Learner Engagement | Gini Wilderson - U.S. Census Bureau |
Automated Training for a New Census | Jennifer Kim, Vicki McIntire, Reginald Bingham - U.S. Census Bureau |
Microsoft tools enhance training and management of offsite dietary coders | Amber Brown, Deirdre Douglass, Thea Palmer Zimmerman, Suzanne McNutt - Westat |
Interactive In-System Technology Training for Mobile Clinical CASIC Systems | Josh Kumpf - Henry M. Jackson Foundation |
Management Challenges - Human Capital
Wednesday, March 4, 1:30-3:00, Room(s) Conference Room 4
This panel will discuss challenges related to recruiting, development and retaining technical staff, especially programmers and technologists. The current environment for technology staff is highly competitive, resulting in needs for creative approaches to recruiting and retention.
Examples of considerations that the panelists will discuss include:
- Today's environment for recruiting programming and technology staff is very competitive; what novel approaches are organizations using to recruit and hire staff with current technology skills, such as mobile application development?
- Because of the recruiting challenges, retention of key staff has become significantly important – what novel approaches are organizations using to retain key staff?
- Along with the challenges in recruiting new staff, and retaining existing staff, how are organizations dealing with planning for succession for strategic and key roles? What approaches are being used to be able to have successors ready and willing when they are needed?
Panelists:
- Arnie Wilcox, USDA-NASS
- Diane Herz, Mathematica
- Ken Robertson, BLS
- Gina-Qian Cheung, University of Michigan
- Preeta Chickermane, NORC
Paradata
Wednesday, March 4, 1:30-3:00, Room(s) Auditorium 1 & 2
Using paradata, or data about survey processes, can greatly enhance surveys. However, the surface is just being scratched with this new and innovative look at how to improve our surveys. These data can include trace files, contact history records, cost and effort data, and many other data that have traditionally been used only to monitor survey operations. In this session, we will see examples of how survey designers have used paradata, and new methods for organizing that data.
Usability
Wednesday, March 4, 1:30-3:00, Room(s) Conference Rooms 2 & 3
Usability in CASIC may manifest itself in more than a user’s experience with a data collection instrument. This session does address usability in terms of the interviewer or respondent experience, for example in how to use browser tools to help test accessibility, and how to incorporate user experience methods throughout the survey system development lifecycle. But it also reflects on making data quality monitoring systems more usable for survey management and field staff, and optimizing the performance of a data collection application across different computing environments, to make it more usable in a “Bring Your Own Device” scenario.
Coordinator: | Jean Fox, Bureau of Labor Statistics |
Presentations:
Improving Survey Data Quality Assurance with a User-Friendly Stata Package | Nathan Cutler, Danae Roumis - Social Impact, Inc. |
Responsive Design | Linda Sloan - Agilex |
Incorporating user experience methodologies in the software development process | Kathi Kohlmeyer - Northrop Grumman |
March 4, 2015, 3:20-4:50
Big Data
Wednesday, March 4, 3:20-4:50, Room(s) Auditorium 1
Big Data is one of the current buzz phrases along with adaptive design in today's survey world. This session delves into how Big Data is actually being used to improve survey response and quality of information collected. The presenters take a unique look at how this can help at both the Federal and State levels while also looking at how the two levels can work together using Big Data.
Presentations:
Opportunities and Considerations for the Use of Big Data Techniques on the Consumer Expenditure Survey | Brett McBride - Bureau of Labor Statistics |
Where Big Data Meets Administrative Data | Randall J. Olsen - Center for Human Resource Research, Ohio State University |
Combining Statewide BRFSS Data to Produce National Prevalence Estimates | Kristie Healey - ICF International |
Special Topics in Mobile Technologies
Wednesday, March 4, 3:20-4:50, Room(s) Auditorium 2
The increasing use of smart-phone technology, iPads, and Android platforms is changing the landscape for survey data collection, providing alternatives to traditional methods. These new technologies present implementation and security challenges. They also present advances in opportunities for field interviewers to track and record activities and by allowing users to enter data through the mode specific to their needs. This session examines some of these challenges and advances.
Presentations:
Fear, Uncertainty and Doubt: Lessons Learned/Best Practices in Mobile Security | Peter Jim, David Charbonneau - U.S. Census Bureau |
Use of Mobile Technology for CAPI Management | Abie Reifer, Ray Snowden - Westat |
Mobile Security | Stephen M. Dye - Agilex |
Multi-mode User input for Mobile Clinical CASIC Systems | Tiffanni Reidy - Henry M. Jackson Foundation |
Survey Redesign
Wednesday, March 4, 3:20-4:50, Room(s) Conference Rooms 2 & 3
There are many issues to consider when designing or redesigning a survey - i.e., content, navigation, visual design. There are pros and cons of alternative modes of data collection – web, CAPI, CATI, and other modes. This session highlights conceptual differences between CAPI and CATI modes of data collection for a household survey, a web-based questionnaire development system for conducting a cross-sectional or a longitudinal healthcare study, and a new Blaise web questionnaire generator compatible with the existing metadata repository.
Presentations:
Does CATI Have 9 Lives? Lessons Learned Converting a SIPP Supplement from a CAPI to CATI environment | Cindy Easton, Denise Lewis - U.S. Census Bureau |
Redesign of the Production Statistics at Stats Netherlands | Lon Hofman - Statistics Netherlands |
Survey Item Bank | Jessica Graber, Ruth Brenner - NICHD, Steven Fink, Joan Wang - Avar Consulting, Inc. |
Survey Design - Sampling
Wednesday, March 4, 3:20-4:50, Room(s) Conference Room 1
Random sampling has long been at the backbone for selecting the cases for surveys, especially in the federal context. In recent years, more innovative sample designs have been needed to combat rising survey costs and tight budgets. In this session, we will see examples of new methods to better survey sampling, including drone-assisted sampling, web-survey sampling, and using supplemental information to enhance the sampling frame and field readiness.
Coordinator: | Joy Sharp, U.S. Department of Transportation |
Presentations:
Evaluating the effectiveness of releasing the sample to State field offices before the call out date in the Crops Acreage Production Survey | Kelly Toppin, Barbara Rater, Kevin Harding, Linda Young - National Agricultural Statistics Service |
Is it Feasible to Use Immunization Information Systems (IIS) as a Supplemental Sampling Frame for the National Immunization Survey (NIS)? | Stacie Greby, Laura Pabst, LaTreace Harris, Sarah Reagan-Steiner, Holly Hill, Laurie D. Elam-Evans, James A. Singleton - CDC NCHS, Vicki Pineau, Kathleen Santos, Sari Schy, Elizabeth Ormson, Margrethe Montgomery - NORC at the University of Chicago |
Drone Assisted Sample Design for Developing Countries | Joe Eyerman, Karol Krotki, Safaa Amer, Ryan Gordon, Jonathan Evans - RTI International, Kyle Snyder - North Carolina State University |
Innovations in General Population Web Panel Surveys of Households to Improve Sample Coverage and the Response Rate | Michael Stern - NORC at the University of Chicago |
Data Collection Systems
Wednesday, March 4, 3:20-4:50, Room(s) Auditorium 3
Electronic data collection instruments and the platforms they operate on can take many forms. This session covers a variety of examples, with insights into how they might be better designed and implemented. This session will examine the use of tools that handle complex reporting scenarios, take advantage of new cloud computing utilities, and allow for adaptive design and cross-survey commonality.
Presentations:
Dynamically programmed spreadsheets as the main collection instrument for multiunit respondents | Stephen Mangum, Jason Bauer - U.S. Census Bureau |
Design Intuitive Interfaces that Simplify Survey Completion and Reduce Respondent Burden | Alex Schwartz - Henry M. Jackson Foundation |
Cloud Deployment and Testing of Internet Data Submission Applications | Doug Smith - Northrop Grumman |
Designing and Architecting a Shared Platform for Adaptive Data Collection | Michael T. Thieme, Anup Mathur - U.S. Census Bureau |
Day 2 - March 5, 2015, 9:00-10:45
Plenary Session: Text Analytics
Thursday, March 5, 9:00-10:45, Room(s) Auditorium
Open-ended questions have been an important part of survey questionnaires since their beginning, providing answers to important issues analysts neglected to ask and filling in gaps using “Other, specify” as an additional alternative. With computerization, very large samples, and increased urgency from decision-makers, unstructured text has fallen into disuse because of cost, time constraints, and unreliability of human coders. This panel offers the history and an update to where we are in the use of very smart computers (using new text analytic data mining tools) to sort, code and determine sentiment in unstructured text (including Big Data).
Panelists:
- Jim Caplan, DoD
- Todd Leyba, IBM-Watson
- Matthew Burgess, BLS
March 5, 2015, 11:00-12:30
Estimation and Imputation
Thursday, March 5, 11:00-12:30, Room(s) Auditorium 2
This session will give organizations an opportunity to share information about recent innovations in the areas of imputation and estimation. One of the presentations discusses an outlier detection and resolution method used during data collection. Two of the presentations will discuss using alternative imputation methods compared to the traditional hot deck imputation method including the use of administrative records. One of the presentations compares weighted estimates and item response rates using mixed regression models.
Coordinator: | Robyn Sirkis, National Agricultural Statistics Service |
Presentations:
Comparison of Outlier Follow-up in an Individual vs Establishment Survey | Jennifer O’Brien, Jocelyn Newsome, Kerry Levin, Sarah Bennett-Harper, Stephanie Beauvais - Westat, Brenda Schafer, Patrick Langetieg - Internal Revenue Service |
Survey Integration and Estimation Comparisons across Florida Youth Surveys | Brenda Clark, Lee Harding, Stephanie Richelsen, Ronaldo Iachan - ICF International |
Exploratory Study Comparing Alternative Imputation Methods for the National Teacher and Principal Survey | Sarah Dial, Jacob Enriquez, Svetlana Mosina, T. Trang Nguyen, Allison Zotti - U.S. Census Bureau |
Model Based Imputation for the Survey of Income and Program Participation | Martha Stinson, Graton Gathright - U.S. Census Bureau |
Bring Your Own Device (BYOD)
Thursday, March 5, 11:00-12:30, Room(s) Auditorium 1
The use of mobile technologies to support data collection activities continues to evolve and gain prominence. Of particular interest are opportunities posed by mobile Bring Your Own Device (BYOD) solutions to support data collection, as well as the challenges associated with these BYOD solutions. These sessions will cover a range of topics including public and interviewer opinions of BYOD solutions, usability, and associated technologies.
Metrics and Analytics
Thursday, March 5, 11:00-12:30, Room(s) Auditorium 3
Performance measurements are essential for monitoring survey operations and for conducting program evaluations. Performance measures are useful when seeking to identify, to treat, and to prevent sampling and nonsampling errors during survey operations. In addition, these performance measures provide actionable information when conducting program evaluations. This session focuses on approaches to improve call center customer service, to use paradata from adaptive design to monitor resources and measure performance of a multi-mode data collection system, and to assess data quality and cost-effectiveness of a survey program.
Coordinator: | Gina Cheung, Survey Research Center, Un. of Michigan |
Presentations:
Applying Technology Solutions to Improve Federal Program Evaluation and Performance Monitoring | Bradley Epley - ICF International |
Application of Data Analysis to Improve Contact Center Performance and Enhance Business Operations | Caryn Lesage-Jones - Agilex |
An Update on Developing Response Metrics for the Economic Census | Eric B. Fink, Joanna Fane Lineback - U.S. Census Bureau |
Improving Survey Management: The Unified Tracking System and the National Survey of College Graduates | Stephanie Coffey - U.S. Census Bureau |
Case Studies
Thursday, March 5, 11:00-12:30, Room(s) Conference Room 3
This session highlights various topics for innovation in different types of surveys. We will hear about a new method for re-engineering the decennial census, survey redesign testing, and changing the collection of both consumer data and energy statistics.
Presentations:
Reorganized Census with Integrated Technology | Stephanie Studds, Jay Occhiogrosso - U.S. Census Bureau |
From Telephone Interview Only to Web First Multimode: Lessons Learned from Usability and Field Testing In the Redesign of the National Survey of Children’s Health (NSCH) | Alyson Croen, Sabrina Bauroth, Michael Stern - NORC at the University of Chicago, Reem Ghandour, Catherine Vladutiu - Maternal and Child Health Bureau |
Consumer Price Index Collection in Canada: Today and Tomorrow | Andrée Girard, Paul Durk - Statistics Canada |
Transforming, Collection, Analysis, Reporting, and Dissemination of Energy Statistics | James Ellis, Matthew Grosso, Debra Coaxum - U.S. Energy Administration |
Process Management
Thursday, March 5, 11:00-12:30, Room(s) Conference Room 4
Development of complex, large-scale CASIC systems may be improved by incorporating Agile and DevOps practices. This session features two presentations specifically on “DevOps” (a collaborative approach to the software development lifecycle, based on agile principles of enlightened relationships between individuals and teams) and its application to survey data collection systems. A third presentation extends the concept of agility to the integration of multiple survey-related IT projects into one program.
Presentations:
Dev Ops Factory | Dominic Delmolino - Agilex |
Software Integration Modeled after the Scaled Agile Framework | Martine Kostanich - U.S. Census Bureau |
Beyond the Waterfall, An Evolution to true Agility. DEVOPS in Practice | Josh Salmanson, Keith Kapp - Customer Value Partners |
March 5, 2015, 1:30-3:00
Demonstrations and Poster Session
Thursday, March 5, 1:30-3:00, Room(s) Prefunction Area
This year FedCASIC will feature 15 demonstrations and poster sessions in an open, exhibit setting in our auditorium. FedCASIC attendees will have the opportunity to see and talk with presenters directly about their demonstrations and poster topics.
- Demo: Agile Development Approach for an Enterprise Listing and Mapping Application by Harold Dawson, US Census Bureau
- Demo: Building ACASI Surveys Using State of the Art Text to Speech Technology by Gilbert Rodriguez, Emily McFarlane Geisen, Patricia LeBaron, Martin Meyer, Vorapranee Wickelgren, RTI International, Herman Alvarado, SAMHSA
- Demo:CARI That Weight: Obtaining Consent to Record SIPP Interviews by Holly Fee, Matthew C. Marlay, US Census Bureau
- Demo: COMPASS Application Demonstration by Nicole Seamands, US Census Bureau
- Demo: CSPro for Mobile Data Collection by Glenn Ferri, US Census Bureau
- Demo: Surveys Going Mobile - Eye Track It by Erica Olmsted-Hawala & Elizabeth Nichols, US Census Bureau
- Demo: Facilitating the transparency and scientific rigor of cognitive interviewing methodology: Demonstrating the Q-Suite tools by Justin Mezetin, Luis Cortes, Swan Solutions – NCHS, Sheba King Dunston, Sarah Lessem, Candace Sibley, Marko Salvaggio, NCHS
- Demo: Offline Data Collection using Tangerine by Josh Anderson, Adam Preston, RTI
- Demo: MOJO A Re-engineered Control System by Alessandro Ferrucci, Tamara Adams, and Jay Occhiogrosso, US Census Bureau
- Demo: Audits based on Enumerator Behavior to Collect Survey Metadata by Faizan Diwan, Dr. Christopher Robert, SurveyCTO
- Demo: The PARAData Explorer Data Warehouse by Ananth Koppikar, Jason Markesich, Mathematica Policy Research
- Demo: Sentiment Score Analysis of Establishment Survey Interviewer Notes by Bryan Beverly, Bureau of Labor Statistics; Matthew Burgess, Bureau of Labor Statistic
- Poster: Steering Respondents from Paper to Web Surveys by Annette Luyegu, Mathematica Policy Research
- Poster: Automated Statistical Systems by Terrence Lew, RTI International
- Poster: Imputation of a School Level Poverty Indicator by Svetlana Mosina and T. Trang Nguyen, US Census Bureau
Coordinators: | Matthew Burgess, Bureau of Labor Statistics Eric Falk, Defense Manpower Data Center |
March 5, 2015, 3:00-4:30
Sampling
Thursday, March 5, 3:00-4:30, Room(s) Conference Room 1
This session will give organizations an opportunity to share information regarding innovative sample designs. One of the presentations will compare an equal probability subsampling design and optimal allocation subsampling design with the objective of selecting larger samples in industries that have initially lower response rates. Two of the presentations discuss the use of a stratified multi-stage design. One of the presentations mentions a hybrid design combining a stratified national probability sample with non-probability samples.
Presentations:
Venue-Based and Real-Time Sampling Methodologies in an Intercept Survey of Cyclists | Ronaldo Iachan, Olivia Saucier - ICF International |
Sampling Techniques used to Select a Nationally Representative Sample of WIC Participants for the WIC Infant and toddler Feeding Practices Study – 2 (ITFPS2) | Yumiko Sugawara, Jill Montaquila, Suzanne McNutt - Westat |
Evaluating Propensity Score Adjustment for Combining Probability and Non-Probability Samples in a National Survey | Kurt Peters, Heather Driscoll, Pedro Saavedra - ICF International |
Strategies for Subsampling Nonrespondents for Economic Programs | Stephen J. Kaputa, Laura Bechtel, Katherine Jenny Thompson - U.S. Census Bureau |
Confidentiality, Privacy, and Security
Thursday, March 5, 3:00-4:30, Room(s) Conference Rooms 2 & 3
Confidentiality, privacy, and security remain a major concern of organizations sponsoring and/or conducting surveys, as well as the respondents, themselves. These sessions focus attention on building trust with respondents, managing security vulnerabilities, and encouraging respondent participation via digital media.
Presentations:
Improving Confidentiality and Trust-building Measures for Active Duty Military Respondents in Clinical CASIC Systems
| Chris Olsen, Josh Kumpf - Henry M. Jackson Foundation |
Security– Vulnerability management at Westat
| Dennis Pickett - Westat |
Trust: The Respondent View from Initial Contact through Completed Interviews | William S. Long - Centers for Medicare & Medicaid Services |
Use of Digital Media in Recruiting Survey Participants | Amelia Burke-Garcia - Westat |
Special Topics in Survey Management
Thursday, March 5, 3:00-4:30, Room(s) Auditorium 1
Survey initiatives and field testing help to improve data quality and to reduce operational costs. This session focuses on the following innovations in survey management:
• How to create an easy-to-use graphical user interface (GUI) to allow multiple views of a timeline to improve respondent recall on important life events.
• How to locate mobile sample members in a longitudinal surveys, focusing on results from three tests to assess alternative procedures - mail-out/mail-back, use of administrative and third-party data sources, and use of data from other government surveys.
• How to configure a virtual call center to offer cost-savings and to ensure quality control.
Presentations:
The Challenges of Prompting Memory Recall Across a Very Long Reference Period: A New Life Event Calendar | Richard Zemonek, Katherine Mason, Christopher Ellis, Pamela Lattimore - RTI International |
Mobile Samples and Movers: Locating Respondents in the 2014 SIPP Panel | Amber Phillips, Jason M. Fields, Daniel P. Doyle, Bennett Adelman - U.S. Census Bureau |
The Pros and Cons of Virtual Call Centers: Why is this News? | Margaret Lowden - Center for Human Resource Research, Ohio State University |
Response Rates
Thursday, March 5, 3:00-4:30, Room(s) Auditorium 3
Response rates are dropping across all types of surveys. In times of decreasing budgets, new and innovative survey designs can help combat the falling response rate. In this session, we'll see a variety of methods to increase survey response within modes and methods to move to multi-mode surveys.
Coordinator: | Allina Lee, Office of Management and Budget |
Presentations:
Increasing Electronic Reporting for the 2012 Survey of Business Owners | Mary Frauenfelder - U.S. Census Bureau |
Impact of Paper Invitation on EQ Collection Strategy | Anie Marcil - Statistics Canada |
Improving Response Rates Using Mixed Mode Approach Results from the National Health Care Interview Survey | Lindsay M. Howden - U.S. Census Bureau, Sarah S. Joestl, Robin A. Cohen - CDC NCHS |
Designing a Multipurpose Longitudinal Incentive Experiment for the Survey of Income and Program Participation | Ashley Westra, Mahdi Sundukchi - U.S. Census Bureau |
Case Management Systems
Thursday, March 5, 3:00-4:30, Room(s) Auditorium 2
Integration of case management and other survey support applications is the theme of this session. These organizations have redesigned or built management systems that integrate previously independent applications and various fieldwork processes (such as communication, scheduling, mixed mode data collection, data processing, and interviewer recruitment and payment. Their presentations discuss functional and operational features of their systems, how they have helped streamline and complex survey workflow, and lessons learned.
Presentations:
Development of an Integrated Data System for the Gulf Long-Term Follow-up Study (SSSI)
| David A. Johndrow, Chris Wachtstetter, Nicholas M. Martilik, Mark P. Della Valle, Matthew D. Curry, Polly P. Armsby, Carley L. Prynn, Katherine Sisco - Social & Scientific Systems, Inc., Richard K. Kwok, Lawrence S. Engel, Dale P. Sandler - National Institute of Environmental Health Sciences, University of North Carolina, Chapel Hill |
CAPI Management at Westat
| Mangal Subramanian, Ray Snowden - Westat |
Statistics Canada’s Integrated Collection and Operations System (ICOS) project
| Terrence Riley - Statistics Canada |
Integrated Management of survey Processes at Westat | Jerry Wernimont - Westat |
Go to Welcome