Workshops Program
Opening Day - March 19, 2013, 9:00-noon
Opening Keynote Speaker
Tuesday, March 19, 9:00-10:25
Emerging Technologies: New Opportunities, Old Challenges
Michael Link, The Nielsen Company 
Over the past decade advancements in communications and database technologies have radically changed how people access and share information. As a result the opportunities for researchers to measure attitudes, opinions, and behaviors in new ways have gone through perhaps a greater transformation recently than in any previous point in history and this trend appears likely to continue. The availability of smartphones and ubiquity of social media are inter-connected trends which may provide researchers with new data collection tools and alternative sources of information to augment or, in some cases, replace traditional survey research methods. However, this brave new world is not without its share of issues and pitfalls - technological, statistical, and methodological.
Michael W. Link, Ph.D. is Chief Methodologist and Senior Vice President at The Nielsen Company, directing the activities of the Nielsen Measurement Institute. He has a broad-base of experience in survey research, having worked in academia, not-for-profit research, and government before joining Nielsen. Dr. Link's research efforts focus on developing methodologies for confronting the most pressing issues facing measurement science, including improving participation and data quality, using of multiple modes in data collection, and utilizing new technologies such as mobile platforms and social media. Along with several colleagues, he received the American Association for Public Opinion Research 2011 Mitofsky Innovator's Award for his research on address-based sampling. His numerous research articles have appeared in leading scientific journals.
Plenary Panel
Tuesday, March 19, 10:35-noon
Social Media and Survey Research
Recent years have witnessed increased challenges to efficient, high quality survey data collection. Response rates have continued their steady decline and the increasing proportion of wireless-only Americans has resulted in a loss of landline telephone coverage for sampling purposes. Such challenges require increased resources to maintain quality and production in surveys, but Federal agencies continue to face budget uncertainty. Concurrent with these trends, there has been a meteoric rise in the general popularity of social media and web 2.0 platforms like Facebook and Twitter, among others. As the proportion of Americans on Facebook approaches that of Americans with landline telephones, it is natural to inquire about whether social media holds promise for gaining access to individuals for survey purposes.
But how useful are these data for addressing research questions typically investigated using traditional survey approaches? Does social media hold promise as a cheap and quick sampling frame, locating resource, pool of ready respondents for cognitive assessments, representative surveys, or something else? There are several perspectives on these issues informed by research within and outside of Federal survey work and methodologists are beginning to grapple with how to evaluate the properties of social media data from a total error perspective.
This panel will include researchers who have considered these questions in their research and evidence from their studies about the potential for social media to supplement survey data collection. The plenary will provide the results of research, recommendations for other survey researchers to consider regarding the utility of social media data, and several perspectives on the future viability of these data sources for research purposes.
Panelists:- Joe Murphy, RTI
- Jenny Childs, US Census Bureau
- Mike Stern, NORC
- Amelia Burke, Westat
Session Chair: Lisa Thalji, RTI
March 19, 2013, 1:30-4:30
Recent Innovations
Tuesday, March 19, 1:30-4:30
This session will give organizations an opportunity to share information about recent innovations in CASIC approaches, including hardware, software, training, research, major organizational changes, new surveys, etc. Organizations that wish to participate should contact Bill Mockovak in advance.
The following speakers will be presenting:
- Abie Riefer, Westat
- Lon Hofman, Stat Netherlands/Blaise
- Suzanne Fratino/ Jamie Christy, U.S. Census Bureau
- Angela DeBello, NORC
- Linda Bandeh, Mathematica
- Patty Maher, ISR/Michigan
- David Uglow, RTI
Demonstrations
Tuesday, March 19, 1:30-4:30
The 2013 FedCASIC will host demonstrations from organizations with innovative CASIC instruments and software. Participating organizations will have the opportunity to demonstrate and showcase their CASIC related technologies with government and academic survey methodologists. The demonstrations will take place in a small exhibit hall setting, with tables set up for demonstrations. Demonstrators may use laptops, displays, and handouts to present their technologies to FedCASIC attendees. Attendees will be free to move throughout the exhibits and learn about new CASIC instruments and software.
Target Audience: Government, academic, and industry professionals with interest in computer assisted survey information collection (CASIC).
Presentations:
Performing Next Generation Big Data Analytics - Connecting the Dots | Gary Arnett, Mark Segal - L-3 Communications, Jeff Wootton - Palantir Technologies |
COMET, Web-based System To Process Raw Payroll Data Submitted Electronically by National, Multi-unit Firms | Matthew Burgess, Mangala Kuppa - Bureau of Labor Statistics |
Brandt Information Services CATI Demonstration | Greg Weeks, Adrienne Johnston, George Foster - Brandt Information Services, Inc. |
Use of Mobile and Web Technology in Blaise Multimode Environments: Lessons Learned | Lon Hofman, Roger Linssen - Statistics Netherlands |
Statistics Canada's Consumer Price Index | Chris Kit , Ryan Williams - Statistics Canada |
iPad as Viewer and Memory Aid for Respondents in a Collaborative Approach to CAPI Interviewing | Eric White, Kate Golen, Chris Schlapper - University of Wisconsin Survey Center |
Enumeration Application for Mobile Devices | Danielle Lessard - U.S. Census Bureau, Phillip Simulis - PORTAL Technologies |
Demonstration of the CARI Interactive Data Access System | Carl Fisher, Rita Thissen - RTI International |
Technical Workshop Session Topics
March 20, 2013, 9:00-noon
Mobile Technology Assessments for Cost, Experience, and Quality
Wednesday, March 20, 9:00-noon
The increasing use of feature- and smart-phone technology, iPads, and Android platforms is changing the landscape for survey data collection, providing alternatives to traditional methods. These new technologies present technological, human-computer interface design, coverage and sampling, data quality, and methodological challenges. They also present advances in opportunities to monitor field surveys with real-time tracking and communication. This session examines some of these challenges and advances.
Target Audience: A variety of survey research professionals would benefit from this session, including programming staff, data collection managers, survey designers, survey methodologists and sampling statisticians.
Presentations:
Challenges and Advances in Using the iPad® Computer-Assisted Personal Interview System | Autumn Foushee, Heather Driscoll - ICF International |
There's an App for That: A Data Quality Review of a Transition from PAPI Data Collection to Smartphone Data Collection in a Vendor Management Study | Amy Hendershott, Leslie Erickson, Wandy Stephenson, Daniel Keever - RTI International |
Lessons Learned: Using Tablets in the Field and the Future of Mobile Data Collection | Mark Brinkley, Gene Shkolnikov - Mathematica Policy Research, Inc. |
Cost Implications of New Address Listing Technology: Implications for Efficiency and Data Quality | Katie Dekker - NORC at the University of Chicago |
Paradata as Intelligence for Decision Making: Methods to Control Cost, Quality, and Production Operations
Wednesday, March 20, 9:00-noon
Paradata supply intelligence for decision making in the planning and execution of survey processes.
The session will showcase ways in which researchers are utilizing this intelligence in its myriad forms, highlighting the measurable efficiencies that have been gained before, during, and after data collection in order to meet operational and methodological goals. The session will include lively discussion with both presenters and audience members regarding recent findings and innovations in the use of paradata, the principle challenges researchers and organizations face in utilizing paradata, and the limitations to paradata-driven solutions to our operational and methodological challenges.
Presentations:
Using Audit Trails to Find and Explain Survey Error | Renee Gindi - NCHS |
Using Paradata to Track and Improve Interviewer Quality Across Projects and Over Time | Kyle Fennell - NORC at the University of Chicago |
Selection of Scoring Criteria in Computer Audio Recorded Interview (CARI) Coding | Carl Fisher - RTI International |
Using Paradata to Improve Survey Efficiency for Linguistic Minorities | Kari Carris - NORC at the University of Chicago |
Management Challenges in CAI Survey Organizations
Wednesday, March 20, 9:00-noon
This session will provide CAI survey managers and researchers dealing with management and administrative challenges a venue to share their knowledge and learn how others are approaching these issues. A panel of experts from government and industry will discuss key topics with audience participation including questions and shared experiences.
- Management challenges associated with security in the workplace
Data security tools and practices are widely discussed, and they continue to evolve. This session will discuss data security and broader concerns including how security issues beyond those directly associated with data are being handled ? are there new challenges in physical security especially in the context of open workspaces? How are organizations dealing with potential 'insider threats'? How are organizations handling security issues associated with remote and 'work at home' workers? What are the best practices around incident response? How are organizations handling the proliferation of "bring your own device" (BYOD)?
- Management challenges associated with training, career development and project management
Panelists from CAI organizations discuss how to develop, grow and retain staff in the current environment including technical training and staff incentives. Some of the specifics might include: Addressing the content of training programs, dealing with retention issues, developing technical staff, and training methods. Panelists may also discuss how social media plays a role in staff recruiting and retention.
Panelists include:
- Diane Herz, Mathematica Policy Research
- Michael Horrigan, Bureau of Labor Statistics
- Missy Koppelman, NORC
- William Samples, U.S. Census Bureau
- Patty Maher, University of Michigan
- Gina-Qian Cheung, University of Michigan
- Josh Seeger, NORC
Attendees will hear about the techniques used in different organizations to address key management issues, participate in a discussion of these issues, and have an opportunity to ask the panelists about effective approaches to these situations.
Target Audience: Survey managers, technology managers, and researchers
Blending CASIC Designs with Data from Records
Wednesday, March 20, 9:00-noon
With decreasing response rates and increasing survey costs, interest in ways to incorporate administrative records in surveys has surged. These ways range from abandoning surveys entirely (e.g., censuses in some European countries) to closely integrating survey questions and administrative records data in early stages of survey design. This session will present some examples of innovative uses of administrative records in household and establishment surveys, and discuss their impact on survey quality and costs.
Target Audience: All who are interested in reducing survey costs and increasing survey quality through use of records.
Coordinator: | David Cantor, Westat |
Presentations:
Administrative Records Coverage of Demographic Response Data in the American Community Survey | Renuka Bhaskar - U.S. Census Bureau |
Bringing Data Sources Together | Mark Martin - Office for National Statistics (UK) |
Testing Record Linkage Production Data Quality | K. Bradley Paxton - ADI LLC. |
The HUD Quality Control Study - Collecting Data through File Record Abstraction, CAPI and Administrative Records to Fulfill Mandatory Improper Payment Reporting | Sophia I. Zanakos - ICF International |
March 20, 2013, 1:30-4:30
Applications of Mobile Technology in the Developing World, Agriculture and Longitudinal Studies
Wednesday, March 20, 1:30-4:30
The increasing use of feature- and smart-phone technology, iPads, and Android platforms is changing the landscape for survey data collection, providing alternatives to traditional methods. These new technologies present technological, human-computer interface design, coverage and sampling, data quality, and methodological challenges. They also present advances in opportunities to monitor field surveys with real-time tracking and communication. This session examines some of these challenges and advances.
Target Audience: A variety of survey research professionals would benefit from this session, including programming staff, data collection managers, survey designers, survey methodologists and sampling statisticians.
Presentations:
Mobile Device use in Underserved Areas - Challenges, Considerations and Lessons Learned | Abie Reifer - Westat |
CAPI Surveys on Android Devices in the Developing World | Sam Haddaway - NORC at the University of Chicago |
Data Collection in the Thin Client CAPI and GIS Environments | Eric Wilson, Michael Gerling, Sarah Nusser, Alan Dotts, Andrew Vardeman, Linda Lawson - USDA |
Innovative Retention Methods in Panel Research: Can SmartPhones Improve Long-term Panel Participation? | James Dayton, Andrew Dyer - ICF International |
Responsive / Adaptive Design - A Decision Making Approach: Using Intelligence to Adjust Data Collection Strategies
Wednesday, March 20, 1:30-4:30
Responsive or adaptive design refers to a method for managing surveys that accounts for uncertainties during data collection and uses real-time information obtained during data collection to make decisions. These designs refocus outcomes away from response rate measures inherent in fixed designs to metrics that improve the quality of survey estimates. These sessions focus on CASIC surveys with responsive/adaptive designs and include topics such as:
- types of information used to inform decision-making (e.g. paradata, administrative data, or longitudinal data),
- proposed measures used for reducing non-response bias or error during data collection, and
- systems required for monitoring quality during data collection.
Target Audience: Survey Methodologists, Survey Directors, Data Collection Managers, Analysts
Presentations:
Implementing Adaptive Design at the Census Bureau for the National Survey of College Graduates | Stephanie Coffey, Benjamin Reist - U.S. Census Bureau |
Adaptive Sample Design and Management at NASS | Jaki McCarthy - National Agricultural Statistics Service |
Responsive Design Using Mahalanobis Distancing: Preliminary Results from Two National Center for Education Statistics Longitudinal Surveys | Elise Christopher, Ted Socha - National Center for Education Statistics |
The Evolution of Electronic Questionnaire Collection Strategy at Statistics Canada | Wade Kuseler - Statistics Canada |
Using Responsive Design to Improve Response and Operational Efficiency Under the Constraints of Time-Sensitive Program Evaluation | Andy Weiss, Faith Lewis - Abt SRBI, Rhoda Cohen - Mathematica Policy Research, Inc. |
Data Management
Wednesday, March 20, 1:30-4:30
This FedCASIC session focuses on all post-collection data processing - editing, analysis, visualization, dissemination and archiving. While presentations from all of these areas are welcome, this year we hope to highlight presentations on harmonization of data from multiple collection modes - including web, telephone, mobile and paper - and data issues involving mobile computing in general.
Target Audience: A variety of survey research professionals would benefit from this session including survey managers, data managers, survey designers, programming staff, and methodologists.
Presentations:
The Challenges of Big Data | Timothy Mulcahy, Johannes Huessy, Daniel Gwynne - NORC at the University of Chicago |
Information from New Systems for Evaluating Survey Quality at EIA | Elizabeth Panarelli - EIA |
Artificial Intelligence in Data Processing | Alex Measure - Bureau of Labor Statistics |
Data Management Challenges and Lessons Learned in Project Transition | Maria Hobbs, Al Bethke - RTI International |
Addressing and Reducing Respondent Burden to Gain Cooperation
Wednesday, March 20, 1:30-4:30
This session is a continuation of last year's discussion of respondent burden and possible ways to alleviate it. It was made clear last year that an operational definition of burden should be developed. Once defined, discussion will include possible approaches to reducing burden. Some approaches may include questionnaire design, administration mode, cognitive interviewing for survey development, and/or secondary data collection. The population being studied may have different interpretations or burden. Consideration of the population may be appropriate. Case studies of successes or failures may provide insight to the problem of maintaining respondent cooperation as it relates to burden. Hopefully this session will generate some discussion and brain storming for possible future approaches to the reduction of burden for a resulting increase in subject participation.
Target Audience: Survey authors, implementers, and administrators.
Presentations:
It's About Time: Examining the Effect of Interviewer-Quoted Survey Completion Time Estimates on Nonresponse | Stacie Greby, Kathy O'Connor, Bess Welch, Christopher Ward, Jacquelyn George - CDC NCHS |
A Comparison of Respondent Burden, Approaches for Minimizing Respondent Burden, and Outcomes - Lessons Learned from Two Cohort Studies with Nested Designs for Congressional and Court Mandated Research | Charlie Knott, Christopher Lyu, Dawn Dampier, Cathy Colvard, Martha Ryals, Stephanie Gray, Fred Crane - Battelle, Eric Bair, Gary Slade, William Maixner - UNC, Roger Fillingim - UFL, Richard Ohrbach - UB, Joel Greenspan - UMD |
Decreasing Respondent Burden in the US Census Bureau using CARI (Computer Audio-Recorded Interviewing) | Carl Fisher - RTI International, Terence Strait, Romell McElroy - U.S. Census Bureau |
How Statistics Canada will be Reducing Respondent Burden of its Economic Surveys. The Integrated Business Statistics Program (IBSP) | Michael Sigouin - Statistics Canada |
Using Time of Interview to Inform Recruiting Strategies | Jeff Enos, Yolanda Lazcano, James Christy - U.S. Census Bureau |
March 21, 2013, 9:00-noon
Web-Based Surveys
Thursday, March 21, 9:00-noon
Web-based surveys are increasingly becoming the norm with certain populations. Cost and accessibility are carefully considered when determining which mode to use for survey administration. With advancements in technology, web-based surveys are becoming more user-friendly, aesthetically pleasing, and easier to deploy. With the growing popularity comes increased interest in aspects such as creating and testing instruments, managing sample, deployment to devices such as tablets and smartphones, and non-response follow-up techniques. In this session, we will explore several developments in web surveys, usability testing, case management, and methods to increase web response rates. Audience input will be strongly encouraged.
Target Audience: Survey managers and researchers
Presentations:
Standards and New Electronic Questionnaire (EQ) Surveys Functionalities | Cindy Gagné, Robert Godbout - Statistics Canada |
Challenges in Making Web Surveys 508 Compliant | Sandhya Bikmal - RTI International |
Leveraging Web Capabilities To Reduce Burden and Cost: Establishment Survey Example | Kathryn Harper - ICF International |
caCURE - An Open-source Survey Toolset | Bill Tulskie - HealthCare IT, Inc. |
Web Diary Feasibility Test: Preliminary Findings and Issues | Ian Elkin - Bureau of Labor Statistics |
Protecting Systems, Data and People in a Rapidly Changing Environment
Thursday, March 21, 9:00-noon
This session will cover a diversity of issues ranging from protecting the interviewer and interviewing tools in the field, configuring systems locally and in the cloud to meet current security standards and compliance requirements, to emerging technology and implications for security.
Target Audience: Anyone interested, involved in or wishing to learn about the fascinating issues of securing systems, people and data in a rapidly changing environment should attend.
Coordinator: | Bill Connett, Survey Research Center, Un. of Michigan |
Presentations:
Securing Web Applications Against Cyber-Attacks | Anwar Mohammed - RTI International |
Leveraging Cloud Technology to Improve Study Operations Continuity and Resiliency | Dennis Pickett, Ray Snowden - Westat |
Secure Mobile Data Collection: From iPads to Blaise/IS | Glenn Jones - Mathematica Policy Research, Inc. |
An Overview of the Microsoft Cloud with Security Observations | Marcus Blough - Survey Research Center, Un. of Michigan |
The Internet of Things (IoT) and Implications for Security | Bill Connett - Survey Research Center, Un. of Michigan |
Survey Uses of Metadata
Thursday, March 21, 9:00-noon
Metadata are data that describe other data or processes. For users of data, the metadata are the record of how those data were produced and what the data mean. Metadata are analogous to the work you had to show when solving a math problem in high school. In order to understand the data a survey produces, you must understand the steps that were taken to conduct that survey.
Survey work provides many opportunities to use metadata fruitfully, throughout the survey life-cycle. For instance, data dissemination, data harmonization, and survey documentation all use or produce metadata. This session will explore these and related issues.
Presentations:
Enhancing Transparency and Reproducibility via Frequently-Asked-Questions | Shawna Waugh - Energy Information Administration |
Improving NHANES Data Documentation Processes | Jennifer Dostal, Shannon Corcoran, Ed Stammerjohn, Tim Tilert, Jane Zhang - CDC NCHS |
Defining "Core" Metadata: What is Needed to Make Data Discoverable? | Sandra Cannon - Federal Reserve Board |
Generic Statistical Information Model - An Overview | Dan Gillman - Bureau of Labor Statistics |
New Technologies throughout the Survey Lifecycle
Thursday, March 21, 9:00-noon
The intersection between new technologies and survey research is consistently evolving. This New Technologies session will present potential uses and applications of new and innovative technologies that enhance the efficiency of traditional survey data collection. Presenters from both the public and private sectors will provide evidence and anecdotes of the potential for existing technologies to enhance and improve survey design. Presentations may discuss how these technologies contribute to increased response rates, increased respondent engagement, decreased costs for data collection, or increasing efficiency of survey management. Topics may include cloud computing, crowdsourcing, social networking, virtual computing, and new instrumentation development environments. Questions and discussion are strongly encouraged from session attendees.
Target Audience: A variety of survey research professionals would benefit from this session, including programming staff, data collection managers, survey designers, survey methodologists and sampling statisticians.
Presentations:
Will They Answer the Phone If They Know It's Us? Using Caller ID to Improve Response Rates | Jeff Boone, Heather Ridolfo, Nancy Dickey - National Agricultural Statistics Service |
Using a Self-Administered Web-Based System to Replace the Interviewer: The Automated Self-Administered 24-Hour Dietary Recall (ASA24) | Gordon Willis, Nancy Potischman, Sharon I.Kirkpatrick, Frances E. Thompson, Beth Mittl, Thea Palmer Zimmerman, Christopher Bingley, Amy F. Subar - National Cancer Institute |
Applying Crowdsourcing Methods in Social Science Research | Michael Keating - RTI International |
Using Text-To-Speech Software for ACASI | Jeff Phillips, Ed Dolbow, Brad Edwards - Westat |
Wireless Experience with CAPI Collection | Gyslaine Burns - Statistics Canada |