2013 Federal CASIC Workshops
 
Workshops Information:
FedCASIC Workshops Presentations By Year:
 

Held March 19th to March 21st, 2013 at the Bureau of Labor Statistics, Washington, D.C.
Sponsored by the Bureau of Labor Statistics and the U.S. Census Bureau.

2013 FedCASIC Presentations

Tuesday Sessions

Wednesday Sessions

  1. Mobile Technology Assessments for Cost, Experience, and Quality (4 presentations)
  2. Paradata as Intelligence for Decision Making: Methods to Control Cost, Quality, and Production Operations
  3. Management Challenges in CAI Survey Organizations
  4. Blending CASIC Designs with Data from Records (4 presentations)
  5. Applications of Mobile Technology in the Developing World, Agriculture and Longitudinal Studies (4 presentations)
  6. Responsive / Adaptive Design - A Decision Making Approach: Using Intelligence to Adjust Data Collection Strategies (5 presentations)
  7. Data Management (4 presentations)
  8. Addressing and Reducing Respondent Burden to Gain Cooperation (5 presentations)

Thursday Sessions

  1. Web-Based Surveys (5 presentations)
  2. Protecting Systems, Data and People in a Rapidly Changing Environment
  3. Survey Uses of Metadata (4 presentations)
  4. New Technologies throughout the Survey Lifecycle (5 presentations)

Opening Keynote Speaker

Emerging Technologies: New Opportunities, Old Challenges
Michael Link, The Nielsen Company Link to a non-federal Web site

Over the past decade advancements in communications and database technologies have radically changed how people access and share information. As a result the opportunities for researchers to measure attitudes, opinions, and behaviors in new ways have gone through perhaps a greater transformation recently than in any previous point in history and this trend appears likely to continue. The availability of smartphones and ubiquity of social media are inter-connected trends which may provide researchers with new data collection tools and alternative sources of information to augment or, in some cases, replace traditional survey research methods. However, this brave new world is not without its share of issues and pitfalls - technological, statistical, and methodological.

Michael W. Link, Ph.D. is Chief Methodologist and Senior Vice President at The Nielsen Company, directing the activities of the Nielsen Measurement Institute. He has a broad-base of experience in survey research, having worked in academia, not-for-profit research, and government before joining Nielsen. Dr. Link's research efforts focus on developing methodologies for confronting the most pressing issues facing measurement science, including improving participation and data quality, using of multiple modes in data collection, and utilizing new technologies such as mobile platforms and social media. Along with several colleagues, he received the American Association for Public Opinion Research 2011 Mitofsky Innovator's Award for his research on address-based sampling. His numerous research articles have appeared in leading scientific journals.

Presentation Materials:
Emerging Technologies: New Opportunities, Old Challenges (Adobe PDF Icon, 1.1mb)Michael Link - The Nielsen Company

Plenary Panel

Social Media and Survey Research

Recent years have witnessed increased challenges to efficient, high quality survey data collection. Response rates have continued their steady decline and the increasing proportion of wireless-only Americans has resulted in a loss of landline telephone coverage for sampling purposes. Such challenges require increased resources to maintain quality and production in surveys, but Federal agencies continue to face budget uncertainty. Concurrent with these trends, there has been a meteoric rise in the general popularity of social media and web 2.0 platforms like Facebook and Twitter, among others. As the proportion of Americans on Facebook approaches that of Americans with landline telephones, it is natural to inquire about whether social media holds promise for gaining access to individuals for survey purposes.

But how useful are these data for addressing research questions typically investigated using traditional survey approaches? Does social media hold promise as a cheap and quick sampling frame, locating resource, pool of ready respondents for cognitive assessments, representative surveys, or something else? There are several perspectives on these issues informed by research within and outside of Federal survey work and methodologists are beginning to grapple with how to evaluate the properties of social media data from a total error perspective.

This panel will include researchers who have considered these questions in their research and evidence from their studies about the potential for social media to supplement survey data collection. The plenary will provide the results of research, recommendations for other survey researchers to consider regarding the utility of social media data, and several perspectives on the future viability of these data sources for research purposes.

Panelists:
Session Chair: Lisa Thalji, RTI

Recent Innovations

This session will give organizations an opportunity to share information about recent innovations in CASIC approaches, including hardware, software, training, research, major organizational changes, new surveys, etc. Organizations that wish to participate should contact Bill Mockovak in advance.

The following speakers will be presenting:
Coordinator:Bill Mockovak, Bureau of Labor Statistics
Presentation Materials:
Use of Virtual Desktop Infrastructure for Work At Home staff (Adobe PDF Icon, 235kb)Suzanne Fratino,
James Christy - U.S. Census Bureau
Update on Innovations at NORC (Adobe PDF Icon, 172kb)Angela DeBello - NORC at the University of Chicago
Recent Innovations University of Michigan (Adobe PDF Icon, 693kb)Patty Maher - Survey Research Center, Un. of Michigan
Recent Innovations at RTI International (Adobe PDF Icon, 1.1mb)David Uglow - RTI International

Demonstrations

The 2013 FedCASIC will host demonstrations from organizations with innovative CASIC instruments and software. Participating organizations will have the opportunity to demonstrate and showcase their CASIC related technologies with government and academic survey methodologists. The demonstrations will take place in a small exhibit hall setting, with tables set up for demonstrations. Demonstrators may use laptops, displays, and handouts to present their technologies to FedCASIC attendees. Attendees will be free to move throughout the exhibits and learn about new CASIC instruments and software.
Coordinators:Louis Harrell, Bureau of Labor Statistics
Matthew Burgess, Bureau of Labor Statistics
Presentation Materials:
COMET, Web-based System To Process Raw Payroll Data Submitted Electronically by National, Multi-unit Firms (Adobe PDF Icon, 558kb)Matthew Burgess,
Mangala Kuppa - Bureau of Labor Statistics
Statistics Canada's Consumer Price Index (Adobe PDF Icon, 442kb)Chris Kit ,
Ryan Williams - Statistics Canada
iPad as Viewer and Memory Aid for Respondents in a Collaborative Approach to CAPI Interviewing (Adobe PDF Icon, 451kb)Eric White,
Kate Golen,
Chris Schlapper - University of Wisconsin Survey Center

Mobile Technology Assessments for Cost, Experience, and Quality

The increasing use of feature- and smart-phone technology, iPads, and Android platforms is changing the landscape for survey data collection, providing alternatives to traditional methods. These new technologies present technological, human-computer interface design, coverage and sampling, data quality, and methodological challenges. They also present advances in opportunities to monitor field surveys with real-time tracking and communication. This session examines some of these challenges and advances.
Coordinator:Lew Berman, ICF International
Presentation Materials:
Challenges and Advances in Using the iPad® Computer-Assisted Personal Interview System (Adobe PDF Icon, 707kb)Autumn Foushee,
Heather Driscoll - ICF International
There's an App for That: A Data Quality Review of a Transition from PAPI Data Collection to Smartphone Data Collection in a Vendor Management Study (Adobe PDF Icon, 846kb)Amy Hendershott,
Leslie Erickson,
Wandy Stephenson,
Daniel Keever - RTI International
Lessons Learned: Using Tablets in the Field and the Future of Mobile Data Collection (Adobe PDF Icon, 911kb)Mark Brinkley,
Gene Shkolnikov - Mathematica Policy Research, Inc.
Cost Implications of New Address Listing Technology: Implications for Efficiency and Data Quality (Adobe PDF Icon, 2.4mb)Katie Dekker - NORC at the University of Chicago

Paradata as Intelligence for Decision Making: Methods to Control Cost, Quality, and Production Operations

Paradata supply intelligence for decision making in the planning and execution of survey processes.

The session will showcase ways in which researchers are utilizing this intelligence in its myriad forms, highlighting the measurable efficiencies that have been gained before, during, and after data collection in order to meet operational and methodological goals. The session will include lively discussion with both presenters and audience members regarding recent findings and innovations in the use of paradata, the principle challenges researchers and organizations face in utilizing paradata, and the limitations to paradata-driven solutions to our operational and methodological challenges.
Coordinators:Chris Stringer , RTI International
Dan Zahs, Survey Research Center, Un. of Michigan

Management Challenges in CAI Survey Organizations

This session will provide CAI survey managers and researchers dealing with management and administrative challenges a venue to share their knowledge and learn how others are approaching these issues. A panel of experts from government and industry will discuss key topics with audience participation including questions and shared experiences.

Panelists include:

Attendees will hear about the techniques used in different organizations to address key management issues, participate in a discussion of these issues, and have an opportunity to ask the panelists about effective approaches to these situations.

Coordinators:David Uglow, RTI International
Jane Shepherd, Westat

Blending CASIC Designs with Data from Records

With decreasing response rates and increasing survey costs, interest in ways to incorporate administrative records in surveys has surged. These ways range from abandoning surveys entirely (e.g., censuses in some European countries) to closely integrating survey questions and administrative records data in early stages of survey design. This session will present some examples of innovative uses of administrative records in household and establishment surveys, and discuss their impact on survey quality and costs.
Coordinator:David Cantor, Westat
Presentation Materials:
Administrative Records Coverage of Demographic Response Data in the American Community Survey (Adobe PDF Icon, 95kb)Renuka Bhaskar - U.S. Census Bureau
Bringing Data Sources Together (Adobe PDF Icon, 2.1mb)Mark Martin - Office for National Statistics (UK)
Testing Record Linkage Production Data Quality (Adobe PDF Icon, 2.3mb)K. Bradley Paxton - ADI LLC.
The HUD Quality Control Study - Collecting Data through File Record Abstraction, CAPI and Administrative Records to Fulfill Mandatory Improper Payment Reporting (Adobe PDF Icon, 334kb)Sophia I. Zanakos - ICF International

Applications of Mobile Technology in the Developing World, Agriculture and Longitudinal Studies

The increasing use of feature- and smart-phone technology, iPads, and Android platforms is changing the landscape for survey data collection, providing alternatives to traditional methods. These new technologies present technological, human-computer interface design, coverage and sampling, data quality, and methodological challenges. They also present advances in opportunities to monitor field surveys with real-time tracking and communication. This session examines some of these challenges and advances.
Coordinator:Jim Caplan, Department of Defense
Presentation Materials:
Mobile Device use in Underserved Areas - Challenges, Considerations and Lessons Learned (Adobe PDF Icon, 1.1mb)Abie Reifer - Westat
CAPI Surveys on Android Devices in the Developing World (Adobe PDF Icon, 4.3mb)Sam Haddaway - NORC at the University of Chicago
Data Collection in the Thin Client CAPI and GIS Environments (Adobe PDF Icon, 2.4mb)Eric Wilson,
Michael Gerling,
Sarah Nusser,
Alan Dotts,
Andrew Vardeman,
Linda Lawson - USDA
Innovative Retention Methods in Panel Research: Can SmartPhones Improve Long-term Panel Participation? (Adobe PDF Icon, 945kb)James Dayton,
Andrew Dyer - ICF International

Responsive / Adaptive Design - A Decision Making Approach: Using Intelligence to Adjust Data Collection Strategies

Responsive or adaptive design refers to a method for managing surveys that accounts for uncertainties during data collection and uses real-time information obtained during data collection to make decisions. These designs refocus outcomes away from response rate measures inherent in fixed designs to metrics that improve the quality of survey estimates. These sessions focus on CASIC surveys with responsive/adaptive designs and include topics such as:
Coordinators:Dawn Nelson, U.S. Census Bureau
Sean Simone, U.S. Department of Education
Presentation Materials:
Implementing Adaptive Design at the Census Bureau for the National Survey of College Graduates (Adobe PDF Icon, 461kb)Stephanie Coffey,
Benjamin Reist - U.S. Census Bureau
Adaptive Sample Design and Management at NASS (Adobe PDF Icon, 150kb)Jaki McCarthy - National Agricultural Statistics Service
Responsive Design Using Mahalanobis Distancing: Preliminary Results from Two National Center for Education Statistics Longitudinal Surveys (Adobe PDF Icon, 125kb)Elise Christopher,
Ted Socha - National Center for Education Statistics
The Evolution of Electronic Questionnaire Collection Strategy at Statistics Canada (Adobe PDF Icon, 405kb)Wade Kuseler - Statistics Canada
Using Responsive Design to Improve Response and Operational Efficiency Under the Constraints of Time-Sensitive Program Evaluation (Adobe PDF Icon, 245kb)Andy Weiss,
Faith Lewis - Abt SRBI,
Rhoda Cohen - Mathematica Policy Research, Inc.

Data Management

This FedCASIC session focuses on all post-collection data processing - editing, analysis, visualization, dissemination and archiving. While presentations from all of these areas are welcome, this year we hope to highlight presentations on harmonization of data from multiple collection modes - including web, telephone, mobile and paper - and data issues involving mobile computing in general.
Coordinators:David Uglow, RTI International
Jane Shepherd, Westat
Presentation Materials:
The Challenges of Big Data (Adobe PDF Icon, 254kb)Timothy Mulcahy,
Johannes Huessy,
Daniel Gwynne - NORC at the University of Chicago
Information from New Systems for Evaluating Survey Quality at EIA (Adobe PDF Icon, 172kb)Elizabeth Panarelli - EIA
Artificial Intelligence in Data Processing (Adobe PDF Icon, 104kb)Alex Measure - Bureau of Labor Statistics
Data Management Challenges and Lessons Learned in Project Transition (Adobe PDF Icon, 130kb)Maria Hobbs,
Al Bethke - RTI International

Addressing and Reducing Respondent Burden to Gain Cooperation

This session is a continuation of last year's discussion of respondent burden and possible ways to alleviate it. It was made clear last year that an operational definition of burden should be developed. Once defined, discussion will include possible approaches to reducing burden. Some approaches may include questionnaire design, administration mode, cognitive interviewing for survey development, and/or secondary data collection. The population being studied may have different interpretations or burden. Consideration of the population may be appropriate. Case studies of successes or failures may provide insight to the problem of maintaining respondent cooperation as it relates to burden. Hopefully this session will generate some discussion and brain storming for possible future approaches to the reduction of burden for a resulting increase in subject participation.
Coordinator:Barbara Bibb, RTI International
Presentation Materials:
It's About Time: Examining the Effect of Interviewer-Quoted Survey Completion Time Estimates on Nonresponse (Adobe PDF Icon, 187kb)Stacie Greby,
Kathy O'Connor,
Bess Welch,
Christopher Ward,
Jacquelyn George - CDC NCHS
A Comparison of Respondent Burden, Approaches for Minimizing Respondent Burden, and Outcomes - Lessons Learned from Two Cohort Studies with Nested Designs for Congressional and Court Mandated Research (Adobe PDF Icon, 1.1mb)Charlie Knott,
Christopher Lyu,
Dawn Dampier,
Cathy Colvard,
Martha Ryals,
Stephanie Gray,
Fred Crane - Battelle,
Eric Bair,
Gary Slade,
William Maixner - UNC,
Roger Fillingim - UFL,
Richard Ohrbach - UB,
Joel Greenspan - UMD
Decreasing Respondent Burden in the US Census Bureau using CARI (Computer Audio-Recorded Interviewing) (Adobe PDF Icon, 88kb)Carl Fisher - RTI International,
Terence Strait,
Romell McElroy - U.S. Census Bureau
How Statistics Canada will be Reducing Respondent Burden of its Economic Surveys. The Integrated Business Statistics Program (IBSP) (Adobe PDF Icon, 706kb)Michael Sigouin - Statistics Canada
Using Time of Interview to Inform Recruiting Strategies (Adobe PDF Icon, 724kb)Jeff Enos,
Yolanda Lazcano,
James Christy - U.S. Census Bureau

Web-Based Surveys

Web-based surveys are increasingly becoming the norm with certain populations. Cost and accessibility are carefully considered when determining which mode to use for survey administration. With advancements in technology, web-based surveys are becoming more user-friendly, aesthetically pleasing, and easier to deploy. With the growing popularity comes increased interest in aspects such as creating and testing instruments, managing sample, deployment to devices such as tablets and smartphones, and non-response follow-up techniques. In this session, we will explore several developments in web surveys, usability testing, case management, and methods to increase web response rates. Audience input will be strongly encouraged.
Coordinators:Kristina Rall, Mathematica Policy Research, Inc.
Andrew Frost, Mathematica Policy Research, Inc.
Presentation Materials:
Standards and New Electronic Questionnaire (EQ) Surveys Functionalities (Adobe PDF Icon, 1mb)Cindy Gagné,
Robert Godbout - Statistics Canada
Challenges in Making Web Surveys 508 Compliant (Adobe PDF Icon, 673kb)Sandhya Bikmal - RTI International
Leveraging Web Capabilities To Reduce Burden and Cost: Establishment Survey Example (Adobe PDF Icon, 953kb)Kathryn Harper - ICF International
caCURE - An Open-source Survey Toolset (Adobe PDF Icon, 645kb)Bill Tulskie - HealthCare IT, Inc.
Web Diary Feasibility Test: Preliminary Findings and Issues (Adobe PDF Icon, 953kb)Ian Elkin - Bureau of Labor Statistics

Protecting Systems, Data and People in a Rapidly Changing Environment

This session will cover a diversity of issues ranging from protecting the interviewer and interviewing tools in the field, configuring systems locally and in the cloud to meet current security standards and compliance requirements, to emerging technology and implications for security.
Coordinator:Bill Connett, Survey Research Center, Un. of Michigan

Survey Uses of Metadata

Metadata are data that describe other data or processes. For users of data, the metadata are the record of how those data were produced and what the data mean. Metadata are analogous to the work you had to show when solving a math problem in high school. In order to understand the data a survey produces, you must understand the steps that were taken to conduct that survey.

Survey work provides many opportunities to use metadata fruitfully, throughout the survey life-cycle. For instance, data dissemination, data harmonization, and survey documentation all use or produce metadata. This session will explore these and related issues.
Coordinator:Dan Gillman, Bureau of Labor Statistics
Presentation Materials:
Enhancing Transparency and Reproducibility via Frequently-Asked-Questions (Adobe PDF Icon, 3.1mb)Shawna Waugh - Energy Information Administration
Improving NHANES Data Documentation Processes (Adobe PDF Icon, 2.1mb)Jennifer Dostal,
Shannon Corcoran,
Ed Stammerjohn,
Tim Tilert,
Jane Zhang - CDC NCHS
Defining "Core" Metadata: What is Needed to Make Data Discoverable? (Adobe PDF Icon, 56kb)Sandra Cannon - Federal Reserve Board
Generic Statistical Information Model - An Overview (Adobe PDF Icon, 878kb)Dan Gillman - Bureau of Labor Statistics

New Technologies throughout the Survey Lifecycle

The intersection between new technologies and survey research is consistently evolving. This New Technologies session will present potential uses and applications of new and innovative technologies that enhance the efficiency of traditional survey data collection. Presenters from both the public and private sectors will provide evidence and anecdotes of the potential for existing technologies to enhance and improve survey design. Presentations may discuss how these technologies contribute to increased response rates, increased respondent engagement, decreased costs for data collection, or increasing efficiency of survey management. Topics may include cloud computing, crowdsourcing, social networking, virtual computing, and new instrumentation development environments. Questions and discussion are strongly encouraged from session attendees.
Coordinator:Patricia LeBaron, RTI International
Presentation Materials:
Will They Answer the Phone If They Know It's Us? Using Caller ID to Improve Response Rates (Adobe PDF Icon, 445kb)Jeff Boone,
Heather Ridolfo,
Nancy Dickey - National Agricultural Statistics Service
Using a Self-Administered Web-Based System to Replace the Interviewer: The Automated Self-Administered 24-Hour Dietary Recall (ASA24) (Adobe PDF Icon, 527kb)Gordon Willis,
Nancy Potischman,
Sharon I.Kirkpatrick,
Frances E. Thompson,
Beth Mittl,
Thea Palmer Zimmerman,
Christopher Bingley,
Amy F. Subar - National Cancer Institute
Applying Crowdsourcing Methods in Social Science Research (Adobe PDF Icon, 836kb)Michael Keating - RTI International
Using Text-To-Speech Software for ACASI (Adobe PDF Icon, 831kb)Jeff Phillips,
Ed Dolbow,
Brad Edwards - Westat
Wireless Experience with CAPI Collection (Adobe PDF Icon, 475kb)Gyslaine Burns - Statistics Canada