U.S. flag

An official website of the United States government


end of header
2013 Federal CASIC Workshops
 

Workshops Program

Opening Day - March 19, 2013, 9:00-noon

Opening Keynote Speaker

Tuesday, March 19, 9:00-10:25
Emerging Technologies: New Opportunities, Old Challenges
Michael Link, The Nielsen Company Link to a non-federal Web site

Over the past decade advancements in communications and database technologies have radically changed how people access and share information. As a result the opportunities for researchers to measure attitudes, opinions, and behaviors in new ways have gone through perhaps a greater transformation recently than in any previous point in history and this trend appears likely to continue. The availability of smartphones and ubiquity of social media are inter-connected trends which may provide researchers with new data collection tools and alternative sources of information to augment or, in some cases, replace traditional survey research methods. However, this brave new world is not without its share of issues and pitfalls - technological, statistical, and methodological.

Michael W. Link, Ph.D. is Chief Methodologist and Senior Vice President at The Nielsen Company, directing the activities of the Nielsen Measurement Institute. He has a broad-base of experience in survey research, having worked in academia, not-for-profit research, and government before joining Nielsen. Dr. Link's research efforts focus on developing methodologies for confronting the most pressing issues facing measurement science, including improving participation and data quality, using of multiple modes in data collection, and utilizing new technologies such as mobile platforms and social media. Along with several colleagues, he received the American Association for Public Opinion Research 2011 Mitofsky Innovator's Award for his research on address-based sampling. His numerous research articles have appeared in leading scientific journals.


Plenary Panel

Tuesday, March 19, 10:35-noon
Social Media and Survey Research

Recent years have witnessed increased challenges to efficient, high quality survey data collection. Response rates have continued their steady decline and the increasing proportion of wireless-only Americans has resulted in a loss of landline telephone coverage for sampling purposes. Such challenges require increased resources to maintain quality and production in surveys, but Federal agencies continue to face budget uncertainty. Concurrent with these trends, there has been a meteoric rise in the general popularity of social media and web 2.0 platforms like Facebook and Twitter, among others. As the proportion of Americans on Facebook approaches that of Americans with landline telephones, it is natural to inquire about whether social media holds promise for gaining access to individuals for survey purposes.

But how useful are these data for addressing research questions typically investigated using traditional survey approaches? Does social media hold promise as a cheap and quick sampling frame, locating resource, pool of ready respondents for cognitive assessments, representative surveys, or something else? There are several perspectives on these issues informed by research within and outside of Federal survey work and methodologists are beginning to grapple with how to evaluate the properties of social media data from a total error perspective.

This panel will include researchers who have considered these questions in their research and evidence from their studies about the potential for social media to supplement survey data collection. The plenary will provide the results of research, recommendations for other survey researchers to consider regarding the utility of social media data, and several perspectives on the future viability of these data sources for research purposes.

Panelists:
Session Chair: Lisa Thalji, RTI

March 19, 2013, 1:30-4:30

Recent Innovations

Tuesday, March 19, 1:30-4:30
This session will give organizations an opportunity to share information about recent innovations in CASIC approaches, including hardware, software, training, research, major organizational changes, new surveys, etc. Organizations that wish to participate should contact Bill Mockovak in advance.

The following speakers will be presenting:
Coordinator:Bill Mockovak, Bureau of Labor Statistics

Demonstrations

Tuesday, March 19, 1:30-4:30
The 2013 FedCASIC will host demonstrations from organizations with innovative CASIC instruments and software. Participating organizations will have the opportunity to demonstrate and showcase their CASIC related technologies with government and academic survey methodologists. The demonstrations will take place in a small exhibit hall setting, with tables set up for demonstrations. Demonstrators may use laptops, displays, and handouts to present their technologies to FedCASIC attendees. Attendees will be free to move throughout the exhibits and learn about new CASIC instruments and software.
Target Audience: Government, academic, and industry professionals with interest in computer assisted survey information collection (CASIC).
Coordinators:Louis Harrell, Bureau of Labor Statistics
Matthew Burgess, Bureau of Labor Statistics
Presentations:
Performing Next Generation Big Data Analytics - Connecting the DotsGary Arnett, Mark Segal - L-3 Communications,
Jeff Wootton - Palantir Technologies
COMET, Web-based System To Process Raw Payroll Data Submitted Electronically by National, Multi-unit FirmsMatthew Burgess, Mangala Kuppa - Bureau of Labor Statistics
Brandt Information Services CATI DemonstrationGreg Weeks, Adrienne Johnston, George Foster - Brandt Information Services, Inc.
Use of Mobile and Web Technology in Blaise Multimode Environments: Lessons LearnedLon Hofman, Roger Linssen - Statistics Netherlands
Statistics Canada's Consumer Price IndexChris Kit , Ryan Williams - Statistics Canada
iPad as Viewer and Memory Aid for Respondents in a Collaborative Approach to CAPI InterviewingEric White, Kate Golen, Chris Schlapper - University of Wisconsin Survey Center
Enumeration Application for Mobile DevicesDanielle Lessard - U.S. Census Bureau,
Phillip Simulis - PORTAL Technologies
Demonstration of the CARI Interactive Data Access SystemCarl Fisher, Rita Thissen - RTI International

Technical Workshop Session Topics

March 20, 2013, 9:00-noon

Mobile Technology Assessments for Cost, Experience, and Quality

Wednesday, March 20, 9:00-noon
The increasing use of feature- and smart-phone technology, iPads, and Android platforms is changing the landscape for survey data collection, providing alternatives to traditional methods. These new technologies present technological, human-computer interface design, coverage and sampling, data quality, and methodological challenges. They also present advances in opportunities to monitor field surveys with real-time tracking and communication. This session examines some of these challenges and advances.
Target Audience: A variety of survey research professionals would benefit from this session, including programming staff, data collection managers, survey designers, survey methodologists and sampling statisticians.
Coordinator:Lew Berman, ICF International
Presentations:
Challenges and Advances in Using the iPad® Computer-Assisted Personal Interview SystemAutumn Foushee, Heather Driscoll - ICF International
There's an App for That: A Data Quality Review of a Transition from PAPI Data Collection to Smartphone Data Collection in a Vendor Management StudyAmy Hendershott, Leslie Erickson, Wandy Stephenson, Daniel Keever - RTI International
Lessons Learned: Using Tablets in the Field and the Future of Mobile Data CollectionMark Brinkley, Gene Shkolnikov - Mathematica Policy Research, Inc.
Cost Implications of New Address Listing Technology: Implications for Efficiency and Data QualityKatie Dekker - NORC at the University of Chicago

Paradata as Intelligence for Decision Making: Methods to Control Cost, Quality, and Production Operations

Wednesday, March 20, 9:00-noon
Paradata supply intelligence for decision making in the planning and execution of survey processes.

The session will showcase ways in which researchers are utilizing this intelligence in its myriad forms, highlighting the measurable efficiencies that have been gained before, during, and after data collection in order to meet operational and methodological goals. The session will include lively discussion with both presenters and audience members regarding recent findings and innovations in the use of paradata, the principle challenges researchers and organizations face in utilizing paradata, and the limitations to paradata-driven solutions to our operational and methodological challenges.
Coordinators:Chris Stringer , RTI International
Dan Zahs, Survey Research Center, Un. of Michigan
Presentations:
Using Audit Trails to Find and Explain Survey ErrorRenee Gindi - NCHS
Using Paradata to Track and Improve Interviewer Quality Across Projects and Over TimeKyle Fennell - NORC at the University of Chicago
Selection of Scoring Criteria in Computer Audio Recorded Interview (CARI) CodingCarl Fisher - RTI International
Using Paradata to Improve Survey Efficiency for Linguistic MinoritiesKari Carris - NORC at the University of Chicago

Management Challenges in CAI Survey Organizations

Wednesday, March 20, 9:00-noon

This session will provide CAI survey managers and researchers dealing with management and administrative challenges a venue to share their knowledge and learn how others are approaching these issues. A panel of experts from government and industry will discuss key topics with audience participation including questions and shared experiences.

Panelists include:

Attendees will hear about the techniques used in different organizations to address key management issues, participate in a discussion of these issues, and have an opportunity to ask the panelists about effective approaches to these situations.

Target Audience: Survey managers, technology managers, and researchers
Coordinators:David Uglow, RTI International
Jane Shepherd, Westat

Blending CASIC Designs with Data from Records

Wednesday, March 20, 9:00-noon
With decreasing response rates and increasing survey costs, interest in ways to incorporate administrative records in surveys has surged. These ways range from abandoning surveys entirely (e.g., censuses in some European countries) to closely integrating survey questions and administrative records data in early stages of survey design. This session will present some examples of innovative uses of administrative records in household and establishment surveys, and discuss their impact on survey quality and costs.
Target Audience: All who are interested in reducing survey costs and increasing survey quality through use of records.
Coordinator:David Cantor, Westat
Presentations:
Administrative Records Coverage of Demographic Response Data in the American Community SurveyRenuka Bhaskar - U.S. Census Bureau
Bringing Data Sources TogetherMark Martin - Office for National Statistics (UK)
Testing Record Linkage Production Data QualityK. Bradley Paxton - ADI LLC.
The HUD Quality Control Study - Collecting Data through File Record Abstraction, CAPI and Administrative Records to Fulfill Mandatory Improper Payment ReportingSophia I. Zanakos - ICF International

March 20, 2013, 1:30-4:30

Applications of Mobile Technology in the Developing World, Agriculture and Longitudinal Studies

Wednesday, March 20, 1:30-4:30
The increasing use of feature- and smart-phone technology, iPads, and Android platforms is changing the landscape for survey data collection, providing alternatives to traditional methods. These new technologies present technological, human-computer interface design, coverage and sampling, data quality, and methodological challenges. They also present advances in opportunities to monitor field surveys with real-time tracking and communication. This session examines some of these challenges and advances.
Target Audience: A variety of survey research professionals would benefit from this session, including programming staff, data collection managers, survey designers, survey methodologists and sampling statisticians.
Coordinator:Jim Caplan, Department of Defense
Presentations:
Mobile Device use in Underserved Areas - Challenges, Considerations and Lessons LearnedAbie Reifer - Westat
CAPI Surveys on Android Devices in the Developing WorldSam Haddaway - NORC at the University of Chicago
Data Collection in the Thin Client CAPI and GIS EnvironmentsEric Wilson, Michael Gerling, Sarah Nusser, Alan Dotts, Andrew Vardeman, Linda Lawson - USDA
Innovative Retention Methods in Panel Research: Can SmartPhones Improve Long-term Panel Participation?James Dayton, Andrew Dyer - ICF International

Responsive / Adaptive Design - A Decision Making Approach: Using Intelligence to Adjust Data Collection Strategies

Wednesday, March 20, 1:30-4:30
Responsive or adaptive design refers to a method for managing surveys that accounts for uncertainties during data collection and uses real-time information obtained during data collection to make decisions. These designs refocus outcomes away from response rate measures inherent in fixed designs to metrics that improve the quality of survey estimates. These sessions focus on CASIC surveys with responsive/adaptive designs and include topics such as:
Target Audience: Survey Methodologists, Survey Directors, Data Collection Managers, Analysts
Coordinators:Dawn Nelson, U.S. Census Bureau
Sean Simone, U.S. Department of Education
Presentations:
Implementing Adaptive Design at the Census Bureau for the National Survey of College GraduatesStephanie Coffey, Benjamin Reist - U.S. Census Bureau
Adaptive Sample Design and Management at NASSJaki McCarthy - National Agricultural Statistics Service
Responsive Design Using Mahalanobis Distancing: Preliminary Results from Two National Center for Education Statistics Longitudinal SurveysElise Christopher, Ted Socha - National Center for Education Statistics
The Evolution of Electronic Questionnaire Collection Strategy at Statistics CanadaWade Kuseler - Statistics Canada
Using Responsive Design to Improve Response and Operational Efficiency Under the Constraints of Time-Sensitive Program EvaluationAndy Weiss, Faith Lewis - Abt SRBI,
Rhoda Cohen - Mathematica Policy Research, Inc.

Data Management

Wednesday, March 20, 1:30-4:30
This FedCASIC session focuses on all post-collection data processing - editing, analysis, visualization, dissemination and archiving. While presentations from all of these areas are welcome, this year we hope to highlight presentations on harmonization of data from multiple collection modes - including web, telephone, mobile and paper - and data issues involving mobile computing in general.
Target Audience: A variety of survey research professionals would benefit from this session including survey managers, data managers, survey designers, programming staff, and methodologists.
Coordinators:David Uglow, RTI International
Jane Shepherd, Westat
Presentations:
The Challenges of Big DataTimothy Mulcahy, Johannes Huessy, Daniel Gwynne - NORC at the University of Chicago
Information from New Systems for Evaluating Survey Quality at EIAElizabeth Panarelli - EIA
Artificial Intelligence in Data ProcessingAlex Measure - Bureau of Labor Statistics
Data Management Challenges and Lessons Learned in Project TransitionMaria Hobbs, Al Bethke - RTI International

Addressing and Reducing Respondent Burden to Gain Cooperation

Wednesday, March 20, 1:30-4:30
This session is a continuation of last year's discussion of respondent burden and possible ways to alleviate it. It was made clear last year that an operational definition of burden should be developed. Once defined, discussion will include possible approaches to reducing burden. Some approaches may include questionnaire design, administration mode, cognitive interviewing for survey development, and/or secondary data collection. The population being studied may have different interpretations or burden. Consideration of the population may be appropriate. Case studies of successes or failures may provide insight to the problem of maintaining respondent cooperation as it relates to burden. Hopefully this session will generate some discussion and brain storming for possible future approaches to the reduction of burden for a resulting increase in subject participation.
Target Audience: Survey authors, implementers, and administrators.
Coordinator:Barbara Bibb, RTI International
Presentations:
It's About Time: Examining the Effect of Interviewer-Quoted Survey Completion Time Estimates on NonresponseStacie Greby, Kathy O'Connor, Bess Welch, Christopher Ward, Jacquelyn George - CDC NCHS
A Comparison of Respondent Burden, Approaches for Minimizing Respondent Burden, and Outcomes - Lessons Learned from Two Cohort Studies with Nested Designs for Congressional and Court Mandated ResearchCharlie Knott, Christopher Lyu, Dawn Dampier, Cathy Colvard, Martha Ryals, Stephanie Gray, Fred Crane - Battelle,
Eric Bair, Gary Slade, William Maixner - UNC,
Roger Fillingim - UFL,
Richard Ohrbach - UB,
Joel Greenspan - UMD
Decreasing Respondent Burden in the US Census Bureau using CARI (Computer Audio-Recorded Interviewing)Carl Fisher - RTI International,
Terence Strait, Romell McElroy - U.S. Census Bureau
How Statistics Canada will be Reducing Respondent Burden of its Economic Surveys. The Integrated Business Statistics Program (IBSP)Michael Sigouin - Statistics Canada
Using Time of Interview to Inform Recruiting StrategiesJeff Enos, Yolanda Lazcano, James Christy - U.S. Census Bureau

March 21, 2013, 9:00-noon

Web-Based Surveys

Thursday, March 21, 9:00-noon
Web-based surveys are increasingly becoming the norm with certain populations. Cost and accessibility are carefully considered when determining which mode to use for survey administration. With advancements in technology, web-based surveys are becoming more user-friendly, aesthetically pleasing, and easier to deploy. With the growing popularity comes increased interest in aspects such as creating and testing instruments, managing sample, deployment to devices such as tablets and smartphones, and non-response follow-up techniques. In this session, we will explore several developments in web surveys, usability testing, case management, and methods to increase web response rates. Audience input will be strongly encouraged.
Target Audience: Survey managers and researchers
Coordinators:Kristina Rall, Mathematica Policy Research, Inc.
Andrew Frost, Mathematica Policy Research, Inc.
Presentations:
Standards and New Electronic Questionnaire (EQ) Surveys FunctionalitiesCindy Gagné, Robert Godbout - Statistics Canada
Challenges in Making Web Surveys 508 CompliantSandhya Bikmal - RTI International
Leveraging Web Capabilities To Reduce Burden and Cost: Establishment Survey ExampleKathryn Harper - ICF International
caCURE - An Open-source Survey ToolsetBill Tulskie - HealthCare IT, Inc.
Web Diary Feasibility Test: Preliminary Findings and IssuesIan Elkin - Bureau of Labor Statistics

Protecting Systems, Data and People in a Rapidly Changing Environment

Thursday, March 21, 9:00-noon
This session will cover a diversity of issues ranging from protecting the interviewer and interviewing tools in the field, configuring systems locally and in the cloud to meet current security standards and compliance requirements, to emerging technology and implications for security.
Target Audience: Anyone interested, involved in or wishing to learn about the fascinating issues of securing systems, people and data in a rapidly changing environment should attend.
Coordinator:Bill Connett, Survey Research Center, Un. of Michigan
Presentations:
Securing Web Applications Against Cyber-AttacksAnwar Mohammed - RTI International
Leveraging Cloud Technology to Improve Study Operations Continuity and ResiliencyDennis Pickett, Ray Snowden - Westat
Secure Mobile Data Collection: From iPads to Blaise/ISGlenn Jones - Mathematica Policy Research, Inc.
An Overview of the Microsoft Cloud with Security ObservationsMarcus Blough - Survey Research Center, Un. of Michigan
The Internet of Things (IoT) and Implications for SecurityBill Connett - Survey Research Center, Un. of Michigan

Survey Uses of Metadata

Thursday, March 21, 9:00-noon
Metadata are data that describe other data or processes. For users of data, the metadata are the record of how those data were produced and what the data mean. Metadata are analogous to the work you had to show when solving a math problem in high school. In order to understand the data a survey produces, you must understand the steps that were taken to conduct that survey.

Survey work provides many opportunities to use metadata fruitfully, throughout the survey life-cycle. For instance, data dissemination, data harmonization, and survey documentation all use or produce metadata. This session will explore these and related issues.
Coordinator:Dan Gillman, Bureau of Labor Statistics
Presentations:
Enhancing Transparency and Reproducibility via Frequently-Asked-QuestionsShawna Waugh - Energy Information Administration
Improving NHANES Data Documentation ProcessesJennifer Dostal, Shannon Corcoran, Ed Stammerjohn, Tim Tilert, Jane Zhang - CDC NCHS
Defining "Core" Metadata: What is Needed to Make Data Discoverable?Sandra Cannon - Federal Reserve Board
Generic Statistical Information Model - An OverviewDan Gillman - Bureau of Labor Statistics

New Technologies throughout the Survey Lifecycle

Thursday, March 21, 9:00-noon
The intersection between new technologies and survey research is consistently evolving. This New Technologies session will present potential uses and applications of new and innovative technologies that enhance the efficiency of traditional survey data collection. Presenters from both the public and private sectors will provide evidence and anecdotes of the potential for existing technologies to enhance and improve survey design. Presentations may discuss how these technologies contribute to increased response rates, increased respondent engagement, decreased costs for data collection, or increasing efficiency of survey management. Topics may include cloud computing, crowdsourcing, social networking, virtual computing, and new instrumentation development environments. Questions and discussion are strongly encouraged from session attendees.
Target Audience: A variety of survey research professionals would benefit from this session, including programming staff, data collection managers, survey designers, survey methodologists and sampling statisticians.
Coordinator:Patricia LeBaron, RTI International
Presentations:
Will They Answer the Phone If They Know It's Us? Using Caller ID to Improve Response RatesJeff Boone, Heather Ridolfo, Nancy Dickey - National Agricultural Statistics Service
Using a Self-Administered Web-Based System to Replace the Interviewer: The Automated Self-Administered 24-Hour Dietary Recall (ASA24)Gordon Willis, Nancy Potischman, Sharon I.Kirkpatrick, Frances E. Thompson, Beth Mittl, Thea Palmer Zimmerman, Christopher Bingley, Amy F. Subar - National Cancer Institute
Applying Crowdsourcing Methods in Social Science ResearchMichael Keating - RTI International
Using Text-To-Speech Software for ACASIJeff Phillips, Ed Dolbow, Brad Edwards - Westat
Wireless Experience with CAPI Collection Gyslaine Burns - Statistics Canada
 


Is this page helpful?
Thumbs Up Image Yes Thumbs Down Image No
NO THANKS
255 characters maximum 255 characters maximum reached
Thank you for your feedback.
Comments or suggestions?