U.S. flag

An official website of the United States government


end of header
2012 Federal CASIC Workshops
 

Workshops Program

Opening Day - March 27, 2012, 9:00-noon

Opening Keynote Speaker

Tuesday, March 27, 9:00-noon
Paradata within the Total Survey Error Framework: Successes, Challenges, and Gaps
Dr. Frauke Kreuter Link to a non-federal Web site
Associate Professor
Joint Program in Survey Methodology
University of Maryland

Paradata now play an increasing role in the survey production process within the Federal Statistical agencies. So far paradata are mostly used to address nonresponse error, though in some instances paradata are also used to examine measurement error. To expand our thinking about the roles paradata can plan, this presentation will have three parts: First, I will highlight current paradata collection activities within the Federal Statistical System and elsewhere and summarize typical applications. Second, I will discuss the challenges we have seen in the collection and use of paradata, in particular those arising through measurement error inherent in those paradata. Third, I will outline the use of paradata within the larger Total Survey Error framework. Paradata can help us understand many of the additional error sources beyond NR and ME, such as coverage and coding error. Some of the data we need for this larger viewpoint are already routinely collected, even if they are not necessarily thought of as paradata. We will discuss whether integrating these additional data into the TSE framework can help us improve the survey production process as a whole.

Frauke Kreuter is an Associate Professor at the University of Maryland in the Joint Program in Survey Methodology, and currently head of the statistical methods research department at the Institute for Employment Research in Nuremberg, Germany. She has taught several short courses on paradata, and is guest editor of a special issue of JRSSA on paradata. Right now she is organizing an edited volume on the improvement of surveys through paradata. Her research on paradata is partially funded through a 3-year ESRC Research Grant (with G. Durrant and P. Smith both at the University of Southampton: RES-062-23-2997).


Plenary Panel

Tuesday, March 27, 9:00-noon
The Use of Paradata to Improve Survey Quality:
Organizational Approaches and Challenges

In recent years, FedCASIC paradata workshops have focused on how data collection organizations have used paradata to monitor survey process quality, and on the development of systems for paradata collection and processing and of dashboards for improving paradata accessibility. Presentations often have provided examples of paradata used to monitor survey data collection for a study, or the use of one type of paradata, such as audit trails or call history data. There have been fewer examples of the use of standard and easily tailored indicators across modes and types of surveys (e.g., cross-sectional, longitudinal, or enterprise surveys) to assess survey progress quality.

In this plenary panel, several organizations will address the following issues in relation to their own use or planned use of paradata at organizational and/or study levels:

The panelists will include:

Coordinators:Sue Ellen Hansen, Survey Research Center, Un. of Michigan
Chris Stringer , U.S. Census Bureau

March 27, 2012, 1:30-4:30

Recent Innovations

Tuesday, March 27, 1:30-4:30
This session will give organizations an opportunity to share information about recent innovations in CASIC approaches, including hardware, software, training, research, major organizational changes, new surveys, etc. Organizations that wish to participate should contact Bill Mockovak in advance.

The following speakers will be presenting:
Coordinator:Bill Mockovak, Bureau of Labor Statistics

Demonstrations

Tuesday, March 27, 1:30-4:30
This year we will continue to offer demonstrations of CASIC instruments and software in a mini exhibit hall setting, where attendees can move among exhibitors throughout the demonstration period.
Coordinator:Louis Harrell, Bureau of Labor Statistics
Presentations:
CARI Interactive Data Access SystemCarl Fisher, RTI International
Enhancing Multi-mode Data Collection Through Support for an Additional LanguageMatthew Burgess, Bureau of Labor Statistics
State Data ToolJeri M. Mulrow, National Center for Science and Engineering Statistics
Recent Mobile Data Collection Apps from NORCJosh Seeger, NORC
NORC's Mobile Case Management and Data Collection ApplicationAli Aga, NORC
Fishing, Hunting, and Wildlife Associated Recreation (FHWAR) Survey Quality Assurance ProgramSelvin Guevara,, U.S. Census Bureau
Windows SharePoint Services for Cross-Site Collaboration and Data Collection ManagementMelissa Duggar, Mathematica Policy Research, Inc.

Technical Workshop Session Topics

March 28, 2012, 9:00-noon

New Technologies: Mobile Data Collection

Wednesday, March 28, 9:00-noon
The presentations in this session will cover uses and applications of mobile technologies for data collection in Africa and the developing world, listing, and study recruitment. Technologies will include tablets, Android systems, and digital pen and paper.
Target Audience: A variety of survey research professionals would benefit from this session, including programming staff, data collection managers, survey designers, survey methodologists and sampling statisticians.
Coordinators:Patty LeBaron, Research Triangle Institute
Lew Berman, Centers for Disease Control
Caitlin Blair, Bureau of Labor Statistics
Presentations:
Mobile Data Collection Lessons LearnedJosh Seeger, NORC
Use of Mobile Technology in Support of Study Recruitment and Field Data CollectionAbie Reifer, Westat
Mobile to Web: An Integrated Model for Mobile Data Collection and Web-Based Monitoring and ReportingDavid Cantor, ICFI
Mobile Technology Applications for Verbal AutopsyErin Nichols, CDC / NCHS
Sam Notzon, CDC / NCHS
Surveys in the Developing World: UN Nutrition Surveys Using Digital Pen and PaperGreg Clary, Mi-Co

Survey Uses of Metadata

Wednesday, March 28, 9:00-noon
Metadata are data that describe other data or processes. For users of data, the metadata are the record of how those data were produced and what the data mean. Metadata are analogous to the work you had to show when solving a math problem in high school. In order to understand the data a survey produces, you must understand the steps that were taken to conduct that survey.

Survey work provides many opportunities to use metadata fruitfully, throughout the survey life-cycle. For instance, data dissemination, data harmonization, and survey documentation all use or produce metadata. This session will explore these and related issues.
Coordinator:Dan Gillman, Bureau of Labor Statistics
Presentations:
Metadata: To Boldly Go Where No One Has Gone Before?Pascal Heus, Metadata Technology
Open Government Vocabularies and MetadataDan Gillman, Bureau of Labor Statistics
A Metadata Reference Model for IRS DataJeff Butler, IRS
Using DDI 3 to Manage the Use of Coded Data in Longitudinal StudiesAlexandra Shlionskaya, Booz-Allen Hamilton
Sophia Kuan, Booz-Allen Hamilton
Jay Greenfield, Booz-Allen Hamilton

Management Challenges in CAI Survey Organizations

Wednesday, March 28, 9:00-noon
This session will provide a venue for those grappling with management and administrative challenges in today's CAI environment to share their knowledge and learn from others. A panel of 4-5 management experts from government and industry will discuss the following topics:
Panelists will include:

Audience participation in the form of questions and shared experiences will be encouraged. Session attendees will hear about the techniques used in different organizations to address key management issues, participate in a discussion of these issues, and have an opportunity to ask the panelists about effective approaches to common situations.
Coordinators:Karen Davis, Research Triangle Institute
Jane Shepherd, Westat
LaTerri Bynum, U.S. Census Bureau

Addressing and Reducing Respondent Burden to Gain Cooperation

Wednesday, March 28, 9:00-noon
This session will be considering respondent burden and possible ways to alleviate it. Respondent burden is created by questionnaire length, questionnaire complexity, and repeated contact. This session will initiate a discussion of how to reduce respondent burden and gain cooperation. Elements that could be discussed with respect to respondent burden are questionnaire design, longitudinal study design, and the technology to make administration quicker, easier, or simpler. Case studies of successes or failures could also be discussed as a method of brainstorming solutions to the problem of maintaining respondent cooperation as it relates to burden. The target audience consists of survey authors, implementers, and administrators.
Coordinator:Barbara Bibb, Research Triangle Institute
Presentations:
Survey Burden: an Enabler's Perspective and Guilt FeelingsMark Pierzchala, MMP Survey Services, LLC
Questionnaire Complexity and Respondent BurdenAdrianne Gilbert, RTI International
Dawn Thomas-Banks, RTI International
Gaining Cooperation while Minimizing Respondent Burden on NHANESMercy Merino Rodriguez, CDC NCHS
Tatiana Nwankwo, CDC NCHS
Respondent Burden: SummaryBarbara Bibb, Research Triangle Institute
Lillie Barber, RTI International
Behnaz Whitmire, RTI International
Ansu Koshy, RTI International
Chuchun Chien, RTI International

March 28, 2012, 1:30-4:30

New Technologies: New Uses of Existing Technologies

Wednesday, March 28, 1:30-4:30
This New Technologies session will feature discussion of the support existing technologies provide to various stages of the survey lifecycle. Presenters from both the public and private sectors will discuss, for instance, technology's role in creating a dual-frame sample, decreasing nonresponse, and providing case management across multiple modes. Presenters will provide evidence and anecdotes of the potential for existing technologies to enhance and improve survey design. Questions and discussion are strongly encouraged from session attendees.
Target Audience: A variety of survey research professionals would benefit from this session, including programming staff, data collection managers, survey designers, survey methodologists and sampling statisticians.
Coordinators:Patty LeBaron, Research Triangle Institute
Lew Berman, Centers for Disease Control
Caitlin Blair, Bureau of Labor Statistics
Presentations:
Addressing non-response bias in TPOPS: results from a cell phone frame testGabriela Arcos, Bureau of Labor Statistics
Has text messaging increased participant compliance on NHANES?Tatiana Nwankwo, CDC NCHS
Innovations: Video Data Collection, Processing and CodingRick Dulaney, Westat
Chris de los Santos, Westat
Rick Rogers, Fenestra
Greg Binzer, Westat
HINTS-GEM: Using Science 2.0 to Facilitate Data Integration in Constructing a National Health SurveyRichard P. Moser, NCI
Ellen Burke Beckjord, University of Pittsburgh Medical Center
Lila Finney Rutten, SAIC-Frederick, Inc.
Kelly Blake, NCI
Bradford W. Hesse, NCI
Nirvana - An Enlightened Survey Management SystemR. Suresh, RTI International

CARI: Approaches for Quality Control and Interviewer Performance Feedback

Wednesday, March 28, 1:30-4:30
Over the past ten years, the Computer Audio-Recorded Interviews (CARI) has become an important tool in survey administration. Survey managers and methodologists have used recorded interviews to manage and monitor the quality of data collection, address interviewers' effectiveness and also identify improvements necessary in questionnaires. Many survey organizations have developed their own tools and procedures to utilize this CARI technology. In this session, we will focus on the process of reviewing the recordings:
Coordinators:Gina Cheung, Survey Research Center, Un. of Michigan
Patty Maher, Survey Research Center, Un. of Michigan
Presentations:
Use of CARI to Standardize Field Interviewer PerformanceSusan H. Kinsey, RTI International
Providing Interviewer Performance Feedback Using CARICarl Fisher, RTI International
Using CARI to Conduct Behavior Coding Analysis of Alternative QuestionnairesJoanne Pascale, U.S. Census Bureau
Examining interviewer behavior in handling 'difficult' casesWendy Hicks, Westat
Aaron Maitland, Westat
Brad Edwards, Westat

Usability and Accessibility

Wednesday, March 28, 1:30-4:30
This session will cover the usability and accessibility of CATI and CAPI instruments along with web surveys. Presentations may include topics such as how to incorporate usability and accessibility into the development process and methods for conducting evaluations. Presenters will also discuss lessons learned from their usability or accessibility experience.
Target Audience: From survey managers to survey developers, including usability and accessibility professionals; not too technical.
Coordinator:Jean Fox, Bureau of Labor Statistics
Presentations:
Section 508 Refresh OverviewJennifer Horan, DOL
Accessibility Testing: The Role of Tools, Screen Readers, and Manual MethodsKaren Brenner, Westat
Usability vs Accessibility in Websites/Web SurveysSandhya Bikmal, RTI International
Sridevi Sattaluri, RTI International
Unmoderated Cognitive and Usability Testing Using the WebJennifer Edgar, Bureau of Labor Statistics
Bill Mockovak, Bureau of Labor Statistics
First Fridays Product Testing ProgramKristal Byrd, GSA

Data Management

Wednesday, March 28, 1:30-4:30
This year's session will focus on how CAI data becomes public data, and how external public data can enhance the usefulness of CAI data. Topics could include:
Coordinators:Jane Shepherd, Westat
David Uglow, Research Triangle Institute
Presentations:
Selective EditingElizabeth Panarelli, EIA
Interoperability Through Vocabulary RegistriesDan Gillman, Bureau of Labor Statistics
The NCSES Data System and Metadata SchemaKimberly Noonan, NSF
Evolution of Data Release Documentation for Continuous NHANES (NHANES 99+)
Dashboards and Portals: Tailoring the User ViewLeena Dave, RTI International
Deepa Avula, CSAT
Susan Eversole, RTI International
Bharathi Golla, RTI International
Sujatha Lakshmikanthan, RTI International
William Savage, RTI International

March 29, 2012, 9:00-noon

New Technologies: Survey Management Technologies

Thursday, March 29, 9:00-noon
This New Technologies session will focus on the complex challenges of managing a survey and on the ways in which new technology affects the future of survey management. Presenters from both the public and private sectors will discuss a variety of facets of survey management, ranging from the challenges of data collection application development to system monitoring to integrating data systems. We will be discussing current challenges and ideas for the future as well as new innovations in the field. Questions and discussion are strongly encouraged from session attendees.
Target Audience: A variety of survey research professionals would benefit from this session, including programming staff, data collection managers, survey designers, survey methodologists and sampling statisticians.
Coordinators:Patty LeBaron, Research Triangle Institute
Lew Berman, Centers for Disease Control
Caitlin Blair, Bureau of Labor Statistics
Presentations:
Cross-platform, Disconnected State Mobile Application DevelopmentJonathan Krentel, Gunnison Consulting Group, Inc.
Sandra Dyer, U.S. Census Bureau
It's in the Clouds: Electronic Data CollectionSusan Harris, Energy Information Administration
Custom Dash Board for Comprehensive Project and Facility Management on a Large Scale StudyMaria Hobbs, RTI International
David Forvendel, RTI International
Expanding DCAS (Data Collection Application Suite) for Survey Data Collection on Mobile PlatformsMichael Volynski, InfoPro Systems, Inc.
Integrated Sample ManagementSteven Lehrfeld, Mathematica Policy Research, Inc.

Web-Based Surveys

Thursday, March 29, 9:00-noon
Web-based surveys continue to increase in popularity and advances in technology have made them easier to develop and deploy. Given their popularity, we are increasingly interested in many aspects of web surveys such as creating and testing web-based instruments, sample management, web survey deployment on smartphones and tablets, usability, and the always difficult task of maintaining a high response rate. In this session, we will explore several technical developments in web surveys, usability testing, case management, and methods to increase web response rates. Audience input will be strongly encouraged.
Target Audience: Survey managers and researchers
Coordinators:Mark Brinkley, Mathematica Policy Research, Inc.
Kirsten Barrett, Mathematica Policy Research, Inc.
Presentations:
Household Survey and Establishment Electronic Data CollectionMike Hart, UK Office for National Statistics
Lessons Learned: Adding Support for a Foreign Language to the CES Data Collection Web SiteJulie Hatch, Bureau of Labor Statistics
Testing Complex Web Surveys Using Automated Test ToolsAnwar Mohammed, RTI International
Gilberto Munoz, RTI International
Role of Survey Management Systems in Web Based Data CollectionBryan Davis, Westat

Going Mobile: Ensuring Security of New Data Collection Platforms

Thursday, March 29, 9:00-noon
To reduce cost and increase response rates, it makes business sense to consider engaging with collection staff and respondents using the types of devices and social networking sites with which they are accustomed and familiar.

However, can we ensure adequate security for these methods? Can we say ?yes? to data collection, and/or related activities, via tablets, smartphones, and Facebook?

In this session we will explore the challenges, and potential solutions, to ensuring that data collection activities using mobile devices, or through social media sites, maintains adequate security and compliance protections.

Target Audience: Anyone interested in knowing more about these technology trends and how they might be safely employed for information collection activities. While some material will be technical in nature, presentation as a whole will be relatable for any audience
Coordinators:Paul Blahusch, Bureau of Labor Statistics
Bill Connett, Survey Research Center, Un. of Michigan
Presentations:
Mobile: Ensuring Security of New Data Collection PlatformsPaul Blahusch, Bureau of Labor Statistics
Mobile Device Data Collection and its Security Attack SurfacesGlenn Jones, Mathematica Policy Research
Security Control for Utilizing Social Networking SitesDiana Salazar, NORC

Blending CASIC Designs with Administrative Records

Thursday, March 29, 9:00-noon
With decreasing response rates and increasing survey costs, interest in ways to incorporate administrative records in surveys has surged. These ways range from abandoning surveys entirely (e.g., censuses in some European countries) to closely integrating survey questions and administrative records data in early stages of survey design. This session will present some examples of innovative uses of administrative records in household and establishment surveys, and discuss their impact on survey quality and costs.
Target Audience: All who are interested in reducing survey costs and increasing survey quality.
Coordinator:Brad Edwards, Westat
Presentations:
Use of Administrative Records in NCES Secondary and Postsecondary Sample Surveys Kristin M. Dudley, RTI International
Moving towards convergence: The National Immunization Survey (NIS) and Immunization Information Systems (IIS)Stacie Greby, CDC NCHS
Karen Cullen, CDC NCHS
Ken Copeland, NORC at the University of Chicago
Vicki Pineau, NORC at the University of Chicago
Sabrina Bauroth, NORC at the University of Chicago
The Effect of Reporting Mode on Administrative Records: Are We Sacrificing Quality for Convenience?Marilyn Worthy, Energy Information Administration
Danielle Mayclin, Energy Information Administration
Administrative Records as a Potential Source of Data for Expenditure SurveysSid Schneider, Westat
David Cantor, Westat
Brad Edwards, Westat
Abie Reifer, Westat
 

Go to Welcome


Is this page helpful?
Thumbs Up Image Yes Thumbs Down Image No
NO THANKS
255 characters maximum 255 characters maximum reached
Thank you for your feedback.
Comments or suggestions?