2008 Federal CASIC Workshops
 
FedCASIC Workshops Presentations By Year:
 
Dates: Tuesday March 11 through Thursday March 13, 2008.
Place: Bureau of Labor Statistics Conference Center, Postal Square Building,
2 Massachusetts Ave., Washington, D.C. 20212
Sponsors: The U.S. Census Bureau and the Bureau of Labor Statistics
 

Workshops Program

Opening Day - Tuesday, March 11, 2008

A. Plenary Sessions (Tuesday 9:00 am to 12:00 noon)

The meetings will begin with two 80-minute, consecutive, plenary sessions.

P-1. Opening Keynote Plenary: Tentative Title – Mode Effects, Mode Preferences, Low Response Rates and Other Challenges Facing CASIC.

Don Dillman is Regents Professor and the Thomas S. Foley Distinguished Professor of Government and Public Policy at Washington State University, where he has been a faculty member since 1969. In 1970 he was the founding coordinator of its Public Opinion Laboratory, one of the first university telephone survey centers in the United States. His books Mail and Telephone Surveys (1978) and Mail and Internet Surveys (2000) were among the first to describe procedures for conducting surveys by these modes. In 1991 he was the first person to serve in the Office of the Director as the U.S. Census Bureau’s Senior Survey Methodologist, where he provided leadership for development of data collection procedures for the 2000 Census. His current research emphasizes understanding differences between aural and written communication and the effects of such differences on the conduct of mixed-mode surveys.

In his talk, Dillman will reflect on the historical development of Computer Assisted Survey Information Collection and discuss some of the challenges now being faced in its application, including mode effects and how the mixture of visual and aural data collection in mixed-mode surveys is creating challenges for survey methodologists. He will also discuss some of the difficulties in convincing people to participate in web surveys and how that may be affected by greater use of the Postal Delivery Sequence File to sample households, and creative efforts to use postal methods to obtain web responses. His focus in this presentation will be on methodological challenges of linking the past to the future.

Don Dillman <dillman@wsu.edu>.

P-2. Panel Plenary: What are the key challenges and issues facing the CASIC community today?

A Panel of Experts from both government and private survey agencies will present their views. Time will be saved for audience participation.

Presentation Materials:

PanelistOrganizationTopicMaterial(s)
Thomas Clark <clark.thomasm@bls.gov>BLSDeploying New Technologies in the Field - Challenges and Benefitspresentation (Adobe PDF Icon, 568kb)
Jim O'Reilly <jimoreilly@westat.com>WestatCAI Challengespresentation (Adobe PDF Icon, 119kb)

Coordinator: Bill Mockovak <mockovak.william@bls.gov>.


B. Concurrent Sessions (Tuesday 1:30-4:30 pm)

B-1. Recent Innovations and Lessons Learned at Participating Organizations

This session has replaced the traditional Round Robin Organizational Reports. Following the model of past year’s approach, the organizational reports will be voluntary. Only organizations that have recent innovations to share with their colleagues are asked to report. Because presentations in this session are generally limited to 10 minutes each, we ask that they be focused on true innovations. Descriptions of new or continuing surveys using familiar CASIC methods may be distributed as handout supplements rather than part of the verbal presentation. The innovations may be in organization, types of surveys undertaken, software, hardware, communications, training, research, or what have you.

Schedule of Speakers:

Presenter(s)OrganizationTime
Nicholls & BosleyBLS and Census Bureau - Introduction1:30
Kathy MeleBureau of Labor Statistcs1:35
John MamerMathematica1:47
Janet LefebvreStatistics Canada1:59
Patty MaherSRC-Michigan2:11
Randy ZuWallackMacro International Inc.2:23
Lon HofmanBlaise- Statistics Netherlands2:35
 
BREAK2:50
 
Cheryl LandmanCensus Bureau3:05
Jane ShepherdWestat3:17
Karen DavisResearch Triangle Institute3:29
Judy PettyNORC3:41
available slotavailable slot3:53
Merill ShanksCASES - U.C. Berkeley4:05
Bill NichollsSummary4:20
Presenter(s)OrganizationTime

Time remains for one additional organization to present. To participate in this session, please send the name of your spokesperson(s), their general topic area(s), and approximate time required for presentation to the coordinators listed below. If you plan to participate but can’t supply the details yet, let us know so we can reserve a place for you on the agenda.

Coordinators: Bill Nicholls <wlnicholls2@verizon.net> and John Bosley <jandpbosley@verizon.net>.

B-2. Software and Application Demonstrations

This year we will continue to offer demonstrations of CASIC instruments and software in a mini exhibit hall setting, where attendees can move among exhibitors throughout the demonstration period. Space is anticipated for nine (9) concurrent demonstrations.

Please note that BLS LAN security rules have been tightened. Exhibitors are prohibited from using non-BLS computers to access the internet through the BLS LAN. Wireless access is not available. A limited number of BLS computers will be available if internet access is required. Non-BLS computers may be used for stand-alone presentations. For addition technical information, contact the session coordinator.

Organizations interested in demonstrating should send a brief proposal describing the nature of the demonstration and equipment required to Louis Harrell <harrell.louis@bls.gov>.

Only representatives of Federal agencies or Federal survey contractors may make presentations. Software vendors may participate in demonstrations only when invited by a Federal agency or Federal survey contractor to assist in its presentation.

Coordinator: Louis Harrell <harrell.louis@bls.gov>.

B-3. Multimode Data Quality and Comparability

As more surveys are conducted using 2 or 3 modes of data collection, there is concern with the comparability and quality of data collected between modes. While it is very easy to determine that there is a different response pattern (e.g., mean, standard error) between modes for an item, it is much more difficult to determine whether these differences are due to the intrinsic nature of the modes themselves, or related to the characteristics of those who participate in each mode. Presentations in this session will be based on real-world surveys and will discuss predictors of response by mode, compare estimates between modes, and evaluate the size of mode effects in key survey estimates. They will describe the techniques used to investigate these mode differences and their resulting conclusions. While statistical in nature, the presentations will be targeted to a non-technical audience.

Schedule of Speakers:

Presenter(s)OrganizationTitleTime
Introduction to Session1:30
Karen GrigorianNORCMode Assignment Analysis for NSF's 2006 Survey of Doctorate Recipients1:45
Catherine M. SimileNCHSComparing Selected Health Indicators from the NHIS by Mode of Administration2:15
 
BREAK2:45
 
James CaplanDMDCAdministrative Methods of Increasing Response Rates3:00
David CantorWestat/JPSMEvaluating Mode Effects in Mixed Mode Designs3:30
Discussion and Questions4:00
Presenter(s)OrganizationTitleTime

Coordinators: Debra Wright <dwright@mathematica-mpr.com> and Brad Edwards <bradedwards@westat.com>.


WEDNESDAY MORNING - March 12, 9:00 am -12:00 noon
Concurrent Sessions

WA-1. Web-Based Surveys

Web-based surveys continue to increase in popularity across many content areas. This session will discuss a variety of topics including, but not limited to: web-survey design, web-survey implementation, security regulations, respondent contact, and data processing.

Coordinators: Duston Pope <duston_pope@marketstrategies.com> and Andrew Zukerberg <andy_zukerberg@gallup.com>.

WA-2. Security in CASIC Surveys

This session will examine current security issues and solutions related to the collection and protection of respondent data.  The focus will be on the uses and limitations of mandated procedures, such as encryption, and on realistic, in-the-trenches, issues and their resolution.

Initial Coordinators: Bill Connett <bconnett@isr.umich.edu> and Jim Kennedy <kennedy.jim@bls.gov>, BLS.

WA-3. Challenges of Multiple, Mixed Mode Sample Management and Process Integration

This session will discuss the technical challenges when building and implementing a fully integrated multiple, mixed-mode survey including developing sample/case management systems that allow sample/case movement, management, and reporting between collection modes (both prior to and during data collection), and converting surveys from one mode to another including: paper to web; paper to CATI/CAPI; and CATI/CAPI to web.

Coordinator: Mark Pierzchala <mpierzchala@mathematica-mpr.com> and Gina-Qian Cheung <qianyang@umich.edu>

WA-4. Collection and Use of Survey Paradata: Using Paradata for Production Management

Paradata (Couper 1998) is now a familiar and widely accepted term used to describe data about the survey process, including response rates, sample line contact histories, and survey item keystrokes and times. Most organizations have been collecting paradata for many years, and increasingly are building tools to assist in monitoring processes using paradata. In spite of the availability of paradata, a challenge many organizations face is changing a production management culture from one that does not use paradata for “active management” to one that does, and integrating the use of paradata into production management by both central and remote managers. In this session, several organizations will present how they use paradata in production management, and where applicable, efforts they have made to change their survey management cultures to more effectively use paradata.

Coordinator: Sue Ellen Hansen <sehansen@isr.umich.edu> and Chris Stringer <mark.c.stringer@census.gov>


WEDNESDAY AFTERNOON - March 12, 1:30 - 4:30 pm
Concurrent Sessions

WP-1. CAI Instrument Documentation Tools – Part of the Solution

The problem of documenting just what is in complex instruments, and why, has been a difficult one to deal with. A completely satisfactory solution will involve the integration of instrument documentation with broader metadata for a study or project as a whole. Keeping that overall perspective in mind, this session will focus on new tools to document CAI instruments themselves, following the XML conventions developed by the Data Documentation Initiative (DDI) specifically for instruments.

The session will include the following topics:

Coordinators: John Ladds <john.ladds@statcan.ca> and Tom Piazza <piazza@berkeley.edu>.

WP-2. Current Administrative Issues in CAI Survey Organizations

Expertise in survey methods and CAI technology are not enough for today's managers in CAI organizations. Managers must be versed in project management, risk management, and security as they work to convince many masters that they are working efficiently, effectively, securely. And they must be versed in marketing and therapy as they convince potential recruits to join their organization and current staff to stay.

This session will provide a venue for those grappling with these challenges to share their knowledge and learn from others. A panel of 4-5 management experts from government and industry will discuss several management challenges, listed below. Audience participation in the form of questions and shared experiences will be encouraged. Session attendees will hear about the techniques used in different organizations to address key management issues, participate in a discussion of these issues, and have the opportunity to ask the panelists about effective approaches to common situations. Specific areas to be addressed include:

Coordinators: Anne Stratton <aks1@cdc.gov>, NCHS, Jane Shepherd <janeshepherd@westat.com>, Westat, and Karen Davis <kdavis@rti.org>, RTI.

WP-3. Strategies, Tools and Methods for Improving Response Rates on Establishment Surveys

In today's changing environment, survey organizations have to find new and innovative ways to improve both the response rates and the quality of response on their establishment surveys. Businesses as data providers are dealing with new technologies, new forms of business organization, and increasing demands on scarce business resources. All of these affect the way businesses interact with survey organizations including their likelihood to provide good quality data to us as part of our statistical programs. Several large government organizations have made gains in working with establishments to gain better cooperation and better quality responses. This session will have a panel of agency representatives discuss their successes and lessons learned as they tackle this problem.

The panel includes:

Coordinator: Deb Stempowski <deborah.m.stempowski@census.gov>.

WP-4. Survey Uses of Metadata

Metadata are data that describe other data or processes. They are used to document design decisions and to drive processing in an automated fashion. For users of data, the metadata are the record of how those data were produced and what the data mean. As Phil Rones, Deputy Commissioner of BLS, puts it, metadata are analogous to the work you had to show when solving a math problem in high school. In order to understand the data a survey produces, you must know the steps that were taken to conduct that survey.

Metadata can be simple, not convey a lot of information, and be relatively easy to capture. On the other hand, they can be detailed, convey much information, and be hard to capture. How does a statistical agency get over the hump and begin to capture "enough" metadata? And, how does the agency decide what is enough?

Simple or complex, metadata usually don't help the person tasked with capturing them. They are used by others farther in the survey life-cycle. Therefore, altruism is required to obtain metadata. How does the agency make it worthwhile for survey workers to capture metadata, then?

Users obtaining data and metadata from multiple sources want to be able to compare data even though the data may be defined and organized differently across those sources. The metadata exist to help the user understand each data set and work on analyses or harmonization. Does the same problem need to exist for the metadata, too? Do the metadata need their own metadata to understand them? Metadata standards can solve this problem, but much cooperation is required.

Survey work provides many opportunities to use metadata fruitfully, throughout the survey life-cycle. This session will explore some of these, motivated by the questions above. Since the possibilities are so many and varied, this session can focus on only a few each time. Examples include survey conceptualization, data, tables, designs (sample, question, or database), definitions, classifications, and others.

Schedule of Speakers:

Presenter(s)OrganizationTitle
Sheila ProudfootU.S. Census Bureau Experiences with Developing and Using Metadata-driven Processing Systems for the Economic Census
Abstract:

The Economic Census began utilizing metadata for forms design and dissemination processing systems in 1997. Using metadata-driven systems provided improved information management to disseminate data products in print and electronic media. In 2002 we developed and improved standards to address inconsistencies in presentation. Additionally, all Economic Census questionnaires were made available in electronic and paper media and efficiencies were incorporated into the review and release of data products on the Internet and DVD. This discussion will focus on our experience developing and using metadata-driven processing systems and how we continue to gain efficiency, consistency, and electronic capabilities.

Paul BuggOffice of Management and BudgetCitizen Access to Federal Statistics: Scenario 2020
Abstract:

Individuals want access to Federal statistical data. They wish to learn the characteristics of different areas, what is going on in business and agriculture, or what to expect with regard to inflation and interest rates.

Currently, FedStats provides a single portal for federally collected data sets and for documents based on those data. Still, one can’t ask what would be the economic impact of locating a particular new business in a town? In 2020, this query might trigger a series of questions to acquire more details about that business and user.

To realize this requires IT innovation on several fronts, all dependent on much broader implementation of consistent metadata.

Renee Miller and Shawna WaughEnergy Information AdministrationRepository of Survey Information at EIA
Abstract:

The Energy Information Administration (EIA) initiated a project to develop a repository of survey information and the capability to generate reports that use this information.  This initiative is intended to enhance transparency, consistency, and knowledge sharing.

This repository will contain information on frame, form design and testing, sample design, weighting and estimation, and uses of data.  Another feature will be the capability to generate standard and ad hoc reports that can be used to prepare portions of the Office of Management and Budget clearance package, explanatory notes in EIA publications, and survey summary documentation for EIA staff and customers.

Pascal HeusOpen Data FoundationData Documentation Initiative (DDI), version 3.0
Abstract:

The Data Documentation Initiative (DDI) is a metadata standard for the documentation of surveys and datasets widely used in social sciences. This presentation outlines the importance of metadata best practices in data production, archiving and dissemination, and outlines how the DDI has evolved since its inception in 1995. It will also provide an overview of the new version 3.0 of the DDI to be released later this year.

Presenter(s)OrganizationTitle

Coordinator: Dan Gillman <gillman.daniel@bls.gov>


THURSDAY MORNING - March 13, 9:00 am - 12:00 noon
Concurrent Sessions

TA-1. The Future of CAI Application Development

This session will focus on trends toward the future in development of survey instrumentation. Topics might include instrumentation on non-laptop platforms such as cell phones, use of code-generating or specification-editing systems, instruments which branch out to or communicate with external software, conversion of specs to instruments or existing instruments to updated specs, use of languages other than English and Spanish, interfaces which reduce burden or improve quality, or any other topic related to new tools, approaches and methods for development of CAI questionnaires.

To participate in this session, please send the name of the speaker and a brief description of the topic to the coordinators by February 15, 2008. If there is a speaker you would like to hear, you may suggest names and provide contact information.

Coordinators: Mike Haas <michael.edward.haas@census.gov> and Rita Thissen <rthissen@rti.org>.

TA-2. New Techologies for Surveys

This session is geared to presentations of “leading edge” technologies – in production, in development, or in gestation – that offer promise for improving data collection and data management. Topics are always dependent on contributor’s offering, but could include data aggregation from multiple sources or sites, social computing for research collaboration, or innovative application design to solve old problems. Based on submissions to date, developments in Mobile Computing (cell phones, 3G technology, etc.) will be the topic of at least two presentations.

Coordinator: David Uglow <duglow@rti.org>

TA-3. Event History Calendars, A Review of the State of the Art

An Event History Calendar is a module in a survey instrument that reconstructs a series of events in a person's life or a shorter extent of time such as a 24-hour period. Events must be ordered sequentially and a date (or time) assigned to them. The recollection of events can be challenging for the respondent and there can be many revisions of already collected data. The interview itself tends to be more conversational and less scripted than traditional interviewing. The module must be flexible enough to allow for correction of mistakes such as wrong date or time, mis-ordered events, inserting forgotten events, or deleting spurious events. Other actions must be allowed such as denoting an event as an anchor event (the event is known to have occurred at a certain date or time). As the event history is constructed, there should be continual checks to ensure data integrity. The module may incorporate memory aids such as dates of widely known events.

Event history calendars can be programmed in specially designed software modules that are hooked into a larger instrument, or they may be programmed into a standard CAI survey system itself. In either case, there is a need for special interviewing training. The programming behind this kind of module is difficult and resembles time travel in many respects. Several examples of event history calendars will be offered from a variety of organizations and surveys. Short demonstrations of the modules will be offered as well as a narrative that explains the approach taken and associated issues.

Coordinators: Mark Pierzchala <mpierzchala@mathematica-mpr.com> and Jason Fields <jason.m.fields@census.gov>

TA-4. Accessibility in CASIC Surveys

It can be challenging to insure that our CASIC application comply with the Section 508 regulations and are accessible to all users. Presenters in this session will explain some of the basics of Section 508, introduce the issues associated with accessibility and CASIC surveys, and share some real-world experiences and lessons learned.

Topics will include:

Coordinator: Jean Fox <fox.jean@bls.gov>