Feature Stories - 2006

Data Science and Advancing Survey Methods: Abt SRBI Observations from AAPOR 2016

    Release Date: 6/27/2016

    Rapid changes in the fields of opinion and behavior research were on full display at the 2016 meeting of the American Association for Public Opinion Research (AAPOR) in Austin, TX, which focused on the theme “Reshaping the Research Landscape: Public Opinion and Data science. Data science and survey methodology experts from Abt SRBI  (a subsidiary of Abt Associates) presented across a range of topic areas, with the goal of furthering the industry’s understanding of changing methods and promoting dialogue on new measurement techniques:
     
    DATA SCIENCE
    Abt SRBI President and CEO (and AAPOR Past President) Michael Link facilitated a cross-industry panel (Annie Pettit, Peanut Labs; Craig Hill, RTI International; Frauke Kreuter, University of Maryland; and Fred Conrad, University of Michigan) on defining data science and understanding the increasing role it now plays in understanding attitudes and behaviors. The discussion centered around defining “data science” and its relation to more traditional techniques, such as survey research; identifying the steps being taken by both larger and smaller research organizations to leverage new data science techniques; and the types of mindset changes and skill sets required to be successful in this new arena. The panel concluded that data science is a critical and necessary component for more fully understanding behaviors and opinions, which often works best when combined with survey research to provide both context and broader understanding of the phenomena being researched.
     
    At the World Association for Public Opinion Research (WAPOR) conference, held in conjunction with AAPOR, Abt SRBI Chief Research Officer Mark Schulman provided more detail on how opinion and behavior research are changing in light of new data science techniques in a presentation entitled “’The Future Ain’t What it Used to Be:’ A Shifting Paradigm.” He presented the case for data science replacing surveys in some instances, yet augmenting and furthering the utility of surveys in others.
     
    TELEPHONE SURVEY METHODOLOGY
    Barbara Ferndandez with co-authors from Abt SRBI and University of Colorado presented results of a split-sample experiment in which cell phone sample members that had matched addresses provided by a sample vendor were mailed an advance notification letter. They found that the advance letter neither improved response rates, nor affected the demographic composition of the sample. Only one-fifth of the treatment group remembered receiving the advance letter. However, large differences were found between matched and non-matched samples; the latter had lower income and education levels, more likely to be renters, more likely to be Hispanic.
     
    Raphael Nishimura with co-authors from Abt SRBI and the Washington Post evaluated measures to improve cell phone coverage using “targeted” cell sample (cell phone numbers that are known to the sample vendor to be residing away from their expected geographic area) and registered voter lists. Conducted in Philadelphia, the targeted out-of-area cell phone sample was found to be younger, very highly educated (76% college and above, vs. 23% in the regular cell RDD sample), far more likely to be white (70%, vs. 23% in cell RDD), more likely to be employed, and had higher income. All in all, this is a very specific demographic group that is hardly representative of the cell population. The registered voter lists were used in the study in Washington, DC, and also exhibited biases towards higher socio-economic status.
     
    Targeted cell sample was also evaluated by Nicole Lee with co-authors from Abt SRBI and New York City Department of Health and Mental Hygiene. They found that while the targeted cell phone sample screened at a higher rate than the regular cell RDD, its eligibility rate was much lower, resulting in completion rates twice as low as the regular cell RDD. It was estimated that only 4% of uncovered population (CPO with out-of-area numbers) was picked up due to the use of the targeted cell phone sample.
     
    Rachel Martonik with co-authors from Abt SRBI and New York City Department of Health and Mental Hygiene looked at trends in the landline (LL) and cell phone (CP) frame performance in the New York Community Health Survey throughout 2010–2015, over which time cell allocation increased from less than 10% to more than 50%. They found that working number rates and contact rates for LL declined each year while interviewer hours per complete for both LL and CP increased as did refusal rates. Performance of the cell frame was relatively stable over the period, with slight decrease in contact rates and more attempts required to resolve incomplete cases. Increased cell allocations did, however, help capture a greater number of the hard-to-reach respondents, such as non-NH-White race/ethnicity (up from 57.6% in 2010 to 65.2% in 2015), non-English interviews (up from 17.1% in 2010 to 26.1% in 2015), and those with incomes below poverty (from 18.2% in 2010 to 23.0% on 2015).
     
    Andrew Evans with co-authors from Abt SRBI, DE Division of Public Health, VA and GA Departments of Public Health, OJ and NJ Department of Health investigated performance of the cell sample in BRFSS surveys in these states in 2010–2015. The cell phone frame response rate has been improving over these years, and productivity of the cell frame drastically improved when BRFSS vendors started using CellWIN S activity flags in 2015 to screen out the numbers that were likely to be non-working. In Delaware and Georgia, the 2015 cell phone sample was more efficient than the landline. For health surveys specifically, landline only households remain an important population for health surveillance as they tend to be older and have more chronic health conditions. Also, dual telephone users are increasingly difficult to survey on landline phones as they rely more heavily on their cell phones.
     
    WEB SURVEY METHODOLOGY
    Using a multi-national survey of scientists, Ben Phillips with co-authors from Abt SRBI and Rice University presented a mixed two-way cross-classified model of response time as a function of item-level variables (such as question type, question length) and respondent-level variables (such as gender and academic rank). Phillips found some evidence of cognitive shortcuts taken by the respondents – e.g., they stop reading the question after about 80 words in the stem, jumping to the answer. Numeric or text entry items were found to be more time-consuming than items that can be selected by the mouse. Offering the “Don’t know” option increases response time. Nonresponse patterns in this international survey were additionally studied by
    Allison Ackermann with co-authors from Abt SRBI and Rice University.
     
    Nick Bertoni with co-authors from Abt SRBI studied the impact of field period length and contact attempts on representativeness for Web surveys using paradata from the American Trends Panel, which Abt SRBI manages for Pew Research Center, and found little differences in the demographic composition of the early vs. late respondents (those who completed the survey after the 3rd and the last reminder).
     
    TRAVEL SURVEYS
    Timothy Michalowski and Dara Seidl discussed the relative merits of self-reported, geocoded travel origins and destinations vs. those recorded by a GPS device. They found that due to the typical parcel sizes, residential parcels have the smallest error (of about 120 ft), while commercial, recreational and industrial parcels had errors of 300–700 ft. Thus the home address will likely be picked up easily, while work and school addresses have increased errors. Finally, other destinations such as shopping have the highest levels of error; they also constitute the greatest response burden for the travel diary respondents.
     
    Josh DeLaRosa with co-authors from Abt SRBI discussed the issue of data privacy on implications for requesting GPS data when recruiting respondents for a travel study. The researchers observed that refusal rates in travel surveys are higher when respondents are given a GPS logging device compared to those who self-reported the travel information. DeLaRosa put the privacy associated with the use of high-resolution GPS data into the context of other occasions when the members of general populations are requested to share their electronic data with third parties, such as usage based insurance with discounts for data sharing, government use of personal trackers, and use of electronic logging devices by long-haul truck drivers.
     
    Stas Kolenikov with co-authors from Abt SRBI and National Highway Traffic Safety Administration presented an advanced analytical study of distracted driving behaviors. Using structural equation modeling, Kolenikov demonstrated three types of behaviors (interacting with other passengers in the car; life support functions like eating and drinking, as well as manipulation of the car environment gauges such as radio; and using one’s phone for calls, directions, Internet access, etc.), and their relations to the distracted driving outcomes. He found that police does stop those who are more likely to be using the phone; and that being engaged in “life support” activities increases the likelihood of a crash or a near crash.
     
    Kolenikov also co-taught a course at the conference “Principles and Methods for Weighting Survey Data” (with Trent Buskirk of MSG).
     
     
    Reported by:

    Stas Kolenikov, Principal Survey Scientist, Abt SRBI


    Michael Link, President & CEO, Abt SRBI
     

    For more information on these or related Abt SRBI presentations, please contact S.Kolenikov@srbi.com