New study highlights survey vulnerabilities

Some 8.3 percent of California teens smoke cigarettes. No, wait, make that 4.5 percent. Or is it really 14.2 percent?

It can be hard for public health researchers, much less the public, to know what numbers to believe when surveys come up with such varying results. Epidemiologists are often faced with this challenge - as are politicians or any group that relies upon polls, questionnaires and surveys about people's behavior or beliefs.

A researcher at the University of California, Berkeley, highlights this problem in a new study assessing two different survey methods designed to estimate the prevalence of teen smoking in California. The paper, to be published later this month by Oxford University Press in the Winter 2004 issue of the journal Public Opinion Quarterly, tested a relatively new telephone survey method and compared it with an existing one. The researcher found the two methods yielded significantly different results.

Both methods involved interviews with adolescents 12-17 years of age in California in a survey conducted by the Gallup Organization between April 5 and July 6, 2000. Telephone interviewers contacted a random sample of households and asked permission to interview any adolescents at home. Half of the adolescents were interview, and the remaining half completed the survey using the standard interviewer-administered method. More than 2,400 interviews were completed. The overall survey response rate was 49 percent.

In the telephone computer-assisted self-interviewing (T-ACASI) method, participants listened to pre-recorded, computer-controlled questions and responded by pressing the keypad on a touch-tone telephone. In the computer-assisted telephone interviewing (CATI) method, interviewers asked the questions and entered responses into a computer. The questions were the same in both surveys.

The automated T-ACASI survey resulted in an estimate of 8.3 percent of teens who reported smoking in the prior 30 days. In comparison, the CATI survey yielded a significantly lower estimate, 4.5 percent, of current teen smokers. Both those figures are lower than the 14.2 percent prevalence found in a school-based survey of California teens conducted in 2000.

School-based surveys are the most common tools used to estimate behavior among adolescents and children, said Joel Moskowitz, director of UC Berkeley's Center for Family and Community Health and author of the study. But school surveys miss many high-risk youth who are often not in class and may be more likely to smoke or do drugs.

Still, the students who are in school for a survey may feel more open about reporting high-risk behavior there than at home because they are in a setting with peers, said the report.

Notably, 59 percent of respondents in the CATI survey reported that a parent could hear all or part of the interview, compared with 42 percent of respondents in the T-ACASI survey. The perception of a lack of privacy could have led to lower estimates of smoking in both telephone surveys, said Moskowitz.

Yet phone interviews have increased in popularity over the years as nearly all households in the United States now have phones, according to the report. Calling homes may reach some of the high-risk youth that school-based surveys do not. Phone surveys also tend to be more cost-efficient.

At the same time, response rates for phone surveys are on the decline with the advent of answering machines and telemarketing as people screen calls to their home phones, said Moskowitz. In addition more and more people have cell phones and are becoming less dependent on their land lines. When people do respond, their answers vary depending upon the survey method used.

"In the phone survey where respondents spoke with a live interviewer, they may have underreported their smoking behavior because of a perceived lack of confidentiality, even though the two survey groups were equally anonymous," said Moskowitz. "The youths may have felt more comfortable revealing their smoking habits to a computer rather than a person."

Moskowitz noted what pollsters and survey takers have already discovered: People tend to underreport beliefs or behaviors, such as smoking, that go against what is considered "socially desirable," especially if they are interacting with an interviewer. "That's also why people tend to overreport desirable behavior, such as voting frequency," he said.

A pre-recorded, computer-controlled system takes the human interviewer out of the equation, which could potentially help when surveys are targeting high-risk behavior such as substance abuse, he said.

Pollsters are aware of these drawbacks, and statisticians have come up with methods to compensate for the potential biases. For instance, researchers weight the responses from groups that have a low response rate in an attempt to reflect their true representation in a population. However, that procedure has its downsides.

"Weighting the data can sometimes do more harm than good," said Moskowitz. "You're assuming that the people who respond to the survey are representative of the non-respondents, and that may not always be the case."

Despite the concerns, Moskowitz sees the necessity of surveys and polls. However, he urges people to take a closer look at how surveys are conducted before accepting the results. "These findings underscore the need for people to be wary of survey results because so much depends upon the methodology used," said Moskowitz. "It's important not to take survey results at face value."

He suggested some questions to ask in evaluating a survey:

  • How large is the sample size and how high is the response rate? The bigger the better.
  • Who sponsored the survey? Who conducted the survey? Are they independent?
  • How was the survey worded? The text of instructions or explanations might affect responses.
  • How representative is the population surveyed? What were the methods by which respondents were selected?

More detailed information on evaluating polls and surveys can also be found online at the Web site of the American Association for Public Opinion Research (http://www.aapor.org). The site includes a section specifically for journalists who report on poll results.

The study was funded by the University of California Tobacco-Related Disease Research Program.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Adolescents' genetic risks tied to psychotic symptoms