Here is some information on how the poll was conducted.
This poll was conducted April 23-May 1, 2024. A sample of 1,479 U.S. adults were interviewed online and by telephone. The estimated margin of sampling error for the entire sample is plus or minus 3 percentage points. The margin of error may be larger for smaller groups within the full sample. Sampling error is only one of the potential sources of error in a survey.
Key challenges in conducting a poll are 1) selecting a representative sample, 2) convincing sampled people to participate in the survey, 3) attending to potential errors in data collection and 4) adjusting the sample to make it more representative. This is how KFF worked on these problems in this poll:
Selecting the Sample. Respondents for the KFF Health Tracking Poll were obtained from an online panel and telephone contacts. The online respondents (1,176) and some telephone respondents (25) were members of the SSRS Opinion Panel, a sample of U.S. adults aged 18 or older. SSRS selects addresses from a national list maintained by the U.S. Post Office, giving each address a known chance of being selected. This procedure is called a “probability-based,” method. Unlike volunteer or "opt-in" panels, this sampling method allows those who use the SSRS panel to estimate the representativeness of their surveys – a "margin of sampling error."
SSRS sends invitations to people living at selected addresses to join its panel of respondents. The individuals who agree are then sampled periodically and invited to participate in surveys, such as the KFF Health Tracking Poll.
Because some households in the U.S. do not have internet access or are otherwise hard to reach, SSRS also recruits panel participants by telephone, using a sampling technique called “random digit dialing.” For the Health Tracking Poll, KFF added more telephone interviews to the SSRS panel. They obtained 278 additional interviews obtained through another probability-based method – a random digit dialing telephone sample of prepaid cell phone numbers obtained through a company called Marketing Systems Group. Phone numbers used for this prepaid cell phone component were randomly generated from a cell phone number list targeting Hispanic and non-Hispanic Black respondents and others who are less likely to participate in online panels.
Urging People to Participate: One of the other potential sources of error is "nonresponse bias." Some of the people who are invited to participate in a poll do not agree to do so. If those who do not agree to participate are different in pertinent ways from those who do agree, then the survey findings may be biased. Researchers try to persuade invited people to respond to reduce the likelihood of bias. In the KFF Tracking Poll, invitations to respondents included monetary incentives ($15 for phone respondents and $5 or $10 for online respondents) to increase participation. In addition, multiple attempts were made to contact and encourage both online and telephone respondents to participate.
Addressing Data Quality: The questions used to measure respondents' opinions and experiences, and how they are administered, can introduce error into the survey. KFF employed several methods to reduce the chance of errors. First, the questions were tested prior to the poll to see if they were understood. To introduce the questions in the survey, examples of drugs used for weight loss were provided so that respondents would be sure to know what the questions referred to. These are the exact questions asked in the interview and discussed in the news article:
- "How much have you heard, if anything, about a class of drugs being used for weight loss, such as Ozempic, Wegovy, and Mounjaro?" [A lot, Some, A little, Nothing at all]
- "Are you currently using, or have you ever used one of these drugs to lose weight or treat a chronic condition such as diabetes or heart disease?” [Yes, currently using; Yes, but not currently using; No, never]
- "Do you take these drugs primarily to lose weight, or to treat a chronic condition like diabetes or heart disease, or both?" [To treat a chronic condition, To lose weight, To lose weight and treat a chronic condition]
- "How did you pay for the cost of these drugs?" [Paid the full cost themselves, Insurance covered part of the cost and respondent paid the rest, Insurance covered all of the cost]
- "Did you get these drugs or a prescription for them from any of the following places?" (select all that apply) [Your primary care doctor or a specialist; A medical spa or aesthetic medical center; An online provider or website; VA doctor/clinic (volunteered); Somewhere else]
The order in which response options are presented to respondents can affect how often respondents choose them. In the KFF Tracking Poll, the response options presented to respondents for the questions about reasons for using the drugs, access to the drugs and paying for the cost of the drugs were rotated so that all respondents did not see the same options in the same order.
The amount of time that respondents spend on answering questions can affect the quality of their responses. In the KFF poll, which asked more questions than just those discussed in the news article, the online interviews took 15 minutes on average and the telephone interviews averaged 31 minutes. It takes longer to ask and answer questions in conversation than to answer written questions online. Telephone interviewers were specially trained for this poll to ask questions as written and to deal with any questions or issues raised during the interviews. To accommodate Spanish-speaking respondents, the questions were translated into Spanish. Across online and phone contacts, 83 interviews were conducted in Spanish.
Responses by online panelists were examined for signs that the respondents did not give sufficient attention and care to their answers. This examination included attention check questions to see that respondents spent adequate time on the questions and did not leave many questions blank. As a result of these checks, two online respondents were not included in the final sample.
Adjusting the Sample: No survey sample can give a perfect picture of the population of interest. The achieved sample for any survey will have different characteristics from the population it is intended to represent just by chance (sampling error) and because of non-participation. After all data are collected, researchers adjust some of the fundamental measures –- like the percentage of women and men in the sample - so that they match the values in the population. For the KFF Health Tracking Poll, the combined cell phone and online panel samples were weighted to match the sample's demographics to the national U.S. adult population, using reference data from much larger surveys, like the Census Bureau’s 2023 Current Population Survey.