Survey Question Wording Effects
Principal Investigator(s): View help for Principal Investigator(s) Gabriel Miao Li, University of Michigan
Version: View help for Version V1
Name | File Type | Size | Last Modified |
---|---|---|---|
|
text/x-rsrc | 23.2 KB | 08/29/2022 07:29:AM |
|
application/x-rlang-transport | 183.8 KB | 09/04/2022 05:03:PM |
Project Citation:
Li, Gabriel Miao. Survey Question Wording Effects. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 2022-09-04. https://doi.org/10.3886/E179221V1
Project Description
Summary:
View help for Summary
Survey researchers have long known that the terms used in questions
posed to respondents shape the answers they give. Processes underlying these
differences have generally been attributed to differences in respondents’
interpretations of the questions, though it is also possible that some of this
difference may stem from respondents’ ability to parse what the questions are
asking about. In three online survey experiments we manipulate wordings for
policy attitude questions about the DREAM Act, Trump’s trade disputes, and the
Affordable Care Act. Different wordings for these issues solicited different
attitudes overall as well as across partisan groups. To understand the source
of these differences, we examine the sensitivity of wording effects to
respondent partisanship as well as awareness of topically relevant information.
A simulation approach estimating each individual’s “informed response” allows
us to disentangle question wording differences due to incomplete understandings
(and misunderstandings) from those attributable to partisan bias. Evidence that
individuals with greater topic knowledge better recognize the similarity of
different wordings implies that some of the wording effect is associated with
whether respondents understand what questions mean. When we attribute the
proportion of the wording difference attributable to awareness as opposed to
partisan bias, we find that the influence of each of these factors varies
across the issues we examine.
Related Publications
Published Versions
Report a Problem
Found a serious problem with the data, such as disclosure risk or copyrighted content? Let us know.
This material is distributed exactly as it arrived from the data depositor. ICPSR has not checked or processed this material. Users should consult the investigator(s) if further information is desired.