The impact of text formatting on survey attention

The impact of text formatting on survey attention

Melissa Moore - March 11th, 2025

Could different types of text formatting influence and improve survey attention? This Methodology matters experiment shows that both the type and format of an attention check question significantly influences whether respondents will successfully pass attention checks within a survey.

Introduction and background

Surveys frequently utilize attention check questions to help assess the quality of responses in a dataset. These questions can take a variety of forms and may include various formatting styles to “trap” inattentive respondents into incorrect answers or to alert respondents that an item needs special attention. A correct answer means a “pass” for the respondent on that question, meaning they are likely to be retained in the final data, while an incorrect answer means a “fail” and removal from the data.

In theory, both the type of question and the formatting of the question text could affect pass/fail rates. Different question types require different levels of respondent engagement. A single-choice item requires less input than an open-ended text item. Similarly, a lack of formatting may be less attention-getting than bold, underline, or italics. We explore whether differences emerge in respondents' pass rates based on the type and formatting of attention check questions.

>View data collection methodology statement

Experimental setup

To evaluate the impact of question type and text format on the survey response, we fielded a survey experiment. Participants in the experiment were randomized in a three by four design. Table 1 outlines the experimental conditions, crossing three question types by four formatting options. Table 2 displays the questions as they were shown in the experiment.

Question typeBoldItalicsUnderlineNone

Favorite Color (Single)

Group A

Group D

Group G

Group J

Political Interest (Single)

Group B

Group E

Group H

Group K

Color Entry (Open-End)

Group C

Group F

Group I

Group L

Respondents in the favorite color condition were asked: “For this question, we are interested in your immediate reaction. Please read the instructions carefully and then give the first answer that comes into your head. To show you are paying attention, please leave this item blank. Among the following, which is your favorite color?” The response options were Red, Orange, Yellow, Green, Blue, and Purple. The correct response was a blank (“skipped”) answer.

Respondents in the political interest condition were asked: “For this question, we are interested in your immediate reaction. Please read the instructions carefully and then give the first answer that comes into your head. To show you are paying attention, please leave this item blank. How interested, if at all, are you in politics?” The response options were Extremely, Very, Somewhat, A little, Not very, and Not at all. The correct response was a blank (“skipped”) answer.

Respondents in the color entry condition were asked: “For this question, we are interested in your immediate reaction. Please read the instructions carefully and then give the first answer that comes into your head. To show you are paying attention, please enter 'teal.' What is your favorite color?” The correct response was an entry of “teal”, with small typos (juxtaposing two letters, omitting a letter) also accepted as correct responses.

In each condition, respondents were further randomized to see different formatting options on the attention text. For the favorite color and political interest condition, the formatted sentence was “To show you are paying attention, please leave this item blank.” For the color entry condition, the affected sentence was “To show you are paying attention, please enter 'teal.'” Respondents either saw the text of this line as bold, italicized, underlined, or without formatting.

All respondents surveyed were assigned to one of the 12 possible attention check conditions.

Data preparation

To explore the results, responses to the open-ended color entry question were first qualitatively reviewed and coded into a binary pass-fail variable by coders blind to the experimental assignment. Responses were coded as correct if they entered “teal”, accepting minor typos (e.g., “tral” and “teak”) or explanations that confirmed attentiveness (e.g., “TEAL EVEN THOUGH I DON’T LIKE TEAL” and “Teal. Blue”). The single-choice attention questions were also recoded into binary pass-fail variables.

Research question

We expected differences in compliance (pass rate) to emerge between the types and formatting of the questions, but had no prior expectation as to which type of formatting would be most effective (for additional detail, see our preregistered research design).

Data

To explore the differences between the conditions regarding attention check compliance, we included this experiment at the end of a larger survey on political topics. A total of 4,373 responses were collected between August 31, 2024 and September 5, 2024. Of these, 2,624 responded to the attention check to which they were randomly assigned and retained for analyses.

Results

Research question: Will there be a difference in the rate respondents pass attention checks based on the type and format of the questions?

Comparison of sample means was conducted to compare pass rates by question type, format, and the interactive effect of type and format.

Format and question typeMean (Range 0-1)

Sample Mean

0.77

Question Type

Favorite Color (Single)

0.82

Political Interest (Single)

0.67

Color Entry (Open-End)

0.83

Text Formatting

Bold

0.84

Italics

0.75

Underline

0.8

None

0.71

Question TypeBoldItalicsUnderlineNone

Favorite Color (Single)

0.870.80.890.71

Political Interest (Single)

0.80.620.680.58

Color Entry (Open-End)

0.870.830.850.8

Our results show significant differences in attention check pass rates across question types (p < .001), text format (p < .001), and the interaction of type by format (p < .015).

Exploring the differences between conditions in detail, we find that bold and underline formats significantly outperform all other formats (p < .001) and are not significantly different from each other (p < .065). The political interest item was the lowest performing across all conditions (ps < .001). The favorite color single-choice item did not significantly differ in performance from the color entry open-ended item (p < .278).

With the highest pass rate (89% passed), the underline format on the single-choice favorite color item was the best performing. This was followed closely by the bold format on the favorite color item (87%) and the bold format on the color entry item (87%). These did not significantly differ from each other, or from the underline format on the color-entry item.

General discussion

The results of this experiment show that the type and format of attention check questions can significantly affect their pass rates. We find that bold and underline are the most effective in capturing attention and helping respondents pass the attention checks. It may be that italics are not as clear to read as other formats, though the results still show an improvement by italicizing over no added formatting.

Regarding the type of question, our results showed that respondents were more likely to pass attention check items that asked them to select or enter a specific color as opposed to an item that asked them to select a false answer about their political interest. Applying formatting did significantly increase pass rates on this question compared to no added formatting. It may be that a false question about political interest is more likely to “blend in” to the surrounding content while questions about color offer a topic shift in addition to other elements of attention. Future research could consider additional question types such as multiple-selection questions, single items hidden within larger grids, and other question topics.

About the author

Melissa Moore, Ph.D., Associate Research Director, Scientific Research

Moore has a strong background in qualitative and quantitative research methodologies and utilizing mixed-method research approaches. She has expertise in survey and experimental design, including writing stimuli and questionnaires. Her research has covered a variety of fields including nonverbal communication, deception, persuasion, entertainment media, news media, educational outcomes, and applied theater. Moore holds a Ph.D. in Communication from the University at Buffalo and an M.A. from Emerson College.

About Methodology matters

Methodology matters is a series of research experiments focused on survey experience and survey measurement. The series aims to contribute to the academic and professional understanding of the online survey experience and promote best practices among researchers. Academic researchers are invited to submit their own research design and hypotheses.