Skip to content
This repository has been archived by the owner on Dec 20, 2023. It is now read-only.

Latest commit

 

History

History
124 lines (103 loc) · 11 KB

bias.md

File metadata and controls

124 lines (103 loc) · 11 KB
layout title description permalink sidenav sticky_sidenav subnav
post
Bias
These are 18F's starting points for proactively engaging with bias in the user research process.
/research/bias/
research
true
text href
What is bias?
#what-is-bias
text href
How bias affects research
#how-bias-affects-research
text href
How to account for bias
#how-to-account-for-bias
text href
Create a practice of awareness and reflection
#create-a-practice-of-awareness-and-reflection
text href
Be accountable
#be-accountable
text href
Diversify perspectives
#diversify-perspectives
text href
Build rapport with research participants
#build-rapport-with-research-participants
text href
Interpreting interview data
#interpreting-interview-data
text href
Types of bias in research
#types-of-bias-in-research
text href
Confirmation bias
#confirmation-bias
text href
Cultural bias
#cultural-bias
text href
Framing effect
#framing-effect
text href
Interviewer bias
#interviewer-bias
text href
The observer effect
#observer-effect
text href
Research design bias
#research-design-bias
text href
Sampling bias
#sampling-bias
text href
Social desirability bias
#social-desirability-bias
text href
Survivorship bias
#survivorship-bias
text href
Further reading
#further-reading

[//]: make it possible to put a class on a ul tag {::options parse_block_html="true" /}

All research is subject to bias, whether in our choice of who participates, which pieces of information we collect, or how we interpret what we've collected. Proactively engaging with bias helps us improve the credibility of our research.

What is bias?

Bias is a tendency to believe that some people, ideas, etc., are better than others based solely on preconceived opinions. Everyone carries their past experiences and multiple identities (e.g., your age, gender, education) into their work, which can generate assumptions and biases about who people are and what they need. While everyone has biases, unaddressed biases usually result in unfair assumptions and/or unsubstantiated decision-making.

For more information and examples of bias, watch the TTS Diversity Guild training on Designing for Cognitive Bias (GSA only).

How bias affects research

Historically, parts of the design industry asserted that design and research can and should be neutral or objective, but that’s not realistic or possible. All design and research is subject to bias—whether in the choice of who participates, which pieces of information you collect, or how you interpret what you’ve collected. It’s important to be aware of how and where our biases show up in our research so that we don’t overlook other perspectives or make assumptions based on experiences that are not our own.

How to account for bias

You can actively work to identify your biases by including diverse perspectives in your research, understanding the context in which you are conducting research, and identifying and reducing potential harm to participants and communities. In general, you can acknowledge and address bias and arrive at better solutions by intentionally including people throughout the design process, as well as by helping your team see research as a team activity.

Create a practice of awareness and reflection

  • Start with acknowledging that you can’t know the full experience of a community you are not a member of, and that you can only hope to increase your awareness of their point of view as it relates to the service you are working on.
  • Individually or with your team, surface your assumptions using this (GSA-only) frames of reference bias identification activity so the team can avoid influencing the evidence they gather based on the things they presume to be true

Be accountable

  • Create a role on the team for challenging the team’s assumptions and perspectives to avoid groupthink. Consider rotating this responsibility so that everyone tries holding views counter to the dominant take. Rotating this role also avoids having one person holding that position throughout a project, which can be draining.
  • Clarify the difference between stakeholders (usually public servants) and users of your product/service (usually the public) and who will benefit from your work

Diversify perspectives

  • Include diverse experiences and backgrounds throughout all stages of your work; when creating your team, while recruiting research participants, and decision-making processes. Look for ways to encourage diversity or representativeness by asking, “who might be most impacted by our work?”
  • People who are advocates, case workers, or other public-facing staff, may be easier to reach than the people using your product/service, but come with their own biases that may show up as generalizations about the people using your product/service. Focus on getting specific examples from Advocates of experiences helping an individual access a government product/service. Think critically about how you ask questions to participants in usability tests and interviews Ask open-ended questions like “tell me about your experience” instead of closed-ended questions like “was it hard to [use x feature]?” For more tips of effective research questions see the Research questions section of the Planning process.

Build rapport with research participants

  • Ask people how they prefer to be identified rather than assuming
  • Practice respect, humility, and compassion for others’ experiences
  • Reduce the pressure they might feel to modify their attitudes or behaviors to help people feel less like they are on stage or performing for an audience
  • Emphasize the goals of the research, and how their honest feedback is the best way to meet those goals
  • Periodically echo what you’ve heard during the interview back to interviewees (“Just to be sure I heard you correctly, you said…”)
  • Distance yourself from any proposed design solutions to avoid influencing participants by saying things like; “these are just a few ideas the team came up with, but you’re the expert here,” “I didn’t design this, so if you don’t like it, you won’t hurt my feelings.”

Interpreting interview data

  • Don’t over-interpret what you see or hear by drawing conclusions from any one study or individual and note differences between what people say they do, and what they actually do.
  • Involve your team to collaboratively analyze and synthesize data from your research, and note any plausible alternative interpretations. Your research participants might have different views on the qualities that make a good experience or best way to navigate a service than what you or your team might prefer. Having multiple people involved in interpreting research quotes and observations helps to guard against your personal preferences influencing the research insights.

Types of bias in research

At the start of a new project, each team member should individually review the common biases in research below and consider how their background, role in this work or in the agency, impact the perspective they bring to the work. If the team is comfortable doing so, you might create a combined document of the team's backgrounds and experiences related to the project subject area. The (GSA only) Bias Identification in Research Planning activity inspired by the work of Lesley-Ann Noel and Marcelo Paiva’s Learning to Recognize Exclusion, helps identify whose perspectives are missing and will need to be gathered to create a better understanding of the problem the team is working on, and hypotheses on how best to address it.

Confirmation bias

The tendency to (intentionally or not) seek out or only notice evidence that validates your existing assumptions. This can lead to biased findings and can also lead to the reinforcement of stereotypes.

Cultural bias

Judging others beliefs, experiences, and values by your standards or conventions.

Framing effect

Priming, or influencing that impacts how people respond. For example, if we ask research participants “we’re trying to understand how hard paying your bill is” when it’s not hard for them, it can focus negative attention on a topic they otherwise might not have considered a problem.

Interviewer bias

When the interviewer’s own beliefs or assumptions influence how they lead a session. This can be especially apparent at the start of an interview—for example, if the interviewer expresses excitement about a particular aspect of the product or service (such as “Our team is really proud of the new search feature!”).

Observer effect

When people who participate in research modify their behavior simply because they’re being observed. An example of this is when an office becomes unusually quiet while an interviewer conducts on-site interviews.

Research design bias

When the team designs their research to advance their existing beliefs. For example, if an agency executive believes that they already understand user needs, that executive might discourage the team from speaking directly to people using their product/service.

Sampling bias

When some members of the target population are less likely to be included in the study. For example, if a team is not able to compensate user research participants, or leans too heavily on digital-first participant recruiting processes, it risks excluding members of the public who are not in a financial position to volunteer their time to participate in the research, or don’t interact with the government online.

Social desirability bias

The tendency for people to respond in ways that paint themselves in the best possible light, regardless of whether or not that bears any resemblance to reality.

Survivorship bias

When the population otherwise wouldn't be in the sample at all. For example, if you're trying to understand why it's hard to sign up for a thing and you only talk to people who successfully signed up you'll totally miss the people that failed to sign up. Government falls into this trap often, because frequently we only have the contact information for people who successfully submitted a form.

Further reading

These links do not imply endorsement but are shared here as other resources on this topic: