09 Advantages of Biases

This may seem counterintuitive, since biases always have a negative reputation, but they can have some advantages as well. Before I end this line of blogposts, with a short recap, I want to go another way, to highlight some positive sides of biases.

Biases also have many benefits. Our brains use biases to make decisions quickly, focus on important information, and even stay safe. Let’s explore some of the advantages of biases and how they help us in daily life.

01 Biases Help Us Make Quick Decisions

In a world full of information, our brains cannot process everything at once. Biases help us filter information so we can focus on what matters. For example, the brain ranks and prioritizes information to help us act fast. This ability is essential when making quick decisions in everyday life, such as crossing a busy street or choosing what to eat. Without biases, decision-making would be slow and overwhelming (LinkedIn).

02 They Improve Our Focus and Efficiency

Biases allow us to focus on relevant details while ignoring distractions. This is especially useful in work and learning environments. For example, when searching for an object in a cluttered room, our brains use bias to guide our attention toward what is most likely to help us. Similarly, biases help professionals make better decisions by focusing on key information instead of getting lost in unnecessary details (Airswift).

03 Biases Support Social Connection

Humans naturally form groups based on shared interests, beliefs, or backgrounds. This is known as ingroup bias. While this can sometimes lead to discrimination, it also has benefits. Ingroup bias helps build trust and cooperation within communities. It fosters teamwork, strengthens social bonds, and encourages people to support one another. These social connections are essential for emotional well-being and personal growth (Harvard Business School).

04 They Enhance Learning and Adaptability

Biases help us learn new things by making patterns easier to recognize. For instance, our brains naturally categorize information to make sense of the world. This ability helps us identify risks, recognize familiar faces, and understand new concepts more quickly. Even in education, biases help students focus on the most relevant material and remember information more effectively (LinkedIn).

05 Biases Can Increase Motivation

Some biases, like confirmation bias, can motivate people to pursue their goals. Confirmation bias makes us focus on information that supports our beliefs. While this can sometimes lead to mistakes, it also helps people stay committed to long-term goals. For example, entrepreneurs often rely on positive feedback to keep going, even when facing challenges. This kind of bias can drive innovation, persistence, and personal success (Airswift).

06 They Enhance Survival and Safety

From an evolutionary perspective, biases have helped humans survive by guiding quick and instinctive reactions. For example, people are naturally more alert to potential dangers because of negativity bias, which makes us pay more attention to risks. This bias helps us stay cautious and avoid harm. Similarly, biases like familiarity bias encourage people to stick with what they know, which can be useful in uncertain situations (Harvard Business School).

Conclusion

While biases can sometimes lead to errors, they also provide many benefits. They help us make fast decisions, focus on important details, connect with others, learn efficiently, stay motivated, and protect ourselves. Understanding the positive side of biases can help us use them wisely while being aware of their limitations. Rather than seeing biases as flaws, we should recognize them as essential tools for navigating the world more effectively.

08 Most common Biases in (UX) Design

After talking a lot about biases in general, I want to put focus on biases, that affect the design discipline in particular. I wanted to find out, which biases are very common amongst designers and how they can be spotted.

Biases can creep into UX design in subtle ways, shaping how designers create and evaluate their work. These mental shortcuts or preconceived notions can distort user research, design decisions, and testing outcomes.

Common UX Biases

  1. Confirmation Bias:
    Designers often seek out data or feedback that aligns with their assumptions or expectations. For example, if you’re convinced users will love a feature, you might unconsciously focus on positive comments while ignoring criticism. This skews the final product toward the designer’s preferences rather than the users’ needs (cf. UX Team).
  2. False-Consensus Effect:
    This bias happens when designers assume users think like they do. For instance, just because a designer finds an interface intuitive doesn’t mean the average user will feel the same way. This misalignment often results in designs that alienate diverse user groups (cf. Toptal).
  3. Recency Bias:
    This occurs when designers give undue weight to the most recent feedback or user data they’ve encountered. While recent input can be important, over-relying on it can overlook broader patterns or trends that are crucial to creating balanced designs (cf. PALO IT).
  4. Anchoring Bias:
    Designers may fixate on the first piece of information they receive, such as initial user feedback or early test results, and let it heavily influence future decisions. This can lead to disregarding new, potentially more accurate insights that arise later in the process (cf. UX Team).
  5. Social Desirability Bias:
    During user research, participants might provide answers they think the researcher wants to hear instead of their genuine thoughts. This can lead to misleading data and decisions that don’t address real user needs (cf. Toptal).
  6. Sunk Cost Fallacy:
    Designers sometimes stick with a feature or concept they’ve invested a lot of time and effort into, even when it’s clear it’s not working. This bias prevents teams from pivoting to better alternatives (cf. PALO IT).

Spotting Biases

To identify biases in your work, start by reviewing your assumptions. Are you basing design decisions on data or personal opinions? Regularly involve diverse perspectives in your design process to uncover blind spots. For example, conducting usability tests with a variety of users can highlight mismatches between the design and user expectations (UX Team).

Another tip is to document your decision-making process. Writing down why you chose a certain layout or feature can make biases easier to spot. If your reasoning is based on personal preference or limited data, you’ll know to re-evaluate that choice (Toptal).

Biases in UX design can hinder the creation of user-friendly and inclusive products. By recognizing common biases like confirmation bias, false-consensus effect, recency bias, and others, you can take proactive steps to create designs that truly meet users’ needs. Regularly challenging assumptions and involving diverse perspectives ensures a more balanced and effective design process.

07 How to combat Bias

I have talked a lot about what bias is and where it can occur, but not about how it can be mitigated. You will find some ideas on how to deal with bias in this blog post.

1. Spotting Unconscious Bias

The first step to overcoming unconscious bias is recognizing that it exists. For teams, tools like the Designing for Worldview Framework by Emi Kolawole or AIGA’s Gender Equity Toolkit, can help. If you want to find out, how biased you are towards a certain group of people, check out Harvard’s Project Implicit tests . Knowing your biases is the first step toward fixing them. (cf. UX Booth)

Source

2. Taking Action

Once you’ve spotted your biases, it’s time to do something about them. A great way to start is by consciously designing with different users in mind. Tools like Perspective Cards can help you imagine how your designs might feel to people with different experiences. When working with clients or users, take time to truly listen and understand their perspectives. Let go of your own assumptions—it’s the best way to gain new insights and create designs that work for everyone. (cf. UX Booth)

3. Build Diverse Teams

Diverse design teams are key to creating inclusive experiences. Diversity matters especially in design, a profession that requires professionals to think new thoughts and challenge existing ideas all the time. Different people think a different way, brining them together can result in a pool of new ideas, incorporating different (cf. UX Booth)

4. Keep Learning

Overcoming bias isn’t a one-time thing, it’s a lifelong process. Stay curious and open to feedback. Always think about who you might be leaving out and how you can make your designs more inclusive. By committing to continuous learning and embracing new perspectives, you’ll create better, more universal designs that truly work for everyone. (cf. Medium)

5. Explore the “unhappy paths”

When designing, don’t just focus on the “happy path” — consider the unhappy paths too. These are real-life situations where things break, go wrong, or are misused, and they shouldn’t be ignored as edge cases to fix later. Ask tough questions like, “How could people game the system?” or “Who could use it to harm others?” Addressing these issues early creates more robust and humane products that work for diverse users. While exploring unhappy paths may slow you down initially, it saves time in the long run by preventing costly reworks and ensuring you’re headed in the right direction from the start. (cf. Medium)

6. Make personas challenge assumptions

Personas are a hot topic, with debates on whether they’re necessary or useful, but when done right, they can be a powerful tool to challenge assumptions about users. Start by removing demographic details like age, gender, or income, which can introduce bias. Instead of generic stock photos, use real images of users who defy stereotypes, helping teams confront their unconscious expectations. If real user photos aren’t available, consider inclusive stock photo alternatives like tonl.co. You can also use names from underrepresented groups to further broaden perspectives. Remember, this isn’t about ticking a diversity box — it’s about reflecting real insights and challenging narrow views to design for a wider audience.
(cf. Medium)

7. Designing for a diverse global audience

t R/GA, we design for global audiences by leveraging diverse teams and cultural insights from the start. Our “Human, Simple, Powerful” design model ensures diversity and inclusion are baked into the process. The “Human” element focuses on addressing problems through a human lens, considering cultures, customs, and the context of users’ lives. We validate prototypes through user testing with a diverse audience that mirrors the anticipated end users. By mapping touchpoints and breakpoints across different backgrounds and conducting experience mapping globally and locally, we gain a well-rounded view of our users. This approach helps us create focused, inclusive solutions that eliminate ambiguity and meet the needs of diverse audiences. (cf. Medium)

Source

Although completely overcoming bias is probably impossible, you can try to minimize their impact on your work by utilizing some of the methods, I wrote about in this blog post.

06 How Bias effects (UX) Design

“Life can only be understood backwards, but it must be lived forwards” ~ Soren Kierkegaard
A quote that is also very fitting, when talking about bias in design. Most of the time you can only understand, that a decision could have been made due to a bias, after the changes have already been deployed. Looking a bit deeper into the topic of biases and how they affect (UX) design, here are some interesting stories, how products turned out biased towards or against parts of their user groups.

1 – Spotify Shuffle Button

In a reddit form, a user requested, that the shuffle button in the Spotify app would have a circle around it, since they are color blind and have a hard time seeing the difference between the active and inactive shuffle button. (see picture below) (cf. Reddit) Put simply, this might have happened due to a blind spot affecting Spotifys design team. Not all people perceive colors the same way, some have a hard time, especially seeing red and green. Approximately 8% of men and 0.5% of women are affected by this type of color blindness. (cf. the Guardian) This simple change could be a big difference for certain subsets of users.

Approximation of how a colorblind user with protanopia color blindness may see the Shuffle button in Spotify. Source

2 – Cars and Seat Belts 

Here is a fun one, in the 1960s, most crash test for cars were done with crash test dummy, modeled after an average male physique (height, weight & stature). Therefore safety design decisions were mostly tailored to men, neglecting woman, children, smaller or bigger individuals. Although crash test have been conducted with “female” crash test dummies, but they were only placed in the passenger seat. (cf. User interviews) When talking about safety, one hopes, that all possible users have been considered.

This happened very likely due to the “sampling bias”: “Sampling bias occurs when a sample does not accurately represent the population being studied. This can happen when there are systematic errors in the sampling process, leading to over-representation or under-representation of certain groups within the sample.” (Simply Psychology)

3 – Facebooks “Year in Review” 

In 2014 Facebook introduced the “year in review” feature, which showed the user their best performing posts of the past year. The algorithm would identify the “best” posts/moments depending on the amount of likes. Now this is all fun and games, until you see a lost loved one in your year review. While the algorithm might work for most users, some will have a different, less satisfying experience. (cf. Forbes)

Who ever had the idea for this feature, handed their bias over to the algorithm who automatically creates these reviews. Due to the optimism bias people to believe that they are less likely to experience negative events and more likely to experience positive ones. This bias can lead to overly optimistic expectations about the future, underestimating risks, or failing to prepare for potential challenges. Designers assumed that users’ most engaged photos and moments would always be joyful, leading to a feature that unintentionally surfaced painful memories for some users.(cf. The Decision Lab)

Source

These are just three examples of how biases can affect design and there are many more, this was just the beginning. Although I have noticed, that a lot of bias related “fails” happened, because the designers or researchers focused on one part of their users. There is another bias, that might be the basis for all of this: The majority bias, cognitive bias where people focus on the larger or more visible part of a group, often overlooking minority perspectives. This bias assumes the majority is representative or correct, leading to the neglect of smaller groups or less common viewpoints. Which could lead to neglect of a bunch of smaller groups, which all together would form the majority. (cf Nature)

05 The Cognitive Bias Codex – Too much Information

Source: Wikipedia

The Cognitive Bias Codex, by Buster Benson, is a visualization of over 200 cognitive biases, offering an overview of how our minds work. Inspired by his childhood, Benson developed the Codex to help others understand and mitigate the influence of biases. The Codex encourages critical thinking and greater self-awareness, empowering individuals to make more informed and balanced decisions. (cf. Emergent Thinkers) It separates all biases into 4 problem groups: Too much information, not enough meaning, need to act fast & “What should we remember?”. This and the following blogposts will explain one of the four categories, reflecting on the different biases within them and their impact on UX work.

Each category shows a broad problem definition, which is then split up into different behaviors we show or have. Below these there are effects or biases that explain why we have these behaviors, since they are a combination of all our biases and influences from our surroundings. To make this shorter and easier to read, I will not go into detail on every single bias and effect there is. (At least not in this blog post. ;D)

01 Information Processing

This category of the cognitive bias codex highlights how our brains handle the massive amounts of data we encounter daily. These biases influence how we collect, interpret, and remember information, often simplifying them to help us make decisions faster. While these mental shortcuts can be useful, they also shape our beliefs, judgments, and actions in ways we may not fully realize. Exploring this category helps to uncover hidden filters in our thinking, enabling us to better evaluate information, recognize distortions, and make decisions with more clarity. (cf. Gust de Backer)

01.1 Primed or Repeated Information

Our attention is drawn to information that aligns with what we already know. This makes certain details seem more important than others. The list of biases is very long, so here are the five biases I consider most important for UX Design.

  1. Availability Heuristic
    People judge the likelihood of events based on how easily examples come to their mind. This can lead to skewed decision-making, as recent experiences are more easily recalled and seem more common than they actually are. In UX design, using familiar examples or well-known patterns can help users make quicker decisions. (cf. Beyond UX Design C)
  2. Attentional Bias
    People tend to pay more attention to certain types of information while ignoring others, based on personal preferences, emotions, or past experiences. This means users are more likely to notice and engage with elements that are emotionally charged, eye-catching, or familiar. (cf. Beyond UX Design D)
  3. Mere-Exposure Effect
    People tend to develop a preference for things because they are exposed to them repeatedly. This effect can be used by consistently presenting certain features or brand elements, making users more comfortable and familiar with them. Over time, familiarity can lead to greater trust and engagement. (cf. Beyond UX Design F)

  4. Empathy Gap
    People fail to predict how emotions and mental states affect their behavior, leading to misunderstandings. For example, when not hungry, we might rationally predict we would choose a healthy snack, but in a hungry state, we’re more likely to pick something unhealthy. Understanding this gap helps in designing user experiences that anticipate emotional states and provide supportive features or messaging.
    (cf. The Decision Lab B)
  5. Omission Bias
    Harmful actions are perceived as worse than harmful inactions, even if the consequences are similar. For instance, people may feel less guilty about allowing negative outcomes than if they actively caused harm. Users might prefer passive features, like automatic settings, that avoid perceived responsibility or failure. Designers can use this by considering user preferences for default options or avoiding overwhelming users with too many choices. (cf. The Decision Lab C)

01.2 Attention-Grabbing Details

Unusual or emotional things captivate us, our brains are wired to notice things that are out of the ordinary. These biases make us prioritize spectacle over substance, they show us how we can make important information stand out and make our users remember it.

  1. Von Restorff Effect (The Isolation Effect)
    When multiple similar items are presented, the one that stands out is more likely to be remembered. This can be applied in UX design by making important elements or actions visually distinct. However, it’s crucial to avoid overwhelming users by overusing emphasis and to be mindful of accessibility issues, such as color vision deficiencies or motion sensitivity.
    (cf. Laws of UX)
  2. Picture Superiority Effect
    People tend to remember pictures better than words, visuals are processed in two ways as images and as associated words, while words are processed only as text. In UX design, using clear, literal images can improve memorability and comprehension. Effective placement of visuals, using unique images, and avoiding abstract visuals are key strategies to take advantage of this effect.
    (cf. NN Group B)
  3. Self-Relevance Effect
    People are more likely to remember information that they relate to themselves. This bias enhances memory retention when we connect new knowledge to personal experiences. In UX design, leveraging this effect could involve personalizing content, such as customized recommendations or user-centered messages, to improve engagement and retention. For example, presenting content that users can relate to personally, such as reminders tied to their preferences or past behaviors, can make the experience more memorable.
    (cf. The Behavioral Scientist D)

01.3 Novelty and Change

Elements that are new to us or in motion naturally capture our attention. However, this can make us overlook stable, ongoing factors that are equally significant.

  1. Anchoring
    This bias occurs where initial information, such as a suggested value, influences subsequent decisions. While anchoring can guide users to make decisions that align with desired outcomes, it can also unintentionally restrict creativity and objective thinking. (cf. Beyond UX Design B)
  2. Distinction bias
    This means, that we evaluate options differently when we asses them together or separately. This often leads to misjudgments, when viewing options side-by-side minor differences may seem disproportionately important. For example, comparing two similar products might exaggerate their distinctions. (cf. The Decision Lab A)
  3. Framing Effect
    People react differently depending on whether the same information is framed positively or negatively, influencing decisions. For example, a product described as “95% effective” might be more appealing than one described as “5% ineffective,” even though both mean the same. This bias underscores the power of context and language in shaping perceptions and choices.
    (cf. The Decision Lab B)
  4. Weber–Fechner Law
    The Weber–Fechner law is about how we sense changes, like light, sound, or weight. It says we notice small changes when something is light or quiet, but bigger changes are needed if something is already heavy or loud. For example, if you’re holding a tiny feather and add another, you’ll notice the difference. But if you’re carrying a heavy backpack, adding one feather won’t feel like much. Imagine having a website in a very clean look with very little visual clutter, little changes will be noticed easier, than on a website with a lot of flashing colors and pictures. (cf. The Behavioral Scientist C)

01.4 Confirm Believes

Confirmation bias leads us to favor information that supports what we already think or feel, reinforcing existing opinions and blinding us to contrary evidence. There are a lot of effects and biases listed in this category here are the ones that I consider most important for UX Work:

  1. Confirmation & Congruence Bias
    The confirmation bias describes the tendency to favor information that aligns with existing beliefs, leading to overlooking or dismissing contradictory views. The congruence bias is very similar, it describes the inclination to test hypotheses through direct confirmation, neglecting alternative possibilities, which can result in flawed conclusions. Especially during user testing this could hinder the advance of products. Since the goal is to find the flaws and shortcomings of a product, this could lead to them being overlooked. (cf. Beyond UX Design B, Philosophy Terms)
  2. Expectation Bias (Experimenter Bias)
    This describes the tendency for researchers to unintentionally (or intentionally) influence their study outcomes to align with their expectations, potentially skewing results. Since UX designers have to work with a lot of data, this could once again lead to missteps during the design process and the need to redesign the product later. (cf. The Behavioral Scientist A)
  3. Choice-Supportive Bias
    The tendency to remember past choices as better than they were, often by attributing positive features to selected options and negative ones to rejected alternatives. This could, on a small scale influence, how users give feedback to researchers after a testing session. Highlighting what went well and neglecting frustrating experiences, which could make a product seem better than it actually is. Paying attention to what people do is important to later compare this to what they said. (cf. The Behavioral Scientist B)
  4. Observer Effect
    A phenomenon one is very likely to come across while doing user research. Individuals tend to modify their behavior due to being observed, which can impact the authenticity of observed actions. Which is totally understandable, you wouldn’t want to be perceived as stupid or incapable in front of another person. (cf. NN Group A)

01.5 Spotting Flaws

It’s easier to spot mistakes or biases in other people than our own, making us more critical of others and less about our own behavior. The codex depicts three biases in this subcategory:

  1. Bias blind spot & Naive realism
    (I have already written a blog post about this bias ;D)
    We tend to think, that we see the world objectively (as it really is) and others don’t. We are convinced or information is correct and others who don’t share our views are misinformed or biased. Recognizing naïve realism helps us appreciate diverse perspectives and approach disagreements with empathy. Which is a key ability for UX designers in my book.(cf. Jakob Schnurrer)
  2. Naive cynicism
    We mistakenly believe others are more selfish than they actually are, often misinterpreting their intentions. This bias can strain relationships, create mistrust, and hinder collaboration, especially in team settings. Practices like active listening, open communication, and team-building help prevent misunderstandings and promote a more supportive environment.
    (cf. Beyond UX Design A)

04 Bias in Ai

Taking a little detour from my actual topic, I wanted to explore an issue of our time, bias in Ai. A topic that comes up a lot, when reading about Ai. I wanted to know, what can be done about it and how it could be avoided. Could this have an additional impact on our society?

Artificial Intelligence (AI) is transforming industries, and (UX) Design is no exception. Ai already has the ability to deliver high quality design work and is going to continue to evolve. It’s reshaping how we approach design, offering tools that enhance efficiency, streamline workflows, and even generate creative outputs, it’s already capable to deliver high quality design work. While AI excels at analyzing data, creating prototypes, and even predicting user behavior, the heart of UX design lies in empathy, problem-solving, and collaboration, skills uniquely human in nature. (cf. Medium A)

Ai can analyze vast amounts of user data to uncover patterns and insights that inform design decisions, helping designers better understand their audience. It can also generate initial design drafts or prototypes, saving time and allowing designers to focus on refining creative and strategic elements. Predictive algorithms powered by AI can anticipate user behavior, enabling the creation of more intuitive and personalized experiences. By automating repetitive tasks and offering data-driven insights, AI empowers designers to elevate their craft while maintaining a human-centered approach. (cf. Medium A)

But what if the data the Ai gets is already biased towards a certain user group, making it’s outputs biased as well a therefore influencing UX work. Addressing bias in AI is not just a technical challenge; it’s an ethical imperative that impacts the lives of millions.

Examples of Bias in Ai

  1. Healthcare Disparities: 
    An algorithm used in U.S. hospitals was found to favor white patients over black patients when predicting the need for additional medical care. This bias arose because the algorithm relied on past healthcare expenditures, which were lower for black patients with similar conditions, leading to unequal treatment recommendations.
  2. Gender Stereotyping in Search Results
    A study revealed that only 11% of individuals appearing in Google image searches for “CEO” were women, despite women constituting 27% of CEOs in the U.S. This discrepancy highlights how Ai can perpetuate gender stereotypes.
  3. Amazon’s Hiring Algorithm
    Amazon’s experimental recruiting tool was found to be biased against female applicants. The Ai, trained on resumes submitted over a decade, favored male candidates, reflecting the industry’s male dominance and leading to discriminatory hiring practices. (cf. Levity)

How does bias in Ai form?

Bias in Ai often forms due to the way data is collected, processed, and interpreted during the development cycle. Training datasets, which are meant to teach AI models how to make decisions, may not adequately represent all demographics, leading to underrepresentation of minority groups. Historical inequities embedded in this data can reinforce stereotypes or amplify disparities. Additionally, the way problems are defined at the outset can introduce bias; for instance, using cost-saving measures as a proxy for patient care needs can disproportionately affect underserved communities. Furthermore, design choices in algorithms, such as prioritizing overall accuracy over subgroup performance, can lead to inequitable outcomes. These biases, when unchecked, become deeply ingrained in AI systems, affecting their real-world applications.

Source: Judy Wawira Gichoya, pos. 3

Sometimes, the problem the Ai is supposed to solve is framed using flawed metrics. For instance, one widely used healthcare algorithm prioritized reducing costs over patient needs, disproportionately disadvantaging Black patients who required higher acuity care. (cf. Nature) When training datasets lack of diversity or reflect on historical inequities, Ai models learn to replicate these biases. Also, a well-designed system can fail in real-world settings if deployed in wrong environments it wasn’t optimized for. (cf. IBM) Decisions made during model training, like ignoring subgroup performance—can result in inequitable outcomes. (cf. Levity)

How to address bias in Ai

To avoid bias in Ai thoughtful planning and governance is important. Many organizations rush Ai efforts, leading to costly issues later. Ai governance establishes policies, practices, and frameworks for responsible development, balancing benefits for businesses, customers, employees, and society. Key components of governance include methods to ensure fairness, equity, and inclusion. Counterfactual fairness for example addresses bias in decision-making even with sensitive attributes like gender or race. Transparency practices help ensure unbiased data and build trustworthy systems. Furthermore a “human-in-the-loop” system can be incorporated to allow human oversight to approve or refine Ai-generated recommendations. (cf. IBM)

Reforming science and technology education to emphasize ethics and interdisciplinary collaboration is also crucial, alongside establishing global and local regulatory frameworks to standardize fairness and transparency. However, some challenges demand broader ethical and societal deliberation, highlighting the need for multidisciplinary input beyond technological solutions. (cf. Levity)

03 All about Biases

Before getting to know specific biases and getting to know, how to work around them, let’s take a closer look on what a bias actually is, how it’s formed and whether it’s a good or bad thing.

Bias – Definition

According to the Cambridge Dictionary, a bias is “the action of supporting or opposing a particular person or thing in an unfair way, because of allowing personal opinions to influence your judgment:” (Cambridge Dictionary) Sticking with explanation of language, you might come across the term “to be biased against” something or “to be biased towards” something. Being biased against something means to not favor something and being biased towards something means to favor it over something else. (cf. Britannica Dictionary)

Why am I explaining this? Well, I have come to realize, what I actually want to research are cognitive biases not bias in general. So I wanted to understand cognitive biases a little better.

A cognitive bias, is a predictable pattern of error I how our brain functions, those are very widespread. They affect how people understand and perceive reality and hard to avoid, they can lead to different people interpreting objective facts differently. Cognitive biases can lead to irrational decisions, they are result of mental shortcuts, or heuristics. (cf. Britannica Dictionary)

Additionally, one can differentiate between explicit and implicit biases. Explicit biases are conscious and intentional, individuals are fully aware of their attitudes and beliefs, which they can openly express and acknowledge. Implicit biases are unconscious and unintentional, they operate below the level of awareness, influencing behavior without the individual realizing it. (cf. Achievece)

How do Biases form?

Our minds can be like a collection of pockets where every experience is categorized and stored. This sorting process begins in childhood, helping us make sense of the world and react to future situations based on grouped experiences. It occurs automatically, as a mental shortcut to handle vast amounts of information efficiently. While this process is helpful, it also means our present decisions are often influenced by past experiences, which can lead to unconscious biases affecting how we view people, places, and situations.

Positive bias arises when something aligns with our own ideas or feels familiar, while negative bias occurs when something deviates from what we perceive as normal or preferable. Biases are not solely shaped by personal experiences but can also be influenced by external factors, such as media framing of situations, groups, or issues.

Biases can lead us to perceive someone as less capable or trustworthy or cause subtle discomfort around certain individuals. Importantly, these biases are often based on past experiences rather than the present context. (cf. NHS)

They stem from mental shortcuts, known as heuristics, which help our brains process information efficiently. While heuristics save time, they can lead to errors in thinking, particularly when patterns are misinterpreted or assumptions are made too quickly. (cf. Wikipedia)

How do the effect us?

Bias affects many aspects of our lives, often subtly influencing our decisions and perceptions. Implicit bias, formed over time through exposure to societal norms and experiences, impact everything from personal relationships to professional choices. For example, biases can affect hiring practices.Research shows that even trained scientists show bias in hiring, preferring male candidates over equally qualified women. Similarly, a study found resumes with “white” names were more likely to receive interview callbacks than those with “black” names, even when the resumes were identical.

These biases, often unintentional and shaped by socialization, affect not only professional decisions but everyday interactions as well. Recognizing and reflecting on our hidden biases is crucial to minimizing their impact and promoting fairness. (cf. Forbes)

02 The Bias Blindspot

In my last blogpost, I linked a survey, which sadly no enough people answered, but what was it all about? I wanted to test something I read about while researching for my first blogpost: The bias blindspot. Explained simply this means, that we detect biases easier than in ourselves. Sadly not enough people took part, so I can’t make any assumptions on our study program. Still this is an important topic to talk about, this can pave the way for us to understand biases better and learn how to overcome their influence.

What is the Bias Blindspot exactly?

People are mostly unaware of their own biases, although they can easily detect them in the judgement of others. They tend to believe that they are less biased than their peers. There are many examples of this, that you will surely recognize, maybe you have already been in a similar situation before. If you ask physicians if gifts from pharmaceutical companies would influence their decision about what medicine they prescribe, most claim that they would not be influenced by this. If you turn the question around and ask if they think, other physicians would be influenced by gifts, most will agree. This disparity can occur in many different types of judgements or decisions. (cf. CMU)

A study found that only one out of 661 adults said, that they were more biased than the average person, all other 660 were sure, that they are less biased than the average person. In addition most people have no idea, how biased they actually are but are sure people around them are more biased than themselves. There is a good reason to why this happens, since society teaches us from a young age, that being biased is bad, we don’t want to see ourselves as people, who do bad things.(cf. IxDF)

The bias blindspot is a combination of two mental short cuts, the “introspection illusion” and “naive realism”. (cf. Scopelliti, Irene Scopelliti et. al.; pos. 1-2) The “introspection illusion”, says, that we tend to rely on our own thoughts and memories, when we think about whether we are biased or not. Although this introspection doesn’t reveal subconscious factors that are influencing us. (cf. Renascence) “Naive realism” describes, that we think we see the world how it is, without any distortion, although this is most likely not true. We underestimate the possibility that we are fooling ourselves. (cf. Medium a)

What is the influence of a Bias Blindspot?

The biggest problem of the bias is, if we operate within our blindspot, we are less likely to accept input from our peers and/or experts. In addition, we are less likely to benefit from education and training concerning our particular biases. (cf. IxDF) Not only that but we will underestimate the influence of our own biases, which can lead to skewed design decisions, that fail to cater to diverse user needs. Mitigating biases is essential to gain reliable insights into user behavior and preferences. (cf. Medium b)

So this means for UX Design, this means, that if we fail to learn about flaws in our design, we can’t create an experience that truly caters to the users needs. We might even make mistakes over and over again.

How can we overcome the Bias Blindspot?

First things first, overcoming bias is hard and takes a lot of work and self reflection. To overcome the bias blindspot is to be aware that it exists and it influences a lot of our decisions. Then to truly overcome the bias blind spot, UX designers have to engage in self-reflection, constantly challenge assumptions, and foster an open and inclusive design process. (cf. IxDF, cf. Medium b)

This is quite complex, and I want to give the topic of overcoming biases the room, that it deserves. There will be a follow up post about this topic.

Thanks for reading through my blog! 
Leave a comment, if you are interested in this topic and tell me what you want to read about next! ;P

Fun Fact

While writing this blogpost, I realized, that I thought to myself: “That could never happen to me.”, a second later, I realized, that this was exactly what was going on. So even if you are aware that something like this exists, you will fall into your own traps, over and over again.

by me (using imgflip-MemeGenerator)

01 The influence of cognitive biases on UX Work

Before reading this please answer this question (even if you don’t read the blog):

Results in next post ;D

Background

One of the reasons, who I got into UX Design in the first place is behause it connects three of my fields of interest: Design, Psychology and working with people. I want to find out more about what makes people click and what drives their perception of a design. Considering unconcious factors that influence how a user percieves a product is an important step to make a product truely userfriendly and human centered. Being aware of these factors and biases can really help to correctly approach a UX problem. Is this a „real“ finding or is this problem due to a bias?

What is a Bias?

First things first: “[A] cognitive bias is the tendency to think certain ways, often resulting in a deviation from rational, logical decision-making.” (CXL) The occurs in all areas of life, there is a bias for almost every area of life, they impact how we buy, sell, interact with friends, think, feel, etc. Feeling guiltier about a certain situation than you should, according to friends and family, you could be experiencing the egocentric bias. (cf. CXL) It’s important to remember that biases can occur on both sides during user research, both the user and the researcher can be subject to predetermined believes, affecting the outcome of the research. Some are already well known like the confirmation bias. (cf. Smashing Magazine)

Source

Impact on UX Design

In UX design, a bias can emerge at any stage, from topic selection to data interpretation, due to influences from researchers, participants, or other external factors. This is particularly concerning since designers and researchers may not be aware of them, potentially leading to skewed results or exclusionary designs. (cf. Clara Purdy) Take a look at the picture below, the cognitive bias codex, the list of biases designers make come across is nearly endless. Everyone can be subject to any of those biases, whether you come across it and recognize it or it effects yourself.

Source

Research Goals

Right now I can’t really tell where this research journey is going to take me, for now I will focus on biases and their effects of UX work. BUT during the researcher for this post, I realized how deep the rabbit hole around UX design and psychology goes. (Study guide for the rabbit hole ;D)

For now, a desirable outcome would be, to create a collection of biases and other effects, that influence people. Since one would have to become sensitive to these topics before they can conquer them. In addition to just generating awareness, there should also be info on why this matters and how to adjust to these effects. In the end there should be a lexicon about common effects, to be aware of and how to combat them. A deeper understanding of perceptual psychology will greatly impact how a designer approaches upcoming problems, to deepen the understanding for actions different users take.

Thanks for reading through my blog!
Leave a comment, if you are interested in this topic and tell me what you want to read about next! ;P