• Confirmation bias: We tend to seek out information that confirms our existing beliefs and disregard information that contradicts them. For example, if we believe a politician is untrustworthy, we might focus on news stories that portray them in a negative light and ignore stories that show them in a positive light.
• Availability heuristic: We tend to judge the likelihood of an event based on how easily we can recall examples of that event. For example, we might overestimate the risk of shark attacks because we can recall several dramatic news stories about shark attacks, even though they are actually quite rare.
• Representativeness heuristic: We tend to judge the likelihood of something based on how similar it is to our mental prototype of that thing. For example, we might assume that someone is likely to be a scientist if they are wearing a lab coat, even though not all scientists wear lab coats.
• In-group bias: We tend to favor people who are similar to us, such as members of our own family, friends, or social group. For example, we might be more likely to hire someone who went to the same school as us, even if they are not the best-qualified candidate for the job.
• Out-group bias: The opposite of in-group bias, we may judge or view groups outside or different from our own less favorably
• Halo effect: We tend to form an overall positive or negative impression of someone based on one or two traits. For example, we might assume that someone is intelligent because they are good-looking, even though there is no evidence to support this.
• Fundamental attribution error: We tend to attribute other people's behavior to their personality or character, rather than to external factors. For example, we might assume that someone is rude because they are a bad person, even though they might just be having a bad day.
Bias can be a serious problem, but it is also important to remember that we are all susceptible to it. By being aware of our own biases, we can take steps to mitigate their effects.