Episode 020 Full Transcript — Psychology of Bias

EP020

This page contains the complete transcript of Episode 020 of The Full Mental Bracket, a conversation examining how bias influences first impressions, intuitive judgments, and decision-making. The discussion spans social bias, systemic bias, algorithmic bias, and cognitive bias, using examples from film, psychology, and everyday life.

For the structured psychological framing and applied interpretation of these ideas, see the full episode analysis here

Psychology of Bias: Why Your First Answer Feels So Right

Topics Discussed in This Conversation

  • Why first answers feel obvious even when they’re wrong
  • How cognitive bias differs from social and systemic bias
  • Confirmation bias and intuitive reasoning
  • Bias in algorithms and decision systems
  • Cultural examples used to illustrate bias

[00:00] Opening discussion about bias and first impressions

Brent: Your brain isn’t broken, but it is tilted. Some ideas get a VIP pass, others have to fight uphill just to be heard. This tilt has a name. It’s called bias. And unless you learn to recognize it, it will quietly shape your judgments, your beliefs, and your decisions, while making these flawed conclusions feel both obvious and inevitable. Today we go deep on bias, how it works, how it gets baked into our systems, and tools to compensate for it. I’m Brent Diggs, and this is Full Mental Bracket, where science and storytelling meet to help you level up and tell a better story with your life. Good time period, Bracketeers. We’re coming at you with another episode. And today we are talking about bias. Bias is a scary word that terrifies some people, but we’re going to look into it, tear it apart and get to the bottom of it.

Camille: Yes. And I have something that’s going to massively help us in this process.

Brent: You do?

Camille: Of course I do. You don’t think I would come here empty handed, do you?

Brent: No, naturally not that.

Camille: No. Do you or someone that you know suffer from bias?

Brent: Yes.

Camille: 10 out of 10 people do. But don’t let embarrassing partiality keep you from making judgments. Use Bias Be Gone Bias Blocker SPF 50.

Brent: You have anti-bias lotion?

Camille: Well, yes. It’s more like a gel, but yes, absolutely.

Brent: Okay. So what does SPF stand for?

Camille: You’re going to love this. This is going to be great for marketing. Suspicion Protection Factor.

Brent: Right. So does that mean it doesn’t actually prevent bias?

Camille: No. No, no. It just covers it up so you don’t notice.

Brent: So you can still be biased and you just don’t even notice that you’re biased.

Camille: Correct. Isn’t that great? And for just three low payments of $19.95.95, you can get this ship straight to your door. Of course, that doesn’t include shipping in handy or any tariffs.

Brent: Weird. Okay, so it doesn’t prevent bias, it just keeps you from noticing that you have it.

Camille: Right.

Brent: Which that sounds all natural because that’s surprise, Bracketeers, that’s exactly what our brains do. We’re biased a lot.

Camille: Yeah, but we should make money off that.

Brent: And never notice. That’s true. Are your biases not making you money? Put those biases to work. Why should cable news have all the fun? Bias for profit.

Camille: There you go.

Brent Diggs and Camille Diggs discussing the psychology of bias in Episode 020 of The Full Mental Bracket podcast

[02:54] Conversation about how the brain weighs information

Brent: Wow. All right. So the word bias basically just refers to information being weighed or evaluated unfairly.

Camille: My scale does that all on its own.

Brent: It’s not weighing the evidence accurately?

Camille: No.

Brent: It’s biased against you?

Camille: It sure is.

Brent: It’s telling you a negative story? Why are you so mean?

Camille: That’s right.

Brent: You’re going to turn it into HR for being cruel to you?

Camille: That’s right. It’s making me cry.

Brent: It does that to me from time to time. All right, so a little word nerdery, the word bias actually came to us originally from the French language and it just meant like inclined, at an angle, oblique. The word bias refers to the idea that some information has to work uphill in our brain to get a fair hearing. Our brain is not level.

Camille: Is it uphill in the snow both ways?

Brent: It’s uphill in the snow both ways. When it comes to certain ideas and conclusions, our brain doesn’t evaluate them levelly and equally. It prefers certain options, and it gives them a preferential treatment. They get the VIP pass, they get to go to the head of the line at Disneyland, and other ideas are kind of in the background. And that’s not really a bug, it’s kind of a feature. It helps us process the ton of information that we have in the world, but it can also have some very detrimental effects, especially when it involves people. You want to be able to switch that bias off every once in a while and compensate for it.

Camille: Or at least know how to overcome it.

Brent: That’s true, because you can’t really switch it off. But if you can listen and learn tools, you can learn to compensate for it. All right, so bias covers a lot of things, but there are four main categories of bias that we’re going to cover today.

Camille: It’s coverage like a giant blanket. Or a tent.

Brent: Or like some lotion gel that you smell over your biases, one or the other. All right, so the most well-known is social bias. Like you consider some groups more favorably than others. Other groups, just by default, you think that they’re worse than others. We’ve referred to this previous on the show. We had an example from the original Star Trek.

Camille: Oh yeah, that’s a good one.

Brent: The episode was, let that be your last battlefield. And one alien was white on the left and black on the right, and the other was reversed. And they said, as you can see by staring at me, I’m clearly, you know, ethnically and racially superior. And it’s like, you, you’re just identical. It’s like, no, no, no, no, we’re different. So that’s the standard.

Camille: That’s was a really good, really good episode.

[05:32] Discussion of social bias and shared assumptions 

Brent: That was a safe example of the standard social bias where you, you are for someone or against someone automatically without thinking about it just by what group is activated and what category is activated in your brain. So, another thing that comes from that is systemic bias. So, when those social biases get baked into systems.

Camille: Like in the movie Hidden Figures.

Brent: Oh, yeah.

Camille: Where she has to walk from her work area all the way to the bathroom, a half a mile away, and just to be able to go to the bathroom. I mean, that’s horrible. I mean, it made her look less competent to her boss, which she was probably more competent than anybody in that room.

Brent: So one of the mathematical ladies, Kevin Costner’s busy yelling at her, why are you gone all the time? She’s like, the only bathroom for me is a half mile away. And it didn’t even dawn on him that that handicap, that was a handicap that was artificially constructed. These people can’t use regular bathrooms. They have to go to this other bathroom because the social bias got baked into that bathroom system. The actual structure of the offices. They’re trying to beat the Russians to the moon. And Kevin Cochran says, hey, we got to beat the Russians. We don’t have time for this. And he gets a sledgehammer and knocks the sign down. It’s like, we’re not doing that no more.

Camille: It’s a very satisfying scene.

Brent: Yes. Like, we all pee the same color at NASA, which was a great line. I wish I had written that.

[06:45] Exploring how systems and algorithms inherit bias

Brent: Another way that is, we’re seeing more and more these days is algorithmic bias. In the same way that systemic bias is a social bias baked into a large organizational or government system, algorithmic bias takes social biases and information and how you think about it, and it bakes them into AIs and algorithms of some of our favorite applications.

Camille: Right, like when Amazon was using AI, for some hiring processes. And then the AI just went through some old resumes to teach itself. But in the resumes, there were no women. And so there wasn’t, they, they weren’t, they weren’t actually able to hire any women, because the AI learned that, hey, I’ve, I’ve mastered all these resumes, I know what we’re doing, I know who we’re looking for. And none of them were women. So they had to fire the AI in that case.

Brent: Yeah, and a lot of people, and sometimes you get tired of hearing, like, representation, representation. But that was a good example of that. It was like the data for women engineers was not represented in the data that they trained the AI on, and so it didn’t know to look for it.

Camille: That’s right.

Brent: It’s like, what is an engineer? Well, he’s a guy, and he’s a certain age, and it’s like, you’re wrong. Go back to the beginning. But the AI didn’t know any better, because the people that programmed it just dumped stuff in without thinking about it.

Camille: That’s right.

Brent: It was a biased sample. It wasn’t a full sample of all engineers. It was just a handy sample. They threw it in there. And then that bias got baked into the algorithm. Now, when you do that with police things and other government things, where it’s like, oh, these people are troublemakers and these people are exemplary citizens, and that’s not necessarily true, then you have problems. And so engineers, I hope you’re paying attention. Don’t do that. All right, and then the final group of bias we’re gonna talk about is cognitive bias. Now, cognitive bias is a little bit different.

Camille: I like that word.

Brent: You like cognitive?

Camille: It’s a big word, but you can break it down for us.

Brent: Is that why you like it? Is it scrabble? Are you putting it aside for your?

Camille: Ooh, you get a lot of points for that word.

Brent: You can get quadrupled points or something.

Camille: That’s a great word.

Brent: I know scrabble rules, but for something like that, cognitive, it should be like 12 times the points, I think.

Camille: That’s right, absolutely. I honestly think. Yeah, so maybe you could break it down for us because it’s a big word.

[08:54] Conversation about cognitive shortcuts and intuition

Brent: So all the biases we’ve talked so far are learned. You get this information from other people or in the environment and it goes in your brain and your sample information, like the AI, you receive information that’s kind of twisted or biased or distorted and you learn to reason from that. But a cognitive bias is actually built into the brain. It is systematic. It is part of how the brain works.

Camille: I wish we could overcome it easier, but we actually don’t even know that it’s there.

Brent: Oftentimes, yes, because it’s, once again, it’s one of these things where the brain doesn’t consider it as a bug, it considers it as a feature. And so there’s so much information out there, the brain has programmed these shortcuts, your intuition pops up with this answer and you’re like, oh, that’s gotta be it. And oftentimes it is, but like predictive text, it can give you an answer that works a lot of the time, but not every time. Like a good example for this would be confirmation bias. Confirmation bias is a cognitive bias. And it means that when you’re looking at information, the information that agrees with how you see the world, your own beliefs and how you think the world works, you see that and remember that and acknowledge that far easier than information that disagrees with you.

Camille: True.

Brent: And you can see how that could be a feature. If someone comes up to you and say, gravity is a hoax. Perpetuated by big gravity in the government, you don’t pay any attention to that because it doesn’t line up with what you know.

Camille: That’s true.

Brent: But oftentimes, if what you know is distorted by these other biases, confronting that cognitive bias is actually a useful tool to learn to see the world more accurately.

Camille: Confront, do battle.

Brent: Confront the cognitive.

Camille: Evil biases.

Brent: All right. So an example of this kind of cognitive biases at work, you see this in the movie Moneyball and you’ve got a Billy Bean and his assistant and they come in and they go, his assistant, his assistant manager, Peter Brand, the Harvard genius dude. And they come in and they’re like.

Camille: Wait, is it Yale or Harvard?

Brent: Harvard. Okay. I don’t know. It might’ve been Yale. It was Yale.

Camille: Anyway, he was a smart dude.

Brent: You’re right. It was Yale. I stand corrected. It was Yale. So anyway, Peter Brand, uh, came out of Yale and he knew economics and numbers and they’re like, Hey, surprise, uh, baseball team and, and scouts, we’re going to do baseball according to the numbers. And they were not having that. They’re like, ah, we have years of instinct, intuition. We have traditions. We know, we just know in our gut what makes a good ball player. And it’s like, well…

Camille: They were counting on their experience to just carry them the whole time. And didn’t look outside of the experience.

Brent: And Peter Brant describes that when he’s describing the algorithm. This algorithm is going to let us see past the biases that make some players look better than they are and other players look worse than they are. And we can actually see how good a player they actually are for what we need using these numbers.

Camille: Yeah. Winning.

Brent: Yeah, and so what they did, and this is the hint, this is a spoiler for all cognitive biases, they found a way to get past their System 1 intuitive answers and dug deep in the System 2 into the math and the more complicated logic and saw answers that you didn’t see at the first glance.

Camille: And they won. 20 games in a row, a record.

Narrator: This is Full Mental Bracket. – 12:01

Brent: So going back to social biases, I think most of us are familiar with many social biases. You can have a gender bias, you prefer men over women or women over men.

Camille: Right, you could have racial.

Brent: Yes, you could have an age bias.

Camille: Age, yeah, age, ageism, sure. And then also handicapped or ableism.

Brent: But we didn’t mention the most pervasive and painful social bias out there.

Camille: Yes, this one’s a big one.

Brent: Which is bias against Hawaiian pineapple pizza.

Camille: Canadian bacon and pineapple pizza.

Brent: It’s irrational, it’s painful. It’s learned, it’s a result of bad parenting.

Camille: All of the above.

Brent: Pineapple belongs on pizza.

Camille: With the Canadian bacon though, because it’s a sweet and salty thing.

Brent: Right, right. Now some people would say that we’re biased to like it. I would say you’re biased if you don’t like it. Technically both are true, but I’m sticking with my guns. My guns love bacon.

Camille: And my pizza.

Brent: My guns love bacon. All right, so as we mentioned before, social biases are learned information. You learn it from your parents or authority figures. You learn it from the environment. You pick it up from movies or stories or the narratives that are prevalent in your culture.

Camille: That’s true.

[13:21] Discussion comparing learned and unconscious bias

Brent: For instance, we have a test for you. So this is a bias test. We said riddle. If you’ve heard it before, bear with us. If you haven’t, think through this and tell me what you think.

Camille: Yes, I want everybody to think about what your immediate answer would be to this scenario. A father and son are in a car accident, the father dies, the son’s taken to the ER, and the surgeon there says, I can’t operate on this boy, he’s my son. How is that possible?

Brent: We’ll just pause for a second and think about that.

Camille: So I failed.

Brent: I also failed.

Camille: I failed the test because my immediate gut reaction was, oh, his mama was having…

Brent: Was having an affair.

Camille: Was having an affair. She had some playtime on the side, so that wasn’t the real dad who died, but the real dad, who the kid maybe didn’t know, was actually the surgeon.

Brent: You had a whole soap opera in there.

Camille: I did, I had a whole…

Brent: It was General Hospital.

Camille: It was General Hospital in the hospital, so it was definitely not the correct answer. What was your answer?

Brent: My answer was just baffling. I was just like, what? How can this be? I’m so confused. I’m not as patient with puzzles and riddles as my wife is and after like, I’m used to like having great breakthroughs and being super genius and if I sit there and I don’t get it, I’m just like, this sucks, this thing’s rigged, it’s biased against me. And I often just kind of quit quietly and don’t let anybody know. But uh, but the answer was that the surgeon was the boy’s mother.

Camille: The mom? Of course you have two parents. But our brains are so biased to think of surgeons as men that it didn’t even dawn on us that the surgeon was his mom.

Brent: And the important part to realize is that that wasn’t a conscious belief. If this thing comes to your mind where you can’t think of a female surgeon, it doesn’t mean you think that there aren’t female surgeons or that there shouldn’t be female surgeons or that female surgeons are somehow second rate. It’s just that you absorbed all this information from our culture and all society where the idea of a surgeon and the idea of a man go hand in hand. And although it goes hand in hand with other types of people, they don’t come to the mind quite as quickly, which is a bias. Other representations have to go uphill.

Camille: That’s the unconscious bias. You don’t realize that it’s a bias. You don’t choose it. It’s just there. It’s baked in from society.

Brent: So when they talk about these social biases, some of them are explicit, like I have a bias and I know about it and I love it. And other times they’re implicit, like I don’t consciously believe that, but my unconscious brain has made these associations and it populates these intuitions that are not consistent with what I believe.

Narrator: This is Full Mental Bracket. – 16:11

Brent: The thing is we all have biases. We all have implicit biases and sometimes we all have explicit, actually we do all have explicit biases. And there’s no shame in having biases as part of the brain, but there is shame in having biases and not fighting them. If you let those biases run your brain and manage your decisions, you’re going to make bad decisions. And it doesn’t matter if they’re cognitive biases or social biases. If you’re favoring these same answers without ever questioning them.

Camille: Well, and if you don’t fight them too, or listen to people in your tribe who tell you, that doesn’t sound quite right, then you perpetuate it.

Brent: And how many times have you seen people, you give them feedback like, I gotta go with my gut. And it’s like, your gut has a pretty average batting average. It’s pretty good, but it’s not gonna be good every time.

Camille: That’s right.

Brent: Do you do your checkbook according to your gut? My gut says I got plenty of money. It’s like, no, your gut just wants that pineapple pizza.

Camille: Yeah!

Brent: It’s like, no, you gotta flip into system two and use the math. Now, an example of explicit bias would be, again, in Star Trek. It’s Star Trek movie number six, The Undiscovered Country, where we we learn that Kirk, after his long many years of doing movies and shows, he hates the Klingons and he makes a big speech about it. I hate the Klingons. I’ve never trusted them. They killed my son. And Spock says, you know, that’s a bias, right? And he’s like, yes, darn right, it’s a bias. I love it. I polish it every day. I have a bumper sticker. I put it on my spaceship. I hate Klingons.” And he’s very aware that he’s biased against them, and he does not care.

Camille: Yeah, but he, I mean, he still had to serve in Starfleet, and he still had to serve with Klingons, and he had to kind of cover it up for a while.

Brent: Overcoming the bias was part of the story problem. He couldn’t see the right answer because his bias was clouding his vision from the other possibilities. And learning, so every story problem you have to grow and mature against as you transform against adversity. One of his adversities is that he couldn’t see past his own bias. Facing his own bias and learning how to think past it was part of the adversity that he faced to grow, to be a new and improved Kirk that could solve the story problem.

Camille: To be fair, I mean, his son did die and he had some emotional baggage that was weighing him down. So sometimes our emotions are gonna lie to us too.

Brent: To be fair, he was a professional military man and his job was to broker the peace between the Klingons and the humans. He’s like, screw this job, I want it all to go down in the pot so I have an excuse to blow some more up. And it’s like, that’s not just incompetent, that’s bad. Listen, Kirk, we love you, buddy, but you gotta get your head on straight, that’s not good.

Camille: All I’m saying is he probably needed therapy after his son died so that he could work through those emotions. So the emotions didn’t take on, a life of their own.

Brent: Well, you know, they didn’t believe in therapists till they got to the next generation. They got Troy on there. You know, he didn’t, it was just, it was a matter of his time. He didn’t have a, I didn’t have a Troy to talk him down off the ledge.

Camille: Okay. Moving on.

[19:18] Conversation about community, challenge, and decision-making

Brent: So something that we bring up in every single episode is the story framework about how you tell a better story with your life. And now you may be wondering, as we were wondering, how does these biases affect the story framework? The main way is about selecting your tribe and making sure your tribe is broad enough and diverse enough and enough different and conflicting and contrary opinions in your tribe to actually challenge you and critique your plans.

Camille: that reminds me of the 10th Man Doctrine, where you have a group of people and you’re working through a problem, you’re trying to solve some issue or you have a topic you’re discussing, and you have assigned a person, even though everybody’s come to the same conclusion, consensus, you have a person that you assign to take the opposite approach, to say, well, No, I’m going to say that we’re going to do it this way instead of that way. And everybody’s like, well, we all said we’re going to do it this way. But you’re going to assign that person to look at something different. And that will help you to overcome flaws, get out of groupthink, and be able to see from a different perspective.

Brent: So you have an assigned devil’s advocate who actively looks for flaws in the plan, who looks for alternatives like, okay, but here’s all the ways that your consensus could go wrong, and he plans for those eventualities.

Camille: Yeah, we actually had somebody in U.S. history who did that too.

Brent: Would that be Abraham Lincoln?

Camille: It sure would.

Brent: Yeah. Abraham Lincoln, when he was elected, he packed his cabinet with people who disagreed with him, had disagreed with the fact that he was president. They thought they should be president.

Camille: I don’t think it would be called the 10th man at that point. It would be called, you know, because he wanted everybody against him so that he could find all the flaws. So it wasn’t even, it was like, But the concept is valid. He wanted people to not just say yes to him, but to give him another perspective.

Brent: And then the 10th man seems like that might be a little biased too. Does it have to be a man? Do we have to be…

Camille: Oh, that’s true. It could be a woman.

Brent: Some, yeah. Just when it comes to being contrary, it doesn’t have to be men. It can sometimes be people that you know very closely.

Camille: Okay. So we’re moving on.

Brent: Right, so Lincoln, a lot of people on Lincoln’s cabinet thought they should have been the president. They didn’t like that he was president. They disagreed with almost everything he did, and he put them on there close at hand. It wasn’t necessarily keep your enemies closer. It was like, if there’s a flaw in my plan, I want the most savage people I know to tear it apart so we can make the best plan and the best decisions.

Camille: He wasn’t afraid. He wasn’t afraid to hear an opinion that wasn’t his own.

Brent: We’re in a war, we have to survive. I can’t afford people to coddle my ego and play nice to me and be yes men. I need just the opposite. All right, so a lot of social biases can affect your tribe picking people, but also cognitive biases. There’s a thing called the frequency illusion, whereas like when you…

Camille: Like the movie, We Can Time Travel?

Brent: Not quite so much. It’s more about when you buy a Jeep or you buy a motorcycle or something, suddenly you notice how many of them are on the road.

Camille: That’s true. I notice all the Nissan Versas on the road. Cause that’s the kind of car we own.

Brent: And when I got a motorcycle, I didn’t have any idea there were so many motorcyclists out there until I got one. And then I saw them everywhere.

Camille: Yeah.

Brent: And it’s like, it’s, you recognize the things that you’re familiar with.

Camille: True.

[22:28] Exploring discomfort, growth, and perspective

Brent: And that can happen when you’re trying to build a tribe. Sometimes when you say, oh, there’s a potential tribe member and there’s a potential tribe member and you recognize them because they’re just like you. So sometimes you’ve got to compensate for that bias and say, all right, so these other people, I’ve got people in my life who may be from a different religion or a different culture, or they make me uncomfortable in other ways. And it doesn’t mean that they don’t have good wisdom. It doesn’t mean that they’re not wise. It doesn’t mean that you would not benefit from having them in your circle. It just means it’s uncomfortable.

Camille: And you could maybe learn how to cook some different types of food too, so that might be cool.

Brent: That’d be great. And if there’s anything we’ve learned here, it’s that discomfort is the cost of growth. If you want to grow, you have to leave the comfort zone and be uncomfortable. It’s like, well, but the people in my circle, they’re not just like me. Like, that’s great. I mean, we don’t talk much about cloning, but what would be the point of cloning a group of people that are just like you if they’re gonna look for your blind spots?

Camille: Yeah, you need people who don’t think like you so that they can see outside of the box.

Brent: So to summarize, biases are natural. Everyone has them. If someone says…

Camille: 10 out of 10 people have them.

Brent: 10 out of 10.

Camille: You can’t, you can’t really get this guys. Sorry. (Fake product, Bias Be Gone)

Brent: Right.

Camille: Not really available.

Brent: So sometimes today when you still tell people they’re biased, they get their feelings hurt. They’re like, oh, can you It’s like you’re saying, I breathe oxygen. It’s like, yes, we all have biases. Don’t get your feelings all wrapped up about that. That’s true. The question is, what are you doing with them? Are you addressing them? Are you working on them? Are you compensating for them? Are you building a team that can help you with them?

[24:12] Closing discussion around recognizing and compensating for bias

Camille: That’s right. So let’s do some takeaways.

Brent: Oh, let’s do. So what biases do you recognize in your own thinking?

Camille: How are you compensating for them?

Brent: How can you lean into your tribe to help you recognize the biases in your thinking and decisions?

Camille: If you don’t see any biases, do you think that’s because you don’t have any?

Brent: You might want to work on that. All right, that’s all we have for this episode. We’ll come back with more episodes, some on cognitive biases. It’ll be really exciting. Thank you for joining us and goodbye.

Announcer: Full Mental Bracket podcast hosted by Brent Diggs. Logo by Colby Osborne. Music by Steven Adkinsson. Learn more at FullMentalBracket.com. This is the Full Mental Bracket.

Subscribe

Scroll to Top