This is an automated transcript from Descript.com - please excuse any mistakes.
Turi: [00:00:51] This podcasts spends a lot of time bashing Descartes’ Cogito, the Cogito Ergo Sum the idea that rationality is what makes us who we are. Miriam Schoenfield helped us understand how our most important beliefs about the world have the most arbitrary foundations. We inherited them. Adrian Barden explained in frightening detail, how we lie to ourselves to make ourselves feel better. James Mumford showed how our politically allegiance is, have only the most limited internal coherence and to make matters even worse. David Robson, you showed us how it’s the very cleverest amongst us who makes some of the most hideous epistemic mistakes. So. We’ve been painting a picture of humans as dangerously, irrational, but highly intelligent agents using all our smarts, not to understand the world, but to justify of use of it. The result, a deeply fractured political ecosystem and accelerating polarization. Today, we’re here to talk with dr. Kevin Dorst. A research fellow at Oxford university and assistant professor at the university of Pittsburgh, working on polarization and irrationality, who I think is going to argue that actually we are rational creatures. We should see ourselves as rational creatures and failing to do so is what is pulling our societies apart. Kevin, it’s a great pleasure to have you on the poly-A podcast.
Kevin: [00:02:18] Great to be here.
Turi: [00:02:19] Kevin. I wonder whether we might start by a story. Can you tell us about you and Becca?
Kevin: [00:02:27] Yes, definitely. So, polarization sort of is on everyone’s minds, but it sort of goes to my heart and in one particular way. So I, I grew up in rural, Missouri, and, I mean a small town in Missouri and, Even though it was sort of a, a little bit of a blue.in a very red state. it was not very blue. And so I had, you know, I was in a liberal family, but, we weren’t terribly liberal and certainly many of my friends were quite conservative. one of my good friends,. Becca. was it good? My friend growing up in high school and you know, when we, he went off to college, I was going off to a liberal university in st. Louis and she was going off to a conservative community college rural. I mean, at least you can guess that I was going to become more liberal and Jewish because it’s going to become more conservative. And so when we came back, we would see eye to eye less and so on, and that’s more or less what happened. I mean, we stayed in touch for a couple years, but it started to get more and more strained. We saw each other more and more differently or saw politics particularly more and more differently. I feel like that’s a pretty standard story these days. , I think there are sort of three things that stand out to me in the story. and two of them are pretty standard. So one is that like the polarization, that results is pretty profound by now. I mean, after the liberal university, I went to a liberal graduate school. I’m now quite. quite liberal. And she’s no doubt with more conservative than I. So this is a profound polarization. It’s also a persistent polarization because, , talking to her again, , the fact that you disagreed certainly doesn’t make me doubt my beliefs. and likewise for her. . But I think the most fascinating thing for me is that it’s also predictable that sort of coming. And so that constellation of facts about polarization at, I suppose, profound really far apart, it’s persistent. We don’t change our opinions and we find out that we disagree and it’s predictable. We can see all this come in. It’s a striking picture of polarization and a one that I want to understand more.
Turi: [00:04:21] And you use this story of Kevin and Becca as a sort of metaphor for what you think has happened broadly across the U S in the last decades. I say you asked, but also we’ve seen similar polarization stories across Western Europe to now. Yep.
Kevin: [00:04:38] Well, let’s, let’s get to your plate. Let’s talk about the U S net, which I feel like I have a better grip on. and yeah, I think that the, , this is a fairly standard story now for or three sentences in which U S has become increasingly polarized over the last few decades. the first is sometimes called partisan sorting it’s. this idea that well, even if people’s individual views on individual issues, don’t become more extreme. Their views tend to become more politically correct assistant in the sentence that liberals flock to the democratic party and conservatives flock to the Republican party. So, you know, 50 years ago, if you knew someone’s opinion about gun rights, you couldn’t prove it, their opinion about abortion, but now you can. And that’s part of that sorting. So that’s one sense in which we polarized.
Turi: [00:05:22] So this idea of partisan sorting, ideological, sorting, it’s the sense that Democrats have become more universally democratic, that they’re the opinions of, of, of coalesced much more closely. And that is very rare. For example, to have a pro life. person who is also, who’s also, left-wing economically those things have now split and people who are democratic, sort of hold onto those democratic principles across the entire way, the sort of policies and data on the Republican side. Is that right?
Kevin: [00:05:54] I think that’s right. Although I might qualify it a little bit by saying, well, it’s not entirely clear that there’s some unifying there’s necessarily some unifying. Philosophy that makes these policies all count as democratic. What’s definitely true is that there’s much more or correlated blinks amongst Democrats. So, you know, two Democrats are more than likely to do. I agree on some random issue that you choose nowadays than they were 50 years ago. Now it’s a hard question, whether there’s some like over left, right divide that really unified as those cholesterol physicians, then maybe there is, but at least what’s definitely true is that there’s agreement whether or not it’s sort of like a female.
Turi: [00:06:30] So jumping in quickly here. So only because plugging a wonderful conversation I have with James Mumford on this podcast, you were at a book called Vaxxed, actually going into all the intellectual inconsistencies that exist within a single political party. For example, the fact that. that you can be both pro-life and pro gun in the Republican party, for example. So, so actually what we see is that there isn’t, there really isn’t a core of, ideological consistency within any of these parties. So there’s actually, these are, this is an emotional move in other words, but that’s sort of ideological sorting.
Kevin: [00:07:07] Yes. So that was the first, Send some which the U S has become increasingly polarized. A second one is probably the sort of more common when you think of when you hear polarization is sometimes called attitude, polarization. It’s just the idea that takes some particular question. Like, is Trump a good president or is a Republican president a good thing for the country or something like that? People’s opinions have been more gotten full, pulled further apart on that. So for example, I think what are some of these numbers? So if you look at presidential approval ratings over time, there’s always been a partisan split Democrats, always like democratic presidents and Republicans always like Republican presidents and not vice versa, but that still is going, getting wider and wider. So I think in the Nixon era, it was like 75% of Republicans approved and 34% of Democrats did averaging across his, his tenure in Trump. I think it’s closer to 90%. Republicans approve. And 5% of Democrats do across Trump’s tenure and so on. And that’s a consistent, wow. And that’s the people that’s great.
Turi: [00:08:07] And that, and that you call attitude polarization. So that, the view of the policies of the other party has hardened negatively.
Kevin: [00:08:18] Well, so there’s, there’s a debate over whether attitude polarization has happened. And part of that debate reduces to the question of like, what. Issues, do we count? What opinions do we count? So I think it’s actually much harder to, if you just focus on policies, if you say like a universal basic income, say, take that as your policy and you just pull, whether people approve of that or not, it’s not so clear that there are at least across many issues. It’s not so clear that this has been a hardening of opinions in the abstract approving or disapproving between Democrats and Republic. Okay. But if you ask someone. Different question, which is Democrats are proposing a universal, basic income or something like that. do you approve this policy or not? Then there’s a massive split in, across parties. So sort of, if we look at policies per se, it’s not so clear that there’s been polarization, but we look at attitudes more broadly, including things like whether the democratic leave proposed policies are good ones with the Republican presidents are. You know, why is or whatever. those are opinions that have pardoned and gotten more extreme.
Turi: [00:09:20] I feel like jumping in again with apologies. If, if on ideological sorting, we’re clear that in fact. The there is little intellectual consistency in these different political tribes, but nevertheless, yes, we are a little more tribal or inside them and too on attitude. Polarization. What you’re suggesting here is, again, that it’s not so much the policies as the naming, the fact that they are Democrat or Republican, which makes us dislike them, not actually the policies themselves is that right?
Kevin: [00:09:54] I think that’s broadly, right though. I think that implication that you’re going for is when I would read it. So there’s, there’s a natural implication from those two observations say, well, okay. So it’s just, you know, it’s a rational or insensitive or, or, or silly to, or emotional or whatever, to just, you know, if we don’t actually care more about the policies, why is, whether a party proposes them or not, relevant for them. And I think, I mean, there’s a lot more to be said, but at least the first thing that is worth. Being clear on is that there’s nothing necessarily inconsistent or rational in not approving a policy X, but then learning that when Democrats proposed all CX, you do approve it because basically, yeah, these claims like we should implement a universal, basic income. That’s a different claim than the claim that Democrats are going to implement universal basic income. So if you. Let’s suppose rationally, really trust Democrats. Then you might think, Oh, that’s, if Democrats are proposing, that’s a good idea. If you really distrust Democrats, and of course you’re gonna have the exact opposite, if understood. Maybe.
Turi: [00:10:55] I keep interrupting you Kevin. What’s that last point about polarization in America today. Okay, great.
Kevin: [00:11:02] So the last and probably most striking form of polarization is sometimes called affective polarization, or I prefer to call it demonization. It’s just that the tendency to dislike. People from the other party and the other party in general. So the numbers here are particularly striking. If you ask people. What their opinion is of the other party say Republicans, what their opinion is of the democratic party from 1994 to 2016, that percentage that had a quote, very unfavorable and quote opinion of the democratic party jumped from 21% to 58% was I think something like 90% had an overall on favorable thing. And the numbers were Democrats attitudes towards Republicans are very similar. So the perhaps most striking and, worrying form of polarization is this. The fact that we just tend to hate each other. Now that people on different parties tend to disapprove of the other of those. On the other side,
Turi: [00:11:54] we spoke to Bob Talisse about this and, went into some of the reasons for it. But, can you explain how it works? You’re interested in all these various different forms of. Bias in the way that we understand each other confirmation bias, selective exposure, et cetera. What are the various mechanisms that we use to help us polarize and to keep us polarized?
Kevin: [00:12:18] Right. So there are, yeah, I mean, there’s a ton of different mechanisms that go on to lead people, to sort of predictably polarized, to predictably, strengthen their beliefs on, and their opinions and their feelings on these political, Discussions is political debates. The, I think it’s helpful to focus on a couple of different, tendencies, which have still a fairly broad tent. So the first one’s often called confirmation bias. People probably heard of that, but it’s hopefully distinguished into two sub categories or some subtypes broadly confirmation bias is the tendency to look for and interpret evidence in a way to. confirm your prior beliefs, that the one type of that is looking for evidence it’s called selective exposure. This idea that, which sort of evidence we look for is highly contingent on what, what we believe. So, I mean, this is pretty common sense school. If you’re liberal, you probably check the New York times. Sure. More often than you’ve checked Fox. If you’re Republican, you probably check Fox more often the New York times or wall street journal, pick your, pick your poison. So it’s selective exposure is just the fact that our prior beliefs condition what’s right of new information we get and what sort of sources we look at. and so if we tend to have some prior beliefs and look at sources that are favorable to those prior beliefs, that will tend to harden those beliefs. The second subtype of confirmation bias is once you get the evidence. So it’s sometimes called biased assimilation of evidence. And this is the fact that how we interpret evidence, Is contingent on what our prior beliefs are. And in particular, you can give, you know, Democrats and Republicans or more generally people will disagree on some issue, the exact same evidence, and they’ll interpret it in different ways. At least, at least sometimes they’ll interpret it different ways. So the classic study from a late seventies, I think is. Took these two groups, who disagreed about whether capital punishment was exact wording, but we should implement capital or have allow capital punishment. And, what they did was, you know, one group, I thought we showed one group prior. I thought we shouldn’t. And then they gave them two studies, one study that looked at the effect of capital punishment on, you know, Crime rates and so on and saw it, it had a deterrent effect and one that looked at it and said it didn’t. So they get everyone in these studies get the, in every subject, it gets these two studies to look at and they get some time to evaluate them. And what happens is those who thought capital punishment is a good policy. Really scrutinized the study, which suggested that it wasn’t and found flaws and all sorts of things. And so they interpreted the evidence on the whole to say, well, yeah, I saw that it was one study in favor of one study against, but on the whole, the study supported the idea that capital punishment is effective. And the people who thought prior that capital punishment wasn’t effective, had the exact opposite reaction. and so as an example of by assimilation, they get the same evidence and they have opposite reactions to it.
Turi: [00:15:09] Right. all of these are forms of what’s been called motivated reasoning. And, just to put it back in the context, the broader context of in this talk, motivated reasoning is broadly how we all assume we think we have this narrative. Certainly on the poly-A podcast of deeply irrational humans, fixing the data to make it suit our views of the world. can I ask you to tell me why you think that’s wrong? Because I think your sense here is it actually, there is rationalism in the, in some of that motivated reasoning. Could you help me understand what is rational about polarization here?
Kevin: [00:15:51] That’s right. so I think it’s helpful to distinguish two different issues as the empirical issue of what people do when presented with new evidence when given the choice choice of which newspaper to look at and what have you. And there’s a normal question of whether they should be doing that, or whether that’s a rational and, the empirical work, I mean, Replication crisis. It’s not withstanding the empirical work on this stuff is pretty solid that there’s tendencies. People have to look for information. Yeah. Tends to confirm their beliefs, to interpret information a way that tends to confirm it, to have these motivated reasoning effect on their beliefs. This is normative question of, is that irrational? so the basic. Argument I want to make is that it’s not nearly so clear as you might think that it’s rational
Turi: [00:16:39] and we can, we can make absolutely clear that we know what we’re saying. Aye. Replicability crisis, which is the issue of whether all these experiments can actually be replicated. And I know there is a crisis in academia around some issues there, but I’m going to ignore that and just go with the anecdotal I, and everyone. I know. Absolutely. Does selectively expose. I do read liberal newspapers. Cause that’s what I like. And I do scrutinize evidence, which I disagree with Matt trunch, Harbor and evidence tonight. But I do agree with that’s a common trait, right? What you’re telling me is that actually that may actually be rational. That may be the reasonable thing to do.
Kevin: [00:17:22] Excellent. So, yeah, I’ll I want to break the argument into two parts. One is just a, general argument that we should think there’s gotta be some rational explanation of these mechanisms, then I’ll go. We can, we can talk in more detail about how, for example, that particular tendency could be, could be rational. so the general argument is pretty straightforward. It’s got two premises or two steps. the first is just sort of, These psychological tendencies, we have to selectively expose ourselves to different information, to interpret evidence in a bias or sorry, by simulation Evans and so on. psychological evidence, just clear that those are bi-partisan. So, both Democrats and Republicans, liberals, conservatives have the same tendencies to fall fall for, or use at least motivated reasoning, selective exposure, confirmation bias, and so on. and that’s. Been shown in all sorts of studies that were made these things so well know
Turi: [00:18:15] we’re as bad as each other.
Kevin: [00:18:17] So the basic idea, the first point is that, well, look, these mechanisms are what drive polarization. we know that as well from empirical stories about what drive forward, Whether you’re rational or not depends on the mechanisms you use to form your beliefs. And so if Republicans and Democrats or liberals, conservatives are using the same mechanisms to form their beliefs and either they’re both doing so rationally or they’re both doing so irrationally. So we have a sort of a parody of rationality claim that’s that’s claim one, and that’s just well supported by a psychological data and anecdotally as well. The second claim is that, well, you can’t think that your own. Firmly held political beliefs are rational. I mean, just try it. So like take your opinion about Trump, whatever that is. I mean, I think Trump’s not a good precedent. I’m pretty confident that, here’s something I don’t think, I don’t think Trump’s a bad president, but I’m a rational to believe that Trump’s a bad president, but I shouldn’t believe that that’s an incoherent belief tab in so far as I think my belief is. Not rationally based. I have to give up that I can’t maintain confidence in something and think I shouldn’t have confidence in it. So sort of a straightforward coherence constraints as we can’t think that our own firmly held beliefs are rational. That’s step two. So if you can’t think of your own firmly held beliefs are rational and you should think, well, my beliefs are rational. Sort of that that claim is true. If, and only if the other side’s beliefs are. Rational. then you should infer, well, both sides are rational, so that’s the basic argument for why. Okay. Well, I, I know that I have these sort of motivated reasoning, selective exposure. And so on tendency that I admit that maybe I’m not, I’m not perfect. I’m certainly not perfectly rational, but I don’t think that my opinion about Trump is just caused by pure irrationality. I’ve got plenty of good evidence for that. That’s what you should think. And because you should think that you think there’s gotta be some rational explanation for what’s going on here for how I got to have these. From political police and therefore how those who disagree with me got to have theirs. That’s
Turi: [00:20:14] what we should see if we see ourselves. If we, if, if we, refuse to see our own political opinions as irrational, we cannot demonize the other side. and, And see their political opinions as irrational. That’s the argument, of course, in real life. That’s precisely what we do.
Kevin: [00:20:37] Got it. Yeah. So that’s, that’s where I come in. And so I think that the history here is actually somewhat interesting cause I, I didn’t start, but wow. To get to, polarization as, topics. So in my, when I was doing my PhD, I was working on this sort of arcane, subfield of epistemology, Probably study of knowledge, the subfields called formal epistemology. And so it’s basically building rational model or models, mathematical models of rational belief is that we’re trying to make sort of tools or thinking about how rational people reason. And that particular thing I was working on was what I’ll call ambiguous evidence. Ambiguous evidence is basically evidence. That’s hard to know how to react to evidence that sort of is unclear what to make up. So, I mean, It’s evidence that rational people shouldn’t be unsure, whether they’re responding rationally to it’s evidence that you should have some self doubt about. Anyway, ambiguous evidence is everywhere in politics. And the basic, the key results that my dissertation was about was the sort of two pronged thing. The first is that, there’s a sense in which ambiguous Evans always leads to predictable shifts in opinions, at least on some claims you get ambiguous evidence, you can predict that you’re having your opinions gonna move around a certain direction. The second thing is nevertheless, ambiguous evidence is often valuable. In the sense that you expect it to help you get to the truth. So although there are these predictable shifts, you don’t expect them to be sort of directing you away from the truth. You expecting them to help you get to the truth. So that was just a purely theoretical result.
Turi: [00:22:11] I’m going to stop you there because I need, I want to, I want to dig in a little bit here. The key piece here is the idea of ambiguous evidence and here, therefore is the key back into the proposed rationality. Of confirmation bias, motivated reasoning, et cetera. Can you give, can we, can we dig a little bit deeper into these two ideas that you flagged the first is that ambiguous evidence moves you around and secondly, you aim for it to get you to the truth. Give me an example.
Kevin: [00:22:43] Great. Yeah. So I’ll, here’s an example that I made to run an experiment and, and, test this. Predictable polarization. So it’s called a word completion task. And what you do is you get a string of letters and some planks. And the question is, is there an English word that completes that string? Right? So I might give you a string that reads T R blank P blank, R and then the question is, can you fill that in, in a way that makes a word TTR blank, peep blank, and there’s no, there’s no way to trace her tripe or none of those words. So that’s not an uncomfortable strip. I can give you a different string that says F R blink blink L does that completable well, frill frail. Those are two ways to complete fr blink link L so fr blink, blink. Oh, that’s a completable string tr blink, blink are, that’s not a completable string. And here’s the key thing about these word completion tasks is they give you evidence that is asymmetrically. Ambiguous in the sense that if I give you a completable string, it’s relatively unambiguous evidence. All you have to do is find a word frill, and then, you know, it’s completable, but I could give you an uncompleted will string. Then it’s ambiguous. Heavens, it’s hard to know what to make it. You won’t find the word, just stare at tr blink P blank, offer five, 10 seconds. And you nothing will come into your head, but you will be unsure or whether it’s just, you just missed it. You know that you’ll wonder whether in a second, maybe you’ll see. Some, some answer to fill it in. And so what that means is that sort of, when I give you a word completion task, sort of, you might get strong evidence, strong unambiguous evidence, which says it’s completable Fripp, or you might get weak evidence, which says it’s not, you sort of doesn’t look exactly right. And so what that means is that on average, Your confidence is going to go up that the word is completable. So it might go way up. It might go a little bit down that averages to a little bit up so particular. If you give people a bunch of these things and look at their average confidence across time, on average, they’re going to get more confident that the strings they see are completable.
Turi: [00:24:49] So that’s the basic way. and the evidence can lead to, Shifts and beliefs. And now you can, you can do that. Give me, bringing me into the real world, Ruth Bader Ginsburg and argument about gays in the military, whatever.
Kevin: [00:25:03] Yeah. Excellent. so basic idea then with, a word completion task is that, you’re doing a sort of cognitive search. You’re looking for, you know, some particular item to, Figure out what to think with this and be using it for me. You got this letter string and that’s something that we do all the time. So, if I give you, you know, take that, capital punishment study, right? I give you these two studies, right. One which has the headline, which seems to support the idea that capital punishment has a deterrent effect. One which has headline, what suggests the opposite. What are you going to do with those studies? Well, you’re going to do a cognitive search. You’re gonna look at those studies and try to see what to make of. Let me try to see if there’s like, if the reasoning is good, or if there’s a flaw, see if they made some mistaken move and so on. And the idea is, what you’re doing there is very similar to what you’re doing in a word completion task. If you find a flaw with a study, you know what to think, you know, it’s a bad study, but if you look for a flaw and fail to find one, well, then you don’t know what to make of it, and you don’t get much strong evidence against it. So. The idea then is that, we’ll look, suppose you believe capital punishment has a deterrent effect, right? And then you get these two studies and you’ve got a limited time and resources to think about them. What are you going to do? You’re going to try to find your, you’re gonna look for a flower where you think you’re going to find one, right? That’s a good way to get clear on ambiguous evidence. Where do you think you’re gonna find the one you think you’re going to find one in the study that said. It doesn’t have a turn affect. You, you think you’re going to find one that in the study that was surprising and for course, vice versa, if you thought capital punishment didn’t to, in, you’re going to look for a flaw in the study that you think, Yep was surprising that suggested it did have a deterrent effect and that’s, that’s rational. If that’s, if you’re doing something like a word completion, like the word completion task, you’re trying to find unambiguous, strong evidence. And in, so doing, you just happened to be doing it in a way, which leads you to produce predictably, strengthen your beliefs because of the asymmetry and that sort of thing. Ambiguity, you’re going to get.
Turi: [00:26:59] Gotcha. That’s beautiful. And, one could look at it positively, not just negatively, so faced with two pieces of evidence, which seemed to contradict each other. You’d bring to bear upon the question, all the other evidence that you have, which would be amongst other things here, your pre held belief, that capital punishment is, or is not a good thing. So you’re adding your experience of evidence, even if it’s. Personal experience rather than science or data to the mix. And coming up with a, with a view that, that,
Kevin: [00:27:30] and trying to use that to remove the ambiguity, remove the uncertainty and the evidence to figure out what to make of it. Exactly.
Turi: [00:27:36] And that’s rational. That’s reasonable. so Kevin, on the back of this explanation of ambiguous evidence, can you explain to me why motivated reasoning actually is rational?
Kevin: [00:27:49] Great. Yeah. So the basic idea. So with this, say motivate raising that’s tied to your prior beliefs. So it’s by simulation of evidence or selective exposure say, the simple idea is that, it makes sense to try to avoid ambiguous evidence, try to find unambiguous, clear evidence. With the bias simulation, we sort of saw how that’s going to work. It’s well, if you get this conflicting piece of evidence, you’re gonna look for flaws in the pieces of evidence that tell against your prior beliefs. And that makes sense to do because that’s where you expect to find the flow of how you expect to find unambiguous evidence. We can see this also in the sort of selective exposure and you’re looking for new evidence. Was you deciding whether to tune into Fox or. Yeah. MSNBC or CNN or whatever. Some, some political news is just broken. You want to get the latest on it. If you’re a liberal, you think, well, if I tune into Fox, they’re going to be giving me all these reasons to think that, you know, say the Republican move after RBGs death is a reasonable one and so on. And I think that’s, yup. I think that’s going to be missing, but I’m going to have trouble figuring out exactly what to make of it. It’s going to be ambiguous. Whereas if you go to the CNN, you know what they’re going to tell you, they’re going to tell you. What you already think that the Republicans are doing some shady business that’s real critical and so on because that’s, you’re a Democrat, if you’re a Republican vice versa. So the idea is trying to avoid ambiguous heavens trying to get a clear sense of what’s going on, drives people to look for, sources that will, they predict will confirm their prior beliefs. And that’s how you can get at least some forms of, of rational, selective exposure. And so confirmation bias, more general.
Turi: [00:29:23] So it’s actually a two step process. On the one hand when weighing up evidence on either side, we select to ensure that we can come to hard evidence and we select therefore in favor of our prior experience, but we also do it preemptively. And so far as when we know that there’s going to be ambiguous evidence around an event that we’re looking at, like, for example, The response to Ruth Bader Ginsburg, Ginsburg staff, we go to the place that we will find evidence, which we will find conclusive.
Kevin: [00:29:56] Exactly.
Turi: [00:29:57] Okay. I think that I’ve understood the move here. . This argument. That we are in fact rational creatures. And that the way that we think about politics is also rational is a profoundly political realization. Is that right, Kevin?
Kevin: [00:30:13] Yes, that’s right. I think the basic idea here is that, when we attribute people’s beliefs to irrationality, that can contribute to one of the forms of. Polarization effect of polarization in particular. Uh, so let me, let me talk through that. So there’s this well known, fantastic work from the seventies, by the psychologists, Daniel Kahneman and Amos diversity. that was basically scrutinizing, you know, the question of how rational people are. They started a research program that became known as the heuristics and biases research program. And the basic idea was that you might’ve thought people are generally rational, but when you look under the hood, what you see is that they use a sort of grab bag of heuristics that sort of. Give simple, easy answers in a variety of, but that lead to systematic biases and errors and systematic irrationalities. So this was a hugely influential, a research program in a field that became known as judgment decision making in psychology, and then grew into, behavioral economics and economics, field. And so, the basic insight. That I totally agree with is that there are these simple assumptions that people were making in sort of economics and other fields about how rational people reason. And it turns out people don’t conform to those simple assumptions. So these simple models of rationality, people don’t conform to those. And so they use all these things like confirmation bias and so on. We’re supposed to demonstrate that, the thing that. Like question is whether those things are necessarily irrational, whether the failures to map out those rational models, the simple rational models indicate irrationality or didn’t create something more complex going on. So that’s the sort of thing that the ambiguous evidence story I was giving is about. It’s about using more sophisticated, rational model to see how these things. That people do actually need and be irrational. The reason why I think that’s important is that this sort of narrative of irrationality really got into the popular culture. So do things like Google and gram, the track that you were usage of terms like irrationality and biases, and they just explode across the mid 20th century. I think increased by like 20 times or 18 times, I believe is the occurrence of bias. So that sort of gets into the popular culture. Economen writes his famous book thinking fast and slow, which sells millions of copies. And he wins the Nobel prize. Richard Taylor wins the Nobel prize in economics and so on. And we get this narrative that people are rational and overconfident and so on in their beliefs. And the problem, if you have them narrative new ticket to politics is that, well, you try to explain polarization by appeal to those biases. You say, well, what drives fuller? They isn’t that people have. The rational reasoning, biases, the trouble is right. You’d never for the reasons that we talked about earlier, you don’t apply that to yourself. You don’t think your own beliefs were formed, systematically, irrational, rational, you can’t, otherwise you wouldn’t have those beliefs. And so when we attribute polarization to irrationality, we attribute it to the other sides. Rationality. We think the other side where the bias ones, the ones who are using confirmation bias and all the rest to come to think so differently than us. And the problem is it’s a short step from thinking they’re rational and biased to thinking that they’re dumb and immoral, because once you think someone, you know, if you think someone has beliefs that they shouldn’t have given their evidence, and they’re using those beliefs to pursue a political agenda, then that’s, you know, they’re doing something wrong. Right. Whereas if you think. They’re sort of mistaken. They’re misled, they’re rational but wrong. Then you should resist them. You should counteract their, political moves and so on and, and think they’re wrong, but you shouldn’t think they’re dumb or biased or evil. And so I think that there’s a connection between thinking that people are rational on the other side of thinking that they’re evil and all the sort of things that go along with effective polarization. So basic hope for the sort of rational stories that it could sort of cool. The temperature a little bit. If you can get us to see. maybe we can think the other side is wrong and not think they are dumb as I like to put it.
Turi: [00:34:13] Can I tell this story back to you just to make sure I’ve understood, which is that, we discover with Dan Kahneman and almost firstly and others. That in fact, the human animal is not rational in the ways that we’d previously assumed, at least not rational in straight lines. The idea, therefore that the human being is irrational starts taking proper route across culture because we can’t see ourselves as irrational. When we meet people with different opinions to us, we call them irrational. And that critique that you’re an irrational human is one which is. Not just polarizing it’s demonizing. It makes them less than human. It makes them intellectually corrupt that accelerates puts fuel onto the fire of polarization and means that we actually can’t engage with them. We stop thinking of them. As we stopped thinking as humans, we start thinking as the right kind of partners for us in the democratic project, which is collaborative politics. What your move does is to remind all of us. And in fact, Even our cognitive processes that seem most motivated, that seem most biased are also rational moves. And therefore we cannot claim that the other side is irrational and therefore less
Kevin: [00:35:30] than human. Exactly. Yes. I think that’s the, that’s the big picture. Project at least.
Turi: [00:35:36] So Kevin, it’s taken you half an hour to walk me through the arguments for the rationality of bias. How do you take this out into the world? How do you help us understand that the other side is doing its best
Kevin: [00:35:54] You have to end on the hardest question, right? The question that I police has said, but yeah, I think that’s the million dollar question. Not willing that I feel like I have a great answer to, but a few thoughts for, I think it’s relevant. One is simply that like, I think it’s good to recognize that sort of the main, one of the main, if not the main problem, polarization is this effect of polarization, this demonization of the other side. And so that we’re, we’re doing no one a favor. And we’re part of the problem when we. When we face with people who disagree with them, disagree with us, we think of them as necessarily corrupt or a moral and one of the shout them down. So I think there’s certainly something to be said for, encouraging more engagement with ideas, thinking that even if you think the person is wrong, you need to think they’re dumb. I think that’s at least supposed to be a partial palliative, although. No, not much a second thing that I think it adds is that it really reminds us that when we look for something to blame polarization on, we don’t need to blame people. We can blame structure so we can blame certain people without blaming everyone. So I think I’m, I’m very much the type of, Yeah, in this rational story, I very much wanted to go hard distinction between sort of the average American se and political actors that are in power. So like Trump McConnell, you know, Nancy blow students. I’m not saying that they’re acting in good faith or they’re acting rationally and so on, but, and you know, we can certainly well blame them, but it doesn’t mean we necessarily have to blame, you know, the. Average voting Democrat.
Turi: [00:37:30] Yeah, exactly.
Kevin: [00:37:31] so just trying to cool. The temperature on, on the demonization is one, the story existing is, is one point. I think another thing that, I quite like is, I think you mentioned, Robert Teresa’s work earlier, th this idea that, part of what drives, certain forms of polarization is the fact that we sort of don’t is social sorting, that we don’t interact as much. With, people from the other side and sort of nonpolitical capacities , Democrats, and Republicans are more friends with themselves and across party lines and Zoe there’s , less and less social interaction. Cross party pollination and so on. And so we see, we don’t really see them as human as much. We sort of just see them when their political views come in contact with ours and then we don’t like it. And so one of the thing. That, police proposals that I I’m quite sympathetic to his idea, that one way to lower the temperature is to try to resuscitate, certain forms of civic engagement and have nothing to do with politics. Just, you know, the classic example of the bowling league, as this came in this book, bowling alone, right? I’m feeling his first name, but Putnam, his last name, about the rise and fall of civic engagement in the 20th century, U S and sort of. Throughout the later 20th century, there was just a specific as fall in sort of doing stuff together that was just regular social little leagues and bowling leagues and that sort of thing. And so in so far as we can do more of that hard in a pandemic, of course, but in so far as we can do more of that, that’s less political engagement, more just, yeah. Engaging with our fellow citizens and seeing them as, as reasonable people. Then that’s one way to think, to try to combat.
Turi: [00:39:07] Those are three upbeat suggestions. we tend not to have, we tend not to finish these podcasts on, on a, on a positive note. So I’m very pleased to be doing this with you, Kevin, thank you so much for walking me through the rationality of bias. and I hope, we see your future come to pass.
Kevin: [00:39:27] Thanks so much. Great to be here.