Team Reasoning David Papineau Kings College London and

  • Slides: 17
Download presentation
Team Reasoning David Papineau King’s College London and City College of New York 7

Team Reasoning David Papineau King’s College London and City College of New York 7 th Annual Philosophy Public Lecture University of Aberdeen 14 June 2018

“There is no such thing as society. There are individual men and women and

“There is no such thing as society. There are individual men and women and there are families. ” Margaret Thatcher

Human Cooperation People often cooperate – in sports teams, voting, use of the commons,

Human Cooperation People often cooperate – in sports teams, voting, use of the commons, non-free-riding, business associations, and so on and on. Economists and philosophers find this puzzling. “If the others cooperate, then there’s no need for me to do so; and if they don’t, then there’s no point in my doing so. ” (Catch 22: What if everybody felt that way? Yossarian: Then I’d certainly be a damn fool to feel any other way. ) The economists offer all kinds of ingenious explanations: iterated games, other-regarding aims. But the whole debate is based in a false premise. People normally think as Teams, not individuals.

Decision Theory Nature Rain Movies 5 Sun 5 The agent assigns probabilities to the

Decision Theory Nature Rain Movies 5 Sun 5 The agent assigns probabilities to the relevant states (Rain, Sun) and then works out which option will maximize expected utility. A Beach In a Decision Problem, a single agent needs to worry about states of the world. 0 10 In our example, Beach is best if and only if Prob (Sun) > 50%)

Game Theory: Hi Lo/Footballers’ Problem B Hi Hi Lo 10, 10 0, 0 5,

Game Theory: Hi Lo/Footballers’ Problem B Hi Hi Lo 10, 10 0, 0 5, 5 A Lo In Game Theory, we have two (or more) rational agents. The could use probabilities to guess what their “opponents” are going to do. But since their “opponents” are now rational agents, not nature, it should be possible to to do better and predict their moves. But sometimes there are multiple Nash Equilibria. (That is, pairs of moves {X, Y} such that, if I know you’re going to do Y, I should do X, and vice versa. In effect, a Nash Equilibrium is a rut we can get stuck into. )

Stag Hunt B Stag 10, 10 Rabbit 0, 5 A Rabbit 5, 0 5,

Stag Hunt B Stag 10, 10 Rabbit 0, 5 A Rabbit 5, 0 5, 5 In the much-discussed Stag Hunt, two are needed to catch a Stag, but one is always sure to bag a rabbit. Clearly, it’s rational for each to go for a stag, if the other is going to as well. But equally it’s rational to go for a rabbit if the other does. So once more orthodox game theory leaves both options as rational possibilities.

What should we do? Hi, Hi Lo, Lo 5, 5 Now it’s a no-brainer.

What should we do? Hi, Hi Lo, Lo 5, 5 Now it’s a no-brainer. Of the four options, the first is clearly the best. Hi, Lo 0, 0 By framing it as a team problem, all the complications about probabilities, and about what should I do if B does. . ? simply dissolve away. Lo, Hi Us Suppose we each ask, not “What should I do”, but together ask “What should WE do? ” 10, 10 0, 0

Team Reasoning Team reasoning can reach solutions that individual reasoning struggles to find. (Beyond

Team Reasoning Team reasoning can reach solutions that individual reasoning struggles to find. (Beyond Individual Choice by Michael Bacharach, ed Gold, N. & Sugden, R. ) This isn’t only of theoretical interest. It reflects the way people actually think in many situations. I would say that thinking as teams is exactly how people manage to cooperate. That’s why they don’t destroy the commons, don’t avoid voting, don’t ride for free, and so on.

Changing the Rules Game theorists say nothing in this is an argument against game

Changing the Rules Game theorists say nothing in this is an argument against game theory per se. I’m just modeling the real-life situations by different games, not producing some alternative to game theory. Fair enough. But note that I’m not just changing the model by giving the individual agents different altrusitic aims/values. Rather I am questioning the assumption that the agents facing decision problems are always individual humans.

Altruism not the Point B Rat Shtum -10, -10 0, -30, 0 -5, -5

Altruism not the Point B Rat Shtum -10, -10 0, -30, 0 -5, -5 A Shtum In the cases we’ve looked at so far, we didn’t need altruism as well as Team Reasoning to get the optimal result. Even self-interested agents can see that the optimal joint action is the best choice for the team. In other cases, like the notorious Prisoners’ Dilemma, we do need altruism —but we still need Team Reasoning as well. Even a prisoner who cares equally about the other can start worrying that, if B were rattled enough to confess, then I’d better confess too. . .

Changing the Rules Economists and game theorists think it is a cheat. Teams are

Changing the Rules Economists and game theorists think it is a cheat. Teams are made of individuals, and so surely joint decisions are simply compounds of individual decisions. The first claim is true enough. But the second doesn’t follow. The fact that groups are made of individuals doesn’t mean that group choices must be sums of rational individual choices, any more than the fact that people are made of cells means that individual choices must be sums of cell choices. Nothing in metaphysics can possibly show that my family, or my cricket team, or my country aren’t capable of all asking ‘what shall we do? ’, and all doing our bit once this question had been answered. (Note the importance of this last commitment to Team action. In a way this is more important than the reasoning. )

The Evolution of Decisions If evolution selects traits that foster individual fitness, then won’t

The Evolution of Decisions If evolution selects traits that foster individual fitness, then won’t it favour forms of decision-making that benefit the individual over the rest of their group, and so individual and not team reasoning? But we’ve been here before. Evolution has lots of mechanisms (kin selection, group structure) that can favour behaviours and desires that are altruistic. So it can similarly favour a whole form of reasoning that benefits groups rather than individuals. (Individual decision theory can do as well as Team Reasoning if agents have the right probabilities for what the others are going to do. But Team Reasoning is evolutionarily advantageous precisely because it by-passes the need to estimate probabilities case by case. )

Normative Rationality Some theorists allow that Team Reasoning is empirically real, but insist that

Normative Rationality Some theorists allow that Team Reasoning is empirically real, but insist that the rational decisionmaking is always individual. I see no merit in this position, apart from its familiarity. If practical rationality is a matter of choosing means to ends, what’s wrong with choosing Team means to Team ends?

Us or Me? I’m not saying team reasoning is more natural or rational than

Us or Me? I’m not saying team reasoning is more natural or rational than individual reasoning. In my view both are on a rational par as available psychological mechanisms for selecting behaviour. No doubt evolution has disposed us to use team reasoning in some situations and individual reasoning in others. The interesting empirical issue is which situations prompt one kind of reasoning and which the other, and the interesting normative issue is when is it right to use one kind of reasoning rather than the other.

Reasons for Reasoning Haven’t I implicitly been taking individual reasoning as basic, in that

Reasons for Reasoning Haven’t I implicitly been taking individual reasoning as basic, in that showing how it’s often in the interests of individuals to Team Reason? Maybe I have shown that, in some cases. But that doesn’t show individually rationality is basic. We could also show that it’s often Team rational to Team Reason, or indeed that it’s sometimes Team rational to reason individually. (Think of Adam Smith’s Team argument for individual decision-making. ) None of this privileges one kind of reasoning as basic.

The Importance of Trust We would all be in a mess without Team Reasoning.

The Importance of Trust We would all be in a mess without Team Reasoning. But note how it rests on trust. This is a precious resource. Trust can disappear even among people who are all strongly altruistic, and once it goes it is not easily regained.

The End

The End