‘Cognitive bias’ is a general term for a number of ways identified by psychologists in which we often do not engage with the world as objectively as we might. Cognitive biases interfere with our understanding of the world by distorting our judgement in one way or another. There is an obvious overlap between cognitive bias and Middle Way Philosophy, as objectivity in Middle Way Philosophy is understood as including the overcoming of such biases. If Middle Way Philosophy is correct, cognitive biases should be explicable in terms of attachments to metaphysical beliefs, and can be overcome through the development of integration.
However, before we look at some examples of cognitive bias and how they can be understood in terms of Middle Way Philosophy, some differences also need to be noted between the approaches typical of empirical psychology, where these biases have been identified and tested, and those of Middle Way Philosophy. Wherever a failure in judgement is noted, a standard for comparison in good judgement needs to be assumed. In empirical psychology this standard is either based on independently verifiable facts or on accepted conventional facts, neither of which can be absolute, but which nevertheless assume the naturalistic model of facts and values. In Middle Way Philosophy, whilst science and convention can often provide us with helpful indicators, it is rather the avoidance of metaphysics and the adequacy of experience itself that provides the standard of objectivity, rather than any appeal to ultimate ‘external’ facts of any kind. Because facts are recognised as necessarily inter-related with values in the very meaning of our language, too, there is no distinction between ‘cognitive bias’ and ‘affective bias’: both stop us engaging with conditions in parallel ways. Discussion of cognitive bias thus needs to be supplemented by psychological study of emotional limitations such as failures in empathy.
Nevertheless, the huge volume of work done by empirical psychologists is generally applicable and helpful in understanding the limitations on our objectivity of belief. It helps to provide evidence to support Middle Way Philosophy by showing the many ways that objectivity (or its lack) is incremental and psychological in nature rather than consisting in verified theories that are assumed to have a one-to-one representational relationship with reality.
All cognitive biases can probably be explained as (a) dependent on a metaphysical belief, (b) often resolvable through awareness and integration. The awareness and the integration here are interdependent: one needs initial awareness that one has a bias in order to be able to address it, but to stop it arising unconsciously one needs a longer-term integration that addresses the conditions in which the cognitive bias develops. On the other hand, continued attachment to the metaphysical belief at the basis of the bias entrenches it, and prevents integration. However, there is no guarantee that we will always be able to overcome all cognitive biases, just as we may not be able to remove all metaphysical assumptions from our thinking.
There is a very helpful list of cognitive biases on Wikipedia here. Many of these phenomena overlap, but have been given different names by different psychologists along with slightly different explanatory theories. Most of the ones listed on Wikipedia can be readily explained in terms of metaphysics and integration, so I am happy to respond to queries on specific ones not mentioned below. Here I will just give an indicative list of some of the more important cognitive biases that have been identified, with their metaphysical root and integrative resolution. All the names of the biases given as sub-headings below are linked to Wikipedia articles that explain them further and give references.
All these entries are provisional, and comments to help improve them are welcome.
Bias: We tend to explain our own behaviour in terms of the demands of the situation, and others’ behaviour in terms of their character.
Metaphysical assumptions: Freewill and determinism, although both may be found on either side.
If we are attempting to merely describe actions without a weight of negative group-judgement attached to them, we tend to see ourselves as responding rationally to a situation and others as responding in predictable ways according to character. If we were to recognise the effects of character on ourselves here it would interfere with the belief that we have a choice, and if we were to recognise the effects of situations on others we would also have to recognise that they also reason and make choices in relation to situations.
However, when negative moral judgement is involved, we tend to see ourselves as determined by a situation in order to avoid a negative judgement from the group (“I had to do it – I had no choice”), but others as responsible for their character, which led them to act in the way they acted (e.g. the belief that a murderer is evil).
Integration: We need to integrate our beliefs about character with our beliefs about situations so as to treat them in a more consistent way, and a decisive avoidance of implicit beliefs in freewill or determinism will help with this process.
Bias: This is the tendency to make judgements in dependence on a specific ‘anchoring’ focus on a fact or type of fact.
Metaphysics: Any metaphysical belief can be used as an ‘anchor’ in this way. For example, an anchoring belief in the existence of God can make us focus on types of event that show God’s care or God’s punishment, and neglect events that do neither of these. An ‘anchor’ here appears to be another name for a habitual identification due to conditioning.
Integration:To integrate the anchor with other possible starting points we need to extend meaning, which may mean a process of learning in which others point out alternative approaches.
Bias: Here an emotionally more powerful identification leads us to pay far more attention to some factors than others, for example not considering counter-examples or negative results in experiments, but only the relationship between preferred belief and the evidence that directly supports it.
Metaphysics: The absolute value of the belief identified with, and lack of value of any alternatives.
Integration: Probably at the level of desire, and involves the person extending their awareness systematically to other areas of experience (e.g. in meditation).
Bias: We tend to avoid difficult and complex judgements by unconsciously substituting easier ones: for example, judging a person according to stereotypes, or solving a difficult maths problem by using an inappropriate but more familiar and simpler method.
Metaphysics: This type of cognitive bias can contribute to explanation of why any type of metaphysics is favoured as a basis for judgement. Metaphysical beliefs are simple, universal and catch-all, so are much easier to call in than complex assessments based on experience.
Integration: To deal with this tendency we need to integrate our desire for a quick and easy answer with our desire for an accurate answer that addresses conditions and/or our belief that there are easy metaphysical answers with our recognition of their deceptiveness.
Bias: A new idea becomes rapidly popular (‘goes viral’ in internet parlance) because of its increasing accessibility and the group pressure to adopt it. This tends to overwhelm critical scrutiny of the idea.
Metaphysics: This mechanism helps to explain both the links between any metaphysical belief and its supporting group, and the ways in which opposing metaphysical beliefs can rapidly replace one another. A belief adopted in this way cannot be provisional because of the way in which it is adopted, which precludes scrutiny.
Integration: This mechanism can be avoided by developing individuality and critical thinking skills so as to integrate beliefs more effectively.
Bias: Here, the process of reasoning is undermined by a tendency to accept or reject the conclusion regardless of its validity based on the premises.
Metaphysics: This seems to be an example of a metaphysical defence mechanism, because it prevents reasoning based on experience from upsetting belief in a metaphysical claim that is already accepted. Experiential claims, on the other hand. depend for their justification on such links of reasoning, so they cannot be served by the belief bias, only undermined by it. Metaphysical conclusions that are supported by metaphysical premises are unaffected by the belief bias, because the outcome is the same whether or not the reasoning actually affects the outcome.
Integration: Integration of belief, particularly through critical thinking skills.
Belief disconfirmation paradigm, to avoid cognitive dissonance
Bias: Cognitive dissonance is created by inconsistent beliefs, and we have a drive to reduce this inconsistency which is obviously one of the motivators of reasoning. So cognitive dissonance can be very useful to us. However, the drive to reduce cognitive dissonance is not necessarily reduced by reasoning to bring about consistency, but can alternatively be tackled just by ignoring or denying information that creates conflicts with existing accepted beliefs. We may not even see things that are dissonant in this way, through denial. Another technique that avoids cognitive dissonance, is ad hoc reasoning (also known as ‘moving the goalposts’), where the original information is re-assessed in a way that merely defends the original belief, as in the re-interpretation of failed prophecy.
Metaphysics: The defensiveness of these cognitive biases can be employed to support any kind of metaphysical claim, and is associated with metaphysical claims rather than provisional ones, because provisional claims do not need to be defended in this way.
Integration: The solution to cognitive dissonance is the use of critical reasoning to either identify what conflicting beliefs have in common, or to show the justifiability of one and lack of justifiability of the other. This is a dialectical process, rather than one of rejecting or denying one belief immediately without considering evidence. In this way conflicting beliefs can be integrated rather than one merely being identified with and subjugating the other.
Bias: This is the tendency to find patterns in phenomena that are no more than random or otherwise lack the significance attributed to them. A harmless form of this is merely projecting imaginative significance onto forms, such as seeing dragons in the clouds (indeed this could be seen as a positive extension of meaning). However, a cognitive bias can obtrude when this projection is given a metaphysical status, i.e. taken to be the ‘real’ message of the phenomena. For example, numerological interpretations try to find hidden messages from God in the apparently insignificant numerical construction of the Bible, such as the numbers of chapters and verses in each book. The clustering illusion can also be detected in the design argument for the existence of God, which attributes divine significance to patterns that could otherwise be interpreted as the result of random processes (e.g. through evolutionary theory).
Metaphysics: Any metaphysical claim can be supported using a clustering illusion. However, in applying the Middle Way to this instance of cognitive bias, it should also be recognised that there is a cognitive bias involved in denying such significance, as opposed to merely affirming it. For example, we cannot deny that a person may experience God’s purpose in the world around them. The metaphysical move, and the cognitively harmful bias, arises either in affirming or denying a metaphysical status for that experience.
Integration: So, the clustering illusion needs to be treated carefully if we want to integrate the beliefs it tends to give rise to. If we experience significance in a given pattern that others think is ‘merely random’, we should accept that significance. However, we also need to stay with that experience as an experience rather than over-interpreting it as providing truths about the universe. Meditation can provide helpful practice in observing such experiences and their significance for us, without attaching unnecessary labels to them.
Bias: Confirmation bias is the tendency to look for information that confirms your existing beliefs, rather than neutral or falsifying information. This leads to the under-reporting of negative results in scientific research, and for those considering a particular viewpoint to concentrate on arguments from viewpoints that favour it. For example, how many theists read as much work by atheists and agnostics as by theists? Scholarliness is another approach that is very subject to confirmation bias. Diligent scholars can give you hundreds of references to prove their point, and then be baffled as to why you should possibly still reject such overwhelming authority: but all the authorities quoted have been selected to support the scholar’s position. Even when making an everyday observation (e.g. “John is lazy”) we do not usually compare John’s degree of laziness with other possible degrees of observable laziness, but just take characteristics of laziness by themselves to prove the point – but perhaps John is a lot more industrious than most of his peers.
Metaphysics: Confirmation bias can be seen as part of the mechanism by which any metaphysical view is maintained. Given that we tend to seek evidence that confirms a metaphysical view because it can be easily interpreted in its terms, the view remains unchallenged. Opposed views may be given cursory consideration, but not with a degree of detail or seriousness that allows challenges to really be considered, because the nature of the view is such that alternatives become threatening and a move away from it would be discontinuous. Provisional views, however, are constantly checked against alternatives which may supplant them is they fit experience better.
Integration: Confirmation bias shows a lack of integration, because the potential for other beliefs is held opposed to the belief we identify with. Psychologists investigating polarisation have found an association between publicly declaring a position and the development of opposed perspectives on the same evidence amongst different groups. This suggests that the more we identify with a group, the more we need to reject an opposing set of beliefs, reinforced with confirmation bias, to reinforce our group identity. Confirmation bias can thus be tackled by integrating the beliefs of groups at a social level and/or the beliefs of individuals, who can be trained in critical thinking and/or rigorous use of scientific method to give just as much attention to opposing evidence or opposing views. In the long term this is the best way of supporting a view that one identifies with: make it address the conditions better!
Disregard of regression towards the mean
Bias: We tend to assume that exceptional activity will continue, rather than falling back to a pattern that is more consistent over the longer-term. For example, after a period of weeks of exceptionally hot weather, I expect more hot weather and am surprised when the temperature falls. A temperature that is actually normal for that time of year in my section of the globe feels cold for a while as a result. The ‘Gambler’s fallacy’ is another example of this: gamblers who have had a run of luck expect it to go on.
Metaphysics: The metaphysical belief involved here is that of scientific relativism: we absolutise the conclusions drawn from a localised pattern and assume it is universally and eternally the case for us (even if we theoretically recognise different patterns existing elsewhere) instead of considering it provisionally in relation to patterns in other times and places.
Integration: The solution is to integrate our universal beliefs with our relative ones through a process of critical investigation.
Bias: This is the tendency to believe with hindsight that we knew how things were going to turn out: a belief that is not supported by an accuracy of prediction before events.
Metaphysics: This cognitive bias is closely related to the metaphysical belief in determinism, that all events are inevitable and thus could have been known in advance. Even if we don’t claim to have actually known in advance that certain events were going to happen, hindsight bias makes us feel that in some sense we should have known had we had enough information or insight – the illusion perpetuated in determinism.
Integration: Hindsight bias shows a lack of integration of belief between different times, because we are unable to acknowledge the ways in which are past beliefs were different from our present ones but insist on imposing the present on the past. We have to acknowledge our past beliefs as different through critical investigation before we can genuinely integrate them with present ones.
Bias: Hyperbolic discounting is the tendency to value benefits that are closer in time over those that are further away in time, so that $1 today is worth more to me than $1 in a year’s time (regardless of inflation). This ‘mistake in moral mathematics’ as Derek Parfit calls it, is one of the difficulties of a utilitarian approach to moral reasoning in which we must weigh up benefits that are the consequences of actions against each other to judge the actions. Whenever we weigh up benefits, our judgement is distorted by a preference for having them sooner, even though the benefits themselves are not experienced any differently later. Jeremy Bentham formalised this hyperbolic discounting by including propinquity, or nearness in time, as one of the aspects of pleasures we should consider in his moral calculus, but he was unable to give any further justification for why an imminent pleasure should be considered better than a distant one. It amounts to a metaphysical assumption that is made in consequentialist calculation.
Metaphysics: A metaphysical assumption is involved when we discount an event in the future just because it is in the future, rather than because of uncertainty about events in the future. The metaphysical belief involved is effectively a denial of time, and is the same left-hemisphere judgement as that involved in Zeno’s Paradox.
Integration: The lack of integration involved is very clear here, because it consists in a privileging of desires that we can more closely relate to our present experience over those in the future. This reflects the demands of the ego, insisting that our current desires and views are the only ones and failing to identify with those of other times. The process of integration here depends on extending our identification of ourselves to different times, which might be done through reflection and/or meditation. Using Kantian deontological moral theory as a device to challenge the weaknesses of the utilitarian approach is also a good moral strategy for overcoming this cognitive bias in moral decision making.
Bias: This is the tendency to believe that we are in control of events when our actions have a direct input into them, even though those actions may not be significant in determining such events. For example, when playing a lottery, being able to choose the number makes people feel that they have control over their chances of winning, even though the chances of winning with a chosen number are no greater than with a random number. Those who are driving also tend to believe that there is less chance of an accident then than there is when they are in the passenger seat.
Metaphysics: This cognitive bias can be related to the metaphysical belief in freewill, which imposes a belief in our control over all our actions because we are making choices about those actions, without taking into account the conditioning affecting the choices, the actions and the outcomes.
Integration: Here the belief that we are in control needs to be integrated with the belief that we are not in control, probably associated with active and passive desires. We should be able to do this through critical reasoning together with a basic level of scientific knowledge about the conditions we encounter.
Bias: This is the tendency to correlate easily identifiable, available or unusual categories and have exaggerated beliefs about the extent of the correlation, based on limited experience. For example, an elderly white woman burgled by a black man might subsequently think of all black men as potential burglars.
Metaphysics: An absolutising of beliefs that may be correct in relation to certain limited times and places, but are not absolutely or universally the case. This can be seen in relation to moral beliefs when a moral rule that may be relevant to one situation is adhered to as an absolute, e.g. Jewish dietary rules, which seem to have had the original purpose of distinguishing the Israelites as a tribe from other tribes, continue to be applied by orthodox Jews when this distinction no longer serves the same function.
Integration:Like any identification with an absolute position, this cognitive bias involves a lack of integration between identification with the absolute belief and other possible identifications, with denial of experiences that do not fit (see belief disconfirmation paradigm above). A process of critical thinking and sceptical enquiry is probably the primary means by which integration can be achieved here.
Bias: This is our tendency to overestimate the strength and/or duration of future feelings, whether good or bad. In general we recover more quickly from loss, and get bored more quickly with a new source of satisfaction, than we think we will.
Metaphysics: Like hyperbolic discounting above, this involves a metaphysical assumption of the absolute value of our identifications at one moment in time, and a denial of time itself.The ultimate version of impact bias is the belief in eternal bliss or eternal damnation after death, in absolute contrast to the short-lived experiences of pleasures and pains we have in this life.
Integration: The bias is created by a lack of integration between different identifications at different times. However, here, instead of giving preference to our current identifications, we idealise future ones and identify with those at the expense of present ones. In this way, we give less credence to current pains and pleasures and exaggerated credence to the impact of future ones. As with hyperbolic discounting, both meditation working directly with desires, and Kantian moral reasoning dealing with moral principles, might help integrate opposed identifications with different times here.
Bias: Ingroup bias is the tendency to favour our own group over those outside it, both in preferential treatment and in greater credibility given to the beliefs of the ingroup. This has been shown by empirical psychologists to occur even in randomly created groups where there was no other reason for favouring an in-group.
Metaphysics: Ingroup bias is part of the mechanism of metaphysics, by which group beliefs become a badge of identification for a group. Ingroup bias research shows that individual judgements are (consciously or unconsciously) subjugated to the group through adherence to group beliefs. This could mean that alternative views are never entertained, that they are prejudicially rejected when considered, or that they are accepted secretly but not revealed to the group for fear of losing one’s place in it. Any view can be held dogmatically by a group in this way, discouraging individual investigation or consideration, but metaphysical beliefs lend themselves to this process by being apparently immune to disproof. More empirical research needs to be done to establish (or falsify) the links between metaphysical beliefs specifically and ingroup bias in an experimental context.
Integration: The lack of integration involved in ingroup bias is obvious at a social level: it leads to conflict between groups, or between groups and transgressive individuals. However, it also applies at a psychological level, because our feelings of wishing to conform to the group tend to conflict with justified beliefs based on experience. We can only overcome this lack of integration in the long term by making group beliefs increasingly provisional and subject to evidence. Strongly-rooted drives here seem to be in conflict with our investigatory intelligence, but we can still integrate the two by recruiting the energy of those drives towards increasing the adequacy of group beliefs. A group of research scientists or a critical thinking class could each provide examples of groups where, despite some continuing ingroup bias, there is a strong attempt to use the resources and energies that individuals give to a group to actively undermine dogmatic group positions.
Bias: Information bias is the tendency to prefer more information as an end in itself, even when this is irrelevant to our practical judgements. It is a popular delaying tactic for politicians or managers who want to avoid difficult decisions.
Metaphysics: This is directly related to metaphysical belief, as metaphysical belief consists in assumed information that cannot be relevant to our practical judgements, even though we have a tendency to assume that it will be. For example, if we are making a judgement about a person’s responsibility for a blameworthy action, we may believe that their freewill is relevant to the judgement, but this gives us no further information than that offered by experience of their character, behaviour, reasoning abilities etc. Whilst judging them on the basis of experience, we use freewill as a further false justification for allocating responsibility.
Integration: We can overcome information bias by integrating our practical beliefs more fully with our cognitive construction, so that we are neither creating a cognitive construction that goes beyond practical experience, nor adopting an unduly narrow understanding of practical needs.
Bias: This is the tendency to believe that our own introspection is a more reliable guide to the causes of our choices and behaviours than are the outward observations of others. This belief is inconsistent with the way we treat others’ introspections, which we tend to see as unreliable. We tend to believe that our own motives for choices are transparent to us, even when we have constructed our account of those motives with the benefit of hindsight. This enables us to believe that we are rational and virtuous in ways that others are not, because we see our own behaviour as motivated by clear, justified reasons, even when may well have unconscious causes, and the behaviour of others as irrationally caused even when it may have been carefully considered.
Metaphysics: Introspection illusion is closely associated with metaphysical beliefs about true sources of knowledge known introspectively. The most obvious example of this is that of Descartes, who believed that what he “clearly and distinctly” perceived must be indubitable. Cartesian assumptions have motivated many subsequent philosophers to believe that either direct internal experience or a priori reasoning (which Descartes called “clear and distinct”) must provide certainty in contrast with uncertain external observations – for example, Husserl and the phenomenologists on unmediated internal experience or the analytic tradition on a priori reasoning. Meditators, mystics and other religious figures have also often claimed to know with certainty that their inner illuminations give them certainty about God or about universal truth, of a kind that external experience could not have. In both these kinds of cases, the clarity and/or power of an internal experience have been mistaken for a source of absolute truth, even though the power or clarity of internal experience or reasoning bears no neceesary relationship to the truth of assertions justified through it. Metaphysics is thus often justified in ways that illustrate the introspection illusion.
Integration: The introspection illusion can be overcome by integrating the beliefs that we associate with introspection with those we associate with outward observation. This involves extending our awareness to try to understand ourselves from the point of view of others and others from the point of view they have of themselves, in addition to (not instead of) our personal experience of authoritative introspection. Obviously communication may help with this process.
Bias: This is a tendency to assume that the world is just, in order to avoid the cognitive dissonance (see above) created by injustice, to maintain a sense of security and avoid anxiety. The just world hypothesis leads people to blame victims for their suffering and praise those who are merely fortunate, preventing us from recognising conditions that do not accord with the belief that the good always prosper and the bad always suffer. This bias is discussed in Melvin J. Lerner’s The Belief in a Just World (1980).
Metaphysics: The Just world hypothesis has strong links with the metaphysical belief in cosmic justice. For example, the Just world hypothesis motivates belief in heaven and hell as reward or punishment for sins on earth, belief in karma and rebirth or reincarnation, and belief in a just outcome in history (such as the American ‘Manifest Destiny’ or the Israelite belief expressed in the Old Testament that the invasion of Israel by Assyrians and Babylonians was God’s punishment for their lack of faith). In philosophy, it also motivates doctrines such as Schopenhauer’s ‘Principle of Sufficient Reason’ (following Leibniz). However, the Just world delusion does not need to take such grand forms. If we simply feel that we should have been given that job that we clearly deserved, failing to comprehend the reasons why we didn’t, the Just world delusion may be at work.
Integration: The solution to the Just world hypothesis is to integrate our beliefs about observable events with our desire for favourable outcomes. The Just world hypothesis is just extended wishful thinking, but if we take enough notice of what observation actually tells us, whilst recognising the idealising energy we invest in Just world fantasies, we might be able to start investing those energies in making the world more just, as opposed to pretending that it is already. A decisive rejection of Cosmic Justice beliefs can only help with this, even though we can still accept and enjoy the meaningfulness of Just world fantasies as we meet them, say, in Dante’s Divine Comedy or the Buddhist Wheel of Samsara (‘Wheel of Life’).
Bias: Moral luck is our tendency to make moral judgements that depend, not just on actions that are the result of choices, but on events, circumstances or shaping influences that are not under our control. For example, we blame people who do things that are a danger to others (like reckless driving) much more when they actually result in harm, and we praise able students for success that is largely a result of intellectual capacities that have been genetically inherited. Moral luck is also related to the actor-observer bias (see above), because we tend to discount mitigating circumstances that apply to others when holding them responsible, whilst exaggerating those that apply to ourselves.
Metaphysics: The belief in the absolute moral value of what we take to be the factual situation, regardless of the psychological considerations that would introduce degrees of value. The assumption of moral luck is one factor that makes the analytic style of ethics (which effectively appeals to convention as the only moral ground) negatively metaphysical, because it takes a conventional view that includes moral luck as the basis of judgement when we cannot justify it through our moral experience (including experience of developing integration).
Integration: To address this, we need to integrate our desire for social acceptance (which drives us to accept the conventional blame of the unlucky) with our desire for individual control and freedom (which leads us to the rejection of moral luck).
Bias: We tend to neglect probability when acting in situations of uncertainty, but make our decisions based only on an absolute emotional response to a risk or a desired outcome. For example, players are attracted to lotteries by large prizes at extremely low odds, and people change their behaviour much more in response to risks that have been brought to their attention rather than ones that continue in the background (e.g. a well-publicised rapist makes some women afraid to walk the streets at night, even though the overall probability of an attack has not changed significantly).
Metaphysics: The neglect of probability is metaphysical because it involves thinking in absolutes rather than increments: probabilistic thinking is incremental and absolute thinking is metaphysical.
Integration: Any kind of integration can help us to pay more attention to probability by reducing the unreflective power of the thoughts of one moment and helping us see them in the wider context of other thoughts.
Bias: Self-serving bias is the tendency to evaluate all information in a way that supports our own interests. Ambiguous information is interpreted in accordance with our own interests, and self-serving information is selected to the exclusion of opposing information. We take credit for success but make excuses for failure.
Metaphysics: Belief in the self and identification of the self with the individual. Self-serving bias attributes an absolute value to a narrow conception of my own interests.
Integration: If this conception of my interests was extended and integrated with other opposing conceptions, my own longer-term interests would be better addressed because they would address more conditions.
Status quo bias/ system justification
Bias: This is the tendency to support the existing established system or status quo over alternatives, even when an alternative may be clearly in our interests. For example, this accounts for conservatism amongst the poor, some of which prefer the existing social and economic system over any radical changes that may be proposed, even though it works to disadvantage them.
Metaphysics: System justification can be related to conservatism as a metaphysical ideology, which gives absolute value to tradition regardless of its usefulness, whether in a political, social, or religious context.
Integration: To overcome system justification we need to integrate our identification with the past and present with that for future ideals, using ideals to influence present policy realistically through gradual change.
Bias: Stereotyping is the tendency to assume that individuals in groups other than our own have features that we take to be typical of that group, regardless of individual differences. It is related to argument from anecdote and the problem of induction, because it involves over-generalising from limited examples.
Metaphysics: Stereotyping is due to interpreting our experience in terms of expectations, which take an absolute form, rather than taking in new incremental information from experience. In a sense every metaphysical view is a stereotype, because it consists in an absolute belief that can only be built on relative foundations, when all our experience is relative.
Integration: We can overcome stereotyping by integrating our wider generalising beliefs with our specific observations. We do not need to go to the extreme of rejecting all generalisations, but we do need to keep all generalisations provisional and subject to revision in the light of new information.
Subjective validation/ Forer effect
Bias: Subjective validation is the tendency to believe that two unrelated phenomena are connected because the belief in a link fits into our existing beliefs (closely related to confirmation bias, see above). The Forer Effect is a specific example of this, found in people’s tendency to believe that a vague or general character description that fits their beliefs about themselves was specifically written about them – as in horoscopes and fortune telling.
Metaphysics: Subjective validation tends to reinforce metaphysical beliefs because, as in confirmation bias, we tend to interpret the world around us in terms of those beliefs. In the particular case of the Forer effect a deterministic belief either about our character or our future fate is reinforced by a tendency to interpret ambiguous information in its terms.
Integration: We can overcome subjective validation through the practice of critical thinking, which integrates our existing beliefs with new ones acquired through new experience or information, through the process of examining the justification of the existing beliefs in the light of the new ones.
Hi Robert, I have read the main body, and a couple of the instances detailed below, but haven’t time to go through them all just now. My initial response is cautious of accepting metaphysics for the explanation of cognitive biases. Many biases are due to giving excessive weight to one particular way of evaluating information rather than anything to do with a metaphysical belief. The availability heuristic is a good example. Biases maybe failures in objectivity, but I am very doubtful that being poor at objectivity can always be reduced to holding metaphysical beliefs. My intuition is that the direction of causation is more likely to go the other way. Cognitive biases are frequently the result of evolved information processing systems we share with other animals who couldn’t even conceive of a metaphysical belief, never mind use one to guide their decision making processes. I also think it is unfair to suggest empirical psychology necessarily assumes a naturalistic model of facts and values. The conclusion from empirical psychology that the availability heuristic is a cognitive bias doesn’t require a division of facts and values at all, simply the collection and presentation of a wider range of evidence than is typically available to our individual memory recall. I don’t see how this has anything to do with metaphysics. It is just plain old fashioned physics rather than metaphysics. If we got rid of metaphysical beliefs overnight, we’d still be left with plenty of cognitive biases because of the way our brains / minds process information. I don’t believe that when psychologists use the term cognitive bias they are suggesting they have access to some absolute reference standard with which to assess that bias, merely that the individual is being excessively selective about the information being considered and making assumptions which aren’t coherent with assumptions drawn by considering a wider range of information which they have available to them.
Hi Julian,
There are various issues here, but let’s start with the other animals.
I’m not sure that animals can’t have metaphysical beliefs, given that beliefs can be implicit as well as explicit. A belief is a model on the basis of which we act, and a metaphysical belief is a rigid and absolute model. If an animal adopted a rigid model of its environment and continued to act on the basis of that rigid model, I would call that an implicit metaphysical belief. For example, a cat could absolutely believe that every time a tin was opened there would be food for it: when its owner started opening a tin of tomatoes, it would then continue to mew in the expectation of food. The difference with humans is that the metaphysical belief becomes a conscious object of reflection, and there is thus the potential for the Middle Way rather than just its precursor, homeostasis (see the homeostasis page).
At the human level, “an excessive weight to one particular way of evaluating information” is exactly what a metaphysical belief involves. It usually has a quality of stuckness and fragility, but its effect might be, for example, to limit attention to a particular field. I don’t see why the availability heuristic can’t be an example of an implicit metaphysical belief that the truth must be found in this field rather than that.
By associating metaphysics with cognitive bias, I am not suggesting that causality goes one way or the other. Rather they are different aspects of the same interactive or mutually causal process. You might as well say that metaphysics is a redescription of cognitive bias, or another way of analysing it.
As I think I explained on the retreat, I don’t envisage that it would be possible to remove metaphysical beliefs overnight. That would be rather an idealised goal. We could make incremental progress, but as you say, it may well be that we could not eradicate them. I wouldn’t make such a clear distinction as you are between metaphysics and physics, as physical claims that are naturalistic rather than provisional are also metaphysical.
I’m open to the possibility that I may have analysed some of these biases incorrectly (which is where it would be useful to examine them in more detail with you – when you have time). However, it wouldn’t fit with my current understanding of what a cognitive bias is that a basic condition in the way we unavoidable process information is involved. I can’t see that with any of these examples, at any rate. For example, the availability bias as I understand it is not about having limited information available to us, but about our response to having that limited information. If we expect to find correct information only within that limited field, that involves a metaphysical belief. I think you need to tell me in more detail (again, when you have time) which biases you think are about background conditions.
Then, finally, there’s the question of whether empirical psychology assumes the fact-value distinction. My remarks about that were mainly intended to avoid people assuming that a similar fact-value distinction was the case for Middle Way Philosophy. Obviously I can’t account for all empirical psychologists, but my impression is that all empirical psychologists that I’ve ever come across do try to apply that distinction. Even Jung was worried about not appearing scientific, and had to be coaxed to reveal much of his autobiography because it was value-laden. So are you just objecting to my generalisation here, or do you positively know of empirical psychologists who don’t apply the fact-value distinction?
I feel I’m bumping into the effects of your scientific training here. Do take your time to think about it, because it might take a while to unpick different elements of that training. Perhaps the key thing here is that you seem to be taking me to be offering a rival scientific hypothesis about the cause of a phenomenon. That’s not what I’m doing really – I’m trying to understand that same phenomenon using different conceptual tools in a way that is more clearly compatible with a value system.
Julian and Robert, I think that you are both getting caught up in an argument about origins.
Julian, you seem to attribute the way people give different weights to the information coming in to them as a natural, non conscious effect of their genetics, culture and experience.
Robert, I don’t think you would deny that genetics, culture and experience have an impact on the development of all cognitive biases, but you seem to be suggesting that those factors have become internalised and brought into a reasoned belief, or ideological structure, a metaphysical construct.
This seems to be a trivial and yet crucial distinction between your different approaches to this fascinating subject.
For me, a metaphysical belief is necessarily a construct and as such is subject to any and all physical and mental constraints and cognitive abilities. Such beliefs, it seems to me, are formed in order to make sense of the world as we perceive it and to create a model which we can use to interpret circumstances and events. If, then, it is ourselves that create these beliefs, or have reasoned our way into accepting beliefs that others have constructed, then these fabrications, or avenues of reasoning will have been subject to our inherent cognitive biases from the get go…
We cannot escape our genetics, nor our upbringing and their effects on the way we view the world are cumulative and reinforced throughout our lives. It takes a great deal of conscious effort to overcome these biases because they are often completely hidden from our conscious mind and are so ingrained that even when we become aware of them they remain, stubbornly, a part of the way we think of and evaluate the world.
To sum up, a metaphysical belief must hold some sort of logic that is consciously available and be, at least to some degree, the result of a reasoned argument. Cognitive biases on the other hand do not need any sort of conscious reasoning, once identified they can, of course, be either defended, or attacked, with a constructed argument, but this is not necessary for their existence, or for their influence on the different ways in which we view the world we live in…
Julian and Robert, I apologise if I have misrepresented your views, but I enjoyed reading your pieces and I felt I had something to add.
Hi Martin, Thanks for your comment. You may have noticed that the discussion you’re referring to is eight years old! My ways of discussing this issue have developed quite a lot in that time, and I wouldn’t be surprised if Julian’s have too. The article above was written before my book ‘Middle Way Philosophy 4: The Integration of Belief’ in which I go into all this in more detail. I also wrote this paper about the issue of biases specifically: https://www.researchgate.net/publication/283460051_Cognitive_error_as_absolutisation. I am also planning an update on that book in a new series I am writing to be published by Equinox.
One difference in the way I discuss these issues these days is that I don’t tend to lead with the term ‘metaphysics’, but rather with ‘absolutization’, which I then relate to metaphysics. If one can understand absolutization as the common element in experience that makes us get stuck, thinking we have the whole story, we can then use that to bridge the gap between philosophy and psychology. What I want to say about the subject may or may not cohere with the ways others, such as yourself, may wish to define ‘metaphysics’. I’m interpreting it in a particular way for particular purposes.
For my part, I don’t think I’m caught up in an argument about origins, as I’m concerned only with what we do with biases when we experience them. We absolutize either when we’re taken in by them, or if we react against them to assume that what we believe is wholly false because of them. The practical Middle Way that we need here is also one that avoids both determinism, with its assumption that we can do nothing to change the situation, and freewill, with its assumption that we can change everything instantaneously.
Just as a postscript to the above comment. I have now created an ‘About metaphysics’ page, which uses two diagrams to clarify the range of metaphysical claims that surprises Julian here, as well as explaining the process by which metaphysics affects our judgements.