Our guest today is David Mcraney, an internationally bestselling author, journalist, and lecturer who created the You Are Not So Smart blog, books, and podcast. David, who lives in Mississippi, cut his teeth covering Hurricane Katrina on the Gulf Coast and across the Deep South. Since then, he has been a beat reporter, editor, photographer, voiceover artist and television host. Before that, he had a varied working life, waiting tables, working construction, selling leather coats, building and installing electrical control panels, and owning pet stores. He’s here to talk to us today about his latest book ‘How to Beat your Brain’, an attempt to help us overcome our quirks and make decisions more effectively.
MWS Podcast 76: David Mcraney as audio only:
Download audio: MWS_Podcast_76_David_Mcraney
If you’d like to listen to the full unedited version of the talk in which we go into the Middle Way in a bit more depth and talk a bit more about David’s hopes for the book, you can do so here:
MWS Podcast 76: David Mcraney full version as audio only:
Download audio: MWS_Podcast_76_David_Mcraney_full_version
Here’s also the link to David’s blog post on Brand Loyalty that we talked about
A great podcast with lots of good material. I’ve also just been looking at two of David’s books, and can recommend them to anyone who wants a readable introduction to cognitive biases and the like.
I obviously do have some differences of emphasis from David. One is that he discusses cognitive biases and heuristics as phenomena that obviously have both positive and negative features, whereas I am more interested in drawing on this work to more clearly and broadly differentiate the positive from the negative features. I tend to do this by distinguishing between the aspects of cognitive biases that are unavoidable aspects of our embodied situation, and those for which we can take some responsibility because we are capable of changing them. Absolutisation seems to be the distinguishing feature of those aspects of cognitive biases we can do something about.
I also think it’s necessary to grasp the nettle where absolutist or metaphysical beliefs are concerned. There really is no clear difference between absolutisation as it is found in cognitive biases and fallacies and absolutisation as it is found in philosophical and religious traditions – and the practical effects are indistinguishable – so I can see no justification for treating them differently. But that does not mean an anti-religious stance, given that negative metaphysical beliefs are just as absolute as positive ones (so atheism is just as dogmatic as theism) and many other aspects (for example, archetypes) of religion can be identified and worked with aside from absolute beliefs.
I’ve just listened to this podcast, and I’m also halfway through reading David’s book (the one called ‘You Can Beat Your Brain’ – something awful about that UK title). I thought the main thing was ‘should you do anything about these biases?’ with the answer yes, because it can have such a negative impact on ourselves and others, sometimes many others. In thinking about ‘can you do anything about these biases?’ I’ve realised, in my own experience, that one of the reasons we have a formal risk assessment procedure (let’s say, for planning school trips, where the children’s wellbeing is in my hands) is to protect the children from my (mistaken) belief that I will always act in a rational fashion. Perhaps we should be explicitly giving this as a reason why we have to do formal risk assessment procedure (as people who have to do this sort of thing love to complain about ‘health and safety gone mad’ etc.). The ‘sunk cost’ fallacy seems particularly dangerous for things like school trips, especially those that contain ‘hazardous pursuits’ – the trip leader can feel that they’ve spent so much time and effort planning a particular activity that they’ll carry on with it even though circumstances suggest that its not a good idea (pressing on up the mountain in horrendous weather).
Hi Jim,
I do agree that forward planning in general can be one way of working round likely biases. However, bureaucratised forward planning doesn’t strike me as a particularly effective way to do it, because bureaucracy tries to standardise everything, undermines trust and may also undermine our sense of responsibility. I have worked in a college in the past where I had to do risk assessments for trips, and I usually did think they were ridiculous. Perhaps there are some trips where there were appreciable or unusual risks for which prior planning would be useful and, as you say, it would help us avoid the assumption that we’ll always respond adequately. However, I can’t think of any actual trip (given the pretty safe trips I did, like taking students to a Buddhist monastery) where I actually did anything different at all because of the risk assessment. It’s the one-size-fits-all and lack of discretion and lack of trust that seemed to me so undermining about the risk assessment process.
You’ve hit the nail on the head there, about the bureaucracy issues. If the burden of formalising what you would have done anyway is too great then trips don’t even get started, to the detriment of the children who would have gone on them.