top of page
Behavor_white2.png

Car Park Ethics (and Lessons for Behaviour, Blind Spots and Conduct Risk)

  • davidjamesgrosse
  • Mar 1, 2022
  • 6 min read

In the BC era (Before Covid) my fun daily commute involved a drive to the local station, followed by a train and tube journey into London, during which I would dial-in to pay the parking charges and register my car.


Occasionally on the return train journey I would realise I had forgotten to pay and approach the car park with sweaty palms and trepidation. Then, with a sense of relief, I would see that no penalty notice was stuck to my windscreen and drive home, conveniently overlooking that I had not paid the parking fee.


Was that behaviour acceptable? What would you have done? Would it make any difference if you needed to pay the £8 in cash to an attendant at the exit?

 

The chances are that most of you wouldn’t pay after the event if the car park was unattended, and even for those of you who genuinely think that you would pay, in reality many would not. The human mind has an unerring ability to rationalize our actions and to create a gap between intended and actual ethical behaviour[i].


You may already be undertaking a moral justification on my behalf (it wasn’t deliberate, he was busy, it’s a faceless corporation, it would be complicated to pay after the event anyway, nobody will thank him).

 

How about this scenario: “Your firm has a policy of paying for a taxi home after 9.00 pm. You finish an important assignment at 8.30 pm and are preparing to leave when you notice it is raining. Would you find some other things to do until 9.00 pm, and then take a taxi whose costs you later reclaim?”

 

Whilst researching at the London School of Economics (LSE) for my MSc I asked this question (together with more complex and technical ethical scenarios) to a sample of people employed in financial services. The average response was that people would wait and take the taxi “more often than not”.

 

Of course, it may be that you genuinely are the exemplar who wouldn’t dream of taking advantage, and that you consider yourself virtuous. In fact, most people reading this article think that they are more ethical than the average person. Statistics are not my strong point but that seems unlikely.

 

In an often-quoted example, most people think that they are better behind the steering wheel than the average driver[ii], and in the case of ethics this pattern is even more evident. A positive self-image is important, and who wants to admit to themselves as being less ethical than average? A survey of those serving time in prison found they rated themselves higher than the average person on a range of characteristics, including morality and kindness[iii].

 

In my study, after the respondents had completed a range of ethical scenarios, I asked them: “Taking into account all the scenarios that you have answered, how do you think other bankers would behave compared to you in the likelihood of them taking advantage of opportunities that were presented?” Only 6% believed that other bankers would exploit the scenarios less than themselves (with 58% saying more, and 36% the same).


This is known as the Better-Than-Average-Effect (BTAE) where people evaluate themselves more favourably than their peers[iv]. However, if you see yourself as an ethical person then how does your mind handle any evidence to the contrary? Are you busy re-framing and rationalizing the uncomfortable truth?

 

Does any of this matter, and is there a read across into conduct risk? I would answer with a resounding yes. If we spend too much time thinking about misconduct as solely the result of egregious actions from a few naughty outliers, then we are missing the important context that all of us are capable of taking inappropriate actions and then rationalizing them as being ethical[v]. Many conduct issues arise from decent people doing unwise things[vi].

 

So what actions can we take to help address this? The first is awareness of these behavioural biases and recognizing that we are all prone to them. There are also a range of factors that are likely to influence how we respond to a given situation and that will increase the likelihood that we take a less ethical option[vii]. These may include being tired and emotional; a sense that things are not fair; loyalty to others; or the slippery slope (with a small transgression leading to bigger issues). Therefore, be mindful of these exacerbating factors when making decisions, and if you are a manager be sure not to cultivate an environment where these features are prevalent.

 

Another common human predisposition is to over-emphasise personality and internal explanations for the behaviour of others, whilst under-emphasising situational and external factors. Unsurprisingly when considering your own conduct, you take a different slant as you are incentivised to consider the wider context. Psychologists call this the “fundamental attribution error”. This is not to say that character and individual responsibility are not important, but rather a more balanced understanding of the drivers of decisions and actions is often needed.  As a company, manager or colleague you need to ask yourself, “am I creating an environment that is making it harder for people to take ethical decisions?”. Spend less time looking for the bad apple and more time creating a good barrel.

 

Given these factors, we need to think carefully about those circumstances in an organization where we may have blind spots and where people are more prone to exploit a situation. These will be the grey areas and conflicts where we need to balance the interests of ourselves, teams, clients, the bank and other parties, and where temptation meets opportunity. They may be more subtle or technical that unattended car park exits or claiming taxi expenses. In retrospect the risks around LIBOR or Payment Protection Insurance seem obvious, but were they before the event? Were biases and blind spots preventing their timely identification? Importantly, what are the new and developing conflict scenarios and blind spots latent within the financial system (or indeed any business or organization) today?

 

Research into the psychological aspects of conflicts-of-interest shows that they are also subject to self-serving interpretations of fairness[viii]. This applies to Politicians, Prime Ministers, Bankers and indeed Behavioural Scientists[ix]. Test yourself to identify all the conflicts that are specific to your company, role and business and to understand how they are (or are not) recognized and managed.

 

This article is intended to test us all to look at conduct through a different lens. In Financial Services a lot of time is spent looking at aspirational target outcomes and convincing ourselves that we treat clients fairly and act to maintain market integrity. However, we should challenge ourselves to look upstream into the human biases and environmental drivers that shape actions (the “why” of conduct risk) and the situations and scenarios where those behaviours are most likely to manifest themselves as a problem (the “where” of conduct risk).

 

A final question: “Whilst tidying your house you come across some old confidential papers from your previous employer. On reading them you realise that the contents are exactly what you need to help accelerate the development of a new product your team are working on. Would you use the details of these papers rather than return or destroy them?”[x] If you said no, congratulations you are a virtuous person. But what about if you are tired, under pressure from your manager to perform, your prior work has been unfairly overlooked, the internal processes are getting in the way, you want to help your team, or you have a sneaking suspicion that others are taking shortcuts?

 

Postscript - This article is intended as an introduction into important elements of human behaviour, ethical decision making and environmental context, and their impact on conduct risk. In future articles I will dive more deeply into the themes raised, give further worked examples and proffer some suggestions for action. All comments and feedback received gratefully.

 

 

References:


[i] Bazerman, M. H. and Tenbrunsel, A. E. (2011) “Blind spots : why we fail to do what’s right and what to do about it”.

 

[ii] Delhomme, P. (1991). “Comparing one's driving with others': assessment of abilities and frequency of offences. Evidence for a superior conformity of self-bias?”. Accident Analysis & Prevention, 23(6), 493-508.

 

[iii] Sedikides, C. et al. (2014) ‘Behind bars but above the bar: Prisoners consider themselves more prosocial than non-prisoners’, British Journal of Social Psychology.

 

[iv] Guenther, C. L., & Alicke, M. D. (2010). “Deconstructing the Better-Than-Average Effect”. Journal of Personality and Social Psychology, 99(5), 755–770.

 

[v] Many of the concepts raised within this article link to the concept of “bounded ethicality” which relates to the systematic and predictable ways in which people make decisions without realising the (ethical) implications of their behaviour. For more details see -    https://www.ethicalsystems.org/bounded-ethicality/

 

[vi] Ariely, D. and Jones, S. (2012) “The (honest) truth about dishonesty”. See also endnote ix).

 

[vii] “The only way is ethics” | Mind Gym UK. Available at: https://uk.themindgym.com/resources/the-only-way-is-ethics/

 

[viii] Chugh, D., Bazerman, M. H. and Banaji, M. R. (2005) ‘Bounded ethicality as a psychological barrier to recognizing conflicts of interest’.

 

[ix] In the above endnote vi) I refer to the work of the popular behavioural scientist Dan Ariely – However, a 2021 article indicates that some of the data underlying key research may not be robust. See - https://www.economist.com/graphic-detail/2021/08/20/a-study-on-dishonesty-was-based-on-fraudulent-data

 

[x] The average response score from respondents to this question was “about half the time”. However, remember that people over-estimate their own ethicality, including within their responses to confidential surveys. Refer - Krumpal (2013). “Determinants of social desirability bias in sensitive surveys: A literature review”. Quality and Quantity, Vol. 47, pp. 2025–2047.

Comments


bottom of page