We all have biases which influence how we see the world and make decisions. They can lead to us prejudging people and situations, and making poor decisions.
The following are examples of (cognitive / unconscious) biases:
- Ingroup bias: the tendency for people to give preferential treatment to others they perceive to be members of their own groups.
- Hyperbolic discounting: to have a stronger preference for immediate benefits rather than longer term ones.
- Horns effect or halo effect: our impression of something, people or a person in one domain influences our impression of them in other domains.
- Ostrich effect: people avoid information that they perceive as potentially unpleasant.
- Confirmation bias: to search for, favour, interpret and recall information in a way that confirms our pre-existing beliefs.
- Pessimism bias: to overestimate the likelihood that bad things will happen. This bias can cause us to assume that we are going to do badly on an exam, even if we prepared for it and it’s likely that we will actually do well. The opposite is optimism bias.
- Bandwagon effect: to think or act in a certain way because we believe that others are doing the same. This is related to the Herd Instinct, which is the tendency to adopt the opinions and follow the behaviours of the majority to feel safer and avoid conflict.
- Hindsight bias: people overestimate how predictable a past event was, once they already know its outcome.
- The rosy retrospection bias: causes people to remember past events as being more positive than they were in reality.
Awareness of unconscious bias can help in decision making and tackling inequalities, but behavioural scientists and psychologists such as Daniel Kahneman (author Thinking, Fast and Slow) acknowledge that merely knowing that we can make biased decisions does not necessarily improve our decision-making.
It’s more effective to ‘debias the system’ rather than the person.
When we talk about reducing inequalities / advancing equality and eliminating discrimination, what do we really mean?
We’re talking about creating a system that does not unfairly or unlawfully discriminate against people because of their ethnicity, disability, gender etc – a system that isn’t biased.
To ‘debias the system’:
- Remove personally identifiable information (or ‘markers’) from decision making processes such as names, titles, date of birth or years at school, and sometimes where someone lives. These things can indirectly reveal characteristics which may lead to bias (horns/halo effect, for example). Typically, a name could reveal someone’s gender or ethnicity. However, in some cases they may be relevant and necessary to know – for example, where eligibility criteria for a service includes disability or an age limit.
- Check that needs based upon age, disabilities, cultures, ethnicities, gender identities, family/relationships and religion or belief are considered fully, and make adjustments where necessary. For example, an ICT system will need to be checked that it can be used by people with neurodiversity (such as autism and dyslexia), sensory and physical disabilities. The Diversity Guide can assist as well as community, staff and/or service user consultation.
- Look at diversity monitoring data for patterns – is there over or under-representation in the system by particular groups (‘disparate impact’)? At what point/s in the process do different outcomes occur? What policy or practice could be causing this difference? What behaviours (such as customer service) could be leading to different results?
- Check for jargon and making sure information is in plain English and easy to understand. If a person is not already familiar with the system, jargon can be a barrier to getting involved. If used in recruitment, this reduces your ability to recruit from a wider pool of talented people who may be able to offer new and interesting perspectives.
- Check for visibility of diversity – even for minority groups. For example, is there a narrative and data around Black, Asian and other ethnically diverse people? Who is ‘around the table’ and whose issues could be missed because they are not in the ‘ingroup’?
- ‘Walk through the process’ from the perspective of someone new. Don’t dismiss things ‘because of small numbers’ – use qualitative information instead.
- Check for stereotyping. For example, women only / always represented in caring and parenting roles or disabled people as vulnerable and needing help.
- Ensure that decisions are based upon facts, not assumptions and emotions. For example, selecting a candidate for a job because of their experience, skills and qualifications, not ‘because we like them’ or we ‘think they’ll fit in’.
- Test systems and processes with a diverse group of people.
- Challenge how things are framed, as this may lead to a different decision and outcome. For example, ‘80% fat free’ sounds better than ‘20% fat’.
- Think more than once about things; consider a range of options from different perspectives. Carry out a ‘premortem’ – imagine potential failures and try to explain the likely cause.
- Don’t make decisions quickly if possible – ‘sleep on it’.
- Check you have the right measurements in place – will they give you the answers you are looking for? How well aligned are they to what you are trying to achieve? Are your objectives SMART enough?
- Be prepared to change the system!
The system can include processes, projects, policies and applications such as an ICT system.
Remember that the key points of the definition of Institutional Discrimination (taken from the original definition of Institutional Racism from the Macpherson report) are that it is often ‘unwitting’ and out of ‘ignorance’. This means people do not necessarily cause discrimination and inequalities on purpose, and it could be because they over-looked something because they did not have that ‘world view’ of things.