Understanding and identifying cognitive biases
Our brain is a real decision-making machine, from the simplest to the most complex. In particular our prefrontal cortex, at the very front of our brain. According to Kahneman, two systems of thought are involved in decision making:
Nowadays, a third system is often cited: the inhibition system, which makes it possible to interrupt system 1 - automatic - to activate system 2 and reach reasoned decisions.
Daniel Kahneman, winner of the Nobel Prize in Economics in 2002, began working on this subject in the 1970s with Amos Tversky to try to understand how irrational decisions could be made in the economic field.
Since then, the results of their work have spread to many fields. In particular, the knowledge of cognitive biases which leads to understand and predict the behaviors of individuals, sometimes exploited against the interest of the users (c.f. Dark Patterns).
A cognitive bias is like a "mental shortcut", an automatic thought pattern, which is based on incomplete, non-rational, or distorted information.
In some cases, cognitive biases (or heuristics) help us. Linked to System 1, they allow us to make judgments or decisions quickly and with few resources for our brain.
They are particularly involved in four situations:
when there is too much information: we use these shortcuts to filter in mass
We tend to rely heavily on the first piece of information we are given to make a decision.
For example, if a cheap item is placed next to a more expensive item, the cheap item seems more affordable. Another example was reported in a recent study. Participants were shown increasingly inappropriate images of selfies. The more participants had been exposed to the inappropriate selfies, the more likely they were to disclose personal information. The images had ingrained their perception of what was appropriate to disclose or not.
Also called the "default effect," it describes our preference to leave things as they are rather than change them. It is the bias that causes us to stick with the default settings rather than change them.
For example, it has been shown that the "default" option is more often chosen when it is identified as such than when it is not.
This bias is related to our fear of losing something by changing the situation. Sticking with the status quo avoids the risk of loss, and so we prefer inaction.
However, pre-selected options are not always the least risky, for example for privacy forms where the default choices undermine our protection.
Framing bias occurs when information is presented in a selective manner. That is, by emphasizing the positive consequences of an action and omitting the risks involved.
Similar information may be more or less chosen depending on the characteristics emphasized. For example, asking a user to use a facial recognition service will be chosen more when the benefits are emphasized while omitting the privacy implications. Decisions are therefore influenced by the way the information is presented.
It is the tendency to overestimate the immediate consequences of a decision, while underestimating the long-term consequences.
For example, we choose an immediate reward over a larger reward that might be received in the future. We weigh the fact that long-term actions are uncertain, and therefore carry a risk compared to immediate actions. This difficulty in evaluating the future is called "time myopia". In one study, it was found that people preferred to buy a cheaper movie ticket, even if it meant disclosing more personal data, rather than paying a more expensive ticket. Their choice was different when the tickets were offered at the same price.
It is the underestimation of the likelihood of something happening to us compared to others.
We tend to overestimate our chances of positive experiences compared to negative experiences that might happen to others.
In the context of personal data, a user will therefore think that he or she is less likely to be exposed to a privacy breach than others. This therefore influences his or her sharing decisions.
When a large number of choices is offered to us, we feel overwhelmed. The greater the number of choices offered, the slower and less satisfying the decision will be, and the more we will regret it later.
This is what happens, for example, when there are a large number of choices for configuring their privacy settings, the user will be overwhelmed and their judgments will be influenced.
It is the tendency to adopt beliefs or behaviors because other people have adopted them. In short, doing something because others do it.
This can help us in some cases, as it avoids the long and costly process of evaluation that will lead us to make a decision. This bias also relies on our fear of being excluded from a group and our social appetite.
In a case where other people will have careless attitudes, for example about privacy on social networks, these behaviors will tend to be reproduced and risks will be favored.
It is the act of choosing the most salient option. The graphic representation of elements will impact our attentional processes and subsequently our decisions. We tend to focus on elements or information that are more salient than others.
In application, one choice may be more salient than another and bias our decision making. Or, on the contrary, two options may be presented as equal, when in fact they are different. This is the case, for example, on online newspapers, when sponsored articles are presented visually in the same way as the newspaper articles. This can lead to misleading the user's judgment.
This bias is a major characteristic that will explain human behavior when making decisions. It is the fact of preferring to avoid losses, rather than to gain equivalent gains.
The pain of loss is felt more intensely than the equivalent pleasure of gain. For moderate stakes, one would be twice as sensitive to the loss as to a gain of the same value. This may entail certain risks for users. For example, in the context of 'free trials', the loss aversion bias will make us more likely to agree to pay to avoid losing access to a service or product.
This bias consists of being fixated on the way an object is normally used.
This limits us to the possibility of using the object or a symbol in another way. It can affect us when an object or symbol is misused.
For example, when a lock symbol is used in front of an option, the user will tend to think that this option is the safest. He then abstracts from any rational evaluation or critical thinking and his decisions are impacted.
Self-consistency bias, also known as restraint bias, is the tendency to overestimate one's ability to control impulsive behaviors.
We think we are capable of avoiding a behavior, but we fail after being exposed to it. Our emotions and affective states strongly impact our judgment and decision-making. However, we are generally unable to predict our future affective states. Restraint bias can lead to significant risks (e.g. in the field of addiction). It is also involved, for example, when we try to resist an action, but give in to it after repeated exposure to pressure (e.g. unwanted pop-ups).
The main risk is that the biases are systematic and therefore predictable, and therefore manipulable. It can therefore be tempting for companies to play on our cognitive biases in order to influence our purchasing decisions or our private lives, rather than to convince us of the quality of their products or services. This results in particular in dark patterns or "deceptive patterns", interfaces that deceive or manipulate users. Our R&D Lab took on the problem in 2021 and we developed a platform to fight against Dark Patterns.
Every month, receive our newsletters with our tips and latest projects