4/13/2022

Psychology for Communication Strategy (3/4): Connecting the dots

In the first and second episodes we examined 5 principles of cognitive psychology we can use to select the right information and analyze a situation thoroughly.

Now we’re discovering 5 more concepts that can ease our efforts to come up with sound conclusions and reasonable predictions.

Illusory correlation


The illusory correlation leads us to identify connections between people, events and actions that don’t actually exist.

It especially happens when these connections seem unusual, odd and therefore more conspicuous.

This kind of bias underpins many superstitions - for example: a gesture is accidentally associated with a lucky event that occurs at the same time and has since become a ritual.

Furthermore, it’s the cause and effect of stereotypes and generalizations - for example: a negative prejudice on a social group leads us to blame its members for a malicious behavior on the basis of a simple coincidence, without proven responsibilities.

It often happens in our work, particularly when we must evaluate the results of a communication plan: we risk acknowledging ourselves merits we don’t have and attributing to our performance great results which are actually due to external factors.

Source: Chapman, L (1967). "Illusory correlation in observational report". Journal of Verbal Learning and Verbal Behavior. 6 (1): 151–155

Illusion of validity


The illusion of validity makes us overestimate the accuracy of our predictions after evaluating a set of data.

It happens particularly when a coherent pattern seems to emerge from such set of data; and we interpret that pattern in a contrived way so as to give an order to the chaos and unpredictability of our life.

We often go astray because of the base rate fallacy (when individual cases capture our attention more than the overall data referring to the entire population) and the heuristic representativeness (when we consider an event more likely if it matches our stereotypes).

Sometimes, when we work on an analysis or a communication strategy, we are misled by individual situations and projects which stand out clearly.

We may forget that individual cases don’t make statistics; or predict a certain outcome only because it fits in our wishes or preconceptions.

Source: Kahneman, Daniel; Tversky, Amos (1973). "On the Psychology of Prediction". Psychological Review. 80 (4): 237-251

Neglect of probability


The neglect of probability makes us ignore the probability of an event when making decisions under uncertain conditions.

This happens especially when the event is perceived as threatening and involves us emotionally, so as to feed exaggerated and unreasonable fears.

Many of our emotional responses depend on the dreaded impact of an event, regardless of its actual degree of probability: therefore we often overestimate frightening but statistically limited risks.

For example: about tragic events such as plane crashes, shark attacks or terrorist acts, our fears are determined by their devastating potential rather than their real likelihood (which is sometimes very low).

Likewise, when we plan a communication strategy, emotionally overestimating the risks can lead us to escape profitable activities that have an acceptable margin of danger.

Source: Sunstein, Cass R. (2002). "Probability neglect: Emotions, worst cases, and law". The Yale Law Journal, 112 (1), 61-107

Status quo bias


The status quo bias makes us prefer the current situation, which we take as a natural point of reference.

It stems in part from loss aversion: when the status quo could change, the possible losses are perceived more intensely than the possible benefits.

There are also valid reasons for such a conservative attitude: when we don’t have enough information, the most sensible solution may be to follow already known paths.

In addition, change requires material costs and cognitive effort that are sometimes not worth it.

In our work, however, that inclination can prove fatal: in an ever-changing world, simply keeping the current situation usually means getting out of the race.

Obviously we should also avoid the opposite: to change just for the sake of shuffling the cards.

Source: Kahneman, D .; Knetsch, JL; Thaler, RH (1991). "Anomalies: The Endowment Effect, Loss Aversion, and Status Quo Bias". Journal of Economic Perspectives. 5 (1): 193–206

Escalation of commitment


The escalation of commitment has to do with the inability of individuals and groups to admit their mistakes.

It occurs when we insist in an increasingly failed sequence of actions, in order to justify a previous investment of energy, time and money and to not deny the decisions we have already made.

The effect is ironic: the more we face defeats and setbacks, the more we invest additional resources to achieve our goals.

All this can trigger a vicious circle that many times, even in recent history, leads to long-lasting wars and senseless projects that waste public and private resources.

Our work is not immune to it.

For example, sometimes we continue a campaign with poor results, so as not to admit mistakes with our clients; or persist in an incorrect market positioning in order not to delegitimize the previous decisions underlying it.

Source: Staw, Barry M. (1997). "The escalation of commitment: An update and appraisal". In Shapira, Zur (ed.). Organizational Decision Making. New York, NY: Cambridge University Press. pp. 191-215

Final takeaways


Illusory correlation Don't take the relationship among events for granted
Illusion of validity Don't overestimate your ability to analyze data patterns
Neglect of probability Evaluate the likelihood of an event rationally, not emotionally
Status quo bias Consider valid alternatives to the current situation
Escalation of commitment Critically review your past decisions

No comments:

Post a Comment