In a world where information spreads instantly, it's important to be able to recognize and counter disinformation. There are two main approaches to this problem: responding to already spread false information and preventing its impact.
The first approach is called "debunking." This is the work done by fact-checkers, debunking false claims and providing reliable information. This method is effective for refuting disinformation that has already spread.
The second approach works proactively. It's called "prebunking." Prebunking aims to prepare people for possible manipulation attempts before they encounter them. This approach involves creating messages that help people learn to detect and resist manipulative content before it has spread. These messages can take various forms: from large-scale information campaigns to short videos.
The essence of prebunking is to warn the audience in advance about possible manipulation attempts and equip them with tools to recognize deceptive arguments. This way, people become more resistant to disinformation in the future.
In practice, this might look like showing the audience examples of typical manipulative techniques along with their explanation and refutation. This helps people develop critical thinking and skepticism towards dubious information.
The main conditions for an effective prebunking message:
- to do it before the disinformation messages influences the audience;
- sufficient authority and ability to communicate with people, who act to prevent the spread of misinformation.
- A clear messaging formula: warning the audience they may be manipulated, a microdose demonstrating the manipulation, and a refutation explaining why it is false or manipulative.
Every state has its own sets of topics that may be manipulated by those who spread disinformation. However, the spread of disinformation can also be tracked and preparations can be made to counter these threats. This is done by monitoring the information space and surveying local experts. After that, you can decide which topics or methods of disinformation should be countered. And start acting.
As part of the prebunking initiative launched by Google, Jigsaw, Moonshot and Ukrainian organizations that fight disinformation and spread media literacy, a series of videos was created. Based on previously published research on prebunking as well as contextual research conducted across Ukraine, these videos are aimed at building resistance to manipulative messages.
Moonshot has identified current and emerging topics of disinformation messages in Ukraine. The company's research included interviews with experts and further contextual validation through cross-platform social media case studies.. As a result, three key manipulation techniques prevalent in online misinformation in Ukraine were identified:
Emotional Manipulation
This technique leverages a range of emotions such as sadness, hopelessness, and exhaustion. In the Ukrainian context, this has been used to exploit the psychological exhaustion resulting from the conflict. The approach includes a focus on fear and anger to drive existing frustration and anger around the realities of the war.
Astroturfing
This entails fabricating the appearance of grassroots movements through coordinated, inauthentic behavior. In the Ukrainian context, this was typically employed for targeted harassment, or to artificially suppress or amplify specific content/narratives.
Decontextualization
The practice of taking information out of its original context to create a false or misleading narrative.
Other Popular Manipulation Tactics
False Dilemma (False Dichotomy)
Those who use this manipulation tactic provide a limited number of decision options to choose from. Although in reality there may be other views and ways of solving the problem. For example, the phrase: "You are either with us or against us" excludes the possibility of taking a neutral position. Have a point of view that does not fit into the predetermined framework.
Cherry Picking
This tactic consists of choosing only the evidence that supports a certain thesis, while ignoring or suppressing the evidence that contradicts it. For example, in the report on the results of the study, only those data points that confirm a positive effect can be given, and data about negative effects or force majeure are kept silent.
Fake Experts
These are people who pretend to be experts in a certain field, although they do not have the necessary knowledge or experience. They can be used to support dubious ideas or to undermine the credibility of real experts. For example, an unqualified "expert" may recommend miracle cures that actually have no scientific basis.
Scapegoating
This tactic puts the blame for a problem on a particular person or group of people who cannot be the sole cause of the issue. For example, without reasons and analysis of state policy or economy, some part of society or a minority can be blamed for economic, social or demographic troubles in the country.
Polarization
This tactic divides people into two opposing groups, highlighting their differences and creating enmity between them. For example, political debates can be deliberately heated so that people start hating each other because of their beliefs and forget about the things or issues that unite them.
Ad Hominem Attack
This tactic consists of attacking the opponent instead of discussing what he or she is saying. So, for example, they criticize a person's personal qualities or appearance. In this way, manipulators try to equate the defects of the person who speaks with what he says.
Red Herring
This manipulation tactic consists in diverting attention from the real problem by focusing on another, less important topic. For example, when the problem of corruption is discussed, it can be linked to the lack of ways to solve the problem without resorting to it. And start the conversation in another direction.
Impersonation
This manipulation tactic consists of pretending to be another person online in order to deceive the audience. For example, a politician, celebrity or expert can create a fake social media page to spread misinformation.
Slippery Slope
This tactic states that the action one is about to take will inevitably lead to negative consequences. Like, if you allow some small change, it will lead to catastrophic consequences. Although the future is unknown, and there may be no connection between the first step and the disaster. And on the way from point A to point B, it will be possible to change the decision, depending on the situation.
More information on manipulation techniques is available on prebunking.withgoogle.com