top of page

Using open source to counter mis- and disinformation

What is mis- and disinformation? How is it undermining Afghanistan’s information environment, and how can we help tackle it?

Every day, we are confronted by an overwhelming amount of content: the 24/7 news cycle and social media platforms have made it easier than ever to access, consume and disseminate news and information, but it is also becoming increasingly difficult to distinguish truth from lies. 


The spread of false information is not a new phenomenon, however, the debate around its causes and consequences has erupted in recent years. Misinformation and disinformation are two terms that have become ingrained into public consciousness and use, particularly in the context of the internet and social media. They refer to the spread of false or misleading information – and the biggest difference between the two is intention


Misinformation can be defined as false or misleading information, whilst disinformation involves the intentional spread of false or misleading information – the latter always aims to deceive people. 


Both pose a significant threat to the information environment (the space in which information is disseminated and consumed, see our last explainer on this here), with harmful attempts becoming increasingly sophisticated.


The many forms of mis- and disinformation 


At Afghan Witness (AW), our researchers monitor social media on a daily basis to collect and, where possible, verify information on human rights incidents in Afghanistan. Videos and images claiming to show human rights violations are frequently shared, but it is not uncommon to see footage being mislabeled or used out of its original context. 


The team frequently finds publications and social media accounts presenting old content as though it happened recently. An example is this video posted in September 2023 on X (formerly Twitter) showing a woman being stoned. The footage had been circulating on social media accompanied by captions implying the incident took place recently. AW found that the same video was covered by the news outlet Radio Free Europe (RFE/RL)  in February 2020. It also matches footage of a video covered by RFE/RL and others back in October 2015, claiming to show a woman being stoned in Ghor province.


Screenshot of a post containing a still of a video (post shared recently of the footage of a woman being stoned, source: X), two additional images showing examples of the footage of the same scene shared in 2020 and 2015 (source: RFERL and Guardian).
An example of old footage presented as though it is new.

We often see the same videos resurfacing and repeatedly shared out of context. One such example is footage shared on X (formerly Twitter), most recently in September 2023, claiming to show a woman being beaten after losing her husband in a crowd. The footage is actually from December 2022 and shows a male student being beaten following protests against the Taliban’s ban on women attending university.  


While disinformation takes many forms, one recent example identified by researchers involved image manipulation. In September 2023, AW identified a post on X (formerly Twitter) which claimed that a woman was released by the Taliban after being imprisoned by her brothers for 25 years. AW found that images purporting to show the woman in her youth were in fact photoshopped and originally taken from Instagram. 


Screenshot of a post containing images of a woman (X post claiming that the pictures are of Ms. Nikbakht from her youth with one post showing her with older features), and two additional images showing the same photos – but different faces – found on Instagram and Facebook from unrelated accounts.
An example of image manipulation.

These types of posts – all examples of mis- and disinformation – can generate significant engagement and are often widely shared, with social media users frequently sharing or interacting with the content without checking for accuracy or credibility. 


Some of the actors in the information environment, however, have a specific interest in pushing their own agendas by spreading disinformation. AW has seen such content shared by all sides of the political spectrum. 


The Afghanistan Liberation Movement (ALM) and Loy Paktia Freedom Front (RP01) claim to be the only resistance groups in Afghanistan operating drones. There is, however, little evidence to support this claim. In fact, AW recently found that the two groups have been using recycled satellite and stock images in posts claiming ownership and usage of drones against the Taliban


The intention behind the use of this manufactured content is not clear, though. The aim may be for the groups to deliberately obscure their capabilities or to appear more technologically advanced than they really are. What is clear is the intentional spreading of misleading information.


Another common example of disinformation is the creation of false accounts posing as well-known figures, Taliban ministries or even established media outlets. 


AW’s research has identified several false news accounts set up to closely mimic the branding and style of popular Afghan media organisations – particularly those deemed critical of the Taliban – often attempting to attack or undermine resistance or anti-Taliban actors. 


In research carried out in February 2023, AW identified an X (formerly Twitter) account that mimicked the outlet Afghanistan International, with the fake account’s posts gaining over one million views. Unlike other false accounts AW had come across, the account was not overtly a parody or false, raising concerns that the majority of its content could be read as genuine news by a passing viewer.



Comparison of two screenshots taken from X/Twitter accounts. On the left: the post by the official Afghanistan International account @AfIntlBrk, showing an image with a headline and blue and red branding. On the right is a post by the parody account @AF_Inter5, which mimics the format and branding of the original account.
False accounts are set up to closely mimic the branding and style of popular outlets; on the left is an image taken from @AfIntlBrk, on the right is an Image posted by @AF_Inter5.

Similarly, in a separate case in July 2022, a fake Taliban education ministry account – now deleted – was able to deceive reputable news outlets with a false press release announcing the imminent reopening of girls’ schools in Afghanistan. The incident demonstrated a relatively advanced disinformation effort, which was achieved by setting up a clone account, running it for a period to establish legitimacy and then using it to release a false statement on an issue that was bound to attract international attention. 


These efforts to appear as genuine sources of information have the sole intent of spreading disinformation and undermining the Afghan information environment.


Open source as a debunking tool


AW uses open source techniques to debunk mis- and disinformation. Verifying claims and increasing access to reliable and accurate information, as well as providing detailed analysis of the situation in the country, allows people to better distinguish between accurate and false news at a time when journalists in Afghanistan face numerous restrictions and barriers on their reporting. 


There are several different types of techniques that can be used: from a simple quick reverse image search (which checks if a photograph has been used before and has been taken out of context) to geolocation and chronolocation (which can pinpoint where and at what time the footage was taken). 


As mentioned before, some cases of disinformation use real footage that has not been manipulated but has instead been taken out of context to present a false narrative.  While not always intentional, these posts can generate thousands of views or shares.


An example of this is the below social media post, shared in March 2022, claiming children were being sold at a market in Afghanistan. The issue is complex, not only because it is an emotive and powerful topic, but also because according to news reports, the selling of children is indeed taking place in Afghanistan, making it an ideal topic for misinformation.


Screenshot of a post containing a blurred image of a baby wrapped in a blanket. On the right, a still of a crowd gathering next to a building that appears to be in ruins/abandoned. There is a red square highlighting the matching features in the background
AW’s investigation into an alleged child market was able to prove that the footage claiming to show children being sold, was actually of an earthquake relief effort.

In order to clarify if this story was false or true, AW investigators used several research and open source techniques, including the analysis of social media comments and various versions of the footage shared online; geolocation to find the exact location of where the footage was filmed; and the identification of some of the individuals featured in the video. By using tools freely available on the internet, AW was able to confirm that this video was not of a child-selling market, but was, in fact, an earthquake relief effort filmed in Badghis province. 


Why information literacy is more important than ever


Open source has grown in popularity – there is no shortage of online tutorials and websites allowing people to develop these skills from the comfort of their homes and without formal training, and many major news outlets now have fact-checking desks and verification units. 


However, increasing awareness around open source techniques has also led to a rise in its misuse – the same methods used to debunk disinformation might also be used to spread it. 


AW previously identified an account promoted by the Taliban that describes itself as “an independent and free organisation, which was created to identify false and fake news”. However, while the account used the fact-checking “brand”, AW found that it was not providing the evidence nor the verification processes necessary for meaningful fact-checking work. AW has seen anti-Taliban campaigners also pursue this strategy, pushing unverified, out-of-context and occasionally false stories on accounts branded as fact-checkers. 


There has been a discussion by practitioners and researchers on best practices to help reduce the misuse of open source. The Berkeley Protocol on Digital Open Source Investigations published jointly by UN Human Rights Office (OHCHR) with the Human Rights Center at the University of California, Berkeley, “identifies international standards and provides guidance on methodologies and procedures for gathering, analysing, and preserving digital information in a professional, legal, and ethical manner”. These guidelines ensure that open source investigators always trace and attribute the online content to the original source, if possible, evaluate the credibility and reliability of those sources, comply with legal requirements and ethical norms, and minimise the risk of harm to themselves and their sources. 


Mis- and disinformation are issues that have always existed and will continue to exist in the future. It is, however, essential to not let it become the norm and undermine our information spaces, especially the digital ones, where information can be spread to thousands with a single click. Information literacy – the ability to find, comprehend, and communicate information in all its various formats – is another key component to achieving that mission. In fact, this week is UNESCO’s Global Media and Information Literacy Week, commemorated annually to “raise awareness and celebrate the progress achieved towards Media and Information Literacy for All.”


​​Investing in critical thinking, research, and verification skills is part of AW’s work to strengthen the open source capabilities of the Afghan diaspora. Flagging examples of mis- and disinformation – and explaining how to do this debunking work – will continue to be a crucial aspect of the work we do to promote a stronger information environment in Afghanistan. 


A diagram titled mis- and disinformation. Misinformation - the spread of false or misleading information. Disinformation - the intentional spread of false or misleading information, with the aim of deceiving people.  Mis- and disinformation take many forms: the repurposing of old content or sharing out of context, the manipulation of content, the creation of fake accounts.  Debunking tools: open source techniques (showing an icon of a magnifying glass and an unlocked web browser) and information literacy (showing an icon of a book, world wide web, and a speech bubble)
​​Investing in critical thinking, research, and verification skills is part of AW’s work to strengthen Afghanistan’s information environment.

bottom of page