Weeks after Lithuania imposed EU sanctions on Kaliningrad transit, LRT English came under attack from bot networks on Facebook. Analysis by DebunkEU, an NGO researching disinformation, uncovered links between the Facebook accounts used in the attack and a potentially Russia-linked group buying access to authentic social media profiles.
– Many of the Facebook posts published by LRT English between June 1 and August 31, 2022, received a very high interaction rate, up to 111 times more reactions than the average post.
– The distribution of interactions indicated unnatural behaviour, consistent with artificial amplification tactics.
– Automated analysis of data showed the interactions originating from 91 countries, with 57 percent of interactions from Africa and Asia.
– Manual analysis of 850 accounts revealed more than half of them (56.9 percent) to be suspicious, with most of them originating from Nigeria, Kenya and the Philippines.
– DebunkEU found two users promoting a chance to earn money as part of a social media ‘account-renting’ scheme, suggesting a modus operandi for the campaign. One of the users was Russia-linked.
In June 2022, Lithuania began banning the transit of sanctioned goods via rail between Russia and its Baltic exclave of Kaliningrad. The ban entered into force on June 18 at midnight and subsequently led to a pressure campaign by the Kremlin against Lithuania. Warnings from Moscow including threats of “serious negative impact on the population” and a proposal from one MP to revoke the recognition of Lithuania’s independence.
The situation then became a pretext to disseminate false or misleading narratives, including that it would lead to a famine in Kaliningrad and the Third World War.
LRT English posts go viral
Although LRT is one of the largest media outlets in Lithuania, posts published on the LRT English Facebook page typically elicit only a moderate number of reactions. Between September 2021 and June 2022, there was an average of 42 reactions per post. However, in the second half of June, some posts about the Kaliningrad transit situation went viral.
Between June 1 and August 31, 2022, posts published by LRT English that contained the keyword "Kaliningrad" generated 45,700 reactions. However, only a selection of 14 posts focusing mainly on sanctions imposed on Kaliningrad were characterised by the significant increase in the number of reactions, generating up to 111 times more reactions than the average. The posts that gained the highest number of responses occurred at consistent intervals of 10 to 14 days.
Some posts about Kaliningrad gathered hundreds or thousands of reactions, comments and shares. The number of interactions of some of the Lithuanian broadcaster's posts was much higher or comparable to the number of interactions of major global media outlets, such as Reuters, BBC News, and CNN.
On July 7, the Facebook page of LRT English announced it had recorded a suspicious increase in activity toward posts about Russia, Kaliningrad and some other topics. The inauthentic behaviour came "often from what appears to be fake accounts and includes hate speech and disinformation". As a result, the site's administrators decided to restrict commenting on certain posts. However, the ban did not significantly reduce the number of reactions on subsequent posts related to the Kaliningrad transit situation.
Accounts from Africa and Asia
DebunkEU recorded many of the active accounts amplifying LRT English posts showing a mismatch between the username and the account URL and location, as well as having generic profile images or a locked account. However, DebunkEU believes that the usernames corresponded to their geographic-cultural background in most cases.
Using the Namsor application, DebunkEU found that the posts gained responses from 91 countries.
DebunkEU checked 847 accounts of which more than half (56.9 percent) were found to have a high probability of being social media trolls or bots. Those accounts originated from 91 countries, mainly in Africa and Asia (77 percent).
The top three countries of origin of suspicious accounts were Nigeria, Kenya, and the Philippines. With documented instances of social media bot and troll farms being used in these areas before, it was another reason to believe they were part of an artificial amplification campaign against LRT English.
DebunkEU has adopted several red flags that could suggest an account being likely inauthentic (ie, troll, bot, hacked account) – a recent date of creation, suspicious profile activity (eg, inactivity for years or hyperactivity and continuous posting), posting exclusively political content, misaligned account name and the name in the URL (eg, mismatched gender, geographical-cultural region), suspiciously large number of friends (over 2,000), etc.
Accounts originating from African countries displayed an unusually high and similar number of friends (4,900). Most of such profiles were from Nigeria.
Analysts also found a group of African-origin accounts dedicated to the spread of pro-Kremlin propaganda.
A small group of accounts were most likely bots, as they were characterised by a similar form of name writing, similar bios, their activity and the interaction of other suspicious accounts on their profiles.
The suspicious accounts also shared other common features, such as the use of symbols in their bios, Also, the accounts linked their supposed other social media profiles, yet these links were either inactive or led to someone else's social media profiles. The same accounts sometimes provided their WhatsApp numbers – checking some of the phone numbers revealed that they were linked to scam chat rooms.
The accounts also encouraged to add them as friends, sometimes commenting that they are testing operating “speed farms”, ie follower aggregators.
The potentially inauthentic accounts create a huge network, which leads to an increased ability to spread their 'liked' content and amplify their reach. By researching “friends” and “likes” on the suspicious profiles, DebunkEU tracked Facebook groups acting as “follower farms”, like a public group named Follow To Follow Back Facebook Page with over 250,000 members. Accounts with thousands of friends were among those sharing the posts on Kaliningrad.
‘Account-renting’ in the Philippines
DebunkEU also discovered a potential modus operandi of disinformation campaigns on social media when it found an advertisement offering money if one agreed to "rent your Google Ads Account to put and run [client's] ads from 9pm to 6am”. This activity is particularly popular in the Philippines.
Similar targeted campaigns also offer people to rent their social media accounts, including on Facebook. Although the name of the account sharing the ad was Filipino, the URL showed a Russian-sounding name. The profile also had many Russian pages in the "likes" section and provided profile links to social media widely used in Russia (Odnoklassniki and VKontakte). Both links are inactive.
DebunkEU also analysed 8,936 comments written by 3,140 accounts under six posts. They were not usually directly related to the topic of sanctions. Instead, they tackled issues on a more general level, spinning discussions about the West and the Russian invasion of Ukraine.
A narrative analysis technology developed by Graphika helped to identify common themes among the comment text, and its network analysis technology to identify audiences engaging with these themes. Early findings include propaganda, anti-Ukraine and pro-Russia narratives, engaged with by audiences around the world, including coordinated actors who support the Assad regime in Syria and Mexican journalists and researchers.
New disinformation strategies
The analysis reveals a new way of manipulating information on social media. Posts containing links to LRT’s articles became a target of an attack from foreign Facebook bot and troll farms. The supposed aim of the malign social media campaign was to increase pressure on international opinion and create fears about possible implications of continuing sanctions.
These goals coincide with the interests of Russia, which at the same time waged a political campaign to intimidate the public by threatening consequences. While the analysis did not show any direct involvement of state actors, the campaign would be in line with the goals of the pro-Kremlin propaganda.
The article was originally written by DebunkEU. An edited and condensed version was republished by LRT English. The full investigation can be found here.