An update in Twitter’s algorithm increasing the visibility of criticism of the government based on defamation campaigns by opposition supporters and terrorist groups’ sympathizers prompts manipulation concerns ahead of the May 14 elections
Twitter’s newly updated algorithm has promoted posts by terrorist sympathizers ahead of Türkiye’s presidential and parliamentary elections on May 14, a monthlong study by the Turkish newspaper Sabah analyzing suggested tweets found.
The social media platform’s algorithm, which now divides the timelines into "For you" and "Following" sections, deliberately conceals apolitical content or content created by supporters of the ruling Justice and Development Party (AK Party). Instead, it puts tweets on users’ timelines from individual and bot accounts of supporters of the PKK terrorist organization and the Gülenist Terror Group (FETÖ).
Such an incident occurs less than two months before the much-anticipated vote and raises questions about purposeful manipulation. It also considers the Cambridge Analytica scandal, where the consulting firm worked for Donald Trump’s 2016 election campaign in the United States and gained access to the personal information of millions of Facebook accounts for voter profiling and targeting. SCL Elexion, the parent group of Cambridge Analytica, has opened offices in Türkiye as well.
Despite being for real users, Twitter has also become a breeding ground for bots effectively wielded by terrorist groups and individuals serving specific purposes.
In the past month, since the Feb. 6 earthquakes hit Türkiye’s southeast, a disinformation campaign began to spread on especially social media platforms to incite fear, concern, and panic among citizens.
Sabah’s report delved into Twitter posts made between Feb. 6 and March 13 regarding the earthquakes and discovered that of the 266,334,080 million posts made by 21,493,445 accounts, a total of 5,362,720 were made by bot accounts, corresponding to roughly 25.55% of the total.
The report found that some 27% of these 5.3 million posts, 1,493,256 to be exact, were made by bot accounts managed by FETÖ and the PKK.
Moreover, 33% of all posts mentioning the word "ordu," Turkish for "the army," and accusing the Turkish Armed Forces (TSK) of being late in its response by two days were made by bots.
Social media users have complained about the sudden change in content, such as Eyüp Aytan, who says that the social media website keeps suggesting he follow Twitter accounts linked to the main opposition Republican People’s Party (CHP) and other accounts related to the bloc. "Twitter started its election campaign and manipulated people," he tweeted.
Uğur Yenisu, another user, said the accounts he did not follow appeared on his timeline more than the ones he followed. "I see so many dumb things like those boasting the high vote rate the opposition would supposedly use to win the elections," he tweeted.
Twitter is also awash with bot accounts that emerged after municipalities run by the CHP started allocating bigger budgets to agencies organizing social media campaigns. For instance, many bot accounts shared the same tweet saying they were a villager from the central province of Konya who "cannot afford the costs of his/her tractor" and would no longer vote for the AK Party, despite voting for the party for two decades.
Associate professor Selman Tunay Kamer from Kastamonu University, a digital culture and political sciences expert, says the algorithm change became effective in January. "You recently see the content from Twitter users you have never even followed and posts you did not like. It is a known fact that social media platforms have been involved in manipulation in the past, especially in affecting people’s behavior. The Cambridge Analytica scandal is most notorious, and even campaign managers (of Trump) acknowledged that they would not be able to win the elections "without Facebook." Eventually, social media websites successfully impacted the elections, causing at least a 7%-8% change in voters’ choices. It is safe to say that the same can happen here," he said.
Professor Levent Eraslan, head of the Center for Social Media and Digital Security Education and Research, said terrorist groups actively use social media. Therefore, it was undoubtedly suspicious that Twitter promoted the content produced by FETÖ and PKK-affiliated users.
Twitter last month was criticized by the European Commission for its alleged shortcoming in tackling disinformation. European Commission Vice-President for Values and Transparency Vera Jourova has singled out Twitter for failure to comply with the EU’s code of practice on disinformation. Like other social media companies, Twitter must present a report on data on how much advertising revenue the companies had earned from disinformation actors, the number or value of political advertisements accepted or rejected, and detected instances of manipulative behaviors.
The microblogging site is among the breeding grounds for disinformation and manipulation in Türkiye. British academic Marc Owen Jones said on his Twitter account that Türkiye faced a possible influence operation following the 2021 forest fires in the country’s south, pointing out the "#HelpTurkey" hashtag created on Twitter in the wake of forest fires. Many social media users took to Twitter to express their sorrow and to call for global help for Türkiye, also prompting a nationalistic reaction over the call’s perceived implication that the country is portrayed as unable to fight the blazes. "Some felt the message being generated on the hashtag was designed to make Turkey look weak, incompetent, and desperate," Jones said in a tweet. "This, coupled with the scale of the campaign, suggested a possible influence operation. To be clear, though. The hashtag had many real users."
The academic at Qatar’s Hamad Bin Khalifa University’s College of Humanities and Social Sciences also underlined that although many people joining the hashtag did not have ulterior motives, the technical analysis suggested a possible "influence operation."
"My initial analysis included several stages. Firstly, a network analysis of around 160,000 interactions from around 46,000 individual Twitter accounts," Jones said in the following tweet, detailing his research. Jones further said that past research shows "influence operations in Turkey often use the tactic of deleting tweets after writing on a hashtag. The Twitter algorithm reportedly registers the trend here, but the account deletes tweets to avoid detection/be repurposed." The academic also added that other Twitter users who go by several names like "Bad Boy," "Joker Queen," and others changed their handles as a tactic. "We’ve seen this tactic of handle switching elsewhere, including the Gulf," the academic added.
The country’s communications director, Fahrettin Altun, has stated that a single center from abroad orchestrated the Twitter campaign following the forest fires to weaken the ties between the state and the people. The aid campaign was launched with "ideological motives," Altun has stated on Twitter. He highlighted that all kinds of good-willed aid are a necessity of national unity, but this campaign aimed for the opposite.
Jones has revealed a similar analysis following the Feb. 6 earthquakes, based on 30,000 tweets spreading disinformation on the "withdrawal" of Western countries’ ambassadors before the earthquake.
Election manipulation has been under the spotlight recently with the proliferation of social media platforms. An Israeli company tried to influence over 30 elections in the world for clients through sabotage, hacking and spreading misinformation, and an undercover media probe was revealed last month. It added to a growing body of evidence that shadowy private firms worldwide are profiting from invasive hacking tools and the power of social media platforms to manipulate public opinion. The firm was dubbed "Team Jorge" by investigating journalists who posed as potential clients to gather information on its methods and capabilities. Its boss, Tal Hanan, is a former Israeli special forces operative who boasted of being able to control supposedly secure Telegram accounts and thousands of fake social media profiles, as well as planting news stories, the reports say.
The investigation was carried out by a consortium of journalists from 30 outlets, including The Guardian in Britain, Le Monde in France, Der Spiegel in Germany, and El Pais in Spain, under the direction of the France-based nonprofit Forbidden Stories.
"The methods and techniques described by Team Jorge raise new challenges for big tech platforms," The Guardian wrote. "Evidence of a global private market in disinformation aimed at elections will also ring alarm bells for democracies worldwide."
Hanan told three undercover reporters that his services, often called "black ops" in the industry, were available to intelligence agencies, political campaigns, and private companies. "We are now involved in one election in Africa ... We have a team in Greece and a team in (the) Emirates ... (We have completed) 33 presidential-level campaigns, 27 of which were successful," The Guardian quoted him as saying. Most of the campaigns – two-thirds – were in Africa, he claimed. While demonstrating his technology to reporters, he appeared to hack into the Gmail inbox and Telegram account of political operatives in Kenya days before a presidential election. Forbidden Stories named the targets two aides to William Ruto, who won the August 2022 ballot. Online public influence campaigns were carried out via a software platform known as Advanced Impact Media Solutions, which allegedly controlled nearly 40,000 social media profiles across Facebook, Twitter, or LinkedIn, the reports say.
Hanan also claimed that his firm had planted a report on France’s most prominent television news channel BFM about the impact of sanctions against Russia on the yachting industry in Monaco.
For manipulation on social media, the team developed its platform called "Aims," which could be used to create verified user accounts.
The team controlled an "army" of more than 30,000 bots, profiles on social media that real people do not back, the British daily The Guardian reported. These were exceptionally and cleverly designed and are simultaneously represented on various platforms such as Facebook, Twitter and YouTube.
The report said that with the help of smear campaigns and stolen information, public opinion was specifically influenced.
Other similar companies have been named in media reports or sanctioned by Western governments in recent years over their role in trying to influence elections and public opinion.
Last month, the chief of the Russian mercenary group Wagner, Yevgeny Prigozhin, admitted creating an infamous troll firm suspected of interfering in Western elections. Sanctioned by Washington and Brussels, the Saint Petersburg-based "Internet Research Agency" had for years been linked to Prigozhin, a 61-year-old ally of Russian President Vladimir Putin.