At a time Türkiye is rapidly moving toward what is set to be a monumental election on the 100th anniversary of its foundation, an imposition from Twitter seems to threaten the collective consciousness and integrity of the vote.
The social media platform’s recently added “For You” section insistently shows unfiltered content indiscriminate of accuracy or whether it’s by a bot or a real account, thus exposing a user to opinions and information they are not interested in or agree with, according to multiple experts.
Social media venues like Twitter are keystones as propaganda mechanisms in the period that leads up to a critical and globally scrutinized vote like Türkiye’s presidential and parliamentary elections on May 14. Disinformation and manipulative schemes are specifically crafted to influence public perception about standing candidates or parties.
In addition to Elon Musk’s instructions to alter Twitter’s algorithm after his takeover, it’s widely known the platform allows certain companies to manipulate voter perception in several countries via advanced program networks, cyber-operations expert Ersin Çahmutoğlu told Turkish newspaper Takvim.
“There are signal sequences that outline the most talked about and tweeted topics, comments, retweets and likes on these topics per country. If you follow a political opinion whose ideology you sympathize with, just because that account can tweet about different political parties, you are shown tweets of this kind,” Çahmutoğlu said.
A tweet having 100 positive comments but 300 other negative ones is among factors that direct such content into the For You section, he explained, adding that Twitter assumes the user is interested in that specific topic and automatically marks similar tweets in the background.
According to Çahmutoğlu, this change in the platform’s algorithm is a reflection of Musk’s unorthodox management after he bought “the blue bird” for $44 billion in late 2022 and wanted his tweets to be the most read and most seen on the platform.
As for its influence on electoral content, Çahmutoğlu said Twitter doesn’t differentiate organic or inorganic posts and hashtags.
“Someone starts a campaign with bot accounts targeting the government and Twitter marks this as a trending topic in Türkiye and promotes those posts,” he explained.
Dividing Twitter’s administration into “Before and after Musk," with regards to the Twitter Files that exposed the previous management for colluding with the U.S. government, including the FBI, CIA, Pentagon and other intelligence agencies, to censor certain accounts and tweets, Çahmutoğlu argued Washington “directs Twitter however it wants regarding its foreign policy, suppressing certain accounts and promoting others.”
“Twitter has been used as a propaganda tool way before Musk, especially in U.S. elections, with, say Clintons and Democrats wielding it against Trump or Republicans,” he noted.
Stressing that Türkiye, as well as the world, laments such an intrusion on democracy, Çahmutoğlu said, “It’s such a clever system that is designed to show you what you don’t want to see. You can’t help but think ‘Is this a scheme to create different perceptions?’”
Çahmutoğlu further confirmed it was “possible” for terrorist and anti-Türkiye formations like the PKK and the Gülenist Terror Group (FETÖ) to collaborate with companies that operate such programs to interfere in the pre-election period and push for a “systematic disinformation campaign.”
As for how these groups pull it off, Çahmutoğlu said they train their bots to tweet “tons of negative things” about certain keywords, which can be the name of a state official or a political party, and Twitter helps spread these posts through bot networks, generating a trending topic “as if it’s the real agenda of the country."
“This way, they start a smearing campaign and nobody questions whether it’s accurate or false,” he said.
“Promoting a very serious and systematic disinformation campaign is a real threat to Türkiye,” Çahmutoğlu argued. “Like in all other countries, certain people, both from here or the outside, can execute this in Türkiye very well.”
According to Çetin Kaya Koç, an expert in cryptographic algorithms and engineering, Twitter itself would be unable to alter its long-established algorithms “to anyone’s benefit because it would be useless and even a minor change could lead to major uproar.”
Only the CEO would have the power to exert such influence on the platform’s operation, Koç told Daily Sabah.
“His whole team would need to follow Musk’s orders,” Koç said, adding, however, it was possible for “outsiders who are really familiar with these algorithms to intervene.”
“These are made by repeating tags, keywords, certain topics that can be partially manipulated,” he explained. “But it would have a temporary, local effect; it wouldn’t be a global manipulation.”
These external disruptors could not go beyond that, according to Koç.
It is possible to lead such an operation by taking over several servers around the world for which “a small investment of $1 million-$2 million would be enough.”
Koç added, “Rest assured they are doing this. We know the Democratic Party (in the U.S.) previously did this extensively and it is now becoming obvious there is not a party that has not done this.”
Similarly, in an article for Sabah newspaper, columnist Melih Altınok argued Twitter “has already started working to make sure the opposition wins Türkiye’s elections as they have in all the other elections.”
“Even if you don’t follow opposition politicians, their favored journalists, FETÖ and PKK’s fugitive terrorists, their messages show up on your timelines,” he lamented.
“According to just what Twitter is operating in recommending these accounts ‘For You’? I cannot even find tweets from people I follow if I don’t specifically search for it,” Altınok said.
Notably, a tweet from Daily Sabah of a similar article about Twitter’s potentially misleading algorithm was retweeted by three accounts whose follower count totals roughly 5 million people, but the tweet was only viewed by 18,200 people by the time this article was written.
Altınok said when he reached out to Twitter’s Türkiye office for a comment, he wasn’t able to get a hold of any local authority.
“They set up a company here in 2021 for show and records name as its executive officer only a Kevin Cope who is residing in the USA.”
“While Canada and the U.S. can raise a riot about electoral interference before counting their chickens but anyone who has the potential to manipulate Türkiye’s May 14 polls can run an exclusive campaign for Türkiye all the way from the U.S.,” he said.
Twitter last month was criticized by the European Commission for its alleged shortcoming in tackling disinformation. European Commission Vice President for Values and Transparency Vera Jourova has singled out Twitter for failure to comply with the EU’s code of practice on disinformation.
Like other social media companies, Twitter must present a report on data on how much advertising revenue the companies had earned from disinformation actors, the number or value of political advertisements accepted or rejected, and detected instances of manipulative behaviors.
The microblogging site is among the breeding grounds for disinformation and manipulation in Türkiye. British academic Marc Owen Jones said on his Twitter account that Türkiye faced a possible influence operation following the 2021 forest fires in the country’s south, pointing out the “#HelpTurkey” hashtag created on Twitter in the wake of wildfires. Many social media users took to Twitter to express their sorrow and to call for global help for Türkiye, also prompting a nationalistic reaction over the call’s perceived implication that the country is portrayed as unable to fight the blazes.
“Some felt the message being generated on the hashtag was designed to make Turkey look weak, incompetent, and desperate,” Jones said in a tweet. “This, coupled with the scale of the campaign, suggested a possible influence operation. To be clear, though. The hashtag had many real users.”
Election manipulation has been under the spotlight recently with the proliferation of social media platforms.
An Israeli company tried to influence over 30 elections in the world for clients through sabotage, hacking and spreading misinformation, an undercover media probe revealed last month. It added to a growing body of evidence that shadowy private firms worldwide are profiting from invasive hacking tools and the power of social media platforms to manipulate public opinion. The firm was dubbed “Team Jorge” by investigating journalists who posed as potential clients to gather information on its methods and capabilities. Its boss, Tal Hanan, is a former Israeli special forces operative who boasted of being able to control supposedly secure Telegram accounts and thousands of fake social media profiles, as well as planting news stories, the reports say.
The investigation was carried out by a consortium of journalists from 30 outlets, including The Guardian in Britain, Le Monde in France, Der Spiegel in Germany, and El Pais in Spain, under the direction of the France-based nonprofit Forbidden Stories.
“The methods and techniques described by Team Jorge raise new challenges for big tech platforms,” The Guardian wrote. “Evidence of a global private market in disinformation aimed at elections will also ring alarm bells for democracies worldwide.”
Meanwhile, Meta is looking to expose and combat malicious online campaigns, from election lies to terrorist recruitment.
A paper authored by Meta's Ben Nimmo and Eric Hutchins details how to create a "kill chain" for targeting key links in deception operations aimed at duping people online.
"Human stupidity is one of the great powers in the universe, but this kill chain is trying to identify all the different kinds of operations that can try to target human weakness," Nimmo told Agence France-Presse (AFP).
"The goal is to stop the attackers before ever reaching the target."
The Online Operations Kill Chain framework released this week proposes a more unified approach to analyze the gamut of nefarious campaigns including espionage, human trafficking, fraud and election interference.
Online deception campaigns routinely span platforms – from Facebook and Instagram to TikTok, Twitter and even LinkedIn – but reveal features, such as profile images or web addresses, that can be identified, according to the report.
Meta remains under pressure to do more to combat misinformation, particularly campaigns aimed at swaying election outcomes.
The tech titan has invested heavily in content moderation teams and technology, routinely derailing covert influence operations around the world.