THIS CHAPTER IS FOCUSED ON ONLINE NETWORKS OF support for Turkey’s ruling party, the Justice and Development Party (AKP). Founded in 2001, the AKP came to power in 2002, headed by Recep Tayyip Erdoğan, one of the most controversial leaders in contemporary global politics. Although Erdoğan had always acted as the principal leader of the party, his authority increased within the party only gradually. In its first decade, the party acted as a pro–European Union, center-right party with Islamic political origins, operating within Turkey’s parliamentary system. However, since the early 2010s, the party has demonstrated strong and increasing signs of authoritarianism (Erensü and Alemdaroğlu 2018), and these signs became more visible with the government’s response to nationwide protests at Gezi Park in 2013. One of its responses was to use political trolls against critical voices on social media.
The chapter builds on earlier ethnographic work (Saka 2018) that focused on the evidentiary issues and basic structure of AKTrolls, a group of political trolls who served the ruling party. The earlier study found that there was insufficient evidence for the claim that there was a centralized troll army. The study instead demonstrated that troll networks that surfaced after the antiregime Gezi Park protests in 2013 were decentralized and used various tactics to intimidate and suppress groups that were critical of the regime. This chapter is focused on a period when the ruling party had become politically unchallenged, and its hegemony in the state apparatus was more or less stable. The period starting June 24, 2018, gave more powers to the president (Lowen 2017).
In this chapter, I discuss new roles that trolls have since adopted, including efforts to have a leading role in shaping the government-owned mainstream media agenda. Political trolls have taken on the role of mass media for the government and are engaging in a culture war against government opponents. Political trolls initially played a key role in the surveillance of party outsiders—that is, opposition political figures (Saka 2018). Surveillance lost its centrality in troll functions, but it did not disappear, and extended in the later years to party insiders. The chapter begins with a detailed discussion of AKTrolls in the context of the global phenomenon of online trolls. I will then focus on the distinct features and practices that regime-friendly trolls have adopted in Turkey, comparing them with other significant political trolling practices in different parts of the world. Following a discussion on tactics, I will introduce common themes in online discussions that characterize online support for the regime. Political trolls are not necessarily anonymous or isolated individuals. The nature and effects of trolling depend on the political context. When trolls are aligned with the ruling party led by a president with increased powers, most of them stop being anonymous, and some threaten to sue when called out as trolls. This chapter documents trolling practices in relation to Turkey’s experiment with a presidential model that was adopted in 2019. The chapter describes the various tactics used by online trolls, but it also provides examples of their impact to highlight the political consequences.
The chapter concludes by highlighting the implications of social surveillance effected by the human agents behind AKTrolls. It also notes that rivalries that have emerged among AKTrolls, signaling fluctuating loyalties that create enormous uncertainty in terms of how reliable these trolls are for any regime.
A visible progovernment political trolling scene appeared in 2013 during the Gezi Park protests. A major event in this scene occurred when President Erdoğan forced then–prime minister Ahmet Davutoğlu to resign. Davutoğlu had long been a key figure in Erdoğan’s government. A former academic, Davutoğlu was first minister of foreign affairs and is believed to be the ideologue of the AKP’s neo-Ottomanist foreign policy plans, which included a particular focus on the Syrian civil war (for more on Davutoğlu’s policies, see Aras 2014; Ozkan 2014). When Erdoğan became president in the then-parliamentary democratic system, Davutoğlu became prime minister. The latter’s initiatives led to political tensions that ended with Davutoğlu’s forced resignation. An anonymous Wordpress blog titled “Pelikan Declaration” appeared before the resignation (https://pelikandosyasi.wordpress.com/). The declaration claimed that Davutoğlu had attempted to bypass Erdoğan’s authority and included a list of accusations. The declaration was widely circulated by a section of AKTrolls that would gradually be labeled the “Pelikan group.” During the resignation period, some prolific AKTrolls such as Taha Ün were sidelined as being pro-Davutoğlu (Diken 2016).
As opposed to earlier media reports, I found AKTrolls to be much more decentralized and volunteer driven. The “professional” trolls sometimes worked together and sometimes separately. Closest to a business structure was the Pelikan group. This group worked closely with a media conglomerate owned by the Albayrak family, which had increased its wealth during AKP’s rule (E. Sözeri 2016). Erdoğan’s son-in-law and Turkey’s current minister of economy, Berat Albayrak (along with his media conglomerate brother, Serhat Albayrak), were particularly prominent personalities in this relationship. The group quickly received animosity from other AKTroll circles as political alliances evolved.
A list of political trolls was monitored for more than two years. A discursive analysis was then used to analyze selected Twitter production. The list of trolls to monitor was based on the findings of previous ethnographic work. Significant impact nodes were found and mapped (Saka 2018), and representative accounts were selected for monitoring. To follow political and personal changes among trolls, the list was continuously updated to demonstrate emerging cliques. The update process was based on both regular contacts from the field and the trolls’ own Twitter content and engagements.
The origins of the ethnographic research are based on my active engagement in social media communications with users who took part during and after the Gezi Park protests. Some of those users could be identified as AKTrolls (progovernment party, AKP, trolls). Semistructured interviews were conducted with users who were officially involved in social media political campaigns or who self-identified as AKTrolls. One troll was a digital media producer with close ties to AKP circles whose documentaries have been broadcast on state TV channels. Another troll, according to AKP circles, was a “Gülenist organizer” on Twitter. I also interviewed a relatively high-level bureaucrat who specialized in Turkey’s communication sector. In addition, I engaged approximately thirty user-activists in online and offline conversations/interviews to address specific questions. These interviews were conducted between June 2015 and December 2016, and correspondence continued with some of them during the two-year monitoring process for this chapter.
Most engagements occurred online on Twitter and Facebook; however, some offline encounters were critical. Some significant access was obtained by frequenting cafes in the neighborhood of At Pazarı Meydanı in Istanbul’s conservative district, Fatih, as Islamists from predominantly pro-AKP circles frequented these cafes. It also became a favorite locale for Gülenists after the December corruption case in 2013. Gülenists were not known to socialize with other Islamists, but at the start of the crackdown, some members seemed to have decided to socialize with other Islamists to propagate their views. A network-mapping tool, Graph Commons (https://graphcommons.com/), was used to create a map of AKTrolls with the initial data obtained during field engagements. The online monitoring of troll activities would later populate the map.
Trolling and its Antecedants
Trolling has existed since the early days of the internet. It was already so common. even on the Usenet, that Tepper (1997/2013) devoted an entire article to trolls’ exploits (Phillips 2015), and Shepherd et al. (2015) looked at the late 1980s as they emphasize the historical and networked roots of trolling. Filipovic (2007) and Hardaker (2010) point out the military roots of internet communications in the 1960s and argue that the gendered and vitriolic nature of trolling can be traced back to these years. There is continuity in the gendered consequences of trolling (Suler and Phillips 1998).
A turning point giving trolling a more collective and political sense occurred in 2008 in the course of a showdown between the Anonymous hacking group and Scientology (Coleman 2011). According to Phillips (2015), a theoretical shift occurred in the late 2000s when analyzing trolling. Previously, Dahlberg (2001) and others viewed trolls in terms of “deception,” whereas Coleman (2012; 2014), among others, interpreted them as using a “communitarian” approach without ignoring the roots of trolling. Soon the phrase “political trolling,” associated with the term “web brigades” (known in English media as “the troll army”), emerged, initially to describe state-sponsored anonymous internet political commentators and trolls explicitly linked to the Russian government (Soldatov et al. 2015). Observations of Russian cases demonstrate that a centralized political trolling structure was made possible with a series of internet laws and the opening of centers, such as the now notorious Internet Research Agency (Funke and Benkelman 2019), to host trolls who intervened in digital agendas (Lokot 2016).
Once accepted as a positive practice, extreme speech movements soon appropriated the “playfulness” of trolling. Hawley’s (2017) work on alt-right circles in the United States is a good indicator of this current. Hawley claims that the alt-right is an outgrowth of internet troll culture in many respects. The collective creation of memes and other relevant content go hand-in-hand with spontaneity, but it is fine tuned in an increasingly collective manner (Hawley 2017).
Political trolling may not necessarily be associated with the state, but the latter increasingly monopolizes the practice. Hunt (2014) described both formal and informal mechanisms to control the internet—a familiar process in Turkey (Sözeri 2017). Trolling increased in a context in which other repressive measures were already in force. This negatively affected journalists, with female journalists in Turkey a particularly vulnerable group, as they were increasingly exposed to harassment by trolls.
Nevertheless, anonymity may not be a critical point anymore. Some recent research on trolls corroborates my findings. In the harassment of women in the video-gaming community, known as “Gamergate” (Dewey 2014); in the attacks on the Ghostbusters actress Leslie Jones (Brown 2016); and in the study of comments on online petitions published on a German social media platform between 2010 and 2013, anonymity and public identities are both used (Rost, Stahel, and Frey 2016). Because online aggression is rewarded in their social networks, some trolls prefer not to hide their identities (Coren 2016). Ceren Sözeri (2016), for instance, describes how some progovernment journalists act like political trolls; in fact, they sometimes lead the attacks. A disturbing fact is that political trolling may now be used within more mainstream politics all over the world. A BuzzFeed report (Spence 2018) claimed that conservatives in the United Kingdom had formed a Twitter group to use political trolling tactics against the Labour Party leader. At the time of writing, Serbia’s governing party admitted to having thousands of “bots” to write positive comments about President Aleksandar Vucic and his party (Piše:Danas Online 2018). It is now common knowledge that in most cases, under political trolls’ seeming anonymity, there are often connections to power bases. In another case, the hacked emails of white nationalist troll Milo Yiannopoulos demonstrated that he had connections to some prominent figures inside Silicon Valley but also to explicitly white nationalist circles and families of billionaires (Wagstaff 2017). In some countries, political trolling has already become an industry (Ong and Cabanes 2018), and the Philippines’s President Rodrigo Duterte has admitted he paid an army of social media trolls (Ng 2017).
An empirical context can be valuable for understanding political trolling. In this section, I will refer to literature that reports different uses of trolling all over the world, and I will compare them with AKTroll practices. As Phillips (2015) noted, even within the same trolling group, there might be considerable behavioral variation. It should also be noted that there are always emerging and overlapping cases.
Russian trolls seem to have continued to produce divisive content even after discoveries of their roles in the last US presidential election (Graff 2018). These trolls seem to have excelled at targeted advertising, which is vital for social media. They have hijacked hashtags and disguised themselves as average Americans on Twitter (Hsu 2018), and they have mainly used racial themes with their Facebook ads (Penzenstadler, Heath, and Guynn 2018). I could not find a similar algorithm-supported campaign targeting in the Turkish case.
Organized trolling attacks work to intimidate targets. In India, Hindu nationalists using the internet followed and intimidated social media users who posted comments that were not aligned with the trolls’ own views (Udupa 2015). In the United States, the alt-right’s persistent, coordinated trolling would break into the mainstream discussion (Hawley 2017), as both microtargeting and collective content production worked hand-in-hand to produce the expected results for the groups. The sheer volume of threats and insults can discourage targeted citizens; in Turkey, these attacks are usually called “social lynching.”
An example involves Pelin Batu, the daughter of a well-known Turkish diplomat and a historian with her own TV show. She used Twitter on July 21, 2016, to state that she was no longer going to tweet because of threats and insults from AKTrolls.
Trolling can go hand-in-hand with microtargeting but sometimes en masse: the Russian troll farm known as the Internet Research Agency was quick to muddy the news cycle during the Mueller probe in the United States (Glaser 2018). Stray (2017) elaborated on how disinformation is used in swarm attacks along with microtargeting, referring to the modern strategy of disinformation as “the firehose of falsehood” (named by RAND scholars Paul and Matthews 2016). The disinformation is spread through different channels with different levels of content. These channels may range from official sources like Russia Today to curated, hacked, or leaked material with different audiences in mind. The massive volume of disinformation overwhelms alternative voices and creates false sources of credibility. Disinformation is not necessarily explicitly political. BBC News (BBC 2018) claimed that Russian trolls were active in spreading antivaccine stories to create social discord. Many cases exist in Turkey—for example, stories involved women booing the call to prayers (Reuters 2019), people drinking beer in the mosque during the Gezi Park protests (Hürriyet Daily News 2013), and a woman in a headscarf being sexually attacked in Kabataş İstanbul during the Gezi Park protest (Yılmaz 2015).
Spreading Fake News to Discredit Opposition Accounts Later
This is arguably a form of disinformation peculiar to Turkey. A telling case was the circulation of false images related to Silvan, a Kurdish town in southeast Turkey. These images were allegedly disseminated by AKTrolls after a special forces operations in November 2015. When ordinary citizens opposed to it began to use these images, they were accused of disseminating disinformation. Seemingly critical Twitter accounts are sometimes used to seed fake news. When they get wider usage, progovernment “fact-checking” accounts debunk this news and claim how fraudulent the opposition is.
Publishing people’s personal information is a common occurence among trolls, and doxing is already internet slang to define this act. A well-known global case concerned a female Finnish journalist, Jessikka Aro (BBC 2017), who was a victim of doxing. She was framed as a kind of foreign agent by Russian trolls and her contact information was put online. In these cases, what is striking is the collaborative effort put together among online users to find out information about the victims. In contrast, our fieldwork showed that trolls in Turkey mostly gather personal information in collaboration with the authorities.
Phishing for Political Purposes
Phishing goes hand-in-hand with doxing. Phishing differs from mere hacking by masquerading as a trustworthy electronic communication entity but with the intent to steal personal data for malicious reasons. Although it is challenging to develop hard data at this stage, and the situation reflects a general trend, what is verifiable is that nearly all hacked accounts in Turkey begin to produce pro-AKP discourses. A self-proclaimed national hacking team, AyYıldız Team, claims responsibility for Twitter hacking and decorates the hacked account with pro-AKP or pro-Erdoğan images and discourses.
A famous comedian, Atalay Demirci, was a victim of phishing. When private messages on his Twitter account were released, he was arrested based on the allegation that he was a member of the Gülenist movement. To retrieve deleted messages, this hacking group might have received assistance from state institutions, according to one of interviewees. In a previous hack, on June 7, 2016, the account of Akın İpek, a Gülenist businessman who fled to the United Kingdom in 2015, was decorated with Erdoğan’s smiling image and a purported apology. In yet another case, the account of Arzu A. Çerkezoğlu, general secretary of Turkey’s third biggest trade union confederation with a proleft political orientation, was hacked by the same group. This time the account owner was accused of having links to Kurdish guerillas. I could not find evidence of more sophisticated deception tools in the case. However, a BuzzFeed report (Hall 2018) showed that advanced software could have been used in these kinds of attacks.
DFRLab (2018) has shown that many tactics, such as doxing, are deployed through cross-platform coordination. It should be noted that the Turkish case includes not only internet platforms but also close cooperation with legacy media. AHaber news channel is notorious but is not the only example. AHaber either uses the content that is circulated by trolls on Twitter or becomes the source of content that is planned for distribution on Twitter and other digital platforms.
Zimmerman’s (2016) study of verbal abuse used by trolls against female journalists in Turkey is a reminder of the gendered nature of the phenomenon. Verbal abuse could be categorized as intimidating insults, humiliating insults, and sexually related insults. However, verbal abuse is not limited to female citizens, although they are the most frequent targets.
Trolls’ arguments may be limited and repetitive; however, that does not mean there is a lack of rhetorical innovation. Not all rhetoric is verbal abuse. It is essential to create and circulate code words, as Hawley (2017) elaborates in relation to the term white genocide. Sometimes these code words are used to bypass filters, but they are not only about the filters. AKTrolls invent new terms to debase their opponents or catchphrases to convey their allegations. They rarely use the pro-Kurdish party name HDP, for instance, but instead use HDPKK to state their relation to PKK, the banned Kurdish armed group. Against Gülenists, CIAMAT, as a pejorative wordplay, was used frequently to denote alleged relations between the group (Cemaat) and US Central Intelligence Agency, abbreviated as CIA.
Appropriating Popular Culture Themes
Divisive content seems to have utilized popular culture themes. Mainstream themes are transformed into polarizing content or devices of disinformation. Ordinary citizens may not always be able to detect transformed content while unsuspectingly consuming it. Russians allegedly referenced pop culture creations such as SpongeBob SquarePants and Pokémon (Penzenstadler, Heath, and Guynn 2018), and the Pokemon Go game was also used in later trolling activities (Kulp 2017). Appropriation of Pepe the Frog, a popular internet meme (Nuzzi 2016), by alt-right groups in the United States may be the ultimate example in this context. Interestingly, some AKTrolls use Western popular culture symbols and nicknames. An prolific troll, @debuffer, uses Dr. House in its Twitter image. Sözeri (2017) narrates how a “sexy girl profile picture” suddenly changed its name and brand to launch a smear campaign using its 42,000 followers against a legitimate election monitoring group, Oy ve Ötesi (Vote and Beyond).
In the United States, studies have shown that although most Americans are aware of bots and believe bots may have malicious intent, a significant number of citizens may not differentiate bots from humans (Stocking and Sumida 2018). Users with a low level of technical expertise can voluntarily become bot owners. Pirate sites can easily and cheaply offer services or fake followers for Twitter or Facebook. Despite an emphasis on state-level interventions and technical expertise, ordinary people can effectively use bots, as in the case of MicroChip, a notorious pro-Trump Twitter ringleader once described by a Republican strategist as the “Trumpbot overlord” (Bernstein 2017). The use of automated bots has been on the rise and does not need sophisticated investment (Agarwal 2017). A group of graduate-student researchers demonstrated that heavy use of automated bots played a crucial role in countering anti-AKP discourse after the Ankara Bombings in Turkey in October 2015. Twitter banished a bot-powered hashtag that praised President Erdoğan (Hürriyet Daily News 2016), and Turkish ministers immediately talked of a global conspiracy against Erdoğan; however, it was probably due to Twitter’s struggle against spam-creating bots and trolls (Lapowsky 2015). Notable political trolls that have thousands of followers tend to have many accounts. According to an interview with an open-source coder, a troll can have up to one hundred accounts. Use of an automated bot differs from having more accounts in terms of scale. One can sense the use of bots if a message is replicated or retweeted to more than a few hundred other accounts. It should also be noted that as of November 2016, Istanbul and Ankara were the top two cities for bot usage, according to the major internet security company Norton (Paganini 2016).
Luring Power Users
State-sponsored political trolling has the potential to lure influential users with large follower bases on social media. The story of Serkan İnci (Saka 2018), who leads the highly active communities İnciCaps and İnciSözlük, is striking. He was a Gezi Park activist but gradually moved to serve the government agenda and became an active component of AKTroll discourse. Secular columnist Haşmet Babaoğlu (now closely associated with the Pelikan group) and famous cartoonist and producer Hasan Kaçan are also examples of this practice, as their tweets serve the AKTroll discourse.
Mobilizing against International Foes
Mobilizing against international foes has become an essential practice in the posthegemony period. Any critique of Turkish policies is subject to swarm attacks. As a harbinger of cases to come, when Swedish minister of foreign affairs Margot Wallström criticized a Turkish constitutional court decision on the legal age of marriage in a Twitter message, a mass of replies occurred and resulted in the Twitter campaign #DontTravelToSweden, which warned travelers not to go to Sweden because of allegedly high rape rates. In addition, the EU Parliament’s Turkey reporter, Kati Piri, is attacked online when a report from any European Union institution is released that criticizes Turkey.
Grabbing Mainstream Media Attention
Hawley (2017) and many others have pointed out that the alt-right aimed to capture mainstream media attention in order to spread its extreme discourse. Turkey’s progovernment trolls have rapidly evolved to a new stage. Because Turkey’s mainstream media has predominanty progovernment owners, the need is no longer about capturing mainstream media attention. Aligned trolls have the confidence to dictate their rhetoric to media. When a political troll points out an issue, it is now intended to shape the media agenda, and critics of the progovernment agenda are quickly silenced through such means.
Adoption of the “Presidential System”
As noted at the beginning of this chapter, I take the constitutional change as the starting point of Erdoğan’s consolidation of power. AKTrolls maintained an ongoing battle mode, but after that point, the overall aim of their content was to justify government policies. It was not a smooth process, and as the intraparty rivalries intensified, it was harder to justify each government policy. Nevertheless, in terms of daily matters, most AKTrolls seemed to have agreed on the government line. When price hikes occurred, for example, a staunch supporter would claim that they were a consequence of an international conspiracy.
In the following sections, I will highlight important themes in Turkey’s progovernment political troll discourse from late 2018 to Turkey’s local elections in March 2019, when I monitored troll activities on Twitter. Based on previous ethnographic work and continued Twitter monitoring, I argue that these themes have relevance beyond the defined time period.
In a relatively decentralized scene, one node, the Pelikan group, became increasingly powerful after Davutoğlu’s decline. Well funded and well connected to the party leadership, members of this node have been at the center of troll infighting. Accusing others of being Gülenist has been their favorite line, but some also use this line against members within the group.
Emre Erciş, who rose to prominence with his anti-Gülenist positioning, was accused by another troll who showed that Erciş’s old Twitter messages were against state confiscation of Gülenist media. Erciş produced a long Twitter feed to show the evolution of his political identity from his leftist youth. The polemic turned ugly as they swore at each other. My findings showed that Erciş was always against Gülenists in public forums. He had investigated Gülenist prosecutors’ accusations against some Islamist circles who were allegedly connected with İran. Erciş’s case suggests that the troll scene is a tense place where unsubstantiated accusations are made to gain personal influence. At this specific juncture, being anti-Gülenist was the main currency, and accusations were made accordingly.
Trolls must be careful to note changes in government policies. When the Turkish government’s overall sympathy toward jihadists ended, some began to accuse others of being too pro-Salafi. Arguments between projihadists and nationalist, progovernment trolls about Syria would be frequent. In another example, a columnist was accused of pro-US standing because he allegedly did not criticize the United States when he was critical of Turkey’s approach to the Venezuela crisis. Not being pro-Erdoğan enough is another common accusation. Even on the Xinjiang issue in China, where China is accused to be commiting genocide against the Uyghur Muslims (BBC 2021), AKTrolls were divided. Some accused others of being pro-China in denying the oppression of Muslims there.
For outsiders and even for many secularists in Turkey, Islamists may look like a homogeneous bloc. The assumption was that Erdoğan could coordinate and lead most of the Islamists, but after he secured power, a growing number of nonconforming Islamists also became the government’s target.
A nonconformist preacher, Alparslan Kuytul, who mostly generated a following in Adana, was rearrested the day after he was released from prison. Trolls were quick to approve his rearrest. Kuytul was found guilty because he did not criticize Fethullah Gülen, even after the coup attempt, and continued to criticize Erdoğan. Ahmet Taşgetiren was known as a wise man for many AKP followers. He is affiliated with an elite Naqshbandi sect and is the founding editor of its monthly magazine Altınoluk. Throughout AKP’s rise to power, he contributed to progovernment dailies and TV programs. However, the moment he started to voice seemingly well-intentioned criticisms, he became a target. At the time of writing, Taşgetiren lost not only his column and TV programs but also his writing role at Altınoluk magazine. Supposedly, conservatives lost interest in Taşgetiren, who had tried to raise suspicions against the accusations that were made against Gülenists, although he was a persistent critic of Gülenists throughout his writing career. Mustafa İslamoğlu is the leader of a political Islamist group who might have some cadres in the current administration. Since the coup attempt in 2016, AKTrolls occasionally target him. Because of his existing cadre, he might be seen as a rival to the powerful Pelikan group. He was accused, for example, of taking money from foreign foundations linked to George Soros, including a human rights award in 1998. Islamoğlu was arrested for an op-ed article that discussed Islamic solutions to the Kurdish question. The award was labeled as a grant from “globalists.” A reformist theology scholar, Mustafa Öztürk, was the target of coordinated attacks. He symbolizes those who actively work on non-Sunni theologies. Not all AKTrolls are religious, but the only acceptable religious path seems to be the majority Sunnism. Öztürk publicly stated that he might decide to live abroad after the campaign.
During local elections in March 2019, the Saadet Party and its voters became a subject of hate for many AKTrolls. This socially conservative pro-İslamist party was founded in 2001 after the Turkish state shut down its previous iteration. Its origins date back to the 1960s, and Erdoğan, along with many others in the AKP leadership, started their political career in this party. The party follows the philosophy of the now-dead legendary leader Necmettin Erbakan. Erdoğan and his friends left the party and founded AKP, with a claim to be on the center-right with more Islamic tones but with a definite break from the Saadet Party tradition. A tense relationship has always existed between these two parties, but it came to a head when Saadet, formally and informally, cooperated with the other opposition parties during the last election. Saadet’s critique of AKP resonated with constituencies beyond its traditional voter base. Consequently, AKTrolls increased their focus on Saadet. Any connection with secular opposition would be rebuked in the name of Islamism or for alleged alliance with terrorists (e.g., accusations of cooperation with the pro-Kurdish party HDP).
Weapons of the Culture War
Trolls spend more time on issues that can be classified as part of the “culture wars.” I present a few examples in the next sections.
Against Western Movies on Public TV Channels
TRT2 channel was relaunched as an attempt to reclaim the cultural field. It was already a culturally oriented channel but was shut down during the AKP regime. It was been reopened after Erdoğan’s statements noting a desire to “win” on the cultural front. However, when the TV channel announced it would broadcast old Western movies, as it had before, some trolls saw this as a tool of cultural imperialism. In addition, Netflix’s Designated Survivor was cited as a show that whitewashed Gülenists during a particular episode.
Against Gender Equality and Homosexuality
Korean pop cultural themes and a yoga education project by the Ministry of Education led to a moral panic that Turkish children would lose their gender identity. The supposed aim was to destroy the family institution and thus society. Even unisex restrooms were a bad sign. An international project, mostly funded by the British embassy in Turkey, on gender equality became a key target. The accusation was that this project, in the name of gender equality, would create LGBTI youth. This led to domestic partners of the project being targeted.
At the same time, a troll targeted a campaign against a legal amendment that would lower the earliest marriage age from 15 to 12 (Evrensel 2019). The troll complained that those who wanted to marry young (the troll preferred to ignore underage as an issue) were not as free as LGBT citizens. This message served as promotion of a particular conservative worldview colored with homophobia.
Women Who Discard the Veil
Trolls were particularly angry about news of women who stopped wearing headscarves. A particular tactic was to start ad hominem attacks on women. This approach made headlines with the BBC Turkish news (Kasapoğlu 2019), but it had been a talking point in some circles earlier (Çakır 2018) To corroborate this trend, a new survey claimed that the number of pious youth has decreased in Turkey since 2008 (Konda 2019). However, trolls tend to connect this phenomenon to a global conspiracy against the government.
A positive side to AKTrolls is that they are against anti-immigrants given the government’s pro-Syrian refugee policy. This is one area on which all AKTrolls agree. Nationalist parties or the center-left Republican People’s Party are mostly the source of anti-Syrian agitation. Justifications for the existence of refugees may change, but AKTrolls may be playing the critical fact-checking role. Some have debunked fake news that aimed to agitate against the refugees. Apart from the Syrian civil war, which had a direct connection to Turkey’s domestic affairs, political trolls increased content coverage related to Turkey’s international relations. From US national security advisor John R. Bolton’s visit to Turkey to the crisis in Venezuela, trolls acted like a media force of the Turkish state within the country and abroad on social media.
Trolls were very interested in the candidate chosen by opposition parties during the 2019 local elections. AKTrolls worked laboriously to demonstrate possible links to the Turkish state’s currently designated enemies, such as the Kurdish guerilla movement, Gülenists, or radical leftists. The intention was to persuade non-AKP voters to break away from the candidates. Two major opposition parties, AKP and İyi Party, were the main targets. A specific case in point is to observe the rivalries of Abdullah Gül and Ahmet Davutoğlu.
As of early 2019, although Gülenism remained a vital enemy, an internal enemy had been constructed. I label this the “Davutoğlu and Gül front.” Both played vital roles from the beginning of AKP, and until recently, they were defined as Erdoğan’s close allies. However, this relationship changed radically in recent years. During the immediate postcoup attempt, some trolls even talked about detaining them. However, this did not happen, and despite a visible distance from party politics, both Davutoğlu and Gül never took on explicit confrontation. As the Gülenist threat lost its immediate power, AKTrolls criticized both men and their supposed allies daily. Accusations of being Davutoğlu’s men or Gül’s men were used interchangeably. In the meantime, Karar Daily gradually became a pro-Davutoğlu outlet as many dismissed journalists and columnists from more pro-Erdoğan dailies such as Yeni Şafak and Star began to work there.
All intraparty critics, some of whom were quite influential opinion leaders, were immediately classified as pro-Davutoğlu. Aydın Ünal was the notorious speechwriter for Erdoğan for a long time. Some of his earlier statements were closer to hate speech than mere incivility. According to @Tahaun, he lost the fight against the Pelikan group. Cemile Bayraktar, as a columnist, supported AKP ardently until recently. Both are now accused of being Davutoğlu supporters. İlhami Işık followed a similar career track to Bayraktar. And as far as I recall, Taha Akyol was never an Islamist but a conservative nationalist columnist with a long career that dates back to the pre-AKP period. He had been a liaison between the government and the Doğan Media Group, but his role ended gradually as the latter retreated from the media industry. When he started to write in a pro-Davutoğlu daily, he was labeled as another traitor.
Pohjonen and Udupa’s (2017) approach to extreme speech as an anthropological project and how it differs from the hate speech discourse is constructive in understanding political troll networks. ”Extreme speech” helps the research to better “contextualize online debate with attention to user practices and particular histories of speech cultures” (Pohjonen and Udupa 2017, 1173) Most of the content produced through political trolling may be in the boundaries of the legal-normative discourse of hate speech, but focus on user practices demonstrates that there may be many assemblages of practice that are hard to classify. Trolling bets increasingly on the ambiguity of hate speech versus acceptable speech, but ethnographic research with a new research agenda, such as in the extreme speech framework, can be more productive.
Unlike Whitney Phillips (2015), whose research was based on self-identifying trolls, not all subjects of my study identify themselves as such. In this study, trolls are understood as progovernment internet users who do not hide their identities and whose productive engagement with the authorities through social media networks can be seen as a form of digital surveillance, which in turn triggers restrictive consequences for citizens located in the ranks of Turkey’s opposition. Evgeny Morozov (2012) has highlighted this networked surveillance approach by demonstrating how authoritarian governments use social media to track and crush the opposition. While Morozov emphasized software and platforms, I would like to emphasize social-user–based surveillance. The possibility of being targeted leads to self-censorship. This is strengthened by the fact that not only trolls but all citizens are asked to surveil critical voices. The overall impact is what I would label as the “trollification” of ordinary users. This peculiar kind of vigilance has penetrated every level of user.
Nevertheless, too much emphasis on surveillance may be misleading. As the emerging themes demonstrate, progovernment political trolls in Turkey not only act as moral police but also function as “organic intellectuals” (Gramsci 2005). At the beginning of 2019, Erdoğan complained (Milli Gazete 2019) about the lack of success in culture and art scenes and urged his government and supporters to focus on these areas. This political strategy was not easy to implement, and I suggest that the trolls’ first move was to replace mainstream media as the agenda setter and shape the public agendas and discourses. After the sale of the Doğan Media Group (New York Times 2018), Turkey’s mainstream media became progovernment to an unprecedented degree. Along with pressure on senior journalists and mass sackings, the mainstream media establishment was rendered ineffective, and from cultural themes to foreign affairs, political trolls produced a media discourse that shaped the former’s output. In the contemporary hypernetworked digital media landscape, trolls may be more digitally savvy natives than ordinary users (Phillips 2015), and progovernment mainstream media mostly follows trolls, not vice versa, in the fast flow of information. This new division of labor in media production leads to a further deterioration in the quality of public debates.
A final finding is that political trolls may not always be reliable, even for an authoritarian government. Troll rivalries occasionally weaken the intended level of discursive inculcations. At the time of writing, I have observed that opposition circles have begun to access the government’s many behind-the-doors secrets through these rivalries. Moreover, some trolls become so alienated that they may change sides radically. A typical case is that of @omerturantv72. I have monitored his account since the beginning of this project. After Twitter closed his previous account, when he functioned as a relentless progovernment militant, he returned to Twitter with a new account whose critical messages are frequently shared by opposition users. Reliance on political trolls may thus create vulnerability. Internal rivalries and weakening power of the government can lead to shifting alliances and groups changing political sides.
Agarwal, Amit. 2017. “How to Write a Twitter Bot in 5 Minutes.” Digital Inspiration, July 19, 2017. https://www.labnol.org/internet/write-twitter-bot/27902/.
Aras, Bülent. 2014. “Davutoğlu Era in Turkish Foreign Policy Revisited.” Journal of Balkan and Near Eastern Studies 16 (4): 404–418.
BBC. 2017. “How Pro-Russian Trolls Tried to Destroy Me.” BBC Trending (blog), October 6, 2017. https://www.bbc.com/news/blogs-trending-41499789.
———. 2018. “Russia Trolls ‘Spreading Vaccine Discord.’” August 24, 2018. https://www.bbc.com/news/world-us-canada-45294192.
———. 2021. “Uyghurs: MPs State Genocide Is Taking Place in China.” BBC News, April 23, 2021. https://www.bbc.com/news/uk-politics-56843368.
Bernstein, Joseph. 2017. “Never Mind the Russians, Meet the Bot King Who Helps Trump Win Twitter.” BuzzFeed News, April 5, 2017. https://www.buzzfeednews.com/article/josephbernstein/from-utah-with-love.
Brown, Kristen V. 2016. “How a Racist, Sexist Hate Mob Forced Leslie Jones Off Twitter.” Splinter, July 19, 2016. https://splinternews.com/how-a-racist-sexist-hate-mob-forced-leslie-jones-off-t-1793860398.
Çakır, Ruşen. 2018. “Başörtüsünü çıkaran kadınlar.” Medyascope, September 7, 2018. https://medyascope.tv/2018/09/07/basortusunu-cikaran-kadinlar/.
Coleman, E. Gabriella. 2011. “Anonymous: From the Lulz to Collective Action.” New Everyday, April 6, 2011. http://mediacommons.org/tne/pieces/anonymous-traveling-pure-lulz-land-political-territories.
———. 2012. “Phreaks, Hackers, and Trolls: The Politics of Transgression and Spectacle.” Social Media Reader 5:99–119.
Coleman, Gabriella. 2014. Hacker, Hoaxer, Whistleblower, Spy: The Many Faces of Anonymous. London: Verso.
Coren, Michael J. 2016. “Internet Trolls Are Even More Hostile When They’re Using Their Real Names, a Study Finds.” Quartz, June 27, 2016. https://qz.com/741933/internet-trolls-are-even-more-hostile-when-theyre-using-their-real-names-a-study-finds/.
Dahlberg, Lincoln. 2001. “The Internet and Democratic Discourse: Exploring the Prospects of Online Deliberative Forums Extending the Public Sphere.” Information, Communication and Society 4 (4): 615–633.
Dewey, Caitlin. 2014. “The Only Guide to Gamergate You Will Ever Need to Read.” Washington Post, October 14, 2014. https://www.washingtonpost.com/news/the-intersect/wp/2014/10/14/the-only-guide-to-gamergate-you-will-ever-need-to-read/.
DFRLab. 2018. “#TrollTracker: Journalist Doxxed by American Far Right.” Medium (blog), June 17, 2018. https://medium.com/dfrlab/trolltracker-journalist-doxxed-by-american-far-right-7881f9c20a16.
Diken. 2016. “‘Ak Trol’ Taha Ün: Tartışmalar Yatışana Kadar Kenardayım.” Diken (blog), May 4, 2016. http://www.diken.com.tr/pelikandan-sonra-ulasilamayan-ak-trol-taha-un-tartismalar-yatisana-kadar-kenardayim/.
Erensü, Sinan, and Ayça Alemdaroğlu. 2018. “Dialectics of Reform and Repression: Unpacking Turkey’s Authoritarian ‘Turn.’” Review of Middle East Studies 52 (1): 16–28.
Evrensel. 2019. ‘TBMM’de erken yaşta evlilik için af yasası hazırlanıyor.’ Evrensel, July 25, 2019. https://www.evrensel.net/haber/371042/tbmmde-erken-yasta-evlilik-icin-af-yasasi-hazirlaniyor?a=ee5cd.
Filipovic, Jill. 2007. “Blogging while Female: How Internet Misogyny Parallels Real-World Harassment.” Yale Journal of Law and Feminism 19:295–303.
Funke, Daniel, and Susan Benkelman. 2019. “How Russia’s Disinformation Strategy Is Evolving.” Poynter. Accessed May 31, 2019. https://www.poynter.org/fact-checking/2019/how-russias-disinformation-strategy-is-evolving/.
Glaser, April. 2018. “Russian Bots Wasted No Time Trying to Confuse People after the Mueller Indictment and the Parkland Shooting.” Slate, February 20, 2018. https://slate.com/technology/2018/02/russian-bots-were-active-after-the-florida-shooting-and-the-latest-mueller-indictment.html.
Graff, Garrett M. 2018. “Russian Trolls Are Still Playing Both Sides—Even With the Mueller Probe.” Wired, October 19, 2018. https://www.wired.com/story/russia-indictment-twitter-facebook-play-both-sides/.
Gramsci, Antonio. 2005. “The Intellectuals.” Contemporary Sociological Thought, 60–69.
Hall, Ellie. 2018. “Celebrities Say White Supremacists Used A New Video App to Trick Them into Endorsing Anti-Jewish Conspiracy Theories.” BuzzFeed News, November 30, 2018. https://www.buzzfeednews.com/article/ellievhall/celebrities-white-supremacists-video-app-cameo-anti-semitic.
Hardaker, Claire. 2010. “Trolling in Asynchronous Computer-Mediated Communication: From User Discussions to Academic Definitions.” Journal of Politeness Research 6:215–224.
Hawley, George. 2017. Making Sense of the Alt-Right. New York: Columbia University Press.
Hsu, Stephen. 2018. “Russian Fake Tweets Visualized.” Towards Data Science, May 1, 2018. https://towardsdatascience.com/russian-fake-tweets-visualized-6f73f767695.
Hunt, Richard Reid. 2014. “Moving beyond Regulatory Mechanisms: A Typology of Internet Control Regimes.” Dissertations and Theses. Paper 1801. https://doi.org/10.15760/etd.1801.
Hürriyet Daily News. 2013. “I Did Not See Anyone Consume Alcohol in Mosque during Gezi Protests, Muezzin Says.” June 27, 2013. http://www.hurriyetdailynews.com/i-did-not-see-anyone-consume-alcohol-in-mosque-during-gezi-protests-muezzin-says-49573.
———. 2016. “Turkish Ministers Accuse Twitter of Plotting against Erdoğan.” March 30, 2016. http://www.hurriyetdailynews.com/turkish-ministers-accuse-twitter-of-plotting-against-erdogan--97106.
Kasapoğlu, Çağıl. 2019. “Başörtüsünü Çıkaranlar: Neden Bu Kararı Alıyorlar, Neler Yaşıyorlar?” BBC, January 4, 2019. https://www.bbc.com/turkce/haberler-turkiye-46758752.
Konda. 2019. “What Has Changed in Youth in 10 Years?” Konda Interactive, https://interaktif.konda.com.tr/tr/Gencler2018.
Kulp, Patrick. 2017. “Election-Meddling Russian Troll Farms Tried to Use Pokémon Go to Stir Racial Tensions.” Mashable, September 12, 2017. https://mashable.com/2017/10/12/pokemon-go-russian-troll-farm/.
Lapowsky, Issie. 2015. “Why Twitter Is Finally Taking a Stand against Trolls.” Wired. April 21, 2015. https://www.wired.com/2015/04/twitter-abuse/.
Lokot, Tetyana. 2016. “Center for Monitoring Propaganda and Disinformation Online Set to Open in Russia.” Global Voices (blog). March 26, 2016. https://globalvoices.org/2016/03/26/center-for-monitoring-propaganda-and-disinformation-online-set-to-open-in-russia/.
Lowen, Mark. 2017. “Why Did Turkey Hold a Referendum?” BBC, April 16, 2017. https://www.bbc.com/news/world-europe-38883556.
Milli Gazete. 2019. “Erdoğan: Kültür sanat meselesi terörle mücadele kadar önemli.” January 10, 2019. https://www.milligazete.com.tr/haber/1778425/erdogan-kultur-sanat-meselesi-terorle-mucadele-kadar-onemli.
Morozov, Evgeny. 2012. The Net Delusion: The Dark Side of Internet Freedom. New York: PublicAffairs.
New York Times. 2018. “Turkish Media Group Bought by Pro-Government Conglomerate.” March 22, 2018. https://www.nytimes.com/2018/03/21/world/europe/turkey-media-erdogan-dogan.html.
Ng, Yi Shu. 2017. “Philippine President Admits He Used an Army of Social Media Trolls while Campaigning.” Mashable, July 25, 2017. https://mashable.com/2017/07/25/duterte-oxford-paid-trolls/.
Nuzzi, O. 2016. “How Pepe the Frog Became a Nazi Trump Supporter and Alt-Right Symbol.” Daily Beast, May 26, 2016. https://www.thedailybeast.com/how-pepe-the-frog-became-a-nazi-trump-supporter-and-alt-right-symbol.
Ong, Jonathan Corpus, and Jason Cabanes. 2018. “In the Philippines, Political Trolling Is an Industry—This Is How It Works.” OpenDemocracy, February 20, 2018. https://www.opendemocracy.net/digitaliberties/jonathan-corpus-ong-jason-cabanes/in-philippines-political-trolling-is-industry-this.
Ozkan, Behlül. 2014. “Turkey, Davutoglu and the Idea of Pan-Islamism.” Survival 56 (4): 119–140.
Paganini, Pierluigi. 2016. “Which Are Principal Cities Hostages of Malicious Botnets?” Security Affairs, October 6, 2016. https://securityaffairs.co/wordpress/51968/reports/botnets-geography.html.
Paul, Christopher, and Miriam Matthews. 2016. The Russian “Firehose of Falsehood” Propaganda Model: Why It Might Work and Options to Counter It. No. PE-198-OSD, Perspectives. Santa Monica, CA: RAND Corp.
Penzenstadler, Nick, Brad Heath, and Jessica Guynn. 2018. “We Read Every One of the 3,517 Facebook Ads Bought by Russians. Here’s What We Found.” USA Today, May 11, 2018. https://www.usatoday.com/story/news/2018/05/11/what-we-found-facebook-ads-russians-accused-election-meddling/602319002/.
Phillips, Whitney. 2015. This Is Why We Can’t Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture. Cambridge, MA: MIT Press.
Piše:Danas Online. 2018. “SNS botovi napisali 10 miliona komentara.” Dnevni list Danas, December 2, 2018. https://www.danas.rs/politika/sns-botovi-napisali-10-miliona-komentara/.
Pohjonen, Matti, and Sahana Udupa. 2017. “Extreme Speech Online: An Anthropological Critique of Hate Speech Debates.” International Journal of Communication 11:1173–1191.
Reuters. 2019. “Erdogan Accuses Women’s March of Disrespecting Islam.” March 10, 2019. https://www.reuters.com/article/us-womens-day-turkey-erdogan-idUSKBN1QR0JT.
Rost, Katja, Lea Stahel, and Bruno S. Frey. 2016. “Digital Social Norm Enforcement: Online Firestorms in Social Media.” PLoS One 11 (6): e0155923. https://doi.org/10.1371/journal.pone.0155923.
Saka, Erkan. 2018. “Social Media in Turkey as a Space for Political Battles: AKTrolls and Other Politically Motivated Trolling.” Middle East Critique 27 (2): 161–177.
Shepherd, Tamara, Alison Harvey, Tim Jordan, Sam Srauy, and Kate Miltner. 2015. “Histories of Hating.” Social Media + Society 1 (2): 2056305115603997.
Soldatov, Andrei, Irina Borogan, Maeve Shearlaw, Shaun Walker, Marc Burrows, Luke Harding, and Maeve Shearlaw. 2015. “What Spawned Russia’s ‘Troll Army’? Experts on the Red Web Share Their Views.” Guardian, September 8, 2015. https://www.theguardian.com/world/live/2015/sep/08/russia-troll-army-red-web-any-questions.
Sözeri, Ceren. 2016. “Trol gazeteciliği.” Evrensel.net. https://www.evrensel.net/yazi/77506/trol-gazeteciligi.
Sözeri, Efe Kerem. 2016. “Pelikan Derneği: Berat Albayrak, Ahmet Davutoğlu’nu Neden Devirdi?” Medium (blog). November 3, 2016. https://medium.com/@efekerem/pelikan-derne%C4%9Fi-berat-albayrak-ahmet-davuto%C4%9Flunu-neden-devirdi-5fabad6dc7de#.lfpf8807m.
———. 2017. “Trolls, Bots and Shutdowns: This Is How Turkey Manipulates Public Opinion.” Ahval, November 14, 2017. https://ahvalnews.com/freedoms/trolls-bots-and-shutdowns-how-turkey-manipulates-public-opinion.
Spence, Alex. 2018. “These Leaked Messages Show How Tory HQ Used A Twitter Army to Attack Jeremy Corbyn. But They Turned on Theresa May Instead.” BuzzFeed, September 26, 2018. https://www.buzzfeed.com/alexspence/these-leaked-messages-show-how-tory-hq-used-a-twitter-army.
Stocking, Galen, and Nami Sumida. 2018. “Social Media Bots Draw Public’s Attention and Concern.” October 15, 2018. http://www.journalism.org/2018/10/15/social-media-bots-draw-publics-attention-and-concern/.
Stray, Jonathan. 2017. “Defense Against the Dark Arts: Networked Propaganda and Counter-Propaganda.” Medium (blog). February 27, 2017. https://medium.com/tow-center/defense-against-the-dark-arts-networked-propaganda-and-counter-propaganda-deb7145aa76a.
Suler, John R., and Wende L. Phillips. 1998. “The Bad Boys of Cyberspace: Deviant Behavior in a Multimedia Chat Community.” Cyberpsychology and Behavior 1 (3): 275–294.
Tepper, Michele. (1997) 2013. “Usenet Communities and the Cultural Politics of Information.” In Internet Culture, edited by David Porter, 39–54. London: Routledge.
Udupa, Sahana. 2015. “Archiving as History-Making: Religious Politics of Social Media in India.” Communication, Culture and Critique 9 (2): 212–230.
Wagstaff, Keith. 2017. “The Dudes Exposed by Alt-Right Troll Milo Yiannopoulos’ Gross Emails.” Mashable, October 5, 2017. https://mashable.com/2017/10/05/tech-bros-milo-yiannopoulos/.
Yılmaz, Mehmet. 2015. “Who Fabricated This Sexual Fantasy?” Hürriyet Daily News, March 13, 2015. http://www.hurriyetdailynews.com/opinion/mehmet-y-yilmaz/who-fabricated-this-sexual-fantasy--79599.
Zimmermann, Çağla. 2016. “Feature: Turkey Trolls’ Use of Insults Stifling Reporting.” International Press Institute (blog). October 13, 2016. https://ipi.media/feature-turkey-trolls-use-of-insults-stifling-reporting/.