Small Wars Journal

The Bird Has Been Freed, and So Has a New Era of Online Extremism

Mon, 01/02/2023 - 11:59am

The Bird Has Been Freed, and So Has a New Era of Online Extremism

By Ella Busch

FREEING THE FAR RIGHT

“The bird is freed.” With these words, @elonmusk announced his official takeover of the Twitter platform on October 27, 2022 at 11:49pm.[1]Elon Musk, the CEO of Tesla -and now Twitter- bought the company for $44 billion this fall. His implementation of a “Twitter 2.0” has been nothing short of problematic, with his self-proclaimed “extremely hard-core” workplace strategy[2] resulting in the resignations of half of the company’s previous 7,500 employees. As the company’s sole board member, Musk has used this authority to apply his personal ideology of unmoderated speech, or “free speech absolutism” to Twitter. The company has already stopped enforcing its previous Covid-19 misinformation policy, reinstated formerly-banned accounts (including that of former President Donald Trump), and has scaled back its moderation efforts.[3] This lack of moderation risks more than the circulation of false or hurtful communications: it is likely to cause extremists to flock to the platform in order to take advantage of unregulated speech, disseminate propaganda, and radicalize potential recruits to terrorist groups. Twitter’s new ownership and content moderation standards will worsen far right extremism in the US because they allow for the creation and spread of far-right extremist (FRE) propaganda as well as the reemergence of figures that inspire and unify the far-right. To mitigate this risk across all social media platforms, the United States must amend its current legislation relating to corporate responsibility in moderating hate speech online.

 

Social media plays a large role in the radicalization of far-right extremists, the largest category of domestic terrorists in the United States. 18 US Code 2231 defines domestic terrorism as “acts dangerous to human life that occur primarily within US territory and are intended (i) to intimidate or coerce a civilian population; (ii) to influence the policy of a government by intimidation or coercion; or (iii) to affect the conduct of a government by mass destruction, assassination, or kidnapping.”[4] Right-wing extremism is defined as “the use or threat of violence by subnational or non-state entities whose goals may include racial or ethnic supremacy; opposition to government authority; anger at women, including from the involuntary celibate ( or “incel”) movement; and outrage against certain policies, such as abortion.”[5] This group poses the greatest threat to national security, as 90% of all terrorist attacks in the United States in 2019 were perpetrated by far-right adherents.[6] The prevalence of the far-right can be largely attributed to the popularity of social media, which serves as a platform for extremists to share their views and indoctrinate others into their ideology. In 2016 alone, 90% of extremists were radicalized at least partially via social media, 23.4% of which utilized Twitter as a primary source of extremist content.[7]

 

Terrorists use the internet for many purposes, just as ordinary people do. They use social media to host content, whether through images, videos, or live-streams of propaganda. They target a variety of audiences, including potential recruits, the media, and their enemies.[8] The online extremist community serves as an ‘echo chamber’ for hateful ideas; it confirms pre-existing radical beliefs by providing a sense of community by connecting individuals with these shared beliefs. The rapid spread of extremist ideas facilitated by social media in turn facilitates the process of indoctrination into violent groups.[9] In 2019, J.M. Berger concluded, “it is safe to assume that the total number of alt-right adherents on Twitter, including deceptive accounts such as bots and sock puppets, exceeds 100,000 and probably exceeds 200,000.”[10] Twitter’s significant far-right community is aided by the platform’s hashtag feature, which allows followers to attract attention to and create communities around certain shared views. Data shows that the most common ideas promoted by the far-right Twittersphere are Pro-Trump, white nationalist, general far-right, anti-immigrant & anti-Muslim, trolling/shitposting, and conspiracy & fake news content.[11]

 

MUSK & MODERATION

On November 16, 2022, in an effort to downsize Twitter’s global workforce, Musk sent a late-night ultimatum to his staff. “Going forward, to build a breakthrough Twitter 2.0 and succeed in an increasingly-competitive world, we will need to be extremely hardcore. This will mean working hard hours at a high intensity. Only exceptional performance will constitute a passing grade.”[12] After a mere twenty-four hours to make their decision, the majority of Twitter’s senior staff decided to resign -including its former Head of Safety and Integrity, Head of Global Ad Sales, Chief Privacy Officer, Chief Security Officer, and Chief Compliance Officer-[13]largely out of concern for Musk’s ideology of free speech absolutism and its potential ramifications for the platform. He has since dismissed the platform’s former content moderation team, stating that a “content moderation council with widely diverse viewpoints'' would be created in its place to determine moderation issues such as the suspension and reinstatement of Twitter accounts. One month later, Musk reneged on this promise, claiming that a group of investors barred him from creating such a council. He retweeted his former post announcing his visions for the council, adding, “A large coalition of political/social activist groups agreed not to try to kill Twitter by starving us of advertising revenue if I agreed to this condition [of creating the council]. They broke the deal.”[14] Instead, such content-related decisions would be made by Musk himself.

 

The absence of content moderation will allow far-right extremists to populate the platform and spread their ideologies to a wider audience. Twitter’s former Policy on Violent Organizations gave the company the right to permanently suspend accounts that “identify through their stated purpose, publications, or actions as an extremist group; have engaged in or currently engage in violence and/or the promotion of violence as a means to further their cause, and target civilians in their acts and/or promotion of violence.”[15] Twitter has cited such policies in the termination of over 1.7 million of such accounts since August 2015.[16] Under Musk, there are no such rules. He describes the content moderation policy of Twitter 2.0 as “freedom of speech, but not freedom of reach,” stating that tweets deemed ‘negative’ or ‘hateful’ would be allowed on the site, but only to people who search for them.[17] Previously-suspended accounts, meanwhile, would be granted “general amnesty, provided that they have not broken the law or engaged in egregious spam.”[18] In the twelve hours following Musk’s takeover of Twitter, the platform saw a 500% increase in the use of the “N Word;”[19] anti-Black slurs overall rose 5000%.[20] Dozens of accounts espousing racially-charged and neo-Nazi commentary were created on the platform, thanking Musk for reinstating their moderated speech.[21] Within one week of Musk’s acquisition, posts containing the word “Jew” increased fivefold, the majority of which were anti-Semitic in nature.[22] Slurs against gay and trans persons have increased by 58% and 62%, respectively.[23] These groups are all specifically targeted by FREs. According to the Anti-Defamation League, “These changes are already affecting the proliferation of hate on Twitter, and the return of extremists of all kinds to the platform has the potential to supercharge the spread of extremist content and disinformation.”[24]

 

REPERCUSSIONS OF ACCOUNT REINSTATEMENT

The reinstatement of previously-banned accounts will allow for a reemergence of important, unifying figures among the far-right, including that of former President Donald Trump. Trump was one of the most controversial members of the ‘Twittersphere’ throughout his term. The majority of his posts attacked Democrats, minorities, immigration, or US allies. 1,710 of these tweets promoted conspiracy theories, an additional 40 of which promoted allegations of voter fraud.[25] Trump’s views and campaign of misinformation did not immediately result in his online suspension, but they did earn him a loyal follower base, primarily among the far-right. Trump was officially removed from Twitter for inciting this group to violence in what was arguably an act of terror: the storming of the US Capitol on January 6, 2021 by a pro-Trump mob. The movement began on Twitter when Trump rallied his followers to “fight like hell” in protest of the inauguration of President Joseph Biden, which was viewed as an instance of election fraud. The viral hashtag, #StopTheSteal, resulted in an insurrection of far-right extremist groups -mainly the anti-government Proud Boys and Oath Keepers- as well as “spontaneous clusters” of non-affiliated lone wolf actors who collaborated in using violent force to destroy property, assault law enforcement, and disrupt the electoral process.[26] Trump’s role in the event resulted in his removal from Twitter, Instagram, Facebook, and Snapchat for the past 22 months. Musk reinstated Trump's Twitter account through a singular ‘yes or no’ poll -with 51.8% voting yes- on November 19, 2022, complete with his former 59,000 posts and 72,000,000 followers.

 

It is unclear, however, if Trump will actually rejoin Twitter; following his online suspension, Trump as he created his own social media platform, Truth Social,[27] which has become a hotspot for unregulated FREs content. The most common topics discussed on Truth Social concern gun rights, the January 6 insurrection, vaccines, LGBTQ issues, and abortion.[28] Truth Social is known to espouse the far-right conspiracy theory, QAnon,[29] which believes that Donald Trump is waging a secret war against satanic pedophiles within the Democratic party and remains the legitimate president of the United States.[30]Truth Social was found to have at least 88 users promoting QAnon ideology on their accounts, 32 of which were previously banned by Twitter. In August 2022, Trump was found to have reposted 65 QAnon-related messages over a four-month period, resharing the QAnon slogan WWG1WWGA (Where We Go One Where We Go All), as well as messages relating to “a war against sex traffickers and pedophiles.”[31]

 

Trump is not the only controversial character who is rejoining the Twittersphere. Musk has reinstated a plethora of controversial far-right-leaning accounts to Twitter. Among these include the accounts of Jordan Peterson, a Canadian psychologist that champions misogyny;[32] the conservative Christian satire website, Babylon Bee, which mockingly awarded the transgender US Assistant Secretary for Health and Human Services, Rachel Levine, with the title of “Man of the Year;”[33] Senator Marjorie Taylor-Greene, a known QAnon adherent who had repeatedly violated Twitter’s Covid-19 misinformation policy[34]; and Andrew Anglin, the founder of the Daily Stormer, a neo-Nazi website.[35]  The provision of ‘general amnesty’ to these hateful accounts sets a dangerous precedent in favor of the proliferation of FRE views on Twitter.

 

CODIFYING CORPORATE RESPONSIBILITY

Corporate responsibility to moderate extremist content must be codified through amendments to our current communications-related laws. Section 230 of the Communications Decency Act (CDA) states that companies are not responsible for user-generated content in the United States.[36] However, this law is thirty years old, and technology has come a long way since 1996. The need for reform is putting policymakers in a bind, wrestling the need to place greater restrictions on the ever-changing social media landscape with the risk of violating current policy. Section 230 is currently drafted in a way that enables Elon Musk to continue his pursuit of free-speech absolutism.[37] Two pending Supreme Court cases may challenge Musk’s current sense of security: Gonzalez v Google and Twitter v Taamneh. In both cases, the families of terrorist victims filed actions against Google, Facebook, and Twitter for aiding and abetting in the radicalization of the terrorists who carried out the attacks. The Supreme Court will decide whether these services should be held accountable for knowingly aiding terrorism.[38]These cases present an opportunity for the United States to strengthen its counterterrorism efforts online, possibly through mechanisms similar to those of the EU’s Digital Services Act. If nothing is done to change current legislation, the government risks a spike in far-right terror attacks in the United States, a phenomenon that is correlated with social media use.[39]

 

Even prior to Musk’s acquisition of Twitter, the social media industry has faced great obstacles to effectively address online extremism because it requires international and transplatform cooperation. All social media companies vary in their rules and terms of service, yet online terrorist campaign typically cross three or more platforms: one smaller, less-regulated platform for private coordination, a second platform to store copies of data, and a third, large social media platform (such as Twitter) to amplify their message. This means that, even if one platform suspends an account or removes terrorist content from their platforms, terrorists can easily move to another platform. Although regulators usually focus on user-generated content, many terrorist efforts on social media are related to funding and coordination, rather than official propaganda, meaning that much of terrorist content goes unnoticed by algorithms or human moderators. According to Dr. Erin Saltman, the Director of Programming at the Global Internet Forum to Counter Terrorism, ultimate privacy policies -such as those promoted by Musk- give terrorists a “free pass” to post dehumanizing and violent content.[40]Brian Fishman, who leads Facebook’s counterterrorism efforts, says that policymakers’ pleas to “do better” are no longer enough; policymakers and academics must directly work alongside tech companies to come up with best practices against online extremism. The scale of the online counterterrorism challenge is massive, exacerbated by terrorists’ abilities to circumvent online enforcement efforts.[41]In addition to platform-wide rules, regulations, and transparency measures, companies must interact regularly with policymakers to adapt to the ever-changing technological landscape.

 

THE INTERNATIONAL PERSPECTIVE

Twitter is facing legal and financial repercussions due to the dismantling of its content moderation policies. The European Union (EU), has warned Twitter that it risks heavy fines -up to 6% of the company’s annual global operations revenue- or a complete operations ban if it fails to meet the content moderation standards set by the EU’s Digital Services Act (DSA). This law will take effect early next year, requiring that companies police content that promotes terrorism, child sexual abuse, hate speech, and commercial scams.[42]Germany in particular has taken issue with Twitter’s lack of regulations, as the country has some of the strictest anti-hate speech laws in the Western world. Its Enforcement on Social Networks Law (NetzDG) allows for fines of up to €50 million for failure to comply with content moderation standards.[43]

It is less clear how the United States will respond to such violations; while legislation has been proposed to counter hate speech online, free speech laws greatly inhibit their ratification. In March 2021, US Democrats reintroduced the Protecting Americans from Dangerous Algorithms Act, which will “hold large social media platforms accountable for their algorithmic amplification of harmful, radicalizing content that leads to online violence.” In another bipartisan effort, Senators Amy Klobuchar and Cynthia Lummis introduced the NUDGE Act (Nudging Users to Drive Good Experiences on Social Media) in February 2022, which aims to study interventions against harmful language on social media.[44]To date, neither bill has been passed.

 

The quantity of resignations by senior leadership has been an additional source of concern from the United States Federal Trade Commission (FTC), particularly in the wake of another FTC dispute in May, in which Twitter paid a $150 million fine to settle allegations of misusing users’ private information.[45]As part of this deal, Twitter agreed to a condition upon which it must report all changes in company structure to the FTC within fourteen days of such a change. Such consent orders carry the force of law; if violations are proven, they may result in fines, restrictions, and even sanctions on individual executives.[46]Since Musk neglected to inform the FTC of the company’s mass layoffs, Twitter faces the possibility of incurring such sanctions. Similarly, Musk faces the scrutiny of Apple and Google, which have the power to remove apps that violate their content moderation standards. Apple’s developer standards state that apps cannot include sexually-explicit, discriminatory, or “just plain creepy” content, including rhetoric against users’ “religion, race, sexual orientation, gender, [or] national ethnic origin.”[47] Due to the rapid increase in hateful and discriminatory rhetoric on Twitter, it is highly likely that Apple may bar the app altogether.

 

THE FUTURE OF TWITTER

Racist and anti-Semitic trolls have caused some of Twitter’s largest advertisers to leave the platform for its lesser-known rival app, Mastodon,[48] including entities such as General Mills, Pfizer, Chipotle, United Airlines, and Audi.[49] IPG, one of the world’s largest advertising companies, has also warned its clients against advertising on Twitter out of moderation concerns.[50]This advertising exodus has cost Twitter roughly $4 million per day in advertising revenue; Musk himself has admitted that the company faces bankruptcy. The Global Alliance for Responsible Media, an influential ad industry trade group, appealed to the possibility of bankruptcy in an open letter  pleading for Twitter “to adhere to existing commitments to ‘brand safety.” [51] Instead of responding with strengthened free speech measures, Musk has made plans to reduce the company’s reliance on advertising and instate a “Twitter Blue” subscription to boost revenues. This system will provide users with the blue check signaling account-verification for the fee of $7.99/month. The program’s launch has been delayed, however, in an attempt to avoid a 30% App Store fee, which is standard for apps requiring in-app purchases. [52]

 

There are various hypothetical scenarios regarding the future of Twitter. According to the company’s former Head of Trust and Safety, Yoel Roth, so long as the company faces political scrutiny and relies upon advertising for 90% of its revenue, Twitter will face “unavoidable limits” to the extent of its free speech policy. “In the longer term, the moderating influences of advertisers, regulators, and, most critically of all, app stores may be welcome for those of us hoping to avoid a dangerous escalation in the volume of dangerous speech online.”[53]

 

Bloomberg’s Parmy Olsen disagrees, comparing the case of Twitter to that of Telegram, an encrypted instant messaging service founded by libertarian billionaire Pavel Durov. Like Musk, Durov is a staunch advocate for free speech, as reflected in the platform’s incredibly scant content moderation policy. Twitter, she says, has sixteen rules regarding content; Telegram has three. Despite being relatively unknown in the United States, Telegram is twice the size of Twitter and its lack of moderation has not impeded its popularity, [54] suggesting that Twitter 2.0 may continue to thrive, albeit in a different way than before. It is important to note, however, that Telegram has its own thriving online terrorist community. [55] Darrell M. West from the Brookings Institution, meanwhile, outlines five potential scenarios for the future of Twitter: bankruptcy, little content moderation coupled with lots of extremism, difficulty in maintaining technical infrastructure (due to the terminations of engineers and policy-related staff), a reliance on premium services to fund the platform, or a combination of these possibilities. [56] No matter what the outcome, Elon Musk’s Twitter takeover has done irreparable damage to the platform’s reputation and future endeavors.

 

 By “freeing the bird,” Musk is not only risking the spread of hateful ideas or the accelerated radicalization of future terrorists, but he is risking the integrity of Twitter as a social media giant. By reinstating the accounts of individuals espousing FRE ideas -including former President Donald Trump- Musk is sending a message that he encourages such rhetoric on his platform, regardless of the consequences. The case of Twitter calls for a fundamental change in content moderation standards to counter violent extremism, perhaps similar to those seen in the European Union. In his pursuit of absolute free speech, Musk has “sent up the batsignal to every kind of racist, misogynist, and homophobe that Twitter was open for business, and they have to react accordingly.” [57] This has paved the way for the proliferation of far-right extremism -the most pressing counterterrorism issue facing our country- on one of the world’s most popular social media websites.

 

 

Notes

 

1            Elon Musk Declares Twitter 'Moderation Council' – as Some Push the Platform's Limits.” The Guardian. Guardian News and Media, October 29, 2022. https://www.theguardian.com/technology/2022/oct/28/elon-musk-twitter-moderation-council-free-speech.

2           O'Sullivan, Donie, and Clare Duffy. “Elon Musk Gives Ultimatum to Twitter Employees: Do 'Extremely Hardcore' Work or Get out | CNN Business.” CNN. Cable News Network, November 16, 2022. https://www.cnn.com/2022/11/16/tech/elon-musk-email-ultimatum-twitter/index.html.

3            Powers, Benjamin. “Will Elon Musk's Lax Twitter Content Moderation Help Ignite Violence around the World?” Grid News. Grid News, December 2, 2022. https://www.grid.news/story/technology/2022/12/01/will-elon-musks-lax-twitter-content-moderation-help-ignite-violence-around-the-world/.

4            Jones, Seth G. “The Evolution of Domestic Terrorism.” The Evolution of Domestic Terrorism | Center for Strategic and International Studies. Center for Strategic and International Studies , February 17, 2022. https://www.csis.org/analysis/evolution-domestic-terrorism.

5               Jones, Seth G, Catrina Doxsee, and Nicholas Harrington. “The Tactics and Targets of Domestic Terrorists.” The Tactics and Targets of Domestic Terrorists . Center for Strategic and International Studies, November 22, 2022. https://www.csis.org/analysis/tactics-and-targets-domestic-terrorists.

6               Baele, Stephane J, and Lewys Brace. “Uncovering the Far-Right Online Ecosystem: An Analytical Framework and Research Agenda.” Translated by Travis G Coan. Taylor & Francis Online, July 17, 2020. https://www.tandfonline.com/doi/full/10.1080/1057610X.2020.1862895.

7               Jenson, Michael, Patrick James, Gary LaFree, Aaron Safer-Lichtenstein, and Elizabeth Yates. “Use of Social Media by US Extremists - UMD.” Use of Social Media by US Extremists. University of Maryland . Accessed December 3, 2022. https://www.start.umd.edu/pubs/START_PIRUS_UseOfSocialMediaByUSExtremists_ResearchBrief_July2018.pdf.

8               Fishman, Brian. “Crossroads: Counter-Terrorism and the Internet.” Texas National Security Review, February 16, 2022. https://tnsr.org/2019/02/crossroads-counter-terrorism-and-the-internet/.

9              Von Behr, Ines, Anais Reding, Charlie Edwards, and Luke Gribbon. “Radicalization in the Digital Era.” RAND Corporation, November 5, 2013. https://www.rand.org/pubs/research_reports/RR453.html.

10             Berger, J.M. “New Research Report: The Alt-Right Twitter Census by JM Berger - Vox - Pol.” VOX, August 13, 2019. https://www.voxpol.eu/new-research-report-the-alt-right-twitter-census-by-j-m-berger/.

11             Baele, Stephane J, and Lewys Brace. “Uncovering the Far-Right Online Ecosystem: An Analytical Framework and Research Agenda.” Translated by Travis G Coan. Taylor & Francis Online, July 17, 2020 https://www.tandfonline.com/doi/full/10.1080/1057610X.2020.1862895.

12             Kolodny, Lora. “Elon Musk Demands Twitter Staff Commit to 'Long Hours' or Leave: Read the Email.” CNBC. CNBC, November 17, 2022. https://www.cnbc.com/2022/11/16/elon-musk-demands-twitter-staff-commit-to-long-hours-or-leave.html.

13             Powers, Benjamin. “Will Elon Musk's Lax Twitter Content Moderation Help Ignite Violence around the World?” Grid News. Grid News, December 2, 2022. https://www.grid.news/story/technology/2022/12/01/will-elon-musks-lax-twitter-content-moderation-help-ignite-violence-around-the-world/.

14            Sharma, Bharat. “Elon Musk Blames His Failure to Set up Twitter Content Moderation Council on 'Activists'.” India Times, November 23, 2022. https://www.indiatimes.com/technology/news/elon-musks-promise-of-content-moderation-on-twitter-585527.html.

15             “Our Policy on Violent Organizations | Twitter Help.” Twitter. Twitter. Accessed December 3, 2022. https://help.twitter.com/en/rules-and-policies/violent-groups.

16             Lima, Cristiano, and Aaron Schaffer. “Analysis | As Twitter Defends Its Counterterror Work, Experts Fear a Spike under Musk.” The Washington Post. WP Company, November 30, 2022. https://www.washingtonpost.com/politics/2022/11/30/twitter-defends-its-counterterror-work-experts-fear-spike-under-musk/.

17             “Elon Musk Reinstates Trump's Twitter Account 22 Months after It Was Suspended.” CBS News. CBS Interactive, November 20, 2022. https://www.cbsnews.com/news/elon-musk-says-donald-trump-reinstated-twitter/.

18             Lima, Cristiano, and Aaron Schaffer. “Analysis | As Twitter Defends Its Counterterror Work, Experts Fear a Spike under Musk.” The Washington Post. WP Company, November 30, 2022. https://www.washingtonpost.com/politics/2022/11/30/twitter-defends-its-counterterror-work-experts-fear-spike-under-musk/.

19             Ray, Rashawn, and Joy Anyanwu. “Why Is Elon Musk's Twitter Takeover Increasing Hate Speech?” Brookings. Brookings, December 1, 2022. https://www.brookings.edu/blog/how-we-rise/2022/11/23/why-is-elon-musks-twitter-takeover-increasing-hate-speech/.

20            Ayad, Moustafa. “Islamic State Supporters on Twitter: How Is 'New' Twitter Handling an Old Problem? .” GNET. Global Network on Extremism and Technology, November 18, 2022. https://gnet-research.org/2022/11/18/islamic-state-supporters-on-twitter-how-is-new-twitter-handling-an-old-problem/.

21             “Elon Musk Declares Twitter 'Moderation Council' – as Some Push the Platform's Limits.” The Guardian. Guardian News and Media, October 29, 2022. https://www.theguardian.com/technology/2022/oct/28/elon-musk-twitter-moderation-council-free-speech

22            Ray, Rashawn, and Joy Anyanwu. “Why Is Elon Musk's Twitter Takeover Increasing Hate Speech?” Brookings. Brookings, December 1, 2022. https://www.brookings.edu/blog/how-we-rise/2022/11/23/why-is-elon-musks-twitter-takeover-increasing-hate-speech/.

23            Darcy, Oliver. “Hate Speech Dramatically Surges on Twitter Following Elon Musk Takeover, New Research Shows | CNN Business.” CNN. Cable News Network, December 2, 2022. https://www.cnn.com/2022/12/02/tech/twitter-hate-speech/index.html.

24             Darcy, Oliver. “Hate Speech Dramatically Surges on Twitter Following Elon Musk Takeover, New Research Shows | CNN Business.” CNN. Cable News Network, December 2, 2022. https://www.cnn.com/2022/12/02/tech/twitter-hate-speech/index.html.

25             Harris, Rich, Blacki Migliozzi, Matthew Rosenburg, and Rachel Shorey. “How Trump Reshaped the Presidency in over 11,000 Tweets.” The New York Times. The New York Times, November 2, 2019. https://www.nytimes.com/interactive/2019/11/02/us/politics/trump-twitter-presidency.html.

26             Clifford, Bennett, and Jon Lewis. “Assessing Domestic Violent Extremism One Year after the Capitol Siege.” Lawfare. Lawfare, January 18, 2022. https://www.lawfareblog.com/assessing-domestic-violent-extremism-one-year-after-capitol-siege.

27             “Elon Musk Reinstates Trump's Twitter Account 22 Months after It Was Suspended.” CBS News. CBS Interactive, November 20, 2022. https://www.cbsnews.com/news/elon-musk-says-donald-trump-reinstated-twitter/

28            Forman-Katz, Naomi, and Galen Stocking. “Key Facts about Truth Social.” Pew Research Center. Pew Research Center, December 2, 2022. https://www.pewresearch.org/fact-tank/2022/11/18/key-facts-about-truth-social-as-donald-trump-runs-for-u-s-president-again/.

29             Bond, Shannon. “Elon Musk Allows Donald Trump Back on Twitter.” NPR. NPR, November 20, 2022. https://www.npr.org/2022/11/19/1131351535/elon-musk-allows-donald-trump-back-on-twitter.

30            Roose, Kevin. “What Is QAnon, the Viral pro-Trump Conspiracy Theory?” The New York Times. The New York Times, August 18, 2020. https://www.nytimes.com/article/what-is-qanon.html.

31             Hsu, Tiffany. “Qanon Accounts Found a Home, and Trump's Support, on Truth Social.” The New York Times. The New York Times, August 29, 2022. https://www.nytimes.com/2022/08/29/technology/qanon-truth-social-trump.html.

32             Beauchamp, Zack. “Jordan Peterson, the Obscure Canadian Psychologist Turned Right-Wing Celebrity, Explained.” Vox. Vox, March 26, 2018. https://www.vox.com/world/2018/3/26/17144166/jordan-peterson-12-rules-for-life.

33             Suciu, Peter. “The Babylon Bee's Twitter Account Was Suspended, but That Made Its Story Go Viral.” Forbes. Forbes Magazine, March 23, 2022. https://www.forbes.com/sites/petersuciu/2022/03/21/the-babylon-bees-twitter-account-was-suspended-but-that-made-its-story-go-viral/?sh=7fec6d3f209d.

34             Blistein, Jon. “Elon Musk Hasn't Been Able to Woo Trump Back to Twitter, so He's Trying Marjorie Taylor Greene Instead.” Rolling Stone. Rolling Stone, November 21, 2022. https://www.rollingstone.com/politics/politics-news/elon-musk-reinstates-marjorie-taylor-greene-twitter-account-1234634659/.

35             Hatmaker, Taylor. “Elon Musk Just Brought an Infamous Neo-Nazi Back to Twitter.” TechCrunch, December 2, 2022. https://techcrunch.com/2022/12/02/elon-musk-nazis-kanye-twitter-andrew-anglin/.

36            Mackey, Aaron, and Meri Baghdasaryan. “Section 230 of the Communications Decency Act.” Electronic Frontier Foundation. Accessed December 4, 2022. https://www.eff.org/issues/cda230.

37            Solomon, Aron. “Why Elon Musk Banned Ye for 'Inciting Violence'–and What It Means for the Future of Twitter's Content Moderation Policy.” Fortune. Fortune, December 2, 2022. https://fortune.com/2022/12/02/why-elon-musk-banned-ye-inciting-violence-twitter-content-moderation-policy-tech-politics-aron-solomon/.

38             Neschke, Sabine, Danielle Draper, Sean Long, Sameer Ali, and Tom Romanoff. “Gonzalez v. Google: Implications for the Internet's Future.” Bipartisan Policy Center, November 29, 2022. https://bipartisanpolicy.org/blog/gonzalez-v-google/.

39             Grelicha, Keanna, Indirah Canzater, Tiffany Dove, Dyuti Pandya, Clea Guastavino, and Cassandra Townsend. “Far-Right Extremists' Use of Social Media Platforms to Communicate and Spread Radicalized Beliefs.” The Counterterrorism Group. The Counterterrorism Group, December 20, 2021. https://www.counterterrorismgroup.com/post/far-right-extremist-use-of-social-media-platforms-to-communicate-and-spread-radicalized-beliefs.

40             Saltman, Erin. “Challenges in Combating Terrorism and Extremism Online.” Lawfare. Lawfare, October 16, 2021. https://www.lawfareblog.com/challenges-combating-terrorism-and-extremism-online.

41            Fishman, Brian. “Crossroads: Counter-Terrorism and the Internet.” Texas National Security Review, February 16, 2022. https://tnsr.org/2019/02/crossroads-counter-terrorism-and-the-internet/.

42            Betz, Bradford. “Eu Warns Musk It May Ban Twitter over Concerns about Content Moderation.” Fox Business. Fox Business, November 30, 2022. https://www.foxbusiness.com/politics/eu-warns-musk-ban-twitter-concerns-about-content-moderation

43            Lomas, Natasha. “Musk's Impact on Content Moderation at Twitter Faces Early Test in Germany.” TechCrunch, November 21, 2022. https://techcrunch.com/2022/11/21/elon-musk-twitter-netzdg-test/.

44            Ray, Rashawn, and Joy Anyanwu. “Why Is Elon Musk's Twitter Takeover Increasing Hate Speech?” Brookings. Brookings, December 1, 2022. https://www.brookings.edu/blog/how-we-rise/2022/11/23/why-is-elon-musks-twitter-takeover-increasing-hate-speech/.

45            Dave, Paresh, and Katie Paul. “Musk Warns of Twitter Bankruptcy as More Senior Executives Quit.” Reuters. Thomson Reuters, November 11, 2022. https://www.reuters.com/technology/twitter-information-security-chief-kissner-decides-leave-2022-11-10/.

46            Fung, Brian. “Musk's Twitter May Have Already Violated Its Latest FTC Consent Order, Legal Experts Say | CNN Business.” CNN. Cable News Network, November 11, 2022. https://www.cnn.com/2022/11/11/tech/musk-twitter-ftc/index.html.

47            Rainey, Clint. “Why Apple and Google Could Be the Biggest Threats to Elon Musk's Anything-Goes Version of Twitter.” Fast Company, November 21, 2022. https://www.fastcompany.com/90815181/why-apple-and-google-could-be-the-biggest-threat-to-elon-musks-anything-goes-version-of-twitter.

48             Ayad, Moustafa. “Islamic State Supporters on Twitter: How Is 'New' Twitter Handling an Old Problem? .” GNET. Global Network on Extremism and Technology, November 18, 2022. https://gnet-research.org/2022/11/18/islamic-state-supporters-on-twitter-how-is-new-twitter-handling-an-old-problem/.

49             Porterfield, Carlie. “Musk Wars with the Left: Suggests 'Activists' Killed Moderation Plan and Baits Black Lives Matter Supporters.” Forbes. Forbes Magazine, November 24, 2022. https://www.forbes.com/sites/carlieporterfield/2022/11/23/musk-wars-with-the-left-left-suggests-activists-killed-moderation-plan-and-baits-black-lives-matter-supporters/?sh=3d87e9bf2aaf.

50            Conger, Kate, Tiffany Hsu, and Ryan Mac. “Elon Musk's Twitter Faces Exodus of Advertisers and Executives.” New York Times. New York Times, November 1, 2022. https://www.nytimes.com/2022/11/01/technology/elon-musk-twitter-advertisers.html.

51            Roth, Yoel. “Opinion | I Was the Head of Trust and Safety at Twitter. This Is What Could Become of It. .” New York Times. New York Times, November 18, 2022. https://www.nytimes.com/2022/11/18/opinion/twitter-yoel-roth-elon-musk.html.

52            Duffy, Kate. “Elon Musk Has Delayed Twitter's Launch of Its Verified Subscription Service Again as It Tries to Bypass Apple's 30% App Store Fees, Report Says.” Business Insider. Business Insider, November 30, 2022. https://www.businessinsider.com/elon-musk-delays-twitter-blue-launch-avoid-apple-store-fees-2022-11.

53            Roth, Yoel. “Opinion | I Was the Head of Trust and Safety at Twitter. This Is What Could Become of It. .” New York Times. New York Times, November 18, 2022. https://www.nytimes.com/2022/11/18/opinion/twitter-yoel-roth-elon-musk.html.

54            Olson, Parmy. “Parmy Olson: Musk's Twitter Won't Die. Look at Telegram.” Lowell Sun. Lowell Sun, December 3, 2022. https://www.lowellsun.com/2022/12/03/parmy-olson-musks-twitter-wont-die-look-at-telegram/.

55            “Terrorists on Telegram.” Counter Extremism Project. Counter Extremism Project, May 2017. https://www.counterextremism.com/terrorists-on-telegram.

56            West, Darrell M. “The Future of Twitter: Four Scenarios.” Brookings. Brookings, November 22, 2022. https://www.brookings.edu/blog/techtank/2022/11/22/the-future-of-twitter-four-scenarios/.

57            Darcy, Oliver. “Hate Speech Dramatically Surges on Twitter Following Elon Musk Takeover, New Research Shows | CNN Business.” CNN. Cable News Network, December 2, 2022. https://www.cnn.com/2022/12/02/tech/twitter-hate-speech/index.html.

 

About the Author(s)

Ella Busch is a researcher at Georgetown University studying Government and Psychology. She has a particular interest in domestic terrorism and hopes to specialize in security in the future.