The Fox News anchors finally spoke on Trump's defeat. Websites like Twitter and Facebook have taken action against election-related falsehoods. Yet, committed believers may find plenty of sources to feed their falsehoods elsewhere.
By Kristina Hook • Ernesto Verdeja
The immense influence of social media is undeniable. Over 300 million images are posted to Facebook every day, while 6,000 Tweets are generated every second. In comparison to the 500 million or so users of the messaging software Telegram, the most popular channels on YouTube get over 14 billion views every week. Similar forums have also been used to incite violence, disseminate political strife, and foster an atmosphere of instability. There are a large number of state-backed and private company manipulation initiatives, and researchers think that coordinated social media disinformation campaigns have operated in at least 81 countries.
This policy paper examines current trends in disinformation across social media platforms and offers suggestions for a wide range of actors based on findings from these surveys. We believe that the community tasked with preventing instability and atrocities should take into account new concerns related to disinformation spread through social media and provide some suggestions for doing so. The preventive community must face a simple but troubling reality: disinformation can swiftly morph across themes, and it takes just a few narratives to gain traction and undermine faith in factual claims and evidence-based practices.
This policy paper examines current trends in disinformation across social media platforms and offers suggestions for a wide range of actors based on findings from these surveys. We believe that the community tasked with preventing instability and atrocities should take into account new concerns related to disinformation spread through social media and provide some suggestions for doing so. The preventive community must face a simple but troubling reality: disinformation can swiftly morph across themes, and it takes just a few narratives to gain traction and undermine faith in factual claims and evidence-based practices.
Concluding Remarks
This policy study looks at how disinformation spread through social media (SMM) might exacerbate political unrest and justify widespread massacres. In particular, the purpose of this study is to persuade leaders in domains related to SMM to reflect on the unique difficulties posed by the deliberate or unintentional dissemination of false information in settings where mass atrocities are possible. Although specifics of context are important, the relationship between SMM and preventing atrocities is characterized by broad dynamics. This research suggests that social media memes (SMM) may be especially persuasive in atrocity-risk environments, where they may have a significant impact by fostering a permissive social bandwagon effect that encourages violence against the targeted group (s). In this paper, we provide an overview of the challenges emerging from SMM and tailor recommendations for each stakeholder group in order to support atrocity prevention. These stakeholders include social media corporations, established (legacy) media, non-governmental civil society actors, researchers and civil society, governments and multilateral organisations. Our hope is that this policy brief will serve as a springboard for further discussion among various stakeholders on a contentious and difficult topic, therefore enlarging the atrocity prevention community in this new field.
Instead of zeroing in on a single instance, this study presents a broad summary of SMM's consequences and makes a number of suggestions to the community working to avoid instability and atrocities. There are multiple portions to this work. The primary sociopolitical, psychological, and social media elements that amplify the effect of social media disinformation are discussed after a glossary of relevant words is presented. After an overview of SMM's primary roles, the study discusses the obstacles faced by various stakeholders whose work involves atrocity prevention. These stakeholders include social media companies, traditional media outlets, NGOs, academics, and government and international organisations. At the conclusion, the report offers policy suggestions for these constituents. This report is based on the authors' prior research and experience in the field, as well as interviews with other experts.
Introduction
The immense influence of social media is undeniable. Over 300 million images are posted to Facebook every day, while 6,000 Tweets are generated every second. 1. Around 14 billion views are accrued each week2 on the most popular channels on YouTube,2. while over 500 million people use the messaging software Telegram. 3. With the help of social media, individuals from all walks of life are able to communicate with one another and share knowledge in ways that would have been unthinkable twenty years ago. Instability, political strife, and calls to violence have all been fueled by the widespread exploitation of social media platforms. Social media manipulation by nations and companies is a growing problem, with studies showing that such operations have been active in at least 81 different countries and growing each year. 4
We believe that the community tasked with preventing instability and atrocities must embrace developing challenges related to social media misinformation (SMM), and we provide suggestions for how this might be done. Those in a variety of fields concerned with preventing atrocities must face a simple but unsettling reality: disinformation can swiftly morph across themes, and it takes just a few tales to undermine faith in facts and evidential standards. Actors may launch various lies, accusations, and conspiracies and see which ones take root because of the thick, wide social interconnections throughout social media platforms. There has been a rise in the prevalence of asymmetric conflicts in recent decades, in which the timing, location, and frequency of attacks are determined by the malevolent actors themselves. These players may be foreign or local state actors, parastatal organizations, or non-state entities. Governments, CSOs, IT companies, media, and others that are tasked with defending against these threats must decide where and how they will put their efforts. The reflexive crouch used by defenders is necessitated by the imbalance. Stakeholders working to avoid instability and atrocities have significant hurdles because of the volume, velocity, and growing complexity of disinformation campaigns.
There are multiple portions to this work. The primary sociopolitical, psychological, and social media elements that amplify the effect of social media disinformation are discussed after a glossary of relevant words is presented. After an overview of SMM's primary roles, the study discusses the obstacles faced by various stakeholders whose work involves atrocity prevention. These stakeholders include social media companies, traditional media outlets, NGOs, academics, and government and international organizations. At the conclusion, the report offers policy suggestions for these constituents.
This policy paper examines current trends in disinformation across social media platforms and offers suggestions for a wide range of actors based on findings from these surveys. Three main sources were used in our study. The first is the most up-to-date study by academics and professionals. Second, from October 2021 through May 2022, we interviewed 40 experts from governments and intergovernmental organizations, 13 experts from human rights organizations, 10 experts from technology corporations and tech oversight organizations, 8 experts from media outlets, 6 experts from groups that check for and monitor misinformation, 16 experts from research centers, and 11 experts from the computer science community. 5. These were longer-form, semi-structured interviews with both groups and individuals. Specification of the misinformation and disinformation problems and their political, legal, and social consequences, especially in regards to atrocities and instability; interviewees' team or organizational responses; assessment of the broader technical, legal, and policy initiatives currently in place; discussion of the strengths and limitations of the existing initiatives; discussion of additional steps needed from the wider practitioner community; and, id. 6. Finally, we draw from our own project with colleagues called Artificial Intelligence, Social Media, and the Prevention of Political Violence. This project makes use of novel artificial intelligence (AI) tools to identify types and patterns of visual misinformation spread across social media during times of political unrest, such as those currently occurring between Russia and Ukraine. 7
Definitions
In this study, we adopt broad formulations of many meanings of fundamental concepts in policymaking. 8. The term "social media" is used to describe various online resources that facilitate real-time interaction and information exchange among users across digital communities. Social networking sites like Facebook and Twitter, video and photo-sharing services like YouTube and Instagram, instant messaging programs like WhatsApp, and hybrid sites like Telegram are all included. Social media disinformation (SMD) refers to the subset of misinformation that is purposefully spread online and is distinguished from social media misinformation (SMI) by academics. 9. In actuality, the borders are permeable. Many disseminators of incorrect information really believe the claims they are making.
Nevertheless, not all potentially damaging social media messages are completely fake. They could be partly or mostly correct but misinforming or taken out of context, or they might promote beliefs that are problematic but not necessarily life-threatening. On top of that, it might be quite difficult to demonstrate where precisely false claims first emerged. 10
The spread of false information through social media may also be a component of influence operations, which are long-term campaigns orchestrated by non-state actors or by governments. In many other contexts, SMD seems to be highly improvised and distributed. Like many in the policy community, we refer to social media misinformation (SMD) whenever there is a reasonable suspicion of an intent to mislead, such as with influence operations. Yet, we do talk about important differences where they exist throughout the study.
What Social Media Can Do For Us
Facebook, Messenger, WhatsApp, and Instagram are four of the seven most popular social networking sites, all of which is owned by Facebook. Five of them are among the most popular alternatives to Chinese social media, with Google-owned YouTube being the sixth. WeChat,11 a software developed in China that "encompasses practically every area of human existence," has shown to be the most effective social media platform in "grabbing, holding, and processing human attention." 12 As a "one-stop-shop" solution, Facebook has been working to merge its several subsidiaries (Facebook, Messenger, WhatsApp, and Instagram) into a single platform. 13
In the eyes of optimists, social media sites are a natural outgrowth of the internet's liberalizing spirit, serving as vehicles for expanding people's agency and access to economic and political opportunities, expanding people's ability to express themselves freely, disseminating progressive ideas, and giving dissidents a voice.
14 Several early investors in Silicon Valley social media businesses had an optimistic perspective of the internet that can be traced back to John Perry Barlow's "Declaration of the Independence of Cyberspace"15. Several of these companies, which traditionally took a neutral stance on political issues, have been openly criticizing governments in recent years. 16. Mark Zuckerberg, creator of Facebook, has advocated for a new global community to replace the "old" social infrastructure of the state, "which fights the flow of information, commerce, and immigration."17. Facebook, Zuckerberg has said, "is more like a government than a regular firm" 18 Jared Cohen and Eric Schmidt, two executives of Google, authored an article discussing the profound effects the internet would have on future elections. The authors speculated that governments "would be caught off guard when significant numbers of their population, armed with essentially nothing but mobile phones, take part in mini-rebellions that threaten their authority." 19
There is no one guaranteed result from using social media. Facebook's stated purpose of "bringing the globe closer together" cannot be achieved if these networks serve to bring together only those who value democracy.
By reflecting public opinion, social media may help democracies thrive. According to Clay Shirky, people's political beliefs may be altered by being exposed to the perspectives of their friends, family, and coworkers. 20. Some individuals believe that the rise of social media around the turn of the century shifted power from authoritarian governments to regular people fighting for freedom and social justice. 21. For example, in 2018, authors Peter Singer and Emerson Brooking argued that social media "illuminated the dark crimes through which tyrants had long clung to power and provided a strong new method of grassroots mobilization." 22. According to Manuel Castells, social media is a "mobilizing force" that has the potential to "topple an established dictatorship if everyone would join together." 23. By minimizing coordination costs and improving shared awareness, these platforms may make up for the drawbacks of undisciplined groups. 24. Unrest in Moldova in 2009 (dubbed "the first Facebook revolution"), Iran in 2009 (dubbed "the first Twitter revolution"), Russia in 2011 (dubbed an "almost-revolution"), and the first wave of Arab social unrest in 2011 (dubbed "the Facebook-armed youth of Tunisia and Egypt") all involved the use of social media platforms. 25. A counter-wave of authoritarianism employing social media itself, braided into a backlash of repression, censorship, and even violence, has emerged in response to these internet-enabled democratic movements, as claimed by Singer and Brooking. 26
There is no one guaranteed result from using social media. Notwithstanding Facebook's purpose statement,27. these networks cannot unite just those who support democratic values. White nationalists and radical Buddhist monks in Myanmar now have far more powerful instruments for propagating incitement to ethnic cleansing, as noted by Zeynep Tufekci. 28. Supporting incumbent politicians inside a nation or assisting foreign authoritarian powers in disseminating propaganda and disrupting elections is possible with the use of social media. 29. Populists, who offer a fundamental threat to neoliberal ideology, also use it; their falsehoods and calls to outrage cloud people's judgment and stoke partisan passions. 30. Social media platforms have acquired an unparalleled concentration of information power over the last several years, which is being used by a wide range of players for both good and harmful ends.
Advantages of Social Media in Terms of Information Accumulation
With the massive volumes of data they have amassed, social media platforms have acquired "knowledge power" in recent years. 31. Susan Strange argues that "what is thought or understood and the channels via which these beliefs, ideas, and knowledge are conveyed, or restricted," are all examples of information that might wield such power. The ability to withhold information is just as important as the ability to disseminate it when it comes to this sort of authority. 32
There are likely numerous manifestations of social media's information power. Consider how much more information Facebook has on an individual than the government has. 33. In 2002, Google realized it could profile individuals based on their attributes and interests using the ancillary data it gathers, and then connect adverts to specific users. 34. Google and Facebook have increased their ad revenue over the years by compromising their users' security and access to their personal information. 35. While competing for profits under what Shoshana Zuboff has termed "surveillance capitalism"36, companies who have access to massive amounts of diverse data have a leg up on the competition. Because of this, social media platforms are increasing both the breadth (from the online to the physical world of the car dashboard) and the depth (by collecting data on people's personalities, moods, and emotions) of surveillance.
Facebook's algorithms not only profile and micro-target users to sell more advertisements but also forecast human behavior to produce "prediction products" that make individuals more malleable.
37. This influence was reportedly used to influence the outcome of the 2016 U.S. presidential election and the 2016 U.K. referendum on EU membership. 38
The influential position that social media currently plays in the media business is another indication of the knowledge strength of these platforms. Democratic states have an edge over non-democratic governments because their news medium is more trusted by the public. Because of their prominence in people's lives and their ability to disseminate information rapidly, social media and search engines have been dubbed "the Fifth Estate" by Lucie Greene. 39. They may now influence how society functions generally, including the kinds of media that are made, the places where people travel, and the kinds of news and information that ordinary people get to view. 40
Facebook's aim, as stated in 2012, is "to bring the world closer together by connecting everyone and everything in it." 41. The reverse has occurred, and it has only been a few years. As a platform, Facebook has helped drive wedges between communities. 42. There are a couple of key causes for this: the "filter bubble phenomenon" and the proliferation of false news.
Facebook's algorithms often bolster a "filter bubble" that prevents users from seeing other viewpoints and instead shows them only stuff that agrees with them.
43. In the digital realm, social media platforms may be found within the "attention economy," which analyses the relationship between user engagement and monetary value. Income for social media sites increases in proportion to the number of individuals using them and being shown advertisements. To keep users interested, Facebook shows them content that has been shared the most, which might include controversial or upsetting news stories that may cause them to adopt more extreme positions. 44. As a result of Facebook, people are more likely to form tight-knit groups with others who share their views, which further isolates communities with different worldviews and contributes to polarization. 45. Video recommendations made by YouTube's algorithm often reflect the political leanings of users and the content they choose to watch, while also often exposing them to content with perspectives more extreme than their own. 46
The proliferation of fake news in recent years may be traced back to the rise of social media platforms as news channels, where users can create and distribute information with little to no external vetting, verification, or editorial oversight. Under their definition, "news pieces that are purposely and verifiably untrue, and potentially mislead readers," Hunt Allcott and Matthew Gentzkow sum up the common definition of "fake news." 47. Partisan news is popular because it is cheaper to create than objective reporting, and consumers want to express their political leanings via their media consumption. When millions of individuals spread a hoax, it gains credibility because they assume it must be genuine. The most incendiary content will spread the furthest and quickest. For example, disinformation spreads far more rapidly and widely on Twitter than the truth, and since it is shared more widely, it generates more revenue for Twitter and other social media sites. 48
People are more likely to accept a statement that is congruent with their existing beliefs if the population as a whole is deeply split along distinct in-group and out-group lines.
49. In Myanmar and Sri Lanka, the spreading of hate speech on social media platforms aided in the ethnic cleansing of Rohingya Muslims and incited anti-Muslim riots, respectively. 50
During the course of the past several years, political players have started to use the informational strength of social media. A 2016 report by Rand describes the "firehose of falsity," a torrent of falsehoods, half-truths, and outright fabrications that influenced democratic elections throughout the world, including those in Ukraine, Italy, France, Germany, and the United States. 51. In the United States, for instance, "firehosing" includes efforts to sway public opinion and encourage political demonstrations. 52. Several academics have suggested that authoritarian and illiberal nations have begun exploiting social media to promote disinformation as a means of exercising "sharp power." 53. This type of power has the potential to chill free speech in democracies, increase internal division, ratchet up ethnic tensions, reignite nationalism, undermine trust in the media and electoral processes, and destabilize the Western-led international order as a whole. 54. Knowledge power from social media combined with AI is used as a surveillance tool by authoritarian and illiberal governments to gather and analyze massive volumes of data on whole populations. Moreover, authoritarian governments undermine reliable news outlets by "bot-fueled operations of trolling and diversion, or piecemeal dumps of stolen files aimed to overwhelm the focus of conventional media." 55. When people start acting as if the regime's propaganda is genuine, the government doesn't need to use force to control them.56
Artificial intelligence and the knowledge capacity of social media are being used by authoritarian and illiberal governments as a surveillance tool to gather and analyze massive quantities of data on whole populations.
But, not only authoritarian and illiberal nations employ false news to increase internal division, radicalize people's views, and revive nationalism. That happens even in free societies. The development of populist leaders in several democracies has contributed to a backslide towards national populism, illiberalism, and even tyranny in some nations. "create massive audiences around similar interests, lace their political message with misleading or incendiary information, and coordinate its spread across various platforms," write Adrian Shahbaz and Allie Funk. 57. Paolo Gerbaudo thinks that populists are drawn to social media because it gives them a platform to challenge what they see as the pro-establishment slant of traditional news outlets. Because of the filter bubble, politically dissatisfied people are able to assemble online and rally violent support for anti-establishment politicians. 58. This turns unchecked social media into a tool for manipulating politics and keeping people in line. 59
In conclusion, the impact of social media is not always positive: It has the potential to be a liberating force for the dissemination of truthful information and knowledge, but it also has the capacity to be a restraining force when used to promote propaganda and falsehoods. There is a double standard in the usage of social media since both authoritarian governments and grassroots movements are able to access and use these channels.60
The Role of Misinformation in Social Media
Although there is currently little evidence to conclude that SMM is the direct cause of serious instability or mass crimes, it may be an enabler,61. legitimizing and speeding up violence in a number of different ways. In the following, we classify them under three main purposes of disinformation. The following are some of the areas where SMM has made contributions:
VULNERABILITY TO THE OUTSIDE WORLD DUE TO:
- Relationships between people and institutions are deteriorating. Studies show that when individuals are subjected to a disinformation campaign, they are less likely to interact with others and more likely to create closed political groups online where they are less likely to be exposed to competing viewpoints. Furthermore, it creates new opportunities for radicalization. 62
- using hateful, divisive language that normalizes the view of political opponents as unreliable, even existential, foes. Extreme and persistent demeaning language provides cover for exclusion, the denial of rights, and even physical assault.63
- Discrediting leaders or opposition organizations by spreading false claims against them, including that they are corrupt or disloyal. This is particularly typical in the run-up to major changes like elections, which are generally associated with elevated chances of atrocities.
- Attacking critics in an effort to silence them and stop the debate. Harassing critics persistently sends a message to would-be dissenters that they, too, maybe a target of such behavior. It might also make violence against political opponents seem less costly.
GROUP UNITY INSPIRED BY:
- Putting forward one's own community as the genuine protector of vital ideals and the only hope for the future. This is a positive function for the organization because it helps people see themselves as part of a just cause with a lot at stake, which might lead to fewer restrictions being placed on their use of violence. The normative weight of the language used to describe the disparities between groups leaves little room for criticism or negotiation.
- One common dynamic in atrocity settings is the formation of a shared identity in response to widespread persecution, with terror and resentment at its core.64
- Support for collective self-defense or preemptive assault against imagined foes is another characteristic of societies that are likely to commit atrocities. 65. When it comes to the development of extreme advocacy organizations, hidden cells, and militia groups, as well as the dissemination of demands for direct attacks on opponents, social media is an efficient platform.
The Role of Social Media in Trying to Halt Conflict and Genocide
The spread of false information through social media creates a number of operational difficulties for organizations working to prevent, respond to, and prosecute atrocities. The following difficulties must be taken into account in order for any program to avoid instability and atrocities throughout various areas to develop policies and methods that are both feasible and successful.
SPEED
Misinformation spreads at a quicker rate and wider afield because to the proliferation of social media and the tech culture that supports it. According to one technological and policy expert, the "maximum engagement approach" forces data into the public eye without giving any thought to whether or not it will be critically examined. 61. Journalists report feeling implicit or explicit pressure to "publish content online quickly at the expense of accuracy" for profit reasons, according to research, suggesting that social media platforms are also changing norms, expectations, and practices in journalism, thereby shaping the professional cultures across all digital, print, television, and radio industries. 62. Our conversations indicated a trend on social media, with officials (and others) feeling the need to react to emergencies quickly. Acceleration plays a role in the global dissemination of COVID-19 deception and misinformation by making it more difficult to give consideration to opposing viewpoints. 63. Confirmation biases are centralized, strengthened, and amplified by SMM's rapid processing speed.
CHAOS
As a result of the proliferation of social media platforms, there is now an overwhelming amount of information to sort through, making it difficult for those tasked with preventing instability and atrocities to identify and prioritize what is most important for decision-making while also sifting through potentially misleading data. The development of SMM parallels the development of the contemporary information era when information has become a producing force in and of itself. 64. Data are being produced at an unprecedented pace, with an estimated 44 trillion gigabytes of data generated every day, mostly as a result of social media activity. 65. In fact, rather than addressing policymakers' concerns, these developments have added to the problem of data glut: "Without new skills, the dilemma of having too much data and too little insight would remain." 66. A former counselor to the United States Department of Defense said, "If you could use human judgments, that's excellent, but replacing human judgment and skill with computational analytics does not work." 67. Some of the world's most difficult problems for practical pattern recognition may be found in the chaos of atrocity-risk environments. Ex-government analyst: SMM adds "gasoline to the fog of war." 68
EDITING AND MANAGEMENT
The nature of social media encourages curating, which external players might use to promote divisiveness and conduct Covert influence operations. Many common threads about the special curative abilities of social media and its ramifications are identified by the experts consulted across a wide range of fields. 69. From software developers who make assumptions about user behavior when designing social media algorithms to political players who use dissimulation to accomplish their interests, one analyst views every social media encounter as having unseen third parties who impact the relationship. Another official spoke about how the saturation of the information space with viral SMM material, such as false charges of impending violence by opponents, might control political discourse. Atrocity offenders might use this dynamic to dehumanize their intended victims by encouraging them to focus on the perceived rather than the real severity of a threat. 70
According to experts, people are more likely to believe SMM if it comes from someone they know, such as a friend or family member. It has been argued that SMD propagandists can now test different disinformation narratives via promoted ads and adjust the content in real time based on audience feedback because social media has extended traditional marketing practices of segmentation (dividing one's market into targeted groups) into micro-segmentation. Those who can use this data strategically, such as SMM and SMD players, will have an edge in their message because of the overlap of competing narratives, conspiracy, and disarray. 71. In the policy advice section, we discuss how these dynamics underscore the significance of legacy media institutions and investigative journalism in mitigating the atrocity risks associated with SMM.
TWO KEY ELEMENTS: ABILITY AND PRIVACY
More and more cutting-edge tools are becoming accessible and user-friendly. Although the widespread availability and anonymity afforded by social media platforms may seem like a step towards a more egalitarian information ecosystem, they also provide misinformation producers, amplifiers, and financiers greater room to maneuver. SMM is more likely as capacity and anonymity grow, but our respondents also noted that these factors may help human rights defenders create or expand decentralized grassroots movements (e.g., Ushahidi and Una Hakika in Kenya). 72. Our interviews show that experts are divided on how to address the two-sided effects of increased capability and anonymity, even if individual instances vary widely. Others remain skeptical of peacebuilding groups' ability to use social media effectively in the face of strong SMM and disinformation campaigns by states or other armed actors, while others see it as a net positive that allows for new means of localized peacebuilding, information sharing, and anonymity for otherwise vulnerable sources. 73
BUSINESSES IN THE SOCIAL MEDIA INDUSTRY
Social media platforms like Facebook and Instagram's parent company, Meta, as well as microblogging platforms like Twitter and search engines like Google and YouTube's parent company, Alphabet, fall under this category of privately held businesses. While many major companies have dedicated SMM departments and plans in place, the following are still required:
- Take into account the platforms' complicated, pivotal role in any anti-fake news or other information-control measures. Shareholders should be reminded that tackling SMM is a prerequisite for leading in their market and is thus expected by their customer base.
- We need to tweak the algorithms that boost SMM, particularly with regard to dampening the influence of SMM and conspiracy accounts by demoting and de-prioritizing their material in users' feeds. Accounts that incite violence may be effectively removed from social media platforms. 74
- Consistently and aggressively removing false and bot accounts. To demonstrate that businesses are adapting to customer needs, it is important that the industry as a whole adhere to a standard that requires frequent statements including statistical information and external reviews (as major businesses like Meta now do).75
- Maintain strict adherence to established guidelines for material moderation, reporting, and deletion.
- Set up strict controls and give your employees the ability to make calls on who to flag and who to delete. Defend your employees against any corporate reprisal, such as the termination of a contract with a service provider.
- Always put SMM cases first, not only when the media is watching. Integrate upstream, preventative monitoring with the existing reactive response activities. By detecting upstream attempts, businesses will be better prepared to prevent SMM, particularly misinformation, rather than only react to it. Stronger efforts are needed to incorporate "lessons learned" from past incidents.
- Staff, executive decision-makers, and content monitors should participate in yearly polarisation and conflict awareness training. The latter group needs periodic training refreshes and supplemental knowledge on context-specific trends. Internal crisis monitoring teams should also collaborate with specialists in instability and atrocity prevention to expand their understanding of these issues. With this information, businesses may better plan for future acts of violence and times of instability.
- The moderation of material would benefit greatly from increased funding for local collaborations in the Global South. Although having access to local specialists is a huge benefit, these individuals sometimes lack the resources necessary to fully contribute. 76. A number of specialists questioned said that many IT business professionals lack firsthand experience and other expertise about civilizations ravaged by violent Acts due to warfare. Conversely, some involved in tech companies have confessed that, despite employing some local expertise, they still have huge knowledge gaps. 77
- Employees in content analysis and moderation roles should be shielded in dangerous or oppressive environments. This includes, but is not limited to, increasing the anonymization of business sources.
- Improve and codify connections between the fast-response community and those working to reduce instability and atrocities. While this has started to happen, experts say it is still in its infancy. 78. Spend money now to avoid stress later.
- Refuse to self-censor or share surveillance data on human rights activists who are trying to stop violence in their communities if pressured to do so by acknowledged authoritarian governments.
- Sponsoring and taking part in digital media literacy initiatives, journalism grants that remove media paywalls during instability and atrocity crises, and university research grant programs for poorly understood parts of SMM are all excellent ways to make the most of corporate donations (such as the specific reinforcing processes connecting online extremism and in-world violence).
- Cooperate with ad networks to reduce or eliminate exposure for sites that are known to profit from spreading SMD.79
NEWS ORGANIZATIONS THAT ARE NOT PART OF THE OLD MEDIA
- Take note of the fact that strong journalism, the prevention of atrocities, and social media all share common ground. Strong investigative journalism may be needed for years to oppose dehumanizing appeals to violence in the future. 80
- Instead of just reporting extreme rhetoric, focus on fact-checking and analysis to establish a common ground on which to fight Propaganda connected to atrocities.
- Spend money on covering major stories over time, building a public record of knowledge that may be used against future misinformation attempts that raise the likelihood of atrocities.
- Collaborate with non-governmental organizations to steer digital media literacy initiatives, a crucial step towards mitigating the spread of hateful content and the potential for tragedy through social media.
- Promote, defend, and uphold journalistic standards while also informing the public about their significance. Refusing to admit fault in the context of these standards erodes credibility as a fact-based arbitrator. At times of heightened tension caused by a buildup of SMM-related atrocity risks, this level of trust will be crucial.
- Discuss the decline of primary or investigative reporting in the news and the rise of news aggregators freely.
- Take away the paywalls on the most important news stories.
- Increase public confidence and digital literacy by taking part in, supporting, and fostering professional growth possibilities linked with non-major city reporting (such as the USA Today Network for rural community reporting). Think about the viability of these approaches in settings where there is a long-term, structural danger of atrocity.
- Discuss the human propensity for epistemic unease during public gatherings including academics and public intellectuals.
- Bring topics of prejudice and cognitive dissonance into the mainstream, particularly among journalists, and be transparent about the measures taken to reduce them in the reporting you produce.
NON-STATE PARTICIPANTS IN CIVIC ACTIVITY
- To lessen the impact of siloed initiatives and strengthen connections before crises erupt, it is recommended to host tabletop exercises and atrocity prevention simulations connected to SMM for stakeholders from all categories.
- Create a strategy with other civil society groups to improve cooperation around monitoring and moderation on social media platforms to avoid using disparate approaches that undermine one another. Get the social media businesses' feedback and support.
- Promote social media responsibility via public advocacy. Get in touch with those who have a vested interest in the social media platform, and do all you can to help reformers inside the system gain political capital. In addition, the public should keep the pressure on IT firms that "whitewash" their SMM efforts with cosmetic changes.
- Put pressure on the IT industry to develop enforceable industry-wide standards and practices for SMM (little incentive exists for them to do so individually). More openness, more participation, and measures that go beyond "low-hanging fruit" are all ways in which the Global Internet Forum to Fight Terrorism may be improved and made into something far more effective. 81
- In order to remove paywalls and provide free access to important news, funding for the media should be encouraged.
- It's important to be practical about your organization's skills and resources and to plot out a strategy for the division of work with other civil society groups.
- Help provide a foundation for personalized and long-term SMM responses by contributing a thorough conflict mapping of instability situations and players. Tech businesses can better implement the proposal for greater upstream, preventative actions if they have a better understanding of these circumstances.
- Engage in critical introspection and give priority to creating spaces where experts from the Global South may share their knowledge.
SCIENTISTS AND THE PUBLIC
- Collaborate with other interested parties in tabletop activities aimed at fostering the development of professional networks. Building a pool of scholars with SMM and atrocity prevention knowledge at all career levels will assure long-term sustainability, therefore senior academics should internally push for such efforts to meet early-career researchers' performance goals (such as tenure).
- Distributed fact-checking systems should be bolstered so that false information may be identified, reported, and exposed. One way to do this is to pool knowledge on issues of great political significance (elections, public health, security, etc.).
- Keep in mind the divergent, rapid-fire timetables of atrocity prevention and Tragedy reaction, and think about how the study might give empirically supported frameworks for information organization in the here and now.
- Make it a habit to regularly consult with other relevant parties to determine the specific forms of research required to fill out the atrocity prevention/SMM toolbox.
- Provide workshops for the public on SMM monitoring and digital media literacy with members of civil society.
- Collaborate with psychologists and other specialists to create awareness-raising initiatives that combat biases stemming from SMM.
- The use of SMM analysis in early warning systems for violence and instability should be prioritized.82
- Encourage the participation of local experts, particularly those from underrepresented groups in the Global South, in the process of producing new knowledge on SMM. Provide people safety and anonymity in dangerous or oppressive environments.
- Use the university's media office, in-person interviews with local journalists, and non-profits like the Alan Alda Center for Communicating Science to hone your media abilities before submitting a grant proposal. Use these abilities to "translate" scientific findings for use by the SMM community and the general public.
- Collaborate with specialists on preventing instability and atrocities to create more accurate and usable frameworks of "damage, "83 taking into account the rapid spread of information enabled by social media.
BOTH NATIONAL AND INTERNATIONAL ORGANIZATIONS
- Build up your organization's in-house SMM analysis skills by doing things like filling information officer positions with people who know their way around the intersection of instability, atrocity avoidance, and SMM.
- Integrate SMM into policy toolkits for early warning and accountability by working with academic experts. Support case study-based investigation on transferability of methods, tools, and lessons learned.
- Raise the level of understanding and awareness among policy analysts and decision-makers on how social media platforms, even those with limited regulation like Telegram and Gab, function on a global scale.
- Avoid having the government become too involved in the process since this might increase the skepticism of the government. Investigate if civil society or other stakeholders may take the helm in public messaging on a controversial issue.
- Inquire about these issues both internally and with government and multinational partners, with a focus on any internal mismatches in SMM working definitions. If you ask the right questions, you may find out whether the ineffectiveness of a policy is caused by conflicting mandates or ambiguous working definitions.
- Make sure that digital firms that censor SMM are required to disclose data related to atrocities/instability with human rights researchers and prosecutors through strengthening laws and international agreements. In the IT industry, content that violates the appropriate Terms of Service is generally removed swiftly and only stored for a short amount of time, despite the fact that this content might be useful for analysis and accountability. Stored Communications Act, Cloud Act, and other Mutual Legal Assistance Treaties (MLAT) are examples of current legal routes that are time-consuming and inefficient. 84. With the help of the Berkeley Protocol on Digital Source Investigations, we can get some real-world advice on how to better this. 85.
- Institutionalize clear responsibilities and rules to combat SMM, both internally and externally.
- 86. To counteract SMM, civic society must take care not to do anything that may weaken that effort.
- Don't make the mistake of putting in more effort without thinking about the ultimate result.
- It is imperative that democracies and international organizations utilize their clout and lobbying opportunities with other states. Local content moderators for social media platforms might nonetheless risk state persecution in non-free situations despite the firms' apparent clout.
- To better incorporate SMM analysis into existing networks, policies, and doctrine for preventing instability and atrocities, use global and regional organizations and platforms (such as the UN, EU, AU, and OAS). Much of this is done on an ad hoc basis, with very little in the way of formalized knowledge exchange. 87. To effectively counteract SMM, intergovernmental bodies may play a pivotal role in coordinating and exchanging information on relevant norms, regulations, and practical measures.
- Legislators and lawmakers should think about passing legislation that provides accountability and oversight systems for violence and incitement that may be directly related to platforms, in addition to NGOs and academic-led monitoring activities.88
Acknowledgments
Several of the people we interviewed preferred to remain nameless, and we appreciate their cooperation. The authors would also like to express their appreciation to the people who participated in the discussions: Isabelle Arnson, Amir Bagherpour, Rachel Barkley, Jeremy Blackburn, Cathy Buerger, Kate Ferguson, Tibi Galis, Andrea Gittleman, Derrick Gyamfi, Maygane Janin, Ashleigh Landau, Khya Morton, Savita Pawnday, Max Pensky, Iria Puyosa, Sandra Ristovska, Walter Scheirer Lastly, we appreciate the help of the Stimson Center's Ilhan Dahir, James Finkel, Shakiba Mashayekhi, and Lisa Sharland.
An Overview of the Writers
Kristina Hook teaches in the conflict management department of Kennesaw State University's Department of Conflict Management, Peacebuilding, and Development. Expert in post-conflict rehabilitation and environmental damage caused by the war in Ukraine and Russia, as well as in the prevention of genocide and mass atrocities. On a regular basis, she discusses these topics with government officials, international groups, and human rights groups. She worked as a conflict stabilization policy adviser for the United States Department of State before she entered the academic world.
Professor Ernesto Verdeja teaches at the University of Notre Dame's Kroc Centre for International Peace Studies at the Keough School of Global Affairs. He also oversees the operations of the Institute for the Study of Genocide as its Executive Director. His studies center on transitional justice, social media deception and mass violence, and the causes and prevention of genocide and other mass tragedies. On a regular basis, he advises governments and human rights groups on these matters.
0 Comments