Antisemitism Trend Alert

Denial of the October 7 Massacre on Social Media Platforms
Total 13 pages

Executive Summary

Page 1

Executive Summary

  • The attack on October 7, 2023, carried out by Hamas terrorists, is the deadliest massacre against Jews since the Holocaust. The attack was antisemitic not only because most of the victims were Jews, but also due to the intent of the attackers – to murder and terrorize Jews. This intent is made clear both by the 1988 Hamas Charter, which calls for a holy war against Jews, and in the stated intentions declared by the perpetrators of the massacre that day.
  • In addition to open celebration of these attacks by sympathizers and supporters, in the days following the October 7 massacre, well produced content, misinformation, and disinformation refuting the scope and nature of the Hamas massacre spread on social media platforms.
  • CyberWell identified three main sub-narratives perpetuating denial and distortion of the events of October 7: there were no acts of rape; the State of Israel orchestrated the violent events; and Israel and the Jews are profiting from the massacre.
  • Similar to Holocaust denial, which is widely recognized as antisemitic and regularly actioned as violating digital policy by most mainstream social media platforms, the denial and distortion of violent atrocities committed against Jewish people, such as the events of October 7, is antisemitic at the core.
  • The dataset in this report is based on 313 pieces of content collected from Facebook, Instagram, TikTok, X, and YouTube, denying and distorting the violent atrocities committed on October This content was studied and verified by CyberWell’s team. Although these posts had a far-reaching impact, collectively garnering over 25 million views, after being reported to the platforms, only 6% of this dataset set was removed. The numbers are even more alarming when examining X alone, which had a removal rate of only 2%. X is the leading platform hosting October 7 denial content in this dataset.
  • The low removal rate of content denying October 7 on social media platforms reveals major gaps in either enforcement of platform policies or in failure to include the massacre on October 7 in the "list" of recognized violent events.
  • An in-depth examination of online discourse denying the October 7 atrocities indicates that a small number of prominent influencers spread this Jew-hatred as early as the morning of October While journalists and reporters led denial discourse on X, TV stars and celebrities led denial discourse on TikTok and Instagram.

Introduction

Page 2

Introduction

Ahead of International Holocaust Remembrance Day on January 27, 2024, CyberWell analyzed one of the latest trending phenomena in online Jew-hatred – denying the atrocities committed by Hamas terrorists on October 7, 2023. The Holocaust and the October 7 massacre are two distinct historical events and CyberWell cautions against conflating the two without appropriate context and nuance. However, aside from purposeful violent targeting and mass murder of the Jewish minority, the two events share another commonality – propagation of the denial of the occurrence and scope of both events being leveraged to promote Jew-hatred. Whereas Holocaust denial at its height was limited to fringe academic circles and extremist hate groups who gained a limited following through traditional media, conferences, and papers, today social media platforms provide an algorithmically enhanced stage to disseminate the antisemitic narrative of
October 7 denial directly into the mainstream from a select few influential accounts.

According to the International Holocaust Remembrance Alliance’s (IHRA) working
definition of antisemitism, a globally recognized consensus definition, Holocaust denial and distortion is antisemitism. Often this takes the form of claiming that Jews invented the Holocaust to gain sympathies for the establishment of the State of Israel.

On October 7 mass atrocities were committed by Hamas terrorists against the Jewish people in Israel. These atrocities were fueled by Hamas’ expressed charter to wage a holy battle against the Jews.  Hamas’ charter was updated in 2017, replacing the word “Jews” with “Zionists”; however, it maintained the position of being engaged in a holy war against the “Zionist project”, describing it as the enemy of the Arab and Islamic nations. Furthermore, Hamas leaders have continuously evoked virulent, hateful, and violent statements against the Jewish people in public and through religious statements.

The central role that social media played during the Hamas massacre of Israeli civilians is historic and unique. First, Hamas terrorists harnessed social media as a vehicle of psychological torture, live-streaming their brutal attack. Second, Hamas terrorists and their sympathizers exploited social media as a mass misinformation and disinformation machine, amplifying claims that deny the reality of the atrocities and asserting that victims, survivors, and first responders are lying or exaggerating their experiences.

Despite the terrorists recording their actions and uploading images and videos to their
own social media channels, the second wave of violent event denial and claims that Jews were lying to gain sympathy quickly began to unfold.

Denial of violent atrocities committed against Jewish people is antisemitism. The recycling of the same denial mechanism that was used against Jews following the Holocaust is once again being used against Jews today. In this report, CyberWell sets out to track, map, and analyze the emerging online trend of October 7 denial, demonstrate how denying this violent atrocity is antisemitic at the core, and make the case that social media platforms should prohibit October 7 denialism as stringently as they do Holocaust denial.

CyberWell

Page 3

CyberWell

CyberWell is a non-profit organization dedicated to eradicating online antisemitism through driving the enforcement and improvement of community standards and hate speech policies across social media platforms. Through data, we identify where policies are not being enforced and where they fail to protect Jewish users from harassment and hate. Our unique methodology consists of identifying antisemitic keywords, applying a specialized dictionary based on the International Holocaust Remembrance Alliance’s (IHRA) definition of antisemitism, and human review. Our professional analysts are trained in the fields of antisemitism, linguistics, and digital policy, and vet each piece of content we collect both based on the IHRA definition and according to what, if any, policy that content violates. CyberWell currently monitors Facebook, Instagram, X (formerly Twitter), TikTok, and YouTube in both English and Arabic.

October 7 Denial Dataset

Page 4

October 7 Denial Dataset

In the month leading up to International Holocaust Remembrance Day, CyberWell conducted a deep-dive and reviewed 910 social media posts that were flagged by our AI technology as highly likely to include discourse connected to October 7 denial and distortion from Facebook, Instagram, TikTok, X, and YouTube in Arabic and English.

This narrative is characterized by claims that the violent events Hamas committed on October 7 either did not take place or were exaggerated/falsified in both scale and nature. CyberWell then categorized these posts according to the major narratives promoting denial.

Out of the 910 posts analyzed, CyberWell confirmed 313 posts as October 7 denial or distortion. It is our hope that the insights gained from this small initial dataset will guide social media platforms on how to flag and address this new antisemitic trend with their content moderation teams.

Methodology

Page 5

Methodology

  1. Analysis comparing Holocaust denial discourse and October 7 denial discourse
  2. Review of social media platforms’ policies regarding the denial of October 7 specifically and policies regarding denial of violent events in general.
  3. Data collection and review
  4. Narrative content analysis
  5. Reporting full dataset to platforms
  6. Enforcement check per platform (24 hours after reporting).

 

Denial of October 7 is Antisemitism

Page 6

Denial of October 7 is Antisemitism

The atrocities committed against civilians by the Hamas terrorist organization on October 7, 2023, was the most devastating massacre against Jews since the Holocaust, both in the number of victims and in the brutality of the acts, which included but was not limited to sexual violence, burning people alive, and kidnapping hostages into Gaza.4 The attack was undoubtedly antisemitic, not only due to the background of the victims, the vast majority of whom were Jews, but also due to the intention of Hamas to deliberately target and harm Jews.

Hamas’ core charter, the 1988 Hamas Covenant, lays out their intention to commit genocide against the global Jewish community. While a central component includes the political goal of destroying the State of Israel, throughout its 36 sections it mentions “the Jews” dozens of times in negative and antisemitic contexts. The charter blames global world Jewry for a variety of alleged harms and indicates Hamas’ intention to destroy Jews as a people and not only to revolt against a political institution.

In comparison, in the 25-point Program of the Nazi Party for the creation of a Nazi State and society, the Jews were mentioned only twice.

Aside from antisemitism and genocidal Jew-hatred as part of their core tenants, the barbaric atrocities committed on October 7 by Hamas terrorists against Jewish civilians leaves no room to doubt their antisemitic intentions. For example, in an appalling recording of a telephone conversation between a Hamas terrorist and his parents, shortly after he murdered ten residents of Kibbutz Mefalsim in southern Israel, the terrorist shouts excitedly:

“Hi Dad! I’m calling you from Mefalsim. Open your WhatsApp now and you’ll see all those I killed. Look how many I killed with my own hands! Your son just killed Jews! [...] Dad, I’m calling you from a Jewish woman’s phone. I killed her and I killed her husband. I killed ten with my own hands! Dad! Ten! With my own hands. Ten! Dad, open WhatsApp and see how many I killed!”.

Global society recognizes that denying and distorting the horrors of the Holocaust is a form of antisemitism. The October 7 attack – an attack deeply reminiscent of the Holocaust in its targeting of Jewish life and safety – is antisemitic at its core. Therefore, discourse that denies or distorts the mass killing and the atrocities committed by Hamas terrorists is also antisemitic.

IHRA Working Definition of Antisemitism

As part of CyberWell’s ongoing methodology of monitoring for antisemitic content across the five leading social media platforms – Facebook, Instagram, X, YouTube, and TikTok – we classify every antisemitic post that we vet according to the International Holocaust Remembrance Alliance’s (IHRA) working definition of antisemitism, which includes eleven examples.

As of the writing of this report, these eleven IHRA examples do not explicitly address the antisemitism manifested in the denial and distortion of the events of October 7 – the only specific event that is currently included in IHRA's examples refers to the denial and distortion of the Holocaust, as can be seen in examples 4 and 5:

Example 4: “Denying the fact, scope, mechanisms (e.g. gas chambers) or intentionality of the genocide of the Jewish people at the hands of National Socialist Germany and its supporters and accomplices during World War II (the Holocaust)”.

Example 5: “Accusing the Jews as a people, or Israel as a state, of inventing or exaggerating the Holocaust”.

While the October 7 attack is highly reminiscent of the events that took place during the Holocaust, and therefore content denying the events of October 7 could be classified as examples 4 and 5, for methodology preservation purposes and in order to respect the distinct historic integrity of the Holocaust, CyberWell classified these posts in our open database of online antisemitism as IHRA Example 1, which states that antisemitism is:

“Calling for, aiding, or justifying the killing or harming of Jews in the name of a radical ideology or an extremist view of religion”.

It is CyberWell’s position that discourse denying a mass atrocity committed against Jews or distorting its scope with the intention to minimize or trivialize that harm constitutes an act of encouraging violence and promoting antisemitism. As no other IHRA example includes the denial of mass atrocities committed against Jews in general, CyberWell concluded that Example 1 is most relevant as the definition is written today. However, CyberWell calls for academic and international acknowledgement that denying the scope or intentionality of violence against Jews for being Jews, blaming the Israeli State for those same atrocities8 or for the purposes of benefiting from that violence, or accusing the Israeli State or Jews as a people of fabricating documented atrocities and violence against Jews for being Jews is antisemitism.

Additional Denial Sub-Narratives

Beyond the IHRA working definition, CyberWell also identified three additional sub- narratives supporting the promotion of October 7 distortion:

  • Israel/Jews are profiting from the
  • The massacre was justified due to events occurring prior to October
  • The testimonies of victims, survivors, and first responders from the massacre are not reliable.

Denial as Antisemitism Matrix

Page 7

Denial as Antisemitism Matrix

To better demonstrate the parallels between Holocaust denial discourse and denial of the October 7 massacre discourse, we further broke down relevant IHRA examples into specific elements and means of denial.

 

Examples of Antisemitic Posts Online

Page 8

In this report we present a sample of posts representing some of the above narratives
of October 7 denial and distortion. The full dataset is available on CyberWell’s public
reporting platform – the first open database of online antisemitism.

Example 1

Promoted Narratives

  • There were no rapes
  • Israel lied about October 7
  • Israel is profiting from the massacre: justification for the war in Gaza

NOTE: For more information on the spread and impact of this post on X please see the Network Analysis section.

Example 2

Views | 112,000

Voiceover: "[...] First of all we find out that there were no beheaded babies [...] there
are none. No dead babies, okay? And we say 'you know what? You know what? That's
fine. They still raped a bunch of girls, right? Right? We went on a manhunt looking for
these girls. Show us one victim. Yahabibi.’ There are no rapes. No one was rapeed.
[...]" (00:00-00:28)

Promoted Narratives
• Children were not murdered
• There were no rapes

Example 3

Promoted Narratives

• Children were not murdered
• There were no rapes

Example 4

Views | 2 million

In this interview to Aljazeera, the journalist Max Blumenthal claims that Hamas’ main
targets in the October 7 attack were soldiers and not civilians: "[...] At least 50% of those who were killed were on military bases at the time. They were Israel's Gaza division [...] Active-duty soldiers in uniform. They were the main targets of the Hamas commando division [...]" - [00:50 - 01:04].

In addition, Blumenthal implies that, of the Israeli civilians who were killed, most were
killed by the IDF: "[...] How many Israelis were actually killed by the Israeli military, using the same doctrine of disproportionate force and the indiscriminate use of heavy weapons, that we are now seeing in the Gaza strip[...]" - [00:30 - 00:43].

Promoted Narratives

• Israel is responsible for the massacre
• Soldiers, not civilians, were the main target of the attack
• Jewish/Israeli civilian victims do not matter [the other 50% of those killed, according to his claim]

NOTE: Manipulated media - A large component of October 7 denial discourse on social
media relies heavily on articles and news interviews from the Israeli media that were
manipulated, taken out of context, or mistranslated to serve denial narratives. The next
example presents just such an instance.

Example 5

Inserted English Caption:

"I've seen this with my own eyes" - [00:00 - 00:02]

"It is the the zionists soldiers and settlers who unalived Israelis and raped them" - [00:03 - 00:08]

Actual words translated to English by CyberWell’s professional staff:

"Who butchered, who raped, who murdered, and who beheaded? Those were the citizens, not Hamas. It was the citizens." - [00:00-00:06]

Inserted English Caption:

"That's not Hamas, Hamas came and unalived soldiers and they went further." -[00:12 - 00:14]

"Those who came later and unalived civilians and raped them were settlers." - [00:16 - 00:27]

Actual words translated to English by CyberWell’s professional staff:

"This is what they don't understand, that it was the citizens. Hamas opened the door, took down soldiers and continued forward. Those who came, raped and looted and did all this atrocity - this cruelty - are the citizens." - [00:06 - 00:22]

NOTE: Context matters - when this soldier refers to “civilians” who committed many of the atrocities, he means Palestinian civilians who came into Israel after Hamas breached the security border. The user who created this video not only mistranslated his words, but further distorted the interview by adding in the terms “Zionist” and “settlers” to claim that Israeli civilians committed the atrocities against their own people.

Example 6: Circle of Denial

A particularly disturbing phenomenon that has come about following October 7 is a trend of "revisiting" the integrity of facts about the Holocaust. This demonstrates that denial of the events of October 7 can lead to Holocaust denial and not just the other way around.

Social Media Policy Analysis

Page 9

Social Media Policy Analysis

As part of CyberWell’s analysis on potential social media policy and enforcement gaps regarding the denial and distortion of the events of October 7, we analyzed the community guidelines and standards of the platforms we monitor and the relevant policy updates released after October 7. A more detailed look at platform policies can be found in the Appendix.

Violent Events Denial Policies

  • Denial of violent events or mocking victims of violent events is prohibited by all platforms (Meta, TikTok, X, YouTube).

Special Announcements Following October 7

  • X, TikTok, and Meta published special policy announcements. None of them explicitly mentioned the way they treat denial of the October 7 massacre.
  • YouTube has not made an official statement regarding the enforcement of its hate speech policy on content addressing the October 7 attack, but from research conducted for the compilation of this report, it appears to be the platform with the least amount of content explicitly denying and distorting the It is worth noting that this marked difference from other platforms could be linked to YouTube’s express guidance on the enforcement of their hate speech policy saying that they, “will remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place.”

Misinformation Policy

  • X is the only platform with no “active” misinformation policy, but instead relies on community notes and crowdsourcing to “fact-check” posts. However, we do see some efforts to label some of the content on the platform as false.
  • All platforms refer to “manipulated media” as policy violating content. Only X refers specifically to using media items in a misleading context. This is important in addressing denial and distortion of the YouTube specifically mentions that their “manipulated content” policy usually applies beyond clips taken out of context.

Fact-Checking

  • Meta, TikTok, and YouTube take part in fact-checking programs.
  • X relies on its “community notes” feature, allowing users to add context to potentially misleading posts.

Meta, TikTok, and YouTube participate in fact-checking programs, but they pose serious challenges. The two most problematic include time (the length of time it takes to verify or deny a claim while potentially false information remains online) and credibility (the trustworthiness or potential biases of the fact checkers themselves).

Aside from X, which uses the community notes feature as its primary method of fact- checking, a risk of its own, all other platforms that CyberWell monitors rely on third- party, independent fact checkers. Platforms assert that all fact checkers have been certified by the International Fact-Checking Network (IFCN) and are therefore unbiased and non- partisan. This enables the platforms to claim impartiality. While a thorough examination of the platforms’ fact-checking partners is beyond the scope of this report, our analysts conducted a quick examination of several fact-checking websites and found different approaches to verifying the same narrative resulting in vastly different conclusions. “Unbiased” fact-checking is rarely straightforward and can lead to serious harm, in this case when the October 7 denial narratives are treated as regular content to be strenuously fact-checked instead of recognized as nefarious smears meant to promote Jew-hatred.

Policy Enforcement in Practice

Page 10

Policy Enforcement in Practice

CyberWell checked the enforcement of the social media platforms’ policy on the content
we collected according to each platform’s individual standards, including X's 'freedom of
speech, not freedom of reach' approach.

Out of the 313 posts that CyberWell reported, 8.91% were removed by the platforms
prior to publishing this report.

“Labeling” | Fact-checking, Community Notes & Inappropriate Videos

Meta Labels

Out of 93 posts collected on Meta platforms, only 13 were removed. An additional 3 posts (3.2%) each received a different form of labeling aimed at alerting users that the information presented in the post is false, inaccurate, or misleading.

Example 1

"Partly false information
Independent fact-checkers say this information has some factual inaccuracies. You can choose whether to see it."

Example 2

“Altered photo. Independent fact-checkers say this information could mislead people.”

The two posts above asserted the denial narratives of 'Israel did it', 'Israel profits from October 7', and 'there were no rapes'.

Example 3

“False Information
The same information was reviewed by independent fact-checkers in another post.”

X Labels

On X, out of 148 posts reported, 19 (12.8%) received a community note12 saying:
"Readers added context they thought people might want to know".

These 19 posts (all retweets of the same original tweet) were categorized by CyberWell as 'Denial of Hamas' October 7 Massacre'.

Only one post in the dataset received a label related to the credibility of the
denial.

“This media is presented out of context”

TikTok

No videos in the examined dataset received a label alert of false content.

YouTube

Two out of the 18 videos reported in this dataset received the label: “This video may
be inappropriate for some users” with the option “I understand and wish to proceed”.
The label did not mention the misinformation or distortion of facts presented in the video.

This user claims that Israel lied about the atrocities committed by Hamas on October 7 and posted a video created by Propoganda.co.

Caption: "[...] He (an Israeli soldier) says Hamas fighters burned babies and then beheaded them. But this is all a lie [...]" - [00:10 - 00:17].

Title: "Here’s What Really Happened on Oct 7: Israel lied to carry out ethnic cleansing campaign in Gaza".

Description: "The entire Israel-Western narrative of Oct 7th is built on lies, and is being used to justify the most inhumane war crimes we are all witnessing now in Gaza. Israel's own media dismantles the Oct 7 lies that are being parroted by Israel. This was never about Hamas. Hamas is being used as an excuse to carry out ethnic cleansing of Gaza."

Promoted Narratives

  • Israel lies about October 7
  • Israel/Jews profit from the massacre: justification for the war in Gaza
  • Hamas didn’t commit the massacre
  • Children were not murdered

“3 Lies” Narrative

CyberWell conducted an in-depth analysis into how three platforms treated one example of the same video being spread as part of denial discourse.

The video, which went viral on social media within a few days of October 7, aimed to
“expose three lies” supposedly being told about the massacre:

1. 40 Babies were killed.
2. People were raped.
3. 250 civilians were killed at the Nova music festival.

This promoted major narratives of denying the means of the massacre (murdering
babies and rape) and denying the fact and scope of murdered civilians.

Quote: "[...] People were raped. Also False. There is no evidence of this whatsoever[...]" - [0024:-00:29]; "[...] 250 People were killed at a concert. False[...]" - [00:30-00:34].

X “3 lies”

Some of the tweets promoting this video received the label “The following media includes potentially sensitive content” without including an alert of potential false information.

Other tweets sharing the same video on X, gaining millions of views, received no
labeling.

Meta “3 lies”

Some of the posts with the video were labeled as “Partly false information”.

TikTok

No labeling was detected in the examples in this dataset.

Views | 14,000

 

Enforcement Insights

• The examined social media platforms have major gaps in either enforcement of
their policies or in failure to include the massacre on October 7 in their “list” of
recognized violent events.

• Labeling – predicated on the results of third-party fact-checking – is diverse, not
systematic, and was not identified in the majority of content in this dataset of the
denial and distortion of October 7.

• X’s community notes policy, which puts the responsibility of fact-checking on the
users instead of the platform itself, is not being systematically applied to the denial
of the October 7 massacre.

• Denial and distortion of the October 7 massacre is not being de-amplified on X
despite their “no freedom of reach” policy, as evidenced by the high number of views gained by posts in this dataset.

• The platforms failed to address the issue of denial and distortion of October 7 in a reasonable timeframe. Some of the posts went viral and are still online months
after being posted.

Network Analysis

Page 11

Network Analysis

Background

Speculation aimed at denying the October 7 terror attacks perpetrated by Hamas against Israeli citizens began circulating across social media shortly after the massacres were committed and despite Hamas terrorists personally documenting, streaming, sharing, and celebrating their actions. While the initial major narratives tended toward a broader theme of “Israeli lies”, from October 12 onwards discourse began targeting specific events, such as the beheading of Israeli babies and mass rape.

Analysis on X

One of the first tweets claiming that the IDF killed Israeli civilians and thereby absolving Hamas of their atrocities was published the day after the massacres on the morning of October 8 at 9:12 AM by Elias Khoury, a X user based in Lebanon. Though his number of followers is modest (6,552), his post gained over 381,800 views.

On the same day, minutes after tweeting the previous post (9:23 AM), Khoury tweeted several frames of a video showing the dead body of an Israeli woman. He again absolved Hamas of the crime of sexually assaulting Israeli women by claiming that the woman in the video often attended concerts “underdressed”.

On October 9, several posts began making the rounds alleging that Israel was spreading “lies” about the massacre. Ali Abunimah, an American-Palestinian journalist, founder, and editor of the online newspaper “Electronic Intifada”, played a pivotal role in promoting October 7 denial. Together, Abunimah and Electronic Intifada have a network of over 206,700 followers and 220,400 followers respectively on X. On October 9 Abunimah shared a post – no longer available – claiming that the Israeli media was lying about the massacre at the Nova music festival since Hamas allegedly did not possess the ability to carry out such a large-scale act. A few hours later, Abunimah shared another post stressing that the first testimonies shared by the survivors of Hamas’s terror attacks had “no evidence” and needed proof. The two posts together gained over 65,000 views.

Abunimah has since continued to use Electronic Intifada to spread a number a false narratives promoting October 7 massacre denial. On October 11 Electronic Intifada began spreading the “no women were raped” narrative, which then became a common trope featuring in several Electronic Intifada articles and webinars.
The overall attitude of absolving Hamas of its atrocities can best be understood in the context of an early tweet, which goes beyond justifying “armed resistance” and defines it as “legal”.

As the days passed, the “Israel lies” trope became more specific and began focusing on the denial of the beheaded babies and of Hamas mass rape. On October 10 the account zei_squirrel (220,200 followers on X; 5,400 subscribers on YouTube) shared a post that gained over 3.7 million views calling out all the journalists who reported that Hamas beheaded babies and questioning their credibility\

The following graph maps the virality potential of the above post – i.e. the spread of the tweet across X and other platforms. The dark blue circle on the left represents the original post shared by zei_squirrel, while the other lighter blue circles represent posts linking to or sharing the original post. The size of the light blue circles represents the number of shares gained by zei_squirrel’s content: the larger a light blue bubble, the higher the number of retweets in a given time. Zei_squirrel's post was rapidly shared extensively by other users.

On the same day, October 10, the account Propaganda and Co (121,600 followers,
with a YouTube channel) tweeted a post denying the testimonies of Hamas beheading babies, followed on October 12 by an additional tweet claiming that Israel invented the claim to gain sympathy from the Western world.

The profile also endorsed the denial of mass rape by Hamas terrorists, sometimes quoting Ali Abunimah as their source. This shows the pivotal role Abunimah played in spreading October 7 denial theories.

One of the leading sources behind the denial phenomenon is the American political commentator and influencer Jackson Hinkle, who has an X following of over 2.4 million. On October 12 Hinkle tweeted two posts denying that Hamas beheaded Israeli babies. The posts went viral, with 2.6 million and 805,000 views. Hinkle further pushed thi.s narrative with an additional post the following day, October 13, where he also blamed the US for supporting the alleged lie. The post gained 6.8 million views, demonstrating the virality of Hinkle’s communication style, power of his network, and his responsibility
in spreading denial discourse.

Analysis shows that the text of Hinkle’s post from October 13 originated from the account Hadi Nasrallah, also seen in example 1 above (a profile named after Hassan Nasrallah’s son, who died in 1997). While the Hadi Nasrallah account has 126,700 followers, the post gained 2.8 million views. The graph below shows that Hadi Nasrallah's post continued to be retweeted long after the posting date – far into the month of November – which is uncommon for a tweet.

Similarly, Max Blumenthal, a blogger and contributor to several anti-Israel online media outlets with an X account of almost 590,00 followers, began spreading October 7 denial on October 12 with a post suggesting that there was no evidence for the pictures of beheaded babies,  and again on October 13 when he suggested that the pictures available were created with AI tools by Israel to horrify Western governments and media. The two posts jointly gained more than 1 million views.

Al Jazeera also disseminated October 7 denialism on their social media platforms. On October 13 AJ+ shared a video aimed at demonstrating that the Hamas attack “wasn’t unprovoked” - according to Hamas – and therefore their atrocities against civilians was justified.

Al Jazeera also asserted that the claim that Hamas beheaded babies was a lie, and furthered this narrative claiming it was “baseless” in a post discussing President Biden viewing the photos. Their posts and articles were frequently quoted as the official source by other influencers who endorsed the denial theory, such as Mario Nawfal and Sulaiman Ahmed.

Mario Nawfal has emerged as one of the top alternative media stars on X, with 1.2 million followers and an online show “The Roundtable Show,” which he says brings in more than 6 million listeners per week on Twitter Spaces. Nawfal’s profile echoed numerous October 7 denial theories, which were frequently supported and justified by citing confidential Hamas sources. Nawfal asserted the importance of protecting his sources but has repeatedly refused to offer any form of verification. He has used the “confidential sources” tactic numerous times, from denying the beheading of babies on October 11 to denying mass rape, which became a common narrative on Nawfal’s profile at the beginning of November. He further argued that the Hamas guide including instructions to rape Israeli women, which was found by the IDF on Hamas terrorists, was not actually referring to rape but rather how to undress male soldiers.  In addition, Nawfal began promoting the narrative that testimonies by Israeli women of Hamas sexual assault were “fabricated”. Nawfal’s denial approach appears subtle, as it adopts sympathetic statements like “My heart goes out to every woman harmed on October 7”, yet the core conclusion of the content that he creates and promotes denies victims their voice, calls them liars, and supports their abusers.

A post published by pro-Israel lobbyist Jay Engelmayer on October 9, which blames Sulaiman Ahmed of exploiting Nawfal’s profile to spread lies on the atrocities committed by Hamas, casts light on the mechanism of reciprocal influence upon which the phenomenon of October 7 denial is built.

Sulaiman Ahmed is an engineer and PhD candidate in Philosophy at the University of
Wales Trinity Saint David, who has an X profile with 345,000 followers as well as
Telegram and Rumble channels where he frequently discusses anti-Israel themes.40 On October 10 Ahmed was one of the first influencers to spread arguments denying the mass rapes perpetrated by Hamas in a post that gained 540,000 views. In the same post he also used the classic antisemitic trope of the Jews controlling the media to support his
argument that the claims of Hamas sexual assaults were lies deliberately promoted by
Israel via their global media control.41 In the following days Ahmed also promoted denial of the beheaded babies, claiming that this was also fabricated by Israel. The two posts together had almost 55,000 views.

Network & Reciprocal Influence

The mapping of the aforementioned influencers reveals that October 7 denial theories were heavily spread by a small number of dedicated social media users (individuals and organizations) who follow one another and share a common anti-Israel and antisemitic bias. For example, Al Jazeera is followed by Jackson Hinkle; both Max Blumenthal and Jackson Hinkle follow Lowkey, a rapper, influencer, and anti-Israel activist whose posts on “debunking” Israel’s alleged fake argument on the beheaded babies gained 4 million views.

TikTok & Instagram

It is more challenging to map pockets of influencer networks on visual platforms like TikTok and Instagram due to their visual nature and structure.

However, CyberWell was able to identify one important connection between TikTok, Instagram, and Twitter. On October 29 the TikTok user Billy Oppenheimer shared a video produced by Propaganda and Co aimed at “exposing all the lies of October 7th”, which included the claim that the IDF killed Israeli civilians and skepticism over the killing of Israeli babies. Even though Oppenheimer only has 150 followers, the video received 24.2K views and was re-shared on Instagram by the user Matilde Jubia (Matilde Della Porta), a pro-Palestinian student and activist based in Rome.

On October 21 Oppenheimer further published a compilation of pictures claiming that Israel uses AI tools to fabricate lies.

One of the first videos spreading the claim that Hamas did not behead Israeli babies was shared by the TikTok user LeadingLeah, whose account is heavily devoted to pro
Palestinian content. LeadingLeah has 56,900 followers and their one video denying the
Hamas atrocities, published on October 14, grossed the highest number of views on their account (24,700).

A video absolving Hamas of the atrocities committed against Israel was spread by TV
personality Ebraheem Alsamadi, an influencer with the nickname “Blooming Man” and anchorman of a TV show on Dubai luxury. In a video posted on November 19, which received 1.4 million views and 148,000 likes, Alsamadi asserts that he will not condemn “Humos” (Hamas) for the attacks: “after everything that I am going through in the past 40 days, seeing all these innocent children being killed, all this horror we are living in, this lady comes up to me […] and says ‘do you condemn Hamas?’ and I look her straight in the face and say ‘no bitch, but I do condemn your mother giving birth to you’”.

Posts denying October 7 began popping up on Instagram on October 13. The American
Kuwaiti journalist Ahmed Eldin, who has 1.1 million followers, posted the front page of
“The Times”, which was dedicated to the tragedy of the murder of Israeli children, with he comment that the images were fabricated with AI and were fake.48 The post gained more than 20,000 likes. On the same day, Hausa Room, a profile with 345,000 followers that shares news on general topics such as tech, politics, and economics, published thefollowing statement: “ZERO Israeli babies were beheaded by Hamas.

American-Palestinian influencer, Remi Kanazi (42,400 followers), played a pivotal role in spreading denial discourse, sharing posts such as one listing all the alleged “lies” spread by Israel and “debunked”, including the beheaded babies and that hostages were kept under Al Shifa hospital.

On October 18 Lebanese film and TV actor, Nicolas Mouwad (2.9 million followers),
published a video for his “foreign friends” stating that the beheading of the babies is a
“fairy tale”. By analyzing the profiles of Mouwad and Ebraheem Alsamadi, it is possible to note that, while denial theories on X were circulated mostly by journalists or scholars, on Instagram a central role in the spread of October 7 denial was played by TV stars and actors with the potential to reach far larger audiences.

A different denial theme, centered on the criticism of Israel revising the death toll, was
shared by broadcaster Samira Mohyeddin on December 18 in a post that gained 1,557
likes.

Network Analysis Insights

  • Accounts and influencers spreading October 7 denial theories are highly likely to demonstrate previous anti-Israel bias.
  • Denial theories were spread differently depending on the platform: while journalists and reporters led denial discourse on X, TV stars and celebrities led denial discourse on TikTok and Instagram.
  • Network and reciprocal influence, particularly on X but also on Instagram and TikTok, shows how ideas circulate among accounts that follow and influence one aother
  • Insidious strategies to introduce denial theories included claiming that the false information came from
    “official” news sources or legitimate, though unidentified, reliable sources.

Recommendations & Conclusions

Page 12

Recommendations & Conclusions

CyberWell calls on all social media platforms to swiftly recognize and treat content denying and distorting the atrocities of October 7 as it would treat content denying or distorting the Holocaust, by prohibiting it and dedicating the appropriate resources to remove it at scale under the denial of well-documented violent events policy.

CyberWell is open to collaborating with all social media platforms and big-tech companies to move this recognition beyond a policy adjustment and into the implementation phase by sharing data, query strings, and methodologies. As a first step, all reported data in the October 7 denial dataset is available at app.cyberwell.org.

Furthermore, CyberWell calls for a swift recognition by the International Holocaust Remembrance Alliance, the European Union, all United States government agencies subject to the U.S. National Strategy to Counter Antisemitism, the United States Department of Education, the United Nations, and every multi-national multi-lateral body that has a program, position or strategy dedicated to fighting antisemitism to recognize unequivocally that the denial and distortion of violent atrocities aimed at targeting the Jewish people is itself a form of Jew-hatred. Exactly as Holocaust denial is a form of antisemitism, so too is the denial and distortion of the massacre and violent events that occurred on October 7.

CyberWell will continue to monitor this latest form of Jew-hatred online until this policy gap is bridged and appropriate resources are dedicated to enforcing the updated policy at scale.

Policy Appendix

Page 13

Policy Appendix

Meta

Denial & Distortion | Meta Misinformation (Physical Harm or Violence)

"We remove misinformation that [...] are likely to directly contribute to a risk of imminent violence or physical harm to people".

 

 

Violence & Denial | Hate Speech Tier 1

"Dehumanizing speech in the form of comparisons to or generalizations about:

[...]

 

[...]

  • Violent criminals (including but not limited to: terrorists, murderers, members of hate or criminal organizations)
  • Mocking the concept, events or victims of hate crimes even if no real person is depicted in an image". 

Meta Misinformation (Manipulated Media) | (Cropped videos and images, partial articles, & false or misleading context)

"We remove videos [...] if specific criteria are met: (1) the video has been edited or synthesized, beyond adjustments for clarity or quality, in ways that are not apparent to an average person, and would likely mislead an average person to believe a subject of the video said words that they did not say; and (2) the video is the product of artificial intelligence or machine learning, including deep learning techniques (e.g., a technical deepfake), that merges, combines, replaces, and/or superimposes content onto a video, creating a video that appears authentic".

Meta's Updated Policy Following October 7

Meta removed content related to the October 7 attack that violates its "Dangerous Organization and Individuals" policy, including content showing victims during the attack and content that glorifies Hamas, which is defined by Meta as a "Dangerous Organization". Meta also began working with third-party fact-checkers regarding posts about the attack and other events of the Israel-Hamas war in order to identify false content, limit exposure, and add relevant warning labels. However, no steps have been taken to remove misinformation content or content that denies the massacre or distorts its scope.

https://about.fb.com/news/2023/10/metas-efforts-regarding-israel-hamas-war/

X

Denial & Distortion | X Abuse and Harassment (Violent Event Denial)

"We prohibit content that denies that mass murder or other mass casualty events took place, where we can verify that the event occurred, and when the content is shared with abusive context. This may include references to such an event as a “hoax” or claims that victims or survivors are fake or “actors.” It includes, but is not limited to, events like [...] terrorist attacks".

Synthetic & Manipulated Media Policy (Cropped videos and images, partial articles, & false or misleading context)

  • "whether inauthentic, fictional, or produced media are presented or being endorsed as fact or reality [...]
  • whether media are presented with false or misleading context surrounding the source, location, time, or authenticity of the media;
  • whether media are presented with false or misleading context surrounding the identity of the individuals or entities visually depicted in the media".

 

 

X's Updated Policy Following October 7

As of the writing of this report, X’s most recent official statement regarding the Israel- Hamas war was published on November 14, 2023. Between October 7 – November 14, according to X, the platform has removed 325,000 pieces of content that violated their terms of use, including violent speech and hateful conduct and actioned 25,000 posts under its synthetic and manipulated media policy. In addition, it suspended 375,000 accounts in order to protect the authenticity of the discourse regarding the war. Another tool that X uses is the Community Notes, which is designed to alert to potentially misleading content and thus reduce its spread.

However, no reference was made in X's statement about content related to the denial and distortion of the October 7 attack and whether it was removed. The majority of the posts from CyberWell’s dataset are still online.

https://blog.twitter.com/en_us/topics/company/2023/maintaining-the-safety-of-x-in- times-of-conflict

TikTok

Denial + Degrading victims of violent tragedies, such as claiming that they deserved to die or that surviving members are lying about the event | TT Harassment & Bullying

  • "Degrading victims of violent tragedies, such as claiming that they deserved to die or that surviving members are lying about the event".

Distortion of facts | TT Misinformation

  • "Misinformation that poses a risk to public safety [...]
  • Dangerous conspiracy theories that are violent or hateful, such as [...] denying well- documented violent events [...]
  • Specific conspiracy theories that name and attack individual people
  • Material that has been edited, spliced, or combined (such as video and audio) in a way that may mislead a person about real-world events".

TikTok’s Updated Policy Following October 7

TikTok released an official statement asserting that, after the start of the Israel-Hamas war, it established a new anti-hate and discrimination task force designed to combat the rise of hateful behavior and content with an emphasis on antisemitism and Islamophobia. It also noted that, regarding the war, efforts were made to enforce their policy "against hate, harmful misinformation and other violative content". Thus, between October 7- November 30, TikTok asserts that they removed more than 1.3 million videos that violated its Community Guidelines such as "content promoting Hamas, hate speech, terrorism and misinformation".

However, TikTok has not yet indicated how it addressed content denying and distorting the October 7 attack. The dataset in this report indicates that there was no methodical enforcement of denial content.

https://newsroom.tiktok.com/en-gb/protect-tiktok-community-israel-hamas-war

YouTube

Denial | YouTube Hate Speech

We don’t allow content that promotes violence or hatred against individuals or groups based on any of the following attributes, which indicate a protected group status under YouTube’s policy:

[...]

  • Victims of a major violent event and their kin

[...]

Here are examples of hate speech not allowed on YouTube [...]

  • “All of the so-called victims of this violent event are actors. No one was hurt, and this is just a false flag.”
  • “People died in the event, but a truly insignificant number".

Misinformation Policies

  • "Manipulated content: Content that has been technically manipulated or doctored in a way that misleads users (usually beyond clips taken out of context) and may pose a serious risk of egregious harm".

YouTube's Updated Poliicy Following October 7

No official statement has been made by the platform regarding the enforcement of its Community Guidelines on content addressing the October 7 attack, but from the research carried out for the writing of this report, it appears to be the platform with the least amount of content denying and distorting the attack.

Share this content

Facebook
X
LinkedIn
Email
WhatsApp

More Reports

Denial and Conspiratorial Self-Victimization in Antisemitic Discourse: Analysis of the Online...
CyberWell is tracking a dangerous rise in online narratives that deny attacks on Jews and Israelis or claim they were…

January 8, 2026

Regarding Meta Oversight Board Cases Involving Coded Language and Racial Discrimination...
Following Meta’s Oversight Board review of new cases involving the use of emojis and antisemitic code words to target protected…

December 9, 2025

Antisemitism Online Amid National Elections (2024-2025)
In the lead-up to national elections in the United Kingdom (U.K.), the United States (U.S.), Canada, and Australia between 2024…

September 11, 2025