The Increasing Influence of AI Is Compounding Challenges for Residents of Gaza

Saeed Ismail is a real person. But no matter how much proof he posts online, people keep calling him fake.

The 22-year-old has endured almost two years of war in Gaza, as Israel continues an assault that has killed at least 57,000 people and left the territory of 2 million on the brink of starvation. Saeed has been raising money online to feed his family. But many people on social media have been convinced that he isn’t real at all. They believe he’s an AI-generated creation, used by scammers to get donations through crowdfunding platforms from well-meaning people. And in a world where AI videos and images have flooded our digital spaces, it’s easy to understand why some people are so skeptical.

The controversy started with a photo posted in May that features Saeed asking for help. Users of the social media platform Bluesky started accusing him of being fake. A pseudonymous account called Rev. Howard Arson led the charges against Saeed, pointing out his blanket had some odd words woven within it.

Arson pointed to two words on the blanket: “pealistic” and “spfcation,” which seemed to be misspellings of “realistic” and “specification.” Arson suspected that it was evidence the image was made with AI, a result of someone asking an artificial intelligence tool to make a realistic blanket and spitting this out:

Close-up of the blanket that some people online believed was proof Saeed was AI generated.
Photo: Saeed Ismail

Saeed posted several photos and a video showing the blanket to prove it was real, but that didn’t put the accusations to rest. He told Gizmodo his mother purchased the blanket from a market near where he lived in northern Gaza before the war: “When I left the house during the evacuation, I took it with me.”

People were convinced that AI video has become advanced enough that Saeed could be a completely artificial being—an AI agent, as Arson told Gizmodo over direct messages. Arson, who has over 20,000 followers on Bluesky, was getting a lot of attention for his claims, helping them spread across the relatively niche social network. Saeed didn’t know what he could do, as the account seemed to have more influence than he could ever attain.

“When I went on Bluesky, I saw that many people had started to believe I was a fraud, mainly because of the status of the person who posted about me,” Saeed told Gizmodo over WhatsApp. “He’s seen as a trustworthy figure by many, while unfortunately, I don’t have the same following or credibility.”

Gizmodo verified Saeed’s existence as a real person through a variety of means, including a video call where he walked around the room and showed us the view out of his window. He provided us with his Palestinian ID as well as photos and videos of his location. Saeed told Gizmodo he previously raiseed money on PayPal but now fundraises on GoFundMe through a campaign run by a person he trusts in France who sends him the money. After it gets to France, the money is sent through wire transfers into Gaza, which he says is “reliable and secure,” explaining, “We rely on these channels because we simply have no other options available.”

GoFundMe doesn’t operate in Gaza but allows people in countries where it does operate to host fundraisers for people in other countries. It has a tutorial on its website for how to do that safely. The GoFundMe team collects documentation like IDs, receipts for expenses, and other proof to make sure funds are delivered to the beneficiary. A spokesperson from GoFundMe told Gizmodo it has verified Saeed’s campaign.

“During humanitarian disasters and other crises, our team proactively monitors and verifies fundraisers to help ensure funds raised on our platform are delivered quickly and safely,” a GoFundMe spokesperson told Gizmodo in a statement.

“When a GoFundMe fundraiser is created, we prioritize distributing funds while also helping ensure compliance with relevant international laws, global regulations, and requirements dictated by our payment processors,” the spokesperson continued. “We’re working around the clock to help ensure humanitarian relief funds are delivered safely and quickly to those in need, helping families access essentials, rebuild, and recover through the generosity of their communities.”

Saeed Ismail holding the blanket that started a controversy on Bluesky about whether he was AI generated.
Saeed Ismail holding the blanket that started a controversy on Bluesky about whether he was AI generated, photographed in Gaza on May 29, 2025. Photo: Saeed Ismail

Whoever runs the Arson account declined to provide their real name, let alone jump through the same hoops as Saeed to prove his identity to Gizmodo and GoFundMe. “Unfortunately i don’t want a claim i can’t prove connected to my name, which is why—even though i’m very skeptical still—i took down the actual claim and any references directly to the guy in question,” Arson told Gizmodo over Bluesky.

Arson told Gizmodo he’s an American and recognized his country was helping Israel flatten Gaza and kill tens of thousands of people. But he remained unconvinced that Saeed was a real person and seemed frustrated that AI had become so advanced he wasn’t sure what to believe. Arson, who seems to have great familiarity with AI systems, struggles with what to believe, especially when it comes to whether giving money is a good thing.

“I have no idea whatsoever what it is like to be there. i know i would do or say literally anything to eat,” Arson wrote. “But this is why it is absolutely impossible to evaluate whether you are actually doing any good! on one end you have people with money and on the other you have people paying $300 for a bag of lentils in a manufactured and intentional famine.”

Arson acknowledged that his perspective could be warped by seeing so many fake images created with AI.

“The way people get this stuff wrong, and the way I possibly have, is that they end up in a world where half the stuff they look at is synthetic and half of it is real, and presume that 50/50 is the distriubtion [sic] everywhere they look,” Arson told Gizmodo.

Arson pointed to another example of a photo that he previously thought was fake but is actually real. The image shows a woman and two kids sitting in the rubble of Gaza. All three people are wearing sandals, though the boy’s left leg has a white plaster cast. But it was the woman’s sandal that raised suspicions.

The sandal has a panda head logo, along with an odd mix of words: “HAPPY LUCKY rlorE DNUI.” It’s the kind of thing that would immediately raise red flags for anyone on the lookout for AI images being circulated online. The strange thing is that the sandals are real.

Photo taken in Gaza with oddly worded sandals that made people think the image was AI generated.
Photo taken in Gaza of Sahar AlAjrami and two of her children, with oddly worded sandals that made people think the image was AI generated. Photo courtesy of Sahar AlAjrami

Arson found the same sandals being sold on Facebook in a listing from an online shop in Benghazi, Libya back in June 2022. Not only were the sandals seen in the photo from Gaza available for purchase, proving they existed in the real world, but they even came in a variety of colors.

Gizmodo spoke with the woman in the photo, Sahar AlAjrami, over video call to verify she was a real person in Gaza. Her English is not great, so her younger brother, Ahmed, worked as a translator for us, showing the sandals lit by cellphone light in the middle of the night. GoFundMe told Gizmodo that the fundraising campaigns for both Sahar, who’s trying to raise money to feed her children, and Ahmed have been verified as authentic.

It’s not clear who designed the sandals bearing such mangled English, but they are absolutely real. And it’s perhaps notable that June 2022 predates the first widely available AI image generators, which didn’t really open up to the public until late 2022. ChatGPT, which officially kicked off the consumer-facing generative-AI race, was launched to the public in November 2022.

Shoes for sale on Facebook in a listing from June 2022 in Libya.
Shoes for sale on Facebook in a listing from June 2022 in Libya. Photos: Facebook

Sahar is most concerned for her 10-year-old son Odai, who was shot in the foot by an Israeli quadcopter drone. While amateur sleuths on the internet were fixated on her sandals, it was her son’s foot in that photo that should’ve been drawing the attention. Sahar says painkillers and other medicines aren’t being allowed into Gaza and she hopes she can get him to Egypt for treatment. That’s an impossibility at the moment, as Israel has shut down all of the crossings.

Gazans like Sahar and Saeed are far from the only people having a hard time proving their identities in a skeptical social media environment. Bluesky has received criticism from activists in recent months as Palestinians in Gaza struggle to survive but are getting banned left and right for allegedly being scammers or bots. Hany Abu Hilal is an English teacher in Gaza with three young kids that Gizmodo also verified as a real person through his Palestinian ID and a video call. He started a crowdfunding campaign on Chuffed, but he keeps getting his accounts deleted on Bluesky for spam.

“My flat was completely burnt and then completely destroyed,” he told Gizmodo, explaining that he lives in a tent in Khan Younis which can’t shield them properly from the heat or the cold. Hany said he was banned from Bluesky without a clear explanation and he’s desperate because he can’t ask for help online in spaces that keep deleting Palestinian accounts.

“Basic food prices are skyrocket here in Gaza. Can’t even bring a loaf of bread for my three kids because I am jobless and I do not have money,” Hany told Gizmodo. The Israeli government has total control of what goes in and out of Gaza and recently had a two-month total blockade of food and supplies into the territory. That blockade has been loosened, but only a minimal amount as people struggle to find the basic necessities to live. At least 549 people in Gaza have been killed in recent weeks near aid sites, according to the CBC, and more than 4,000 have been injured trying to access food from the so-called Gaza Humanitarian Foundation (GHF), a U.S.-based group registered to an empty office in Delaware. Reporting from the Associated Press suggests it’s not just the Israeli military but U.S. contractors who have fired on crowds of Palestinians trying to get aid.

Hany says he’s had to evacuate 12 times since the start of the war, a common story for people who have been internally displaced by Israel’s invasion of Gaza. The vast majority of Gaza’s population have had to leave their homes, and Israel has announced plans to completely “conquer” the territory, with many politicians talking openly about ethnically cleansing the area. While we’ve verified Hany is a real person, we were unable to reach Chuffed, a fundraising site based in Australia, to get more information about their processes for verifying campaigns.

Palestinians search for usable items in the rubble of heavily damaged and collapsed buildings after Israeli attacks on Al-Shati refugee camp in Gaza City, Gaza on June 26, 2025.
Palestinians search for usable items in the rubble of heavily damaged and collapsed buildings after Israeli attacks on Al-Shati refugee camp in Gaza City, Gaza on June 26, 2025. © Photo by Saeed M. M. T. Jaras/Anadolu via Getty Images

Hany has struggled on Bluesky, which has exploded in popularity ever since Elon Musk bought Twitter and renamed it X. And perhaps no English-language account on Bluesky has done more to raise awareness of the Palestinian plight on the platform than Molly Shah, a 45-year-old American activist based in Berlin. Shah is originally from Kentucky, and she tells Gizmodo that she moved to Germany in 2017 and has previously worked as a lawyer fighting eviction cases. Molly says the people she talks to in Gaza struggle with social media users calling them fake.

“I talk to people every single day who have seen and faced some of the worst things you can possibly imagine, and the most distressing conversations I have are often around losing their accounts or being called scammers,” Molly said. “I think that is because people in Gaza have lost almost everything, but the one thing they do have left is their identities, and being banned or called fake attacks even that.”

Molly has spotted fakes on Bluesky and acknowledges that there are scammers trying to take advantage of the situation. But they are the minority, and she’s shared things to look out for, including people who only use news photos for their fundraising campaigns.

Bluesky told Gizmodo that it understands people in Gaza need platforms to share their experiences and seek support, saying the company is “committed to ensuring they can be heard on Bluesky.” But the platform said that its moderation teams have to “distinguish between genuine advocacy and coordinated inauthentic behavior.”

“In some cases, our investigations have revealed networks where single individuals operate hundreds of accounts engaging in bulk messaging, identical replies across unrelated conversations, and mass following—behaviors that disrupt the platform experience for all users regardless of the underlying cause,” Aaron Rodericks, Bluesky’s Head of Trust & Safety, told Gizmodo over email.

“When we identify these patterns, we provide warnings and opportunities for users to adjust their approach before any suspensions occur. In other cases, malicious actors create bulk networks of accounts to target users on Bluesky and exploit goodwill towards crises,” Rodericks continued.

The social media company told Gizmodo that its moderation decisions “aren’t always perfect, which is why we maintain an appeals process,” but said users need to focus on operating a single account that follows community guidelines. Rodericks said it would be “publishing more guidance in the near-term to help advocacy accounts stay within our rules while sharing their messages,” but didn’t respond to Gizmodo’s follow-up questions about when that guidance would be coming.

A woman holds the body of a child at the morgue of the Nasser Hospital in Khan Yunis, Gaza on June 26, 2025, after Israeli attacks hit a residential area.
A woman holds the body of what appears to be a child at the morgue of the Nasser Hospital in Khan Yunis, Gaza on June 26, 2025, after Israeli attacks hit a residential area. © Photo by Hani Alshaer/Anadolu via Getty Images

Even if Americans do find a fundraising campaign they trust, some folks are worried about what the U.S. government will do to them if they donate to people in Gaza, given the authoritarian and anti-Muslim tendencies of the Trump administration. Trump has previously said all of Gaza should be purged of Palestinians and that the U.S. should take ownership of the territory. And that has translated to terrifying actions in the U.S. domestically.

Trump has been enacting racist and often illegal policies targeting foreign students, with a special emphasis on attacking people protesting against Israel’s war on Gaza. Rumeysa Ozturk, a 30-year-old Turkish citizen and graduate student at Tufts University, was picked up off the street in Massachusetts by masked agents in an infamous video that went viral back in March. Ozturk’s “crime” was co-authoring an op-ed in the student newspaper defending the principles of free speech. But it’s not illegal for Americans to donate to fundraising campaigns for the average person just trying to survive in Gaza. At least, not yet.

The problem, which we always return to, is that it’s getting really hard to tell when someone is a scammer. Google’s latest AI video creation tool, Veo 3, has been a game-changer that further muddies the waters. When Time magazine published a report on the tool recently, the news outlet created a video of fake Gazans receiving aid from USAID—which brings us to the people accused of being fake on the other side of this war.

The Gaza Humanitarian Foundation, backed by Israel and the U.S., has been accused of running an unsafe and ineffective aid organization that’s getting hundreds of people killed. But it’s also been accused of making fake videos. One video created by the GHF and distributed to right-wing news outlets like the Daily Wire was understandably called fake when it was published on social media in late May. Anyone looking at it can see that it does have that kind of unusual AI-style quality that’s hard to define.

Reached for comment over email, an unnamed spokesperson for GHF told Gizmodo the video is real, sending along a screenshot of metadata from an iPhone 15 Pro showing where it was taken in Gaza. GHF initially offered to let Gizmodo speak with the cameraman but didn’t send us the raw video file we asked for and never connected us with the cameraman who was “on a plane back to America.” GHF has given contradictory statements publicly about its activities, with the head of the organization Johnnie Moore recently telling the BBC that no one had died near their aid distribution sites before saying he wasn’t contesting that people had died just a few minutes later.

How does anyone know what’s real in that kind of environment? It was a difficult question for people on the internet, even before the invention of generative AI image creators. Gizmodo has been debunking viral photos for over a decade, and many viral fakes even predate the invention of Photoshop by decades. But AI has changed the game in so many ways. And AI images and videos will only get more sophisticated.

Another problem in deciphering what’s real in Gaza is that Israel doesn’t allow foreign journalists into the territory. The IDF has sometimes escorted TV journalists in for brief periods of time, without allowing communication between the journalists and average Palestinians on the ground, but that obviously amounts to a highly controlled media event. Even if the average American has the budget to fly halfway around the world to go check out Gaza for themselves, they’re simply not allowed to do that. So people are forced to either figure out their own ways of determining what’s real or just throwing up their hands and deciding it’s too hard to authenticate.

But people in Gaza continue to suffer, with no end in sight. The health infrastructure in Gaza has been obliterated, with hospitals becoming “battlegrounds,” as experts at the United Nations have put it. The toll has been immense, with children getting hit particularly hard. Over 50,000 kids have been killed or injured since October 2023, according to figures from UNICEF. And researchers estimate the actual death toll is much higher than the official number that currently stands at over 58,000.

Yet people continue to call the atrocities fake. There are entire accounts on X, which became an extremist far-right platform after it was purchased by Elon Musk in late 2022, that are dedicated to so-called “Pallywood,” a play on the name Hollywood, which allege the people of Gaza are faking the war crimes being committed against them.

At the end of the day, there are no easy solutions. Avoiding scams can be hard as AI infects every corner of the internet. But surviving a genocide is much, much harder.

Like
Love
Haha
3
Обновить до Про
Выберите подходящий план
Больше