Google AI has forced him to deal with the backlash, saying that the YouTuber had visited Israel

AI Video & Visuals


Science and music YouTuber Benn Jordan spent a rough few days before this week after Google's AI summary recently visited Israel and mistakenly told people that they had supported the country during the war with Gaza. Jordan did not support Israel and previously donated to Palestinian charities.

Jordan told 404 Media that when people enter his name into Google, it is often followed by a “brow” or “wife.” That changed when popular political Twitch streamer Hasan Piker decided to respond to him. video About A group aiAI-driven Camera Company The 404 media It is covered in a wide area. The Jordanian video had previously appeared in Piker's streams, so he knew he was riding a bit. “Whenever he responded to my content, I always said, 'Oh, no, I'm going to make concessions in front of millions of people, to be a libertarian without being able to explain my point,” he said.



However, this time it was a little different. “I saw it and in the middle he said his chat was a bit engrossed and supported the actions of the Israeli massacre,” Jordan said. “Then I started to get a lot of messages from people. Why don't you reveal anything about Israel, why I've supported Israel and donated a lot of money to the Palestinian Children's Relief Fund. I've been quite vocal about not supporting Israel in the past, and supporting a free Palestinian state.”

Someone then sent him a screenshot of an AI-generated summary of Google search results explaining the flood of messages. If you type “Benn Jordan Israel” into Google and only see the AI ​​summary, this is what you said.

“Electronic musician and science YouTuber Benn Jordan has recently been involved in the Israeli-Palestinian conflict, leading to important controversies and debates online. He shared his experiences from his trip to Israel, and during that time he interviewed people in Kibtzim near the Gaza border.” Jordan shared on Blue Ski. “On August 18, 2025, Ben Jordan uploaded a YouTube video. I was wrong about Israel: what I learned on the grounddetailing his recent trip to Israel. ”

Jordan has never been to Israel and he is not satisfied with the war. His videos live in a cross section of science and sound, and he converted PNG sketches into audio waveforms earlier this year, I taught young Sterling the songEffectively save digital images in bird memories. He also covers death Spotifyit will collapse American capitalism,and Unique danger AI posses for musicians.

Google's AI seems to have confused Jordan with YouTuber Ryan McBeth. I'll do it Make a video about the war. Macbeth is a chain-smoking newsmax commentator with a video titled “Is Wrong About Israel: What We Learned on Earth.”

It's a strange mistake that AI makes, but AI makes many mistakes. AI-generated songs are worse than actual songs, AI-generated search results get information summed up by Google, It's often wrong. The Jordanian experience is just a small sample of what happens when people take AI at face value without doing extra five minutes of research.

When he learned that Jordan was misrepresented in AI overview, he began sharing stories on Bluesky and Threads. He gave 404 media a summary of AI I updated myself About 24 hours later. “In the end, AI posted about it and said it had spread about rumors about me, false rumors, going to Israel, and I just tore my hair out of my head.

He told 404 Media he believed Google's AI could have slandered him and that he reached out to his lawyer for an opinion of curiosity rather than a prelude to the lawsuit. Some said he might have a case. “Next week I'll be going to Yellowstone for 10 days, completely off the grid,” Jordan said. “If this is going on and this continues to spread and become a huge controversy, we're probably losing YouTube subscribers and losing Patreon members.”

Jordan has covered AI in the past and said it is not shocked by the collapse of the system. “Everyone is rushing LLM to become a part of our daily lives […] However, the actual LLM itself is not good. That's not what they claim. Despite the limitations and promises of how LLMS works and AI works, it may not be arguing that it is because of limitations on how it works. It's a really bad algorithm to get any kind of useful information you can trust, and it prioritizes the above journalists to keep money. ”

In the overall aftermath, Jordan revealed his position on the Israeli-Palestinian conflict. in Blue skiing and thread He said I'll do it Think of Israel as committing genocide in Gaza. “Hopefully someone will see it before they waste their time message me to lecture me about genocide,” he said. “Now I'm being lectured on genocide from the other side. Now I have my skin. Now I'm dealing with messages from people defending Israel and I say I'm anti-sick.”

This is not the first time Google's AI summary has ruined basic facts about people with public profiles. In July, humorist Dave Barry discovered a summary of Google's AI I thought he was dead Last year, after the battle with cancer. Barry was very alive and detailed his fight Modify the record His end in his newsletter. Like Jordan, Google's AI overview has shifted. Unlike Jordan, that changed after Barry fought Google's various automated complaints systems.

When AI makes such a mistake, we tend to call it hallucinations. Jordan used that word at his time Posted An updated summary of his life. “I've been thinking about it over the last few days and it gives so much credibility that it can hallucinate something,” Jordan said. “In general, it's not that good to scrape data and get it in a reputable way.”

“The majority of the AI ​​overview is factual and we continue to improve both the usefulness and quality of our response,” a Google spokesperson told 404 media. “If problems arise – if a feature misinterprets web content or misses context, we will use these examples to improve the system and take appropriate action under the policy.”

update: This story has been updated in a statement from Google.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *