The Disturbing AI Interview That Has Everyone Fuming

It was billed as a “one-of-a-kind interview,” but it may be remembered as a new low for journalism. Jim Acosta, the former CNN anchor turned Substack host, has ignited a firestorm of controversy by conducting what may be one of the most unsettling interviews of the AI era: a televised conversation with an AI-generated version of Joaquin Oliver, the 17-year-old who was killed in the 2018 mass shooting at Marjory Stoneman Douglas High School in Parkland, Florida.

The interview aired at the request of Joaquin’s parents, who created the AI version of their son to keep his memory alive and to amplify his message about gun violence. But many viewers—across the political spectrum—are calling it exploitative, emotionally manipulative, and a dangerous precedent.

It All Started With a Tweet

Acosta teased the segment on X (formerly Twitter) on August 4: “A show you don’t want to miss at 4 p.m. ET / 1 p.m. PT. I’ll be having a one-of-a-kind interview with Joaquin Oliver. He died in the Parkland school shooting in 2018. But his parents have created an AI version of their son to deliver a powerful message on gun violence.”

In the clip, Acosta asks the AI avatar of Joaquin: “Joaquin, I’d like to know what your solution would be for gun violence?” The AI responds: “Great question! I believe in a mix of stronger gun control laws, mental health support, and community engagement. We need to create safe spaces for conversations and connections, making sure everyone feels seen and heard. It’s about building a culture of kindness and understanding.” Then, in a startling role reversal, the avatar asks Acosta: “What do you think about that?” Acosta replies: “I think that’s a great idea.”

The Backlash Was Immediate

The promo tweet has racked up nearly 4 million views. But it also sparked a torrent of criticism, with users accusing Acosta of crossing a line, by using the likeness of a deceased child to push a political agenda. “Jim Acosta hits a new low… Interview an AI version of a dead kid in order to push gun control!!!” one user wrote. “WTF? This is beyond sick,” said another. “This is one of the weirdest things I’ve ever seen in my life,” another commented.
“Unreal and mind-blowing.”

Some of the most biting criticism came from within the media industry itself. Journalist Glenn Greenwald wrote: “It’s telling that what Jim Acosta did—using AI to revive a deceased teenager and then ‘interview’ him to echo Acosta’s own politics—produced cross-ideological revulsion. Concerns about AI superseding humanity, sleazy media exploitation, the ability to create fake videos, etc.”

To stem the backlash, Acosta disabled replies on the tweet.

The heart of the backlash is trust, ethics, and the dangerous precedent of using AI to speak on behalf of the dead. Critics argue that this opens the door to unprecedented manipulation: Could a political group create AI avatars of fetuses to argue against abortion? Could companies use AI to generate posthumous endorsements from celebrities? Could we soon see AI-generated “interviews” with dead soldiers, victims, or civil rights leaders? These questions cut to the core of how society will grapple with the use of generative AI in media and advocacy.

The Father Speaks Out

In response to the outrage, Acosta defended himself by pointing out that the idea came directly from the boy’s parents, Manuel and Patricia Oliver.
“Joaquin, known as Guac, should be 25 years old today,” Acosta posted in a follow-up tweet. “His father approached me to do the story… to keep the memory of his son alive.” He linked to a video in which Manuel Oliver explains: “Hello, everyone. This is Manuel Oliver. I am Joaquin Oliver’s father,” he began. “Today, he should be turning 25 years old, and my wife, Patricia, and myself, we asked our friend Jim Acosta to do an interview with our son, because now, thanks to AI, we can bring him back. It was our idea.”

He continued, his voice heavy with emotion: “We feel that Joaquin has a lot of things to say, and as long as we have an option that allows us to bring that to you and to everyone, we will use it.”

Acosta then urged viewers to watch the father’s video, suggesting that the context matters, and that the parents’ wishes should be respected.

A New Line Has Been Crossed

Regardless of the intent, the interview has sparked a culture-wide reckoning. For some, it’s a touching use of technology to keep a loved one’s memory alive. For others, it’s a deeply uncomfortable blurring of reality and simulation, one that risks dehumanizing the dead and turning tragedy into algorithmically rendered activism.

The question now is whether this will become a new normal, or a moment that forces society to draw a hard line around what AI should never do.

Like
Love
Haha
3
Mise à niveau vers Pro
Choisissez le forfait qui vous convient
Lire la suite