Meta Removes Deepfake Video Targeting Irish Presidential Candidate

Meta removes AI deepfake video of Irish presidential candidate
Engadget

Key Points

  • Meta removed an AI‑generated deepfake video of Irish presidential candidate Catherine Connolly.
  • The video was posted by an unaffiliated account called RTÉ News AI.
  • It was shared nearly 30,000 times on Facebook before removal.
  • Connolly called the video a "disgraceful attempt to mislead voters" and confirmed she remains a candidate.
  • Meta cited violations of its impersonation policy in taking down the content.
  • Irish regulator Coimisiún na Meán confirmed the platform’s swift response.
  • The deepfake falsely showed Connolly withdrawing and the election being cancelled.
  • The incident highlights ongoing challenges in policing political deepfakes on social media.

Meta has taken down an AI‑generated deepfake video that falsely portrayed independent presidential candidate Catherine Connolly announcing her withdrawal from the race. The video, posted by an account named RTÉ News AI, was shared nearly 30,000 times on Facebook before removal. Connolly condemned the clip as a "disgraceful attempt to mislead voters" and affirmed her continued candidacy. Meta cited violations of its community standards on impersonation, while Irish media regulator Coimisiún na Meán confirmed the platform’s swift response. The incident highlights ongoing challenges in policing political deepfakes on social media.

Meta’s Intervention

Meta removed a deepfake video that falsely showed independent presidential candidate Catherine Connolly announcing she was withdrawing from Ireland’s upcoming presidential election. The video, created with artificial‑intelligence tools, was posted by an account called RTÉ News AI, which is not affiliated with the public broadcaster Raidió Teilifís Éireann. Within a short period, the clip was shared nearly 30,000 times on Facebook before Meta intervened.

Candidate’s Response

Connolly described the video as “a disgraceful attempt to mislead voters and undermine Ireland’s democracy.” She publicly affirmed that she remains a candidate and rejected the false statements depicted in the deepfake. The candidate’s statement was aimed at reassuring the electorate amid a rapid spread of the manipulated content.

Content of the Deepfake

The AI‑generated video featured Connolly delivering a fabricated announcement of her withdrawal, followed by a simulated RTÉ journalist, Sharon Ní Bheoláin, reporting on the supposed announcement, and a correspondent, Paul Cunningham, claiming the election had been cancelled. The narrative falsely suggested that Connolly’s opponent, Heather Humphreys, would automatically win.

Meta’s Policy Enforcement

Meta explained that the video and the RTÉ News AI account were removed for violating the company’s community standards, specifically its policy prohibiting content that impersonates or falsely represents individuals. The platform acted after being alerted by the Irish Independent, which prompted Meta to take immediate action.

Regulatory Oversight

Ireland’s media regulator, Coimisiún na Meán, confirmed awareness of the deepfake and questioned Meta about the measures taken. The regulator’s involvement underscores the growing scrutiny of social‑media platforms regarding political misinformation.

Broader Context

The incident adds to a series of challenges Meta has faced in curbing deepfake and maliciously edited videos involving public figures. Earlier this year, the company’s Oversight Board criticized Meta for insufficient enforcement of its own rules and urged better training for content reviewers to detect AI‑manipulated media. Connolly’s poll standing, at 44 points, indicates her prominence in the race, making the spread of false content particularly concerning for democratic processes.

#Meta#deepfake#AI#Catherine Connolly#Irish presidential election#misinformation#Facebook#RTÉ News AI#political manipulation#Coimisiún na Meán
Generated with  News Factory -  Source: Engadget

Also available in: