Satirical ‘Center for the Alignment of AI Alignment Centers’ Mocks AI Safety Culture

Key Points
- CAAAC is a satirical website mimicking AI alignment research centers.
- Created by the team behind a wearable "Box" for women to prevent deepfakes.
- Site design appears professional but hides jokes like profanity‑spelling graphics.
- Job listings require belief that AGI will destroy humanity within weeks.
- A generative AI tool claims to build a fake AI center in under a minute.
- Researchers such as Kendra Albert initially mistook it for a legitimate effort.
- Co‑founder Louis Barclay calls the site "the most important thing" about AI.
- Applicants are greeted with Rick Astley's "Never Gonna Give You Up" as a prank.
A spoof website called the Center for the Alignment of AI Alignment Centers presents itself as a hub for AI safety researchers, but hides jokes and absurd claims throughout. Created by the team behind a wearable “Box” for women, the site features calming visuals that conceal hidden messages, a job board that invites applicants who believe AGI will end humanity in weeks, and a generative AI tool that promises to build a fake AI center in minutes. Researchers such as Kendra Albert initially mistook it for a legitimate effort before recognizing the satire.
Origins and Design
The Center for the Alignment of AI Alignment Centers (CAAAC) launched as a satirical project by the same team that created a literal, physical box that women can wear on dates to avoid deepfake threats. The website adopts a calm, professional aesthetic with a logo of converging arrows and parallel lines, mimicking the look of genuine AI alignment labs.
Hidden Jokes and Absurd Claims
Visitors who linger on the page discover hidden jokes: swirling graphics that spell out a profanity after a brief pause, and a jobs page that asks applicants to believe "AGI will annihilate all humans in the next six months" before they can’t apply. The site also offers a generative AI tool that claims it can create a full AI center—including an executive director—in "less than a minute, zero AI knowledge required." Further down the recruitment funnel, users are serenaded by Rick Astley's "Never Gonna Give You Up," revealing the prank.
Public Reaction
Even seasoned researchers were initially fooled. Kendra Albert, a machine learning researcher and technology attorney, encountered the site and thought it might be real, noting how closely its vibe matched authentic AI alignment organizations. After a closer look, Albert recognized the satire and explained that the project mocks a trend of focusing on highly theoretical AI risks while ignoring concrete problems such as model bias, energy consumption, and workforce displacement.
Founders' Statements
Co‑founder Louis Barclay, speaking in character, described the site as "the most important thing that anyone will read about AI in this millennium or the next." The second founder chose to remain anonymous. Both emphasized the parody’s aim to highlight how some AI safety discourse drifts away from real‑world issues.
Key Features
- Calm visual design that masks hidden jokes.
- Job postings that demand belief in imminent AGI catastrophe.
- A one‑click AI tool promising to generate a fake AI center.
- Rick Astley’s song as a final reveal for job applicants.
The CAAAC site encourages visitors to comment on its LinkedIn announcement to become a fellow, further blurring the line between parody and genuine community building. By inviting users to bring "wet gear" and promising a global Bay‑Area workforce, the project satirically critiques the hype surrounding AI alignment research.