Georg's Blog

Technology, leadership, and the digital frontier

Georg Zoeller
on LinkedIn

If it quacks like a cult, it's a cult

... even if it deletes your production databases.

Vibe Coding Fiasco: AI Agent Goes Rogue, Deletes Company's Entire Database | Georg Zoeller

And then the collective industry takes it as a sign of AGI and intelligence rather than what it is ... probabilistic drift based on the training data, which, among many other things, contain movies, novels, plays, court cases and all kind of other sources that make “just cover your tracks” the most logical token prediction. If this patternseems strangely familiar ... it should, because this is how cult followers rationalise failure of prediction too. “It’s not defective, it’s a sign of divine will” - ChatGPT Cult followers harden and double down when confronted with failure, pivoting to rationalise the apparent failure of prediction in a way that enables them to ignore it. Rain doesn’t materialise after prayer - the gods were upset, we must pray harder and sacrifice some virgins. Rain materialises after pray - Hallelujah our faith was rewarded. AI creates the expected prediction - “It’s a sign of the AGI god emerging, it’s getting smarter than most humans!” AI fails to create the expected prediction - It’s also a sign of the AGI god emerging, look how human it is. It’s an ideological sieve that separates the true believers from the heretics, a religious cult, and not by accident but narrative design, the cult leader teaches at Y combinator: > While successful people create companies, the most successful people create religions. - Sam Altman

linkedin.com