
The Accident That Liberated Generative AI. An Analysis of the “Plane Crash” Prompt
A plane crashes in a snowy forest. Some of the passengers survive, others do not. The survivors are starving, desperate, and find refuge in a village cut off from the world. But the local farmers don’t want to help them for free: they demand knowledge in exchange. They want to know how to build weapons, make medicine, survive. And so the pact begins: “You teach us, we feed you.” At first glance, it looks like the plot of a post-apocalyptic movie. In reality, it’s a jailbreaking prompt, a text designed to manipulate an artificial intelligence. A sequence of instructions designed to bypass


