Model Collapse
Model collapse is a phenomenon where AI models progressively degrade when trained on data generated by other AI models rather than human-created content. Each generation of training loses more diversity and nuance from the original data distribution — rare but important information vanishes first, and the model eventually produces repetitive or nonsensical outputs. Published in Nature in 2024, this effect has raised serious concerns as AI-generated content becomes an increasing share of internet data used for training.
Example
Researchers fine-tuned a language model on Wikipedia articles, then used its output to train a second model, and repeated this process. After nine generations of training on AI-generated text, the model went from producing coherent paragraphs about architecture to outputting repetitive, nonsensical sentences — demonstrating how recursive AI-on-AI training leads to irreversible degradation.
Related Terms
Put this into practice
Build polished, copy-ready prompts in under 60 seconds with SurePrompts.
Try SurePrompts