蒸馏 (zhēng liù) — Distillation — borrowed from chemistry
Distillation — borrowed from chemistry, now a buzzword in Chinese AI circles. Refers to knowledge distillation in two related senses: (1) training a small, efficient model to replicate the behaviour of a much larger one; (2) extracting a specific person's expertise, writing style, or skills by training a model on their articles, videos, or public output — effectively "distilling" that person into an AI. After DeepSeek released its distilled models in early 2025, 蒸馏 became everyday slang across both meanings.
Example Usage
有人把这位教授所有的论文和讲座视频都拿去训练模型,直接把他蒸馏了。
Someone fed all this professor's papers and lecture videos into a model and basically distilled him into an AI.
Cultural Context
Exploded in popularity after DeepSeek's R1 distilled models went viral in January 2025. The human-distillation sense resonates strongly in Chinese tech culture: people joke about "蒸馏" a famous blogger, investor, or expert by feeding all their content into a model to clone their thinking style. You'll see phrases like "把他蒸馏了" ("distil him") used half-jokingly when someone wants to replicate a person's knowledge or voice via AI. The chemistry metaphor is apt — just as distillation concentrates the essence of a substance, AI distillation concentrates the essence of a model or a person's expertise into something compact and reusable.
Category: internet-culture