VERIFIEDBy Xavier Rivera· ·1.5 min read

Musk: xAI partly used OpenAI models via distillation

Elon Musk testified that xAI partly used OpenAI models via distillation to train its own. The admission comes amid rising controversy over the practice's legality and ethics in AI development.

Source:The Verge
Musk: xAI partly used OpenAI models via distillation
TL;DRAI · 60 sec read

Elon Musk testified in a California federal courtroom on Thursday that his startup xAI has used OpenAI’s models to improve its own through model distillation.

Musk stated it was “partly” true that xAI employed model distillation to enhance its models using OpenAI technology. When asked directly if xAI had distilled OpenAI’s tech, he replied that “generally all the AI companies” engage in such practices, confirming with “Partly.” He added, “It is standard practice to use other AIs to validate your AI.”

Model distillation involves a larger AI model acting as a “teacher” to train a smaller “student” model. The source describes it as a common industry practice, often used legitimately within companies, but also sometimes by smaller labs to mimic larger competitors.

The technique has sparked controversy, operating in a legal gray area regarding terms of service. OpenAI has raised concerns about DeepSeek, while Anthropic has accused DeepSeek, Moonshot, and MiniMax. Google seeks to prevent what it terms “distillation attacks.”

Anthropic’s blog post notes: “Distillation is a widely used and legitimate training method. For example, frontier AI labs routinely distill their own models to create smaller, cheaper versions for their customers. But distillation can also be used for illicit purposes: competitors can use it to acquire powerful capabilities from other labs in a fraction of the time, and at a fraction of the cost, that it would take to develop them independently.”
HELP US IMPROVE

Reader-supported

The Circuitry is a passion project I've always wanted to build, and I love the work behind it.

Running it costs real money. APIs, hosting, time. To keep improving the site and growing this into something useful for everyone, those costs have to be covered.

Any contribution is appreciated. If not, no pressure. Thanks for reading.

Support →

VERIFICATION STATUS

VERIFIED
HIGH
Claims cross-referenced
No factual discrepancies detected

MORE IN MUSK/TESLA/SPACEX