Elon Musk told a federal court in California this week that his artificial intelligence startup, xAI, uses OpenAI’s models to improve its systems. His comments came during testimony in a lawsuit that has focused attention on how AI companies build and train their models. The problem being discussed is known as model distillation. This is how one AI model can help train another AI model. The process is widely used in the tech industry, but it has also raised concerns that companies are copying or benefiting from competitors’ technology without explicit permission. Musk’s comments spurred an ongoing debate about how far companies can go when using other companies’ AI systems.
What Elon Musk said in court
During the Q&A, Musk explained that model distillation means using one AI model to train another AI model. When asked directly whether xAI used OpenAI’s technology in this way, he seemed to avoid a definitive answer, saying that “all AI companies in general” do this. Asked further if that meant yes, Musk said “partially.”He further added, “It is standard practice to use other AIs to validate AIs.”
Growing debate about AI training practices
Model distillation has become more common in recent years, but it has also sparked debate across the AI industry. The main concern is whether such actions cross legal or ethical boundaries, especially if companies use competing systems.Companies such as OpenAI and Anthropic have accused some companies, including China’s AI Institute, of using distillation to copy their models. OpenAI has expressed concerns about DeepSeek, while Anthropic names DeepSeek, Moonshot AI, and MiniMax.Meanwhile, Google has taken steps to block what it calls “distillation attacks,” which it describes as “a method of intellectual property theft that violates Google’s Terms of Service.”“Distillation is a widely used and legitimate training method. For example, Frontier AI Labs routinely distills its own models to create smaller, cheaper versions for its customers. But distillation can also be used for illicit purposes. Competitors can use distillation to obtain powerful capabilities from other labs in a fraction of the time and cost it would take to develop them on their own,” Anthropic said in a blog post.”
