
Techzim
January 29, 2025 at 12:23 PM
*OpenAI says it had seen some evidence that DeepSeek used “distillation” to train its open-source competitor by using outputs from OpenAI's proprietary models*
OpenAI claims it has evidence that Chinese AI startup DeepSeek used OpenAI's models to train its own competing AI system through "distillation" - a technique where smaller models are trained using outputs from larger ones to achieve similar results at lower cost.
`Stay updated on AI and other tech news. Follow Techzim on WhatsApp:` https://whatsapp.com/channel/0029VaS23mO84OmAI88Cwd2q
While distillation is common in AI development, using it to build rival models violates OpenAI's terms of service. This allegation comes alongside claims from Scale AI CEO Alexandr Wang that DeepSeek has obtained 50,000 restricted Nvidia H100 chips, potentially circumventing US export controls to China.
The controversy has impacted US AI companies' valuations, with reports suggesting DeepSeek can achieve comparable results at a fraction of the cost - spending $5 million where US companies spend $100 million, and using 2,000 GPUs where others need 100,000.
However, given the market impact and falling valuations of US AI firms, questions remain whether allegations against DeepSeek are driven by genuine concerns or market pressures.
😂
😬
😮
😳
10