China's DeepSeek Unveils New Model
Digest more
This is an open weights model, so you can download it and run it yourself - be warned, though, even the smaller Flash version has 284 billion parameters.
Chinese AI darling DeepSeek is back with a new open weights large language model that promises performance to rival the best proprietary American LLMs. Perhaps more importantly, it claims to dramatically reduce inference costs and it extends support for Huawei's Ascend family of AI accelerators.
West battle for AI supremacy, Chinese artificial intelligence company DeepSeek has released a preview of its latest model, DeepSeek V4. Here's what we know about the new model, and how it stacks up against other new models like OpenAI's GPT-5.
DeepSeek's quest to keep frontier AI models open is of benefit to the entire planet of potential AI users, especially enterprises looking to adopt the cutting-edge at the lowest possible cost.
Discover how DeepSeek 4 rivals closed-source AI in 2026 with open weights, reduced FLOPs, and advanced hardware validation on Havi Ascent NPUs.
The Chinese lab that shook Wall Street just dropped its biggest, most efficient model yet, hours after OpenAI launched GPT-5.5.
DeepSeek's latest open-source AI models boast up to 1.6 trillion parameters and elite coding skills. Discover how the new Pro and Flash variants compare in performance and API pricing.
Chinese AI startup DeepSeek has launched its next-generation flagship model family, DeepSeek-V4. It has introduced two variants, V4 Pro and V4 Flash, as it doubles down on open-source competition in the global AI race.
DeepSeek has released its DeepSeek V4 Preview models, introducing two variants, DeepSeek-V4-Pro and DeepSeek-V4-Flash, alongside open-source weights and updated API access. Both models support a 1
DeepSeek V4 debuts with Huawei support, promising efficient AI performance and larger model scaling amid shifting hardware dynamics.