Deepseek

Company   Tracked keywords: deepseek

12 AI Weekly issues and 15 live stories from the last 30 days mention Deepseek.

DeepSeek redefined the open-weight ceiling in late April. V4-Pro (1.6T-param MoE) and V4-Flash (284B) shipped on April 24, both with 1M context, both under Apache 2.0, with quality reportedly close to Claude Opus 4.6 non-thinking and best-in-class agentic coding among open weights.

The financial story shifted simultaneously: Tencent and Alibaba are in talks to fund DeepSeek above $20B — the open-weight strategy now has its own checkbook for the first time. The technical report introduced FP4 MoE training, Compressed Sparse Attention, and the lowest KV cache footprint in any frontier release. AI Weekly tracks every DeepSeek release and the broader open-weight race.

Mention Trend — Last 20 Issues

Live Stories (15)