NewsPulse
← All stories
Tech1 day ago· 1 min read

DeepSeek Unveils V4 Models With 1 Million Token Context Window, Challenging OpenAI and Anthropic

China's DeepSeek released preview versions of its V4 Flash and V4 Pro models, claiming top-tier performance in coding benchmarks and major advancements in reasoning and agentic tasks. The new models support a 1 million-token context window and introduce a Hybrid Attention Architecture for improved long-conversation memory.

DeepSeek V4 Announcement

A year after rattling Silicon Valley with its technology, China's DeepSeek rolled out preview versions of a new flagship artificial intelligence model, calling it the most powerful open-source platform in a challenge to rivals from OpenAI to Anthropic PBC. The Chinese startup unveiled the V4 Flash and V4 Pro series, touting top-tier performance in coding benchmarks and big advancements in reasoning and agentic tasks.

Technical Innovations

DeepSeek singled out a technique it dubbed Hybrid Attention Architecture, which it said improves the ability of an AI platform to remember queries across long conversations. It also pushed the 1 million-token context window — a leap that allows entire codebases or long documents to be sent as a single prompt.

Competitive Standing

Chinese startup says DeepSeek-V4-Pro beats all rival open models for maths and coding. While Silicon Valley retains a slight edge in developing the most advanced models, Chinese companies have "effectively closed" the AI performance gap with their US rivals, according to the Stanford AI Index 2026.

Sources

Related coverage