June 2024Breakthrough1 min read

338 languages; coding parity with GPT-4 Turbo

DeepSeek-Coder V2 adopted an MoE architecture, supported 338 programming languages, and substantially extended its context window. On multiple advanced coding benchmarks, it caught up with — and on some, surpassed — GPT-4 Turbo. By this point, only seven months had passed since the original Coder release. A small team had reached parity with top closed-source models in a deeply vertical domain. Skeptics of open-source code-model ceilings went quiet. Developer communities celebrated: there was now a fully private-deployable, free-for-commercial-use coding brain.

Sources