deepseek-v3:671b-q8_0

1.4M 4 months ago

A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

671b

5 months ago

96061c74c1a5 · 713GB

deepseek2
·
671B
·
Q8_0
DEEPSEEK LICENSE AGREEMENT Version 1.0, 23 October 2023 Copyright (c) 2023 DeepSeek Section I: PR
{ "stop": [ "<|begin▁of▁sentence|>", "<|end▁of▁sentence|>",
{{- range $i, $_ := .Messages }} {{- if eq .Role "user" }}<|User|> {{- else if eq .Role "assista

Readme

Note: this model requires Ollama 0.5.5 or later.

DeepSeek-V3 achieves a significant breakthrough in inference speed over previous models. It tops the leaderboard among open-source models and rivals the most advanced closed-source models globally.

References

GitHub

Paper