vicuna:13b-v1.5-16k-q4_0

183.2K 1 year ago

General use chat model based on Llama and Llama 2 with 2K to 16K context sizes.

7b 13b 33b
7adfc8235793 · 76B
{
"num_ctx": 16384,
"rope_frequency_scale": 0.125,
"stop": [
"USER:",
"ASSISTANT:"
]
}