Replies: 2 comments 1 reply
-
Mixtral原版只有8x7B的。如需单模型7B的,你可以用我们的Llama-2系列的模型:https://github.com/ymcui/Chinese-LLaMA-Alpaca-2 |
Beta Was this translation helpful? Give feedback.
1 reply
-
我也很期待,Mistral 7b instruct v0.2 刚发布,能力超llama2 13b |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
感谢,可否弄个7b的,我感觉对于普通人来讲,7b和8x7b的区别没那么大,7b的4Q量化版本估计个人电脑还能跑起来(8g显存)
Beta Was this translation helpful? Give feedback.
All reactions