Baichuan 7B ChatML

介绍 Introduction

baichuan-7B-chatml 是支持多轮对话兼容于 ChatML 的模型。 模型基于 baichuan-7B 微调而成。 baichuan-7B-chatml 模型支持商用。但按照baichuan-7B的要求,如果将baichuan-7B衍生品用作商业用途,需要联系baichuan-7B 的许可方

需要注意:在面对事实性知识任务时,模型可能会生成不正确的信息或者产生不稳定的输出(有时可以返回正确答案,有时不能)。

baichuan-7B-chatml is a model that supports multi-turn dialog and is compatible with ChatML. The model is fine-tuned based on baichuan-7B. baichuan-7B-chatml model supports commercial use. However, according to the requirements of baichuan-7B, if baichuan-7B derivatives are used for commercial purposes, you need to contact baichuan-7B

Note: When dealing with factual knowledge tasks, it may generate incorrect information or unstable output (sometimes it can return the correct answer, sometimes not).

代码示例 Examples

模型在百川的基础上提供了对轮对话的函数供调用。

The model provides a function for multi-turn dialogs.

>>> from transformers import AutoTokenizer, AutoModelForCausalLM
>>> tokenizer = AutoTokenizer.from_pretrained("tibok/baichuan-7B-chatml", trust_remote_code=True)
>>> model = AutoModelForCausalLM.from_pretrained("tibok/baichuan-7B-chatml", device_map="auto", trust_remote_code=True)
>>> response, history = model.chat(tokenizer, "请以『春天的北京』为题写一首诗歌", history=[])
春天的北京,
花开万丈,
春意盎然,
清风送暖。
<|im_end|>
>>> response, history = model.chat(tokenizer, "能不能再写一首关于香山的?", history=history)
>>> print(response)
香山之巅,
芳草连天。
清泉潺潺,
山峦绵绵。
<|im_end|>

更多细节 Details

Downloads last month
17
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The HF Inference API does not support model that require custom code execution.

Dataset used to train tibok/baichuan-7B-chatml