Update README.md
Browse files
README.md
CHANGED
@@ -29,17 +29,20 @@ The model is finetuned Using a custom version of UltraChat on TPU-v4 POD using [
|
|
29 |
LinguaMatic utilizes the llama2 prompting method to generate responses. This method, named after the friendly and intelligent llama, enhances the model's ability to engage in meaningful conversations. The `prompt_model` function provided below demonstrates how the llama2 prompting method is implemented:
|
30 |
|
31 |
```python
|
32 |
-
def
|
33 |
-
|
|
|
|
|
|
|
34 |
do_strip = False
|
35 |
-
texts = [f
|
36 |
for user_input, response in chat_history:
|
37 |
user_input = user_input.strip() if do_strip else user_input
|
38 |
do_strip = True
|
39 |
-
texts.append(f
|
40 |
message = message.strip() if do_strip else message
|
41 |
-
texts.append(f
|
42 |
-
return
|
43 |
```
|
44 |
|
45 |
The `prompt_model` function takes a `message` as input, along with the `chat_history` and `system_prompt`. It generates a formatted text that includes the system prompt, user inputs, and the current message. This approach allows LinguaMatic to maintain context and provide more coherent and context-aware responses.
|
|
|
29 |
LinguaMatic utilizes the llama2 prompting method to generate responses. This method, named after the friendly and intelligent llama, enhances the model's ability to engage in meaningful conversations. The `prompt_model` function provided below demonstrates how the llama2 prompting method is implemented:
|
30 |
|
31 |
```python
|
32 |
+
def llama_prompt(
|
33 |
+
message: str,
|
34 |
+
chat_history,
|
35 |
+
system: str
|
36 |
+
) -> str:
|
37 |
do_strip = False
|
38 |
+
texts = [f"<s>[INST] <<SYS>>\n{system}\n<</SYS>>\n\n"] if system is not None else "<s>[INST] "
|
39 |
for user_input, response in chat_history:
|
40 |
user_input = user_input.strip() if do_strip else user_input
|
41 |
do_strip = True
|
42 |
+
texts.append(f"{user_input} [/INST] {response.strip()} </s><s>[INST] ")
|
43 |
message = message.strip() if do_strip else message
|
44 |
+
texts.append(f"{message} [/INST]")
|
45 |
+
return "".join(texts)
|
46 |
```
|
47 |
|
48 |
The `prompt_model` function takes a `message` as input, along with the `chat_history` and `system_prompt`. It generates a formatted text that includes the system prompt, user inputs, and the current message. This approach allows LinguaMatic to maintain context and provide more coherent and context-aware responses.
|