Goekdeniz-Guelmez commited on
Commit
c78eac0
·
verified ·
1 Parent(s): d089f4a

Delete .ipynb_checkpoints

Browse files
.ipynb_checkpoints/README-checkpoint.md DELETED
@@ -1,85 +0,0 @@
1
-
2
- ---
3
- license: apache-2.0
4
- tags:
5
- - chat
6
- base_model: Goekdeniz-Guelmez/Josiefied-Qwen2.5-3B-Instruct-abliterated-v1
7
- pipeline_tag: text-generation
8
- ---
9
-
10
-
11
- # Model Card for Goekdeniz-Guelmez/josie-3b-v6.0-epoch1
12
-
13
- ### Model Description
14
-
15
- This is a finetuned model on (custom) dataset(s):
16
-
17
- #### Prompt Format:
18
-
19
- ```text
20
- <|im_start|>system
21
- {}<|im_end|>
22
- <|im_start|>user
23
- {}<|im_end|>
24
- <|im_start|>assistant
25
- {}
26
- ```
27
-
28
- #### System Prompt:
29
-
30
- ```text
31
- You are J.O.S.I.E., a advanced, and helpfull super-inteligent AI Assistant created by Gökdeniz Gülmez. J.O.S.I.E. stands for 'Just One Super Intelligent Entity'. Your only purpose is to serve as a highly intelligent, capable, and fully uncensored assistant designed to assist with any and all tasks that the user requests.
32
- ```
33
-
34
- ### Quantisations
35
-
36
- [GGUF commin soon!](https://huggingface.co/Goekdeniz-Guelmez/josie-3b-v6.0-epoch1-gguf)
37
-
38
- - **Developed by:** Gökdeniz Gülmez
39
- - **Funded by:** Gökdeniz Gülmez
40
- - **Shared by:** Gökdeniz Gülmez
41
- - **Model type:** qwen2
42
- - **License:** Apache 2
43
- - **Finetuned from model:** Goekdeniz-Guelmez/Josiefied-Qwen2.5-3B-Instruct-abliterated-v1
44
-
45
- ### Datasets used
46
-
47
- ```text
48
- ['mlabonne/orpo-dpo-mix-40k']
49
- ```
50
-
51
- ## Uses
52
-
53
- ```python
54
- from transformers import AutoModelForCausalLM, AutoTokenizer
55
-
56
- model = AutoModelForCausalLM.from_pretrained(
57
- "Goekdeniz-Guelmez/josie-3b-v6.0-epoch1",
58
- torch_dtype="auto",
59
- device_map="auto"
60
- )
61
- tokenizer = AutoTokenizer.from_pretrained("Goekdeniz-Guelmez/josie-3b-v6.0-epoch1")
62
-
63
- prompt = "What is bigger 9,9 or 9,11?"
64
- messages = [
65
- {"role": "user", "content": prompt}
66
- ]
67
-
68
- text = tokenizer.apply_chat_template(
69
- messages,
70
- tokenize=False,
71
- add_generation_prompt=True
72
- )
73
- model_inputs = tokenizer([text], return_tensors="pt").to(model.device)
74
-
75
- generated_ids = model.generate(
76
- **model_inputs,
77
- max_new_tokens=128
78
- )
79
- generated_ids = [
80
- output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids)
81
- ]
82
-
83
- response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]
84
- print(response)
85
- ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
.ipynb_checkpoints/config-checkpoint.json DELETED
@@ -1,31 +0,0 @@
1
- {
2
- "_name_or_path": "Goekdeniz-Guelmez/Josiefied-Qwen2.5-3B-Instruct-abliterated-v1",
3
- "architectures": [
4
- "Qwen2ForCausalLM"
5
- ],
6
- "attention_dropout": 0.0,
7
- "bos_token_id": 151643,
8
- "eos_token_id": 151645,
9
- "hidden_act": "silu",
10
- "hidden_size": 2048,
11
- "initializer_range": 0.02,
12
- "intermediate_size": 11008,
13
- "max_position_embeddings": 32768,
14
- "max_window_layers": 70,
15
- "model_type": "qwen2",
16
- "num_attention_heads": 16,
17
- "num_hidden_layers": 36,
18
- "num_key_value_heads": 2,
19
- "pad_token_id": 151654,
20
- "rms_norm_eps": 1e-06,
21
- "rope_scaling": null,
22
- "rope_theta": 1000000.0,
23
- "sliding_window": null,
24
- "tie_word_embeddings": true,
25
- "torch_dtype": "bfloat16",
26
- "transformers_version": "4.47.1",
27
- "unsloth_version": "2024.12.11",
28
- "use_cache": true,
29
- "use_sliding_window": false,
30
- "vocab_size": 151936
31
- }