ssyok commited on
Commit
1fe23a1
1 Parent(s): 49cafe4

Upload 11 files

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ model.onnx.data filter=lfs diff=lfs merge=lfs -text
LICENSE ADDED
@@ -0,0 +1,223 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ MIT License
2
+
3
+ Copyright (c) Microsoft Corporation.
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE
22
+
23
+ Apache License
24
+ Version 2.0, January 2004
25
+ http://www.apache.org/licenses/
26
+
27
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
28
+
29
+ 1. Definitions.
30
+
31
+ "License" shall mean the terms and conditions for use, reproduction,
32
+ and distribution as defined by Sections 1 through 9 of this document.
33
+
34
+ "Licensor" shall mean the copyright owner or entity authorized by
35
+ the copyright owner that is granting the License.
36
+
37
+ "Legal Entity" shall mean the union of the acting entity and all
38
+ other entities that control, are controlled by, or are under common
39
+ control with that entity. For the purposes of this definition,
40
+ "control" means (i) the power, direct or indirect, to cause the
41
+ direction or management of such entity, whether by contract or
42
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
43
+ outstanding shares, or (iii) beneficial ownership of such entity.
44
+
45
+ "You" (or "Your") shall mean an individual or Legal Entity
46
+ exercising permissions granted by this License.
47
+
48
+ "Source" form shall mean the preferred form for making modifications,
49
+ including but not limited to software source code, documentation
50
+ source, and configuration files.
51
+
52
+ "Object" form shall mean any form resulting from mechanical
53
+ transformation or translation of a Source form, including but
54
+ not limited to compiled object code, generated documentation,
55
+ and conversions to other media types.
56
+
57
+ "Work" shall mean the work of authorship, whether in Source or
58
+ Object form, made available under the License, as indicated by a
59
+ copyright notice that is included in or attached to the work
60
+ (an example is provided in the Appendix below).
61
+
62
+ "Derivative Works" shall mean any work, whether in Source or Object
63
+ form, that is based on (or derived from) the Work and for which the
64
+ editorial revisions, annotations, elaborations, or other modifications
65
+ represent, as a whole, an original work of authorship. For the purposes
66
+ of this License, Derivative Works shall not include works that remain
67
+ separable from, or merely link (or bind by name) to the interfaces of,
68
+ the Work and Derivative Works thereof.
69
+
70
+ "Contribution" shall mean any work of authorship, including
71
+ the original version of the Work and any modifications or additions
72
+ to that Work or Derivative Works thereof, that is intentionally
73
+ submitted to Licensor for inclusion in the Work by the copyright owner
74
+ or by an individual or Legal Entity authorized to submit on behalf of
75
+ the copyright owner. For the purposes of this definition, "submitted"
76
+ means any form of electronic, verbal, or written communication sent
77
+ to the Licensor or its representatives, including but not limited to
78
+ communication on electronic mailing lists, source code control systems,
79
+ and issue tracking systems that are managed by, or on behalf of, the
80
+ Licensor for the purpose of discussing and improving the Work, but
81
+ excluding communication that is conspicuously marked or otherwise
82
+ designated in writing by the copyright owner as "Not a Contribution."
83
+
84
+ "Contributor" shall mean Licensor and any individual or Legal Entity
85
+ on behalf of whom a Contribution has been received by Licensor and
86
+ subsequently incorporated within the Work.
87
+
88
+ 2. Grant of Copyright License. Subject to the terms and conditions of
89
+ this License, each Contributor hereby grants to You a perpetual,
90
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
91
+ copyright license to reproduce, prepare Derivative Works of,
92
+ publicly display, publicly perform, sublicense, and distribute the
93
+ Work and such Derivative Works in Source or Object form.
94
+
95
+ 3. Grant of Patent License. Subject to the terms and conditions of
96
+ this License, each Contributor hereby grants to You a perpetual,
97
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
98
+ (except as stated in this section) patent license to make, have made,
99
+ use, offer to sell, sell, import, and otherwise transfer the Work,
100
+ where such license applies only to those patent claims licensable
101
+ by such Contributor that are necessarily infringed by their
102
+ Contribution(s) alone or by combination of their Contribution(s)
103
+ with the Work to which such Contribution(s) was submitted. If You
104
+ institute patent litigation against any entity (including a
105
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
106
+ or a Contribution incorporated within the Work constitutes direct
107
+ or contributory patent infringement, then any patent licenses
108
+ granted to You under this License for that Work shall terminate
109
+ as of the date such litigation is filed.
110
+
111
+ 4. Redistribution. You may reproduce and distribute copies of the
112
+ Work or Derivative Works thereof in any medium, with or without
113
+ modifications, and in Source or Object form, provided that You
114
+ meet the following conditions:
115
+
116
+ (a) You must give any other recipients of the Work or
117
+ Derivative Works a copy of this License; and
118
+
119
+ (b) You must cause any modified files to carry prominent notices
120
+ stating that You changed the files; and
121
+
122
+ (c) You must retain, in the Source form of any Derivative Works
123
+ that You distribute, all copyright, patent, trademark, and
124
+ attribution notices from the Source form of the Work,
125
+ excluding those notices that do not pertain to any part of
126
+ the Derivative Works; and
127
+
128
+ (d) If the Work includes a "NOTICE" text file as part of its
129
+ distribution, then any Derivative Works that You distribute must
130
+ include a readable copy of the attribution notices contained
131
+ within such NOTICE file, excluding those notices that do not
132
+ pertain to any part of the Derivative Works, in at least one
133
+ of the following places: within a NOTICE text file distributed
134
+ as part of the Derivative Works; within the Source form or
135
+ documentation, if provided along with the Derivative Works; or,
136
+ within a display generated by the Derivative Works, if and
137
+ wherever such third-party notices normally appear. The contents
138
+ of the NOTICE file are for informational purposes only and
139
+ do not modify the License. You may add Your own attribution
140
+ notices within Derivative Works that You distribute, alongside
141
+ or as an addendum to the NOTICE text from the Work, provided
142
+ that such additional attribution notices cannot be construed
143
+ as modifying the License.
144
+
145
+ You may add Your own copyright statement to Your modifications and
146
+ may provide additional or different license terms and conditions
147
+ for use, reproduction, or distribution of Your modifications, or
148
+ for any such Derivative Works as a whole, provided Your use,
149
+ reproduction, and distribution of the Work otherwise complies with
150
+ the conditions stated in this License.
151
+
152
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
153
+ any Contribution intentionally submitted for inclusion in the Work
154
+ by You to the Licensor shall be under the terms and conditions of
155
+ this License, without any additional terms or conditions.
156
+ Notwithstanding the above, nothing herein shall supersede or modify
157
+ the terms of any separate license agreement you may have executed
158
+ with Licensor regarding such Contributions.
159
+
160
+ 6. Trademarks. This License does not grant permission to use the trade
161
+ names, trademarks, service marks, or product names of the Licensor,
162
+ except as required for reasonable and customary use in describing the
163
+ origin of the Work and reproducing the content of the NOTICE file.
164
+
165
+ 7. Disclaimer of Warranty. Unless required by applicable law or
166
+ agreed to in writing, Licensor provides the Work (and each
167
+ Contributor provides its Contributions) on an "AS IS" BASIS,
168
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
169
+ implied, including, without limitation, any warranties or conditions
170
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
171
+ PARTICULAR PURPOSE. You are solely responsible for determining the
172
+ appropriateness of using or redistributing the Work and assume any
173
+ risks associated with Your exercise of permissions under this License.
174
+
175
+ 8. Limitation of Liability. In no event and under no legal theory,
176
+ whether in tort (including negligence), contract, or otherwise,
177
+ unless required by applicable law (such as deliberate and grossly
178
+ negligent acts) or agreed to in writing, shall any Contributor be
179
+ liable to You for damages, including any direct, indirect, special,
180
+ incidental, or consequential damages of any character arising as a
181
+ result of this License or out of the use or inability to use the
182
+ Work (including but not limited to damages for loss of goodwill,
183
+ work stoppage, computer failure or malfunction, or any and all
184
+ other commercial damages or losses), even if such Contributor
185
+ has been advised of the possibility of such damages.
186
+
187
+ 9. Accepting Warranty or Additional Liability. While redistributing
188
+ the Work or Derivative Works thereof, You may choose to offer,
189
+ and charge a fee for, acceptance of support, warranty, indemnity,
190
+ or other liability obligations and/or rights consistent with this
191
+ License. However, in accepting such obligations, You may act only
192
+ on Your own behalf and on Your sole responsibility, not on behalf
193
+ of any other Contributor, and only if You agree to indemnify,
194
+ defend, and hold each Contributor harmless for any liability
195
+ incurred by, or claims asserted against, such Contributor by reason
196
+ of your accepting any such warranty or additional liability.
197
+
198
+ END OF TERMS AND CONDITIONS
199
+
200
+ ============================================================================
201
+
202
+ Copyright 2016-2019 Intel Corporation
203
+ Copyright 2018 YANDEX LLC
204
+
205
+ Licensed under the Apache License, Version 2.0 (the "License");
206
+ you may not use this file except in compliance with the License.
207
+ You may obtain a copy of the License at
208
+
209
+ http://www.apache.org/licenses/LICENSE-2.0
210
+
211
+ Unless required by applicable law or agreed to in writing, software
212
+ distributed under the License is distributed on an "AS IS" BASIS,
213
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
214
+ See the License for the specific language governing permissions and
215
+ limitations under the License.
216
+
217
+ This distribution includes third party software ("third party programs").
218
+ This third party software, even if included with the distribution of
219
+ the Intel software, may be governed by separate license terms, including
220
+ without limitation, third party license terms, other Intel software license
221
+ terms, and open source software license terms. These separate license terms
222
+ govern your use of the third party programs as set forth in the
223
+ "THIRD-PARTY-PROGRAMS" file.
added_tokens.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "<|assistant|>": 32001,
3
+ "<|endoftext|>": 32000,
4
+ "<|end|>": 32007,
5
+ "<|placeholder1|>": 32002,
6
+ "<|placeholder2|>": 32003,
7
+ "<|placeholder3|>": 32004,
8
+ "<|placeholder4|>": 32005,
9
+ "<|placeholder5|>": 32008,
10
+ "<|placeholder6|>": 32009,
11
+ "<|system|>": 32006,
12
+ "<|user|>": 32010
13
+ }
config.json ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "Phi-3-medium-4k-instruct",
3
+ "architectures": [
4
+ "Phi3ForCausalLM"
5
+ ],
6
+ "attention_dropout": 0.0,
7
+ "auto_map": {
8
+ "AutoConfig": "configuration_phi3.Phi3Config",
9
+ "AutoModelForCausalLM": "modeling_phi3.Phi3ForCausalLM"
10
+ },
11
+ "bos_token_id": 1,
12
+ "embd_pdrop": 0.0,
13
+ "eos_token_id": 32000,
14
+ "hidden_act": "silu",
15
+ "hidden_size": 5120,
16
+ "initializer_range": 0.02,
17
+ "intermediate_size": 17920,
18
+ "max_position_embeddings": 4096,
19
+ "model_type": "phi3",
20
+ "num_attention_heads": 40,
21
+ "num_hidden_layers": 40,
22
+ "num_key_value_heads": 10,
23
+ "original_max_position_embeddings": 4096,
24
+ "pad_token_id": 32000,
25
+ "resid_pdrop": 0.0,
26
+ "rms_norm_eps": 1e-05,
27
+ "rope_scaling": null,
28
+ "rope_theta": 10000.0,
29
+ "sliding_window": 2047,
30
+ "tie_word_embeddings": false,
31
+ "torch_dtype": "bfloat16",
32
+ "transformers_version": "4.39.3",
33
+ "use_cache": true,
34
+ "vocab_size": 32064
35
+ }
configuration_phi3.py ADDED
@@ -0,0 +1,213 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # coding=utf-8
2
+ # Copyright 2024 Microsoft and the HuggingFace Inc. team. All rights reserved.
3
+ #
4
+ # Licensed under the Apache License, Version 2.0 (the "License");
5
+ # you may not use this file except in compliance with the License.
6
+ # You may obtain a copy of the License at
7
+ #
8
+ # http://www.apache.org/licenses/LICENSE-2.0
9
+ #
10
+ # Unless required by applicable law or agreed to in writing, software
11
+ # distributed under the License is distributed on an "AS IS" BASIS,
12
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13
+ # See the License for the specific language governing permissions and
14
+ # limitations under the License.
15
+
16
+ """ Phi-3 model configuration"""
17
+
18
+
19
+ from transformers.configuration_utils import PretrainedConfig
20
+ from transformers.utils import logging
21
+
22
+
23
+ logger = logging.get_logger(__name__)
24
+
25
+ PHI3_PRETRAINED_CONFIG_ARCHIVE_MAP = {
26
+ "microsoft/Phi-3-mini-4k-instruct": "https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/resolve/main/config.json",
27
+ "microsoft/Phi-3-mini-128k-instruct": "https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/resolve/main/config.json",
28
+ }
29
+
30
+
31
+ class Phi3Config(PretrainedConfig):
32
+ r"""
33
+ This is the configuration class to store the configuration of a [`Phi3Model`]. It is used to instantiate a Phi-3
34
+ model according to the specified arguments, defining the model architecture. Instantiating a configuration with the
35
+ defaults will yield a similar configuration to that of the
36
+ [microsoft/Phi-3-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct).
37
+
38
+ Configuration objects inherit from [`PretrainedConfig`] and can be used to control the model outputs. Read the
39
+ documentation from [`PretrainedConfig`] for more information.
40
+
41
+ Args:
42
+ vocab_size (`int`, *optional*, defaults to 32064):
43
+ Vocabulary size of the Phi-3 model. Defines the number of different tokens that can be represented by the
44
+ `inputs_ids` passed when calling [`Phi3Model`].
45
+ hidden_size (`int`, *optional*, defaults to 3072):
46
+ Dimension of the hidden representations.
47
+ intermediate_size (`int`, *optional*, defaults to 8192):
48
+ Dimension of the MLP representations.
49
+ num_hidden_layers (`int`, *optional*, defaults to 32):
50
+ Number of hidden layers in the Transformer decoder.
51
+ num_attention_heads (`int`, *optional*, defaults to 32):
52
+ Number of attention heads for each attention layer in the Transformer decoder.
53
+ num_key_value_heads (`int`, *optional*):
54
+ This is the number of key_value heads that should be used to implement Grouped Query Attention. If
55
+ `num_key_value_heads=num_attention_heads`, the model will use Multi Head Attention (MHA), if
56
+ `num_key_value_heads=1 the model will use Multi Query Attention (MQA) otherwise GQA is used. When
57
+ converting a multi-head checkpoint to a GQA checkpoint, each group key and value head should be constructed
58
+ by meanpooling all the original heads within that group. For more details checkout [this
59
+ paper](https://arxiv.org/pdf/2305.13245.pdf). If it is not specified, will default to
60
+ `num_attention_heads`.
61
+ resid_pdrop (`float`, *optional*, defaults to 0.0):
62
+ Dropout probability for mlp outputs.
63
+ embd_pdrop (`int`, *optional*, defaults to 0.0):
64
+ The dropout ratio for the embeddings.
65
+ attention_dropout (`float`, *optional*, defaults to 0.0):
66
+ The dropout ratio after computing the attention scores.
67
+ hidden_act (`str` or `function`, *optional*, defaults to `"silu"`):
68
+ The non-linear activation function (function or string) in the decoder.
69
+ max_position_embeddings (`int`, *optional*, defaults to 4096):
70
+ The maximum sequence length that this model might ever be used with.
71
+ original_max_position_embeddings (`int`, *optional*, defaults to 4096):
72
+ The maximum sequence length that this model was trained with. This is used to determine the size of the
73
+ original RoPE embeddings when using long scaling.
74
+ initializer_range (`float`, *optional*, defaults to 0.02):
75
+ The standard deviation of the truncated_normal_initializer for initializing all weight matrices.
76
+ rms_norm_eps (`float`, *optional*, defaults to 1e-05):
77
+ The epsilon value used for the RMSNorm.
78
+ use_cache (`bool`, *optional*, defaults to `True`):
79
+ Whether or not the model should return the last key/values attentions (not used by all models). Only
80
+ relevant if `config.is_decoder=True`. Whether to tie weight embeddings or not.
81
+ tie_word_embeddings (`bool`, *optional*, defaults to `False`):
82
+ Whether to tie weight embeddings
83
+ rope_theta (`float`, *optional*, defaults to 10000.0):
84
+ The base period of the RoPE embeddings.
85
+ rope_scaling (`dict`, *optional*):
86
+ The scaling strategy for the RoPE embeddings. If `None`, no scaling is applied. If a dictionary, it must
87
+ contain the following keys: `type`, `short_factor` and `long_factor`. The `type` must be either `su` or `yarn` and
88
+ the `short_factor` and `long_factor` must be lists of numbers with the same length as the hidden size
89
+ divided by the number of attention heads divided by 2.
90
+ bos_token_id (`int`, *optional*, defaults to 1):
91
+ The id of the "beginning-of-sequence" token.
92
+ eos_token_id (`int`, *optional*, defaults to 32000):
93
+ The id of the "end-of-sequence" token.
94
+ pad_token_id (`int`, *optional*, defaults to 32000):
95
+ The id of the padding token.
96
+ sliding_window (`int`, *optional*):
97
+ Sliding window attention window size. If `None`, no sliding window is applied.
98
+
99
+ Example:
100
+
101
+ ```python
102
+ >>> from transformers import Phi3Model, Phi3Config
103
+
104
+ >>> # Initializing a Phi-3 style configuration
105
+ >>> configuration = Phi3Config.from_pretrained("microsoft/Phi-3-mini-4k-instruct")
106
+
107
+ >>> # Initializing a model from the configuration
108
+ >>> model = Phi3Model(configuration)
109
+
110
+ >>> # Accessing the model configuration
111
+ >>> configuration = model.config
112
+ ```"""
113
+
114
+ model_type = "phi3"
115
+ keys_to_ignore_at_inference = ["past_key_values"]
116
+
117
+ def __init__(
118
+ self,
119
+ vocab_size=32064,
120
+ hidden_size=3072,
121
+ intermediate_size=8192,
122
+ num_hidden_layers=32,
123
+ num_attention_heads=32,
124
+ num_key_value_heads=None,
125
+ resid_pdrop=0.0,
126
+ embd_pdrop=0.0,
127
+ attention_dropout=0.0,
128
+ hidden_act="silu",
129
+ max_position_embeddings=4096,
130
+ original_max_position_embeddings=4096,
131
+ initializer_range=0.02,
132
+ rms_norm_eps=1e-5,
133
+ use_cache=True,
134
+ tie_word_embeddings=False,
135
+ rope_theta=10000.0,
136
+ rope_scaling=None,
137
+ bos_token_id=1,
138
+ eos_token_id=32000,
139
+ pad_token_id=32000,
140
+ sliding_window=None,
141
+ **kwargs,
142
+ ):
143
+ self.vocab_size = vocab_size
144
+ self.hidden_size = hidden_size
145
+ self.intermediate_size = intermediate_size
146
+ self.num_hidden_layers = num_hidden_layers
147
+ self.num_attention_heads = num_attention_heads
148
+
149
+ if num_key_value_heads is None:
150
+ num_key_value_heads = num_attention_heads
151
+
152
+ self.num_key_value_heads = num_key_value_heads
153
+ self.resid_pdrop = resid_pdrop
154
+ self.embd_pdrop = embd_pdrop
155
+ self.attention_dropout = attention_dropout
156
+ self.hidden_act = hidden_act
157
+ self.max_position_embeddings = max_position_embeddings
158
+ self.original_max_position_embeddings = original_max_position_embeddings
159
+ self.initializer_range = initializer_range
160
+ self.rms_norm_eps = rms_norm_eps
161
+ self.use_cache = use_cache
162
+ self.rope_theta = rope_theta
163
+ self.rope_scaling = rope_scaling
164
+ self._rope_scaling_validation()
165
+ self.sliding_window = sliding_window
166
+
167
+ super().__init__(
168
+ bos_token_id=bos_token_id,
169
+ eos_token_id=eos_token_id,
170
+ pad_token_id=pad_token_id,
171
+ tie_word_embeddings=tie_word_embeddings,
172
+ **kwargs,
173
+ )
174
+
175
+ def _rope_scaling_validation(self):
176
+ """
177
+ Validate the `rope_scaling` configuration.
178
+ """
179
+ if self.rope_scaling is None:
180
+ return
181
+
182
+ if not isinstance(self.rope_scaling, dict) or len(self.rope_scaling) != 3:
183
+ raise ValueError(
184
+ "`rope_scaling` must be a dictionary with three fields, `type`, `short_factor` and `long_factor`, "
185
+ f"got {self.rope_scaling}"
186
+ )
187
+ rope_scaling_type = self.rope_scaling.get("type", None)
188
+ rope_scaling_short_factor = self.rope_scaling.get("short_factor", None)
189
+ rope_scaling_long_factor = self.rope_scaling.get("long_factor", None)
190
+ if rope_scaling_type is None or rope_scaling_type not in ["su", "yarn"]:
191
+ raise ValueError(f"`rope_scaling`'s type field must be one of ['su', 'yarn'], got {rope_scaling_type}")
192
+ if not (
193
+ isinstance(rope_scaling_short_factor, list)
194
+ and all(isinstance(x, (int, float)) for x in rope_scaling_short_factor)
195
+ ):
196
+ raise ValueError(
197
+ f"`rope_scaling`'s short_factor field must be a list of numbers, got {rope_scaling_short_factor}"
198
+ )
199
+ if not len(rope_scaling_short_factor) == self.hidden_size // self.num_attention_heads // 2:
200
+ raise ValueError(
201
+ f"`rope_scaling`'s short_factor field must have length {self.hidden_size // self.num_attention_heads // 2}, got {len(rope_scaling_short_factor)}"
202
+ )
203
+ if not (
204
+ isinstance(rope_scaling_long_factor, list)
205
+ and all(isinstance(x, (int, float)) for x in rope_scaling_long_factor)
206
+ ):
207
+ raise ValueError(
208
+ f"`rope_scaling`'s long_factor field must be a list of numbers, got {rope_scaling_long_factor}"
209
+ )
210
+ if not len(rope_scaling_long_factor) == self.hidden_size // self.num_attention_heads // 2:
211
+ raise ValueError(
212
+ f"`rope_scaling`'s long_factor field must have length {self.hidden_size // self.num_attention_heads // 2}, got {len(rope_scaling_long_factor)}"
213
+ )
genai_config.json ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "model": {
3
+ "bos_token_id": 1,
4
+ "context_length": 4096,
5
+ "decoder": {
6
+ "session_options": {
7
+ "log_id": "onnxruntime-genai",
8
+ "provider_options": [
9
+ {
10
+ "dml": {}
11
+ }
12
+ ]
13
+ },
14
+ "filename": "model.onnx",
15
+ "head_size": 128,
16
+ "hidden_size": 5120,
17
+ "inputs": {
18
+ "input_ids": "input_ids",
19
+ "attention_mask": "attention_mask",
20
+ "position_ids": "position_ids",
21
+ "past_key_names": "past_key_values.%d.key",
22
+ "past_value_names": "past_key_values.%d.value"
23
+ },
24
+ "outputs": {
25
+ "logits": "logits",
26
+ "present_key_names": "present.%d.key",
27
+ "present_value_names": "present.%d.value"
28
+ },
29
+ "num_attention_heads": 40,
30
+ "num_hidden_layers": 40,
31
+ "num_key_value_heads": 10
32
+ },
33
+ "eos_token_id": [
34
+ 32000,
35
+ 32001,
36
+ 32007
37
+ ],
38
+ "pad_token_id": 32000,
39
+ "type": "phi3",
40
+ "vocab_size": 32064
41
+ },
42
+ "search": {
43
+ "diversity_penalty": 0.0,
44
+ "do_sample": false,
45
+ "early_stopping": true,
46
+ "length_penalty": 1.0,
47
+ "max_length": 4096,
48
+ "min_length": 0,
49
+ "no_repeat_ngram_size": 0,
50
+ "num_beams": 1,
51
+ "num_return_sequences": 1,
52
+ "past_present_share_buffer": true,
53
+ "repetition_penalty": 1.0,
54
+ "temperature": 1.0,
55
+ "top_k": 1,
56
+ "top_p": 1.0
57
+ }
58
+ }
model.onnx ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ec36b45ae6c94ff49935535b6c78e591e15d6fb5bf9d3e1e9e1f03da03a32285
3
+ size 3750885
model.onnx.data ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d5bdb5e59d5bde62f6ac2bf0c8969d61d80ae6bffc216b171c4105b6271c382d
3
+ size 7496439040
special_tokens_map.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|endoftext|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|endoftext|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "unk_token": {
24
+ "content": "<unk>",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ }
30
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347
3
+ size 499723
tokenizer_config.json ADDED
@@ -0,0 +1,130 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": false,
3
+ "add_eos_token": false,
4
+ "added_tokens_decoder": {
5
+ "0": {
6
+ "content": "<unk>",
7
+ "lstrip": false,
8
+ "normalized": false,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ },
13
+ "1": {
14
+ "content": "<s>",
15
+ "lstrip": false,
16
+ "normalized": false,
17
+ "rstrip": false,
18
+ "single_word": false,
19
+ "special": true
20
+ },
21
+ "2": {
22
+ "content": "</s>",
23
+ "lstrip": false,
24
+ "normalized": false,
25
+ "rstrip": true,
26
+ "single_word": false,
27
+ "special": false
28
+ },
29
+ "32000": {
30
+ "content": "<|endoftext|>",
31
+ "lstrip": false,
32
+ "normalized": false,
33
+ "rstrip": false,
34
+ "single_word": false,
35
+ "special": true
36
+ },
37
+ "32001": {
38
+ "content": "<|assistant|>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": true,
42
+ "single_word": false,
43
+ "special": true
44
+ },
45
+ "32002": {
46
+ "content": "<|placeholder1|>",
47
+ "lstrip": false,
48
+ "normalized": false,
49
+ "rstrip": true,
50
+ "single_word": false,
51
+ "special": true
52
+ },
53
+ "32003": {
54
+ "content": "<|placeholder2|>",
55
+ "lstrip": false,
56
+ "normalized": false,
57
+ "rstrip": true,
58
+ "single_word": false,
59
+ "special": true
60
+ },
61
+ "32004": {
62
+ "content": "<|placeholder3|>",
63
+ "lstrip": false,
64
+ "normalized": false,
65
+ "rstrip": true,
66
+ "single_word": false,
67
+ "special": true
68
+ },
69
+ "32005": {
70
+ "content": "<|placeholder4|>",
71
+ "lstrip": false,
72
+ "normalized": false,
73
+ "rstrip": true,
74
+ "single_word": false,
75
+ "special": true
76
+ },
77
+ "32006": {
78
+ "content": "<|system|>",
79
+ "lstrip": false,
80
+ "normalized": false,
81
+ "rstrip": true,
82
+ "single_word": false,
83
+ "special": true
84
+ },
85
+ "32007": {
86
+ "content": "<|end|>",
87
+ "lstrip": false,
88
+ "normalized": false,
89
+ "rstrip": true,
90
+ "single_word": false,
91
+ "special": true
92
+ },
93
+ "32008": {
94
+ "content": "<|placeholder5|>",
95
+ "lstrip": false,
96
+ "normalized": false,
97
+ "rstrip": true,
98
+ "single_word": false,
99
+ "special": true
100
+ },
101
+ "32009": {
102
+ "content": "<|placeholder6|>",
103
+ "lstrip": false,
104
+ "normalized": false,
105
+ "rstrip": true,
106
+ "single_word": false,
107
+ "special": true
108
+ },
109
+ "32010": {
110
+ "content": "<|user|>",
111
+ "lstrip": false,
112
+ "normalized": false,
113
+ "rstrip": true,
114
+ "single_word": false,
115
+ "special": true
116
+ }
117
+ },
118
+ "bos_token": "<s>",
119
+ "chat_template": "{% for message in messages %}{% if (message['role'] == 'user') %}{{'<|user|>' + '\n' + message['content'] + '<|end|>' + '\n' + '<|assistant|>' + '\n'}}{% elif (message['role'] == 'assistant') %}{{message['content'] + '<|end|>' + '\n'}}{% endif %}{% endfor %}",
120
+ "clean_up_tokenization_spaces": false,
121
+ "eos_token": "<|endoftext|>",
122
+ "legacy": false,
123
+ "model_max_length": 4096,
124
+ "pad_token": "<|endoftext|>",
125
+ "padding_side": "left",
126
+ "sp_model_kwargs": {},
127
+ "tokenizer_class": "LlamaTokenizer",
128
+ "unk_token": "<unk>",
129
+ "use_default_system_prompt": false
130
+ }