Update README.md
Browse files
README.md
CHANGED
@@ -22,12 +22,13 @@ This model has only been tested on RyzenAI for Windows 11. It does not work in L
|
|
22 |
RoPE support is not yet complete, but it has been confirmed that the perplexity is lower than Llama 3.
|
23 |
|
24 |
2024/07/30
|
25 |
-
- [Ryzen AI Software 1.2](https://ryzenai.docs.amd.com/en/latest/) has been released. Please note that this model is based on [Ryzen AI Software 1.1](https://ryzenai.docs.amd.com/en/1.1/index.html)
|
26 |
- [amd/RyzenAI-SW 1.2](https://github.com/amd/RyzenAI-SW) was announced on July 29, 2024. This sample for [amd/RyzenAI-SW 1.1](https://github.com/amd/RyzenAI-SW/tree/1.1). Please note that the folder and script contents have been completely changed.
|
27 |
|
|
|
|
|
28 |
|
29 |
-
|
30 |
-
### setup
|
31 |
In cmd windows.
|
32 |
```
|
33 |
conda activate ryzenai-transformers
|
@@ -132,6 +133,30 @@ if __name__ == "__main__":
|
|
132 |
|
133 |
```
|
134 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
135 |
![chat_image](trans-sample.png)
|
136 |
|
137 |
## Acknowledgements
|
|
|
22 |
RoPE support is not yet complete, but it has been confirmed that the perplexity is lower than Llama 3.
|
23 |
|
24 |
2024/07/30
|
25 |
+
- [Ryzen AI Software 1.2](https://ryzenai.docs.amd.com/en/latest/) has been released. Please note that this model is based on [Ryzen AI Software 1.1](https://ryzenai.docs.amd.com/en/1.1/index.html).
|
26 |
- [amd/RyzenAI-SW 1.2](https://github.com/amd/RyzenAI-SW) was announced on July 29, 2024. This sample for [amd/RyzenAI-SW 1.1](https://github.com/amd/RyzenAI-SW/tree/1.1). Please note that the folder and script contents have been completely changed.
|
27 |
|
28 |
+
2024/08/04
|
29 |
+
- This model was created with the 1.1 driver, but it has been confirmed that it works with 1.2. Please check the setup for 1.2 driver.
|
30 |
|
31 |
+
### setup for 1.1 driver
|
|
|
32 |
In cmd windows.
|
33 |
```
|
34 |
conda activate ryzenai-transformers
|
|
|
133 |
|
134 |
```
|
135 |
|
136 |
+
### setup for 1.2 driver
|
137 |
+
|
138 |
+
The setup of 1.2 may not work even if you follow the instructions, so I will write some tips on how to run it below.
|
139 |
+
For the first half, see [Appendix: Tips for running Ryzen AI Software 1.2 in Running LLM on AMD NPU Hardware](https://www.hackster.io/gharada2013/running-llm-on-amd-npu-hardware-19322f).
|
140 |
+
|
141 |
+
Then,
|
142 |
+
-Uninstall VC 2019
|
143 |
+
I'm not sure if this is the cause, but compilation sometimes failed if VC 2019 was installed
|
144 |
+
|
145 |
+
-Delete the previous virtual environment for 1.1
|
146 |
+
This may not be necessary, but just to be sure
|
147 |
+
|
148 |
+
-Follow the instructions on [LLMs on RyzenAI with Pytorch](https://github.com/amd/RyzenAI-SW/blob/main/example/transformers/models/llm/docs/README.md)
|
149 |
+
After creating the Z drive and compiling, delete the Z drive before running the script. Otherwise, an error will occur if the module cannot be found.
|
150 |
+
|
151 |
+
-Add PYTHONPATH manually in your CMD window.
|
152 |
+
```
|
153 |
+
set PYTHONPATH=%PYTHONPATH%;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\tools;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\ops\python;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\models\llm\chatglm3;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\ext\llm-awq\awq\quantize;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\ext\smoothquant\smoothquant;<your_ryzen_ai-sw_install_path>\RyzenAI-SW\example\transformers\ext\llm-awq\awq\utils
|
154 |
+
```
|
155 |
+
|
156 |
+
- Copy [modeling_llama_amd.py](https://github.com/amd/RyzenAI-SW/blob/1.1/example/transformers/models/llama2/modeling_llama_amd.py) from the version1.1 tree
|
157 |
+
|
158 |
+
|
159 |
+
|
160 |
![chat_image](trans-sample.png)
|
161 |
|
162 |
## Acknowledgements
|