juewang commited on
Commit
0c964a1
1 Parent(s): e6e29aa

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -75,6 +75,8 @@ To run the model locally, we strongly recommend to install Flash Attention V2, w
75
  ```
76
  # Please update the path of `CUDA_HOME`
77
  export CUDA_HOME=/usr/local/cuda-11.8
 
 
78
  pip install ninja
79
  pip install flash-attn --no-build-isolation
80
  pip install git+https://github.com/HazyResearch/flash-attention.git#subdirectory=csrc/rotary
 
75
  ```
76
  # Please update the path of `CUDA_HOME`
77
  export CUDA_HOME=/usr/local/cuda-11.8
78
+ pip install transformers==4.31.0
79
+ pip install sentencepiece
80
  pip install ninja
81
  pip install flash-attn --no-build-isolation
82
  pip install git+https://github.com/HazyResearch/flash-attention.git#subdirectory=csrc/rotary