Zakia commited on
Commit
c1e6ef9
1 Parent(s): f6804f2

Made the references as list under APA section

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -170,8 +170,8 @@ If you use this model, please cite both the original GPT-2 and DistilBERT papers
170
 
171
  **APA:**
172
 
173
- Radford, A., et al. (2019). Language Models are Unsupervised Multitask Learners.
174
- Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108.
175
 
176
  ## More Information
177
 
 
170
 
171
  **APA:**
172
 
173
+ - Radford, A., et al. (2019). Language Models are Unsupervised Multitask Learners.
174
+ - Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108.
175
 
176
  ## More Information
177