Made the references as list under APA section
Browse files
README.md
CHANGED
@@ -170,8 +170,8 @@ If you use this model, please cite both the original GPT-2 and DistilBERT papers
|
|
170 |
|
171 |
**APA:**
|
172 |
|
173 |
-
Radford, A., et al. (2019). Language Models are Unsupervised Multitask Learners.
|
174 |
-
Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108.
|
175 |
|
176 |
## More Information
|
177 |
|
|
|
170 |
|
171 |
**APA:**
|
172 |
|
173 |
+
- Radford, A., et al. (2019). Language Models are Unsupervised Multitask Learners.
|
174 |
+
- Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108.
|
175 |
|
176 |
## More Information
|
177 |
|