--- license: llama2 --- This model pairs with the PRETRAINED variant of amharic llama. Available here: https://huggingface.co./iocuydi/llama-2-amharic-3784m/tree/main/pretrained It will also require Llama2 weights and this clip model: https://huggingface.co./openai/clip-vit-large-patch14-336 More information on running the model here: https://github.com/iocuydi/amharic-llama-llava Cite: ``` @misc{andersland2024amharic, title={Amharic LLaMA and LLaVA: Multimodal LLMs for Low Resource Languages}, author={Michael Andersland}, year={2024}, eprint={2403.06354}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```