Text Generation
Transformers
Safetensors
llama
text-generation-inference
Inference Endpoints
mfromm commited on
Commit
6b6ebab
·
verified ·
1 Parent(s): 2bcc49d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -31,10 +31,11 @@ pipeline_tag: text-generation
31
  library_name: transformers
32
  base_model:
33
  - openGPT-X/Teuken-7B-base-v0.4
34
- license: apache-2.0
35
  ---
36
  # Model Card for Teuken-7B-instruct-v0.4
37
 
 
 
38
  Teuken-7B-instruct-v0.4 is an instruction-tuned version of Teuken-7B-base-v0.4.
39
 
40
 
@@ -42,11 +43,11 @@ Teuken-7B-instruct-v0.4 is an instruction-tuned version of Teuken-7B-base-v0.4.
42
 
43
  <!-- Provide a longer summary of what this model is. -->
44
 
45
- - **Developed by:** Fraunhofer IAIS
46
  - **Funded by:** German Federal Ministry of Economics and Climate Protection (BMWK) in the context of the OpenGPT-X project
47
  - **Model type:** Transformer based decoder-only model
48
  - **Language(s) (NLP):** bg, cs, da, de, el, en, es, et, fi, fr, ga, hr, hu, it, lt, lv, mt, nl, pl, pt, ro, sk, sl, sv
49
- - **Shared by:** Fraunhofer IAIS
50
 
51
  ## Uses
52
 
 
31
  library_name: transformers
32
  base_model:
33
  - openGPT-X/Teuken-7B-base-v0.4
 
34
  ---
35
  # Model Card for Teuken-7B-instruct-v0.4
36
 
37
+
38
+ Teuken-7B-base-v0.4 is a 7B parameter multilingual large language model (LLM) pre-trained with 4T tokens within the research project OpenGPT-X.
39
  Teuken-7B-instruct-v0.4 is an instruction-tuned version of Teuken-7B-base-v0.4.
40
 
41
 
 
43
 
44
  <!-- Provide a longer summary of what this model is. -->
45
 
46
+ - **Developed by:** Fraunhofer, Forschungszentrum Jülich, TU Dresden, DFKI
47
  - **Funded by:** German Federal Ministry of Economics and Climate Protection (BMWK) in the context of the OpenGPT-X project
48
  - **Model type:** Transformer based decoder-only model
49
  - **Language(s) (NLP):** bg, cs, da, de, el, en, es, et, fi, fr, ga, hr, hu, it, lt, lv, mt, nl, pl, pt, ro, sk, sl, sv
50
+ - **Shared by:** OpenGPT-X
51
 
52
  ## Uses
53