--- license: apache-2.0 --- Self trained GPT-2 large. Around 770M parameters. The tokenizer is the one from https://huggingface.co./openai-community/gpt2. It is being trained on around 400B tokens and this is step 32k. The evaluation is being conducted now. ## License This model is available under the Apache 2.0 License. Well, also MIT License. So both should be followed. ## Discord Server Join our Discord server [here](https://discord.gg/xhcBDEM3). ## Feeling Generous? 😊 Eager to buy me a cup of 2$ coffe or iced tea?🍵☕ Sure, here is the link: [https://ko-fi.com/drnicefellow](https://ko-fi.com/drnicefellow). Please add a note on which one you want me to drink?