The biggest model ever to have been released. Has not been tested, nor do I have the compute to test it. If anyone is willing to host this to help me test, please share your results in the community tab.
Thank you for coming to my ted talk.
This is nearly 960GB of weights. It requires at least 8xA100 80gb to run it in 4 bits probably. probably
- Downloads last month
- 1,050
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.