New discussion

for using model

#63 opened 8 months ago by
yogeshm

Access Denied

#59 opened 8 months ago by
Jerry-hyl

Update README.md

#57 opened 8 months ago by
inuwamobarak

Batched inference on multi-GPUs

4
#56 opened 8 months ago by
d-i-o-n

Badly Encoded Tokens/Mojibake

2
#55 opened 8 months ago by
muchanem

Denied permission to DL

11
#51 opened 8 months ago by
TimPine

Instruct format?

3
#44 opened 8 months ago by
m-conrad-202

MPS support quantification

5
#39 opened 8 months ago by
tonimelisma

Problem with the tokenizer

2
#37 opened 8 months ago by
Douedos

Garbage responses

2
#30 opened 8 months ago by
RainmakerP

GPU requirements

10
#29 opened 8 months ago by
Gerald001

can I run it on CPU ?

5
#28 opened 8 months ago by
aljbali

Is it really good?

5
#20 opened 8 months ago by
urtuuuu

OMG insomnia in the community

#16 opened 8 months ago by
Languido

Max output tokens?

4
#12 opened 8 months ago by
stri8ted

IAM READYYYYYY

2
#3 opened 8 months ago by
10100101j