Spaces:
Running
Running
Kokoro-82M ONNX Runtime Inference
This repository contains minimal code and resources for inference using the Kokoro-82M model. The repository supports inference using ONNX Runtime.
Machine learning models rely on large datasets and complex algorithms to identify patterns and make predictions. | Did you know that honey never spoils? Archaeologists have found pots of honey in ancient Egyptian tombs that are over 3,000 years old and still edible! |
Features
- ONNX Runtime Inference: Kokoro-82M (v0_19) Minimal ONNX Runtime Inference code. It supports
en-us
anden-gb
.
Installation
Clone the repository:
git clone https://github.com/yakhyo/kokoro-82m.git cd kokoro-82m
Install dependencies:
pip install -r requirements.txt
Install
espeak
for text-to-speech functionality: Linux:apt-get install espeak -y
Usage
Download ONNX Model
Jupyter Notebook Inference Example
Run inference using the jupyter notebook:
CLI Inference
Specify input text and model weights in inference.py
then run:
python inference.py
Gradio App
Run below start Gradio App
python app.py
License
This project is licensed under the MIT License. Model weights licensed under the Apache 2.0