MakiPan commited on
Commit
97c4d80
·
1 Parent(s): b79a923

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +7 -6
app.py CHANGED
@@ -265,14 +265,15 @@ To preprocess the data there were three options we considered:
265
 
266
  We anecdotally determined that when trained at lower steps the encoded hand model performed better than the standard MediaPipe model due to implied handedness. We theorize that with a larger dataset of more full-body hand and pose classifications, Holistic landmarks will provide the best images in the future however for the moment the hand encoded model performs best. """)
267
 
268
- gr.Markdown("""<center><h3 style="text-align: center;"><a href="https://huggingface.co/Vincent-luo/controlnet-hands">Standard Model Link</a></h3>
269
- <h3 style="text-align: center;"> <a href="https://huggingface.co/MakiPan/controlnet-encoded-hands-130k/">Model using Hand Encoding</a></h3>
 
270
 
271
- <h3 style="text-align: center;"> <a href="https://huggingface.co/datasets/MakiPan/hagrid250k-blip2">Dataset Used To Train the Standard Model</a></h3>
272
- <h3 style="text-align: center;"> <a href="https://huggingface.co/datasets/MakiPan/hagrid-hand-enc-250k">Dataset Used To Train the Hand Encoding Model</a></h3>
273
 
274
- <h3 style="text-align: center;"> <a href="https://github.com/Maki-DS/Jax-Controlnet-hand-training/blob/main/normal-preprocessing.py">Standard Data Preprocessing Script</a></h3>
275
- <h3 style="text-align: center;"> <a href="https://github.com/Maki-DS/Jax-Controlnet-hand-training/blob/main/Hand-encoded-preprocessing.py">Hand Encoding Data Preprocessing Script</a></h3></center>""")
276
 
277
  model_type = gr.Radio(["Standard", "Hand Encoding"], label="Model preprocessing", info="We developed two models, one with standard MediaPipe landmarks, and one with different (but similar) coloring on palm landmarks to distinguish left and right")
278
 
 
265
 
266
  We anecdotally determined that when trained at lower steps the encoded hand model performed better than the standard MediaPipe model due to implied handedness. We theorize that with a larger dataset of more full-body hand and pose classifications, Holistic landmarks will provide the best images in the future however for the moment the hand encoded model performs best. """)
267
 
268
+ gr.Markdown("""<center><h2><b>LINKS 🔗</b></h2>
269
+ <h3 style="text-align: center;"><a href="https://huggingface.co/Vincent-luo/controlnet-hands">Standard Model Link</a></h3>
270
+ <h3 style="text-align: center;"> <a href="https://huggingface.co/MakiPan/controlnet-encoded-hands-130k/">Model using Hand Encoding</a></h3>
271
 
272
+ <h3 style="text-align: center;"> <a href="https://huggingface.co/datasets/MakiPan/hagrid250k-blip2">Dataset Used To Train the Standard Model</a></h3>
273
+ <h3 style="text-align: center;"> <a href="https://huggingface.co/datasets/MakiPan/hagrid-hand-enc-250k">Dataset Used To Train the Hand Encoding Model</a></h3>
274
 
275
+ <h3 style="text-align: center;"> <a href="https://github.com/Maki-DS/Jax-Controlnet-hand-training/blob/main/normal-preprocessing.py">Standard Data Preprocessing Script</a></h3>
276
+ <h3 style="text-align: center;"> <a href="https://github.com/Maki-DS/Jax-Controlnet-hand-training/blob/main/Hand-encoded-preprocessing.py">Hand Encoding Data Preprocessing Script</a></h3></center>""")
277
 
278
  model_type = gr.Radio(["Standard", "Hand Encoding"], label="Model preprocessing", info="We developed two models, one with standard MediaPipe landmarks, and one with different (but similar) coloring on palm landmarks to distinguish left and right")
279