This model consists of a nnU-Net trained on the GelGenie training dataset in December 2023. It is highly robust and should produce good results for a wide variety of gel types and resolutions. The model was packaged into a torchscript file by scripting many of the data processing functions within the model itself. As such, it is quite a large and complicated model and will require a GPU to run efficiently.
The model was trained on the training set (420 images) for 600 epochs and the best checkpoint extracted based on the validation pseudo-dice score (using the nnU-Net training package).
For more details on the configuration used for training, please visit https://huggingface.co./mattaq/GelGenie-nnUNet-Dec-2023 and check the config files. Our codebase is fully open-sourced and is available here: https://github.com/mattaq31/GelGenie.
For the original nnU-Net package and for more details on the nnU-Net model itself, please check https://github.com/MIC-DKFZ/nnUNet.