DinoVdeau is a fine-tuned version of Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel. It achieves the following results on the test set:

  • Loss: 0.0494
  • F1 Micro: 0.7640
  • F1 Macro: 0.3461
  • Accuracy: 0.7130
Class F1 per class
ALGAE 0.7961
Acr 0.7462
Acr_Br 0.3797
Anem 0.6767
CCA 0.2710
Ech 0.3610
Fts 0.3889
Gal 0.4667
Gon 0.2222
Mtp 0.5521
P 0.3615
Poc 0.4367
Por 0.5018
R 0.7153
RDC 0.1781
S 0.8252
SG 0.8504
Sarg 0.6303
Ser 0.3252
Slt 0.4188
Sp 0.4198
Turf 0.6045
UNK 0.3763

Model description

DinoVdeau is a model built on top of Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.

The source code for training the model can be found in this Git repository.


Intended uses & limitations

You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.


Training and evaluation data

Details on the number of images for each class are given in the following table:

Class train test val Total
ALGAE 36874 12292 12292 61458
Acr 5358 1787 1786 8931
Acr_Br 123 42 42 207
Anem 235 79 79 393
CCA 918 306 306 1530
Ech 618 206 206 1030
Fts 168 57 57 282
Gal 465 155 155 775
Gon 158 53 53 264
Mtp 2370 791 790 3951
P 2658 887 886 4431
Poc 549 184 183 916
Por 1059 354 353 1766
R 31437 10480 10479 52396
RDC 930 310 310 1550
S 57624 19209 19209 96042
SG 25539 8513 8513 42565
Sarg 285 96 96 477
Ser 261 87 87 435
Slt 2730 911 911 4552
Sp 132 44 44 220
Turf 1395 466 466 2327
UNK 292 98 98 488

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • Number of Epochs: 150.0
  • Learning Rate: 0.001
  • Train Batch Size: 64
  • Eval Batch Size: 64
  • Optimizer: Adam
  • LR Scheduler Type: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
  • Freeze Encoder: Yes
  • Data Augmentation: Yes

Data Augmentation

Data were augmented using the following transformations :

Train Transforms

  • PreProcess: No additional parameters
  • Resize: probability=1.00
  • RandomHorizontalFlip: probability=0.25
  • RandomVerticalFlip: probability=0.25
  • ColorJiggle: probability=0.25
  • RandomPerspective: probability=0.25
  • Normalize: probability=1.00

Val Transforms

  • PreProcess: No additional parameters
  • Resize: probability=1.00
  • Normalize: probability=1.00

Training results

Epoch Validation Loss Accuracy F1 Macro F1 Micro Learning Rate
0 N/A N/A N/A N/A 0.001
1 0.8028618097305298 0.7326527412414418 0.7326527412414418 0.2723318326456157 0.001
2 0.8038854002952576 0.7288723192975732 0.7288723192975732 0.290650504066453 0.001
3 0.7705450654029846 0.7408581732025574 0.7408581732025574 0.326985383349658 0.001
4 0.7623223066329956 0.7417118168673019 0.7417118168673019 0.310238611675961 0.001
5 0.7626621127128601 0.7383495061061653 0.7383495061061653 0.3107947240093857 0.001
6 0.7451828122138977 0.7447257016428285 0.7447257016428285 0.34331624846537584 0.001
7 0.7458378672599792 0.7467291510600861 0.7467291510600861 0.3283260141196405 0.001
8 0.7398682832717896 0.7458929286946221 0.7458929286946221 0.3352756381207012 0.001
9 0.7424591779708862 0.7455270814097316 0.7455270814097316 0.329402464136349 0.001
10 0.7364382147789001 0.7474782669291475 0.7474782669291475 0.31573051576045374 0.001
11 0.7368418574333191 0.7465375167680005 0.7465375167680005 0.34419509138343996 0.001
12 0.7442134022712708 0.7428093587219735 0.7428093587219735 0.3321199292959056 0.001
13 0.7384127378463745 0.7479312207104406 0.7479312207104406 0.35283143267744377 0.001
14 0.7464041113853455 0.7463633037751956 0.7463633037751956 0.33455884660313173 0.001
15 0.7394037842750549 0.7446734377449871 0.7446734377449871 0.34277495445998946 0.001
16 0.7397111058235168 0.7478789568125991 0.7478789568125991 0.3506456767847629 0.001
17 0.7110718488693237 0.7554398007003362 0.7554398007003362 0.3747287292827094 0.0001
18 0.7041681408882141 0.7567463981463738 0.7567463981463738 0.3792648315920964 0.0001
19 0.7004917860031128 0.7582446298844968 0.7582446298844968 0.38576725361345504 0.0001
20 0.6942671537399292 0.7602829219003153 0.7602829219003153 0.39339544323315423 0.0001
21 0.6919424533843994 0.7592899078413268 0.7592899078413268 0.3942710537489151 0.0001
22 0.6903713941574097 0.7606313478859253 0.7606313478859253 0.3925331634038099 0.0001
23 0.6874070167541504 0.7607010330830474 0.7607010330830474 0.39534246826429575 0.0001
24 0.6864963173866272 0.7612236720614624 0.7612236720614624 0.3933203620961568 0.0001
25 0.684335470199585 0.7614153063535478 0.7614153063535478 0.402343965759826 0.0001
26 0.6830293536186218 0.7629309593909513 0.7629309593909513 0.4055175650763901 0.0001
27 0.6827249526977539 0.7630703297851954 0.7630703297851954 0.40742696523740624 0.0001
28 0.6805527210235596 0.7629309593909513 0.7629309593909513 0.413578373717749 0.0001
29 0.6796479225158691 0.7625825334053413 0.7625825334053413 0.4138447052049864 0.0001
30 0.6774595379829407 0.7635929687636104 0.7635929687636104 0.41283784117218203 0.0001
31 0.677918553352356 0.7643769272312328 0.7643769272312328 0.4100283180344899 0.0001
32 0.6754601001739502 0.7641504503405864 0.7641504503405864 0.41093585032897456 0.0001
33 0.6749601364135742 0.7645162976254769 0.7645162976254769 0.4185852176848548 0.0001
34 0.6746455430984497 0.7650040940053309 0.7650040940053309 0.41458520697205553 0.0001
35 0.6740487813949585 0.7641852929391474 0.7641852929391474 0.42549626405712787 0.0001
36 0.6740365624427795 0.7646905106182819 0.7646905106182819 0.42042185334833837 0.0001
37 0.6731483936309814 0.7641678716398669 0.7641678716398669 0.4194338951085209 0.0001
38 0.671998918056488 0.7648821449103674 0.7648821449103674 0.42433192329853336 0.0001
39 0.6694707870483398 0.7659274228671974 0.7659274228671974 0.42040362773805245 0.0001
40 0.6693674325942993 0.7665023257434539 0.7665023257434539 0.42052839725843943 0.0001
41 0.6682748198509216 0.7658228950715145 0.7658228950715145 0.4176095837463332 0.0001
42 0.6682831645011902 0.7665371683420149 0.7665371683420149 0.4307432846853069 0.0001
43 0.6695142984390259 0.7664152192470515 0.7664152192470515 0.42637174689527435 0.0001
44 0.669391393661499 0.765840316370795 0.765840316370795 0.42609728385101026 0.0001
45 0.6695447564125061 0.7649518301074895 0.7649518301074895 0.4307801257532618 0.0001
46 0.6653340458869934 0.7673908120067595 0.7673908120067595 0.4362286100546047 0.0001
47 0.6659862995147705 0.7674430759046009 0.7674430759046009 0.43279477858934173 0.0001
48 0.665557861328125 0.7668855943276249 0.7668855943276249 0.43077383038275147 0.0001
49 0.6672787666320801 0.7666242748384174 0.7666242748384174 0.4241681801855841 0.0001
50 0.6661437749862671 0.7662584275535269 0.7662584275535269 0.43151276516162357 0.0001
51 0.6638755798339844 0.7667288026341005 0.7667288026341005 0.4307690989303114 0.0001
52 0.6654694676399231 0.7679134509851745 0.7679134509851745 0.4427799679621408 0.0001
53 0.6643231511116028 0.767234020313235 0.767234020313235 0.4341825403324277 0.0001
54 0.667382001876831 0.7663455340499294 0.7663455340499294 0.4459616186399035 0.0001
55 0.6627440452575684 0.7684709325621505 0.7684709325621505 0.4389385481332223 0.0001
56 0.6627209186553955 0.767094649918991 0.767094649918991 0.43857797557707 0.0001
57 0.6640397310256958 0.7669204369261859 0.7669204369261859 0.43847119749006624 0.0001
58 0.6627684235572815 0.7672862842110765 0.7672862842110765 0.43760916892251167 0.0001
59 0.6614954471588135 0.7679134509851745 0.7679134509851745 0.439932052501894 0.0001
60 0.6633245944976807 0.766990122123308 0.766990122123308 0.44188579219640917 0.0001
61 0.6611309051513672 0.7685754603578335 0.7685754603578335 0.4370683373238057 0.0001
62 0.660831093788147 0.7684360899635895 0.7684360899635895 0.45352172583851785 0.0001
63 0.6621896028518677 0.767791501890211 0.767791501890211 0.44611945476580944 0.0001
64 0.6610415577888489 0.767547603700284 0.767547603700284 0.4439334258360575 0.0001
65 0.6589834690093994 0.7680354000801379 0.7680354000801379 0.434573899819693 0.0001
66 0.6599727272987366 0.76845351126287 0.76845351126287 0.4397010901020239 0.0001
67 0.6572328209877014 0.769045835438407 0.769045835438407 0.44838573955584105 0.0001
68 0.658860445022583 0.7686799881535165 0.7686799881535165 0.4442341389313245 0.0001
69 0.659292995929718 0.7688542011463215 0.7688542011463215 0.43926108990057783 0.0001
70 0.658970832824707 0.7679134509851745 0.7679134509851745 0.4357439201261406 0.0001
71 0.6567061543464661 0.768244455671504 0.768244455671504 0.4432082950234043 0.0001
72 0.65887850522995 0.7681225065765405 0.7681225065765405 0.4369170898714745 0.0001
73 0.6611541509628296 0.7675998675981255 0.7675998675981255 0.4436833314710758 0.0001
74 0.6570971012115479 0.768488353861431 0.768488353861431 0.44906341810873124 0.0001
75 0.6557245254516602 0.768174770474382 0.768174770474382 0.44439364748025234 0.0001
76 0.658838152885437 0.7683489834671869 0.7683489834671869 0.4478607285233603 0.0001
77 0.6572225093841553 0.7686277242556749 0.7686277242556749 0.44887316801028765 0.0001
78 0.6562930941581726 0.768767094649919 0.768767094649919 0.4439839848601264 0.0001
79 0.6564787030220032 0.76810508527726 0.76810508527726 0.4379298766193662 0.0001
80 0.661143958568573 0.768383826065748 0.768383826065748 0.4460529321244195 0.0001
81 0.660437285900116 0.768941307642724 0.768941307642724 0.44750591322776384 0.0001
82 0.6531779766082764 0.7705614884758105 0.7705614884758105 0.4526720456188751 1e-05
83 0.6532895565032959 0.77019564119092 0.77019564119092 0.4489367718812771 1e-05
84 0.6505005359649658 0.7705266458772495 0.7705266458772495 0.45139096558153424 1e-05
85 0.6501905918121338 0.7708053866657375 0.7708053866657375 0.4565625671001629 1e-05
86 0.6507149338722229 0.7707182801693351 0.7707182801693351 0.4559124472677571 1e-05
87 0.6484472751617432 0.770927335760701 0.770927335760701 0.45838476700319497 1e-05
88 0.6496042013168335 0.77054406717653 0.77054406717653 0.4569340367642959 1e-05
89 0.6486304402351379 0.7713977108412745 0.7713977108412745 0.45601319436503734 1e-05
90 0.6490767598152161 0.7711886552499085 0.7711886552499085 0.45742417795749246 1e-05
91 0.6481940746307373 0.7704221180815666 0.7704221180815666 0.45182909085509243 1e-05
92 0.6477252244949341 0.7715545025347991 0.7715545025347991 0.45503456574154166 1e-05
93 0.6489835381507874 0.771206076549189 0.771206076549189 0.4518083431267937 1e-05
94 0.6485304832458496 0.770753122767896 0.770753122767896 0.45105746851856937 1e-05
95 0.647895336151123 0.771659030330482 0.771659030330482 0.45671492126995916 1e-05
96 0.6472702622413635 0.7715022386369575 0.7715022386369575 0.4596959338122155 1e-05
97 0.6460831165313721 0.7713977108412745 0.7713977108412745 0.4625401473178366 1e-05
98 0.6463102698326111 0.7722165119074581 0.7722165119074581 0.45892155771298887 1e-05
99 0.646852433681488 0.77089249316214 0.77089249316214 0.4549241891767946 1e-05
100 0.6456441879272461 0.7723384610024215 0.7723384610024215 0.45970146016594643 1e-05
101 0.6469387412071228 0.7717461368268845 0.7717461368268845 0.4573593202819609 1e-05
102 0.646738588809967 0.7717809794254455 0.7717809794254455 0.4593480600391769 1e-05
103 0.6467755436897278 0.7723036184038605 0.7723036184038605 0.4576617106244536 1e-05
104 0.6456966400146484 0.7725475165937876 0.7725475165937876 0.4578893476938316 1e-05
105 0.6455578804016113 0.7719377711189701 0.7719377711189701 0.45556962818046953 1e-05
106 0.6444206237792969 0.7723384610024215 0.7723384610024215 0.4644160307431161 1e-05
107 0.26552170515060425 0.04797825821849794 0.4910938804941607 0.360721302464235 1e-05
108 0.1419014185667038 0.44983536872179924 0.6693680656054029 0.22462065038139475 1e-05
109 0.07755623757839203 0.6714691381683245 0.7449736568518617 0.2136927959803109 1e-05
110 0.05802077427506447 0.6837163115625163 0.7489470111853911 0.26414762709864326 1e-05
111 0.053473543375730515 0.6935245030574381 0.7547299175391458 0.331603552491686 1e-05
112 0.05167479068040848 0.6998135920976987 0.7585280588776449 0.3483537603081725 1e-05
113 0.05106380954384804 0.7042734447135067 0.7611001027447527 0.3378893620669972 1e-05
114 0.05065497010946274 0.7053535652688978 0.7622047244094489 0.35703913789456687 1e-05
115 0.05039990693330765 0.7117820247034023 0.7647240545893983 0.36428903036337407 1e-05
116 0.05015714839100838 0.7101966864688769 0.7647289615591668 0.3622993891059384 1e-05
117 0.050175271928310394 0.712914409156635 0.7657223847509677 0.36544151863175506 1e-05
118 0.050468478351831436 0.7141687427048309 0.7654782537680462 0.3524192831401073 1e-05
119 0.049900032579898834 0.7127053535652689 0.7658673932788375 0.34416444697858145 1e-05
120 0.049903545528650284 0.7130886221494399 0.7657258505633957 0.35077501544817247 1e-05
121 0.04957958310842514 0.7140816362084285 0.7665756914119359 0.3627670615797559 1e-05
122 0.04973344877362251 0.7163812477134545 0.7672263726699065 0.35293584502475456 1e-05
123 0.04949206858873367 0.7153533910559049 0.7661930650098223 0.36741171960145996 1e-05
124 0.049613192677497864 0.7160676643264055 0.7673595994775795 0.36413807211107735 1e-05
125 0.04959910735487938 0.7124440340760614 0.7658070643240676 0.3509397162428964 1e-05
126 0.0494619682431221 0.7152662845595025 0.7660751240774316 0.37424111866414195 1e-05
127 0.049399666488170624 0.7149178585738925 0.766314294299216 0.35735030768195364 1e-05
128 0.04938925430178642 0.714412640894758 0.7664776721721585 0.36010176970077795 1e-05
129 0.049368634819984436 0.717931743349419 0.7674323253122921 0.3641550142658243 1e-05
130 0.049409620463848114 0.717687845159492 0.7667705923765463 0.3602711408206009 1e-05
131 0.04939533770084381 0.718210484137907 0.7665082507046622 0.3664294602272974 1e-05
132 0.04943186417222023 0.717583317363809 0.7664564319910658 0.3651176446191739 1e-05
133 0.049288176000118256 0.714621696486124 0.7658437005098911 0.36115748131858555 1e-05
134 0.04927274212241173 0.7154753401508684 0.7659699195779215 0.3677607284274943 1e-05
135 0.04929700121283531 0.7189944426055295 0.7674080308866179 0.37226237632641596 1e-05
136 0.049187980592250824 0.7151094928659779 0.766711291239524 0.35969130867283144 1e-05
137 0.0491538941860199 0.7157889235379175 0.7664713487937058 0.3631043131303743 1e-05
138 0.04930136725306511 0.7178446368530165 0.7665450277813434 0.3687935119842292 1e-05
139 0.04927237331867218 0.7182279054371875 0.766155421092079 0.35626916649899915 1e-05
140 0.04918988421559334 0.7197958223724326 0.767376184687937 0.3699733355385332 1e-05
141 0.04920462518930435 0.716468354209857 0.7666041104041745 0.35072596287625124 1e-05
142 0.04919710010290146 0.7194473963868225 0.7669340748803981 0.36600085102264546 1e-05
143 0.04930509999394417 0.7168342014947475 0.765517685242224 0.3673237139632794 1e-05
144 0.0490318201482296 0.7171477848817965 0.7667940015206897 0.3554021435508309 1.0000000000000002e-06
145 0.04918621480464935 0.7201616696573231 0.7677822164123848 0.3711029550898432 1.0000000000000002e-06
146 0.04903709515929222 0.717130363582516 0.7665065530257804 0.368326447075977 1.0000000000000002e-06
147 0.049094948917627335 0.720823679029982 0.768544776459646 0.37476196073915324 1.0000000000000002e-06
148 0.04907181113958359 0.7167296736990645 0.7667018106807243 0.3649988602534311 1.0000000000000002e-06
149 0.04904184117913246 0.718210484137907 0.7671139893046166 0.3787860055208151 1.0000000000000002e-06
150 0.04912904277443886 0.7154404975523074 0.7667920374277589 0.3726446424747262 1.0000000000000002e-06

Framework Versions

  • Transformers: 4.44.2
  • Pytorch: 2.4.1+cu121
  • Datasets: 3.0.0
  • Tokenizers: 0.19.1
Downloads last month
91
Safetensors
Model size
306M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .