Tuu-invitrace commited on
Commit
bb32968
1 Parent(s): e30d78a

Model save

Browse files
README.md ADDED
@@ -0,0 +1,174 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: google/vit-base-patch16-224-in21k
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - imagefolder
8
+ metrics:
9
+ - accuracy
10
+ model-index:
11
+ - name: invitrace-vit-food
12
+ results:
13
+ - task:
14
+ name: Image Classification
15
+ type: image-classification
16
+ dataset:
17
+ name: imagefolder
18
+ type: imagefolder
19
+ config: default
20
+ split: train
21
+ args: default
22
+ metrics:
23
+ - name: Accuracy
24
+ type: accuracy
25
+ value: 0.7150311057595826
26
+ ---
27
+
28
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
+ should probably proofread and complete it, then remove this comment. -->
30
+
31
+ # invitrace-vit-food
32
+
33
+ This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
34
+ It achieves the following results on the evaluation set:
35
+ - Loss: 1.3123
36
+ - Accuracy: 0.7150
37
+
38
+ ## Model description
39
+
40
+ More information needed
41
+
42
+ ## Intended uses & limitations
43
+
44
+ More information needed
45
+
46
+ ## Training and evaluation data
47
+
48
+ More information needed
49
+
50
+ ## Training procedure
51
+
52
+ ### Training hyperparameters
53
+
54
+ The following hyperparameters were used during training:
55
+ - learning_rate: 2e-05
56
+ - train_batch_size: 8
57
+ - eval_batch_size: 8
58
+ - seed: 42
59
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
60
+ - lr_scheduler_type: linear
61
+ - num_epochs: 8
62
+ - mixed_precision_training: Native AMP
63
+
64
+ ### Training results
65
+
66
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
67
+ |:-------------:|:------:|:-----:|:---------------:|:--------:|
68
+ | 5.0709 | 0.0803 | 200 | 5.0587 | 0.0460 |
69
+ | 4.953 | 0.1605 | 400 | 4.9295 | 0.1174 |
70
+ | 4.8249 | 0.2408 | 600 | 4.7934 | 0.2099 |
71
+ | 4.6523 | 0.3210 | 800 | 4.6597 | 0.2266 |
72
+ | 4.5648 | 0.4013 | 1000 | 4.5348 | 0.2814 |
73
+ | 4.5212 | 0.4815 | 1200 | 4.4125 | 0.2882 |
74
+ | 4.2975 | 0.5618 | 1400 | 4.2945 | 0.3345 |
75
+ | 4.2548 | 0.6421 | 1600 | 4.1838 | 0.3761 |
76
+ | 4.0395 | 0.7223 | 1800 | 4.0774 | 0.3873 |
77
+ | 3.9615 | 0.8026 | 2000 | 3.9822 | 0.4080 |
78
+ | 3.9325 | 0.8828 | 2200 | 3.8886 | 0.4124 |
79
+ | 3.8862 | 0.9631 | 2400 | 3.7977 | 0.4479 |
80
+ | 3.3958 | 1.0433 | 2600 | 3.6938 | 0.4618 |
81
+ | 3.4241 | 1.1236 | 2800 | 3.6019 | 0.4722 |
82
+ | 3.3326 | 1.2039 | 3000 | 3.5157 | 0.4951 |
83
+ | 3.2437 | 1.2841 | 3200 | 3.4209 | 0.5139 |
84
+ | 3.2519 | 1.3644 | 3400 | 3.3291 | 0.5198 |
85
+ | 3.2528 | 1.4446 | 3600 | 3.2425 | 0.5308 |
86
+ | 3.117 | 1.5249 | 3800 | 3.1715 | 0.5505 |
87
+ | 3.014 | 1.6051 | 4000 | 3.0965 | 0.5408 |
88
+ | 2.8688 | 1.6854 | 4200 | 3.0171 | 0.5577 |
89
+ | 2.9096 | 1.7657 | 4400 | 2.9386 | 0.5748 |
90
+ | 2.8936 | 1.8459 | 4600 | 2.8630 | 0.5788 |
91
+ | 2.7947 | 1.9262 | 4800 | 2.7981 | 0.5842 |
92
+ | 2.7247 | 2.0064 | 5000 | 2.7218 | 0.5910 |
93
+ | 2.3716 | 2.0867 | 5200 | 2.6467 | 0.6091 |
94
+ | 2.3813 | 2.1669 | 5400 | 2.5855 | 0.6037 |
95
+ | 2.1125 | 2.2472 | 5600 | 2.5160 | 0.6091 |
96
+ | 2.0332 | 2.3274 | 5800 | 2.4506 | 0.6207 |
97
+ | 2.0413 | 2.4077 | 6000 | 2.3949 | 0.6263 |
98
+ | 2.3041 | 2.4880 | 6200 | 2.3396 | 0.6183 |
99
+ | 1.7894 | 2.5682 | 6400 | 2.2855 | 0.6372 |
100
+ | 1.9194 | 2.6485 | 6600 | 2.2341 | 0.6386 |
101
+ | 2.0286 | 2.7287 | 6800 | 2.1997 | 0.6392 |
102
+ | 1.7409 | 2.8090 | 7000 | 2.1385 | 0.6478 |
103
+ | 1.794 | 2.8892 | 7200 | 2.0876 | 0.6528 |
104
+ | 1.6189 | 2.9695 | 7400 | 2.0540 | 0.6570 |
105
+ | 1.5587 | 3.0498 | 7600 | 2.0068 | 0.6629 |
106
+ | 1.2941 | 3.1300 | 7800 | 1.9610 | 0.6701 |
107
+ | 1.3048 | 3.2103 | 8000 | 1.9325 | 0.6639 |
108
+ | 1.1526 | 3.2905 | 8200 | 1.8883 | 0.6699 |
109
+ | 1.2333 | 3.3708 | 8400 | 1.8505 | 0.6693 |
110
+ | 1.1094 | 3.4510 | 8600 | 1.8273 | 0.6703 |
111
+ | 1.4851 | 3.5313 | 8800 | 1.7896 | 0.6829 |
112
+ | 1.1991 | 3.6116 | 9000 | 1.7648 | 0.6829 |
113
+ | 1.1898 | 3.6918 | 9200 | 1.7250 | 0.6903 |
114
+ | 0.973 | 3.7721 | 9400 | 1.7261 | 0.6769 |
115
+ | 1.2646 | 3.8523 | 9600 | 1.6804 | 0.6920 |
116
+ | 1.0756 | 3.9326 | 9800 | 1.6639 | 0.6934 |
117
+ | 1.0885 | 4.0128 | 10000 | 1.6324 | 0.6924 |
118
+ | 0.8466 | 4.0931 | 10200 | 1.6131 | 0.6994 |
119
+ | 0.8781 | 4.1734 | 10400 | 1.5981 | 0.6942 |
120
+ | 0.8557 | 4.2536 | 10600 | 1.5804 | 0.6988 |
121
+ | 0.852 | 4.3339 | 10800 | 1.5532 | 0.7014 |
122
+ | 0.7597 | 4.4141 | 11000 | 1.5395 | 0.7036 |
123
+ | 0.9044 | 4.4944 | 11200 | 1.5195 | 0.7034 |
124
+ | 0.7762 | 4.5746 | 11400 | 1.5106 | 0.7014 |
125
+ | 0.6486 | 4.6549 | 11600 | 1.4979 | 0.7034 |
126
+ | 0.7373 | 4.7352 | 11800 | 1.4804 | 0.7032 |
127
+ | 0.9194 | 4.8154 | 12000 | 1.4659 | 0.7038 |
128
+ | 0.6513 | 4.8957 | 12200 | 1.4487 | 0.7050 |
129
+ | 0.7235 | 4.9759 | 12400 | 1.4307 | 0.7082 |
130
+ | 0.4407 | 5.0562 | 12600 | 1.4304 | 0.7082 |
131
+ | 0.5979 | 5.1364 | 12800 | 1.4227 | 0.7112 |
132
+ | 0.6776 | 5.2167 | 13000 | 1.4237 | 0.7048 |
133
+ | 0.5239 | 5.2970 | 13200 | 1.4098 | 0.7066 |
134
+ | 0.5614 | 5.3772 | 13400 | 1.3947 | 0.7110 |
135
+ | 0.5483 | 5.4575 | 13600 | 1.3901 | 0.7114 |
136
+ | 0.4797 | 5.5377 | 13800 | 1.3844 | 0.7082 |
137
+ | 0.5795 | 5.6180 | 14000 | 1.3816 | 0.7120 |
138
+ | 0.5108 | 5.6982 | 14200 | 1.3748 | 0.7098 |
139
+ | 0.3919 | 5.7785 | 14400 | 1.3662 | 0.7138 |
140
+ | 0.5572 | 5.8587 | 14600 | 1.3542 | 0.7154 |
141
+ | 0.5333 | 5.9390 | 14800 | 1.3451 | 0.7152 |
142
+ | 0.2997 | 6.0193 | 15000 | 1.3406 | 0.7217 |
143
+ | 0.3923 | 6.0995 | 15200 | 1.3472 | 0.7162 |
144
+ | 0.4682 | 6.1798 | 15400 | 1.3437 | 0.7170 |
145
+ | 0.3758 | 6.2600 | 15600 | 1.3396 | 0.7162 |
146
+ | 0.3123 | 6.3403 | 15800 | 1.3393 | 0.7182 |
147
+ | 0.2974 | 6.4205 | 16000 | 1.3303 | 0.7150 |
148
+ | 0.3374 | 6.5008 | 16200 | 1.3275 | 0.7170 |
149
+ | 0.5128 | 6.5811 | 16400 | 1.3322 | 0.7126 |
150
+ | 0.4074 | 6.6613 | 16600 | 1.3254 | 0.7162 |
151
+ | 0.4761 | 6.7416 | 16800 | 1.3249 | 0.7144 |
152
+ | 0.2215 | 6.8218 | 17000 | 1.3247 | 0.7134 |
153
+ | 0.4581 | 6.9021 | 17200 | 1.3237 | 0.7120 |
154
+ | 0.2686 | 6.9823 | 17400 | 1.3138 | 0.7162 |
155
+ | 0.375 | 7.0626 | 17600 | 1.3197 | 0.7146 |
156
+ | 0.2512 | 7.1429 | 17800 | 1.3172 | 0.7146 |
157
+ | 0.3274 | 7.2231 | 18000 | 1.3222 | 0.7134 |
158
+ | 0.3209 | 7.3034 | 18200 | 1.3272 | 0.7126 |
159
+ | 0.2441 | 7.3836 | 18400 | 1.3216 | 0.7124 |
160
+ | 0.2725 | 7.4639 | 18600 | 1.3156 | 0.7132 |
161
+ | 0.2326 | 7.5441 | 18800 | 1.3155 | 0.7132 |
162
+ | 0.3594 | 7.6244 | 19000 | 1.3140 | 0.7162 |
163
+ | 0.2297 | 7.7047 | 19200 | 1.3133 | 0.7152 |
164
+ | 0.3722 | 7.7849 | 19400 | 1.3160 | 0.7130 |
165
+ | 0.202 | 7.8652 | 19600 | 1.3131 | 0.7142 |
166
+ | 0.2272 | 7.9454 | 19800 | 1.3123 | 0.7150 |
167
+
168
+
169
+ ### Framework versions
170
+
171
+ - Transformers 4.41.2
172
+ - Pytorch 2.2.1+cu121
173
+ - Datasets 2.19.1
174
+ - Tokenizers 0.19.1
runs/Jun03_04-14-33_ip-10-192-11-127/events.out.tfevents.1717388075.ip-10-192-11-127.19498.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b234cc4a91fe43d6b8119b5ad4a0d57c312155a8a1c0b4c01016714350cd2af0
3
- size 479870
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:430cb5b3c8cd73208ec8c05ee8fd86d4fc71de6fb1018869d61fe97c17e8afa4
3
+ size 483025