You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using the windows version of darknet to train models in Yolo-v4. I'm specifically using only the CPU for predictions, this program will need to run on machines with only integrated graphics hardware.
As in the title, I've got a tiny model and a large model, both trained on the same datasets. I'm able to load both with darknet.exe for the test operation and get predictions from either one. Of course, the large model performs better, gives more consistent and more accurate bounding boxes, so I'd love to be able to implement the large model in my C# application, but unfortunately, as of right now, while I can load both models with Alturos.Yolo, I can't get predictions as I'm expecting from my large model.
Here's the test image that I've been trying to feed:
The file size limit prevents me from uploading the full sized, 5100x7019 pixel version, so I scaled it down by 50%.
Here are some screenshots of the predictions that each model gives on that image in darknet.exe, first the Large model:
And the Tiny model:
Now, when I load up the model in my C# application and try to predict on this same image, it initializes using the same .cfg that I provided and loads the weights, giving the following output to the console, but not finding any bounding boxes:
Initializing YoloV4 AI...
If YOLO fails check for Microsoft Visual C++ 2017 Redistributable (x64) vc_redist.x64.exe
policy: Using default 'constant'
batch = 1, time_steps = 1, train = 0
layer filters size/strd(dil) input output
0 conv 32 3 x 3/ 1 512 x 704 x 3 -> 512 x 704 x 32 0.623 BF
1 conv 64 3 x 3/ 2 512 x 704 x 32 -> 256 x 352 x 64 3.322 BF
2 conv 64 1 x 1/ 1 256 x 352 x 64 -> 256 x 352 x 64 0.738 BF
3 route 1 -> 256 x 352 x 64
4 conv 64 1 x 1/ 1 256 x 352 x 64 -> 256 x 352 x 64 0.738 BF
5 conv 32 1 x 1/ 1 256 x 352 x 64 -> 256 x 352 x 32 0.369 BF
6 conv 64 3 x 3/ 1 256 x 352 x 32 -> 256 x 352 x 64 3.322 BF
7 Shortcut Layer: 4, wt = 0, wn = 0, outputs: 256 x 352 x 64 0.006 BF
8 conv 64 1 x 1/ 1 256 x 352 x 64 -> 256 x 352 x 64 0.738 BF
9 route 8 2 -> 256 x 352 x 128
10 conv 64 1 x 1/ 1 256 x 352 x 128 -> 256 x 352 x 64 1.476 BF
11 conv 128 3 x 3/ 2 256 x 352 x 64 -> 128 x 176 x 128 3.322 BF
12 conv 64 1 x 1/ 1 128 x 176 x 128 -> 128 x 176 x 64 0.369 BF
13 route 11 -> 128 x 176 x 128
14 conv 64 1 x 1/ 1 128 x 176 x 128 -> 128 x 176 x 64 0.369 BF
15 conv 64 1 x 1/ 1 128 x 176 x 64 -> 128 x 176 x 64 0.185 BF
16 conv 64 3 x 3/ 1 128 x 176 x 64 -> 128 x 176 x 64 1.661 BF
17 Shortcut Layer: 14, wt = 0, wn = 0, outputs: 128 x 176 x 64 0.001 BF
18 conv 64 1 x 1/ 1 128 x 176 x 64 -> 128 x 176 x 64 0.185 BF
19 conv 64 3 x 3/ 1 128 x 176 x 64 -> 128 x 176 x 64 1.661 BF
20 Shortcut Layer: 17, wt = 0, wn = 0, outputs: 128 x 176 x 64 0.001 BF
21 conv 64 1 x 1/ 1 128 x 176 x 64 -> 128 x 176 x 64 0.185 BF
22 route 21 12 -> 128 x 176 x 128
23 conv 128 1 x 1/ 1 128 x 176 x 128 -> 128 x 176 x 128 0.738 BF
24 conv 256 3 x 3/ 2 128 x 176 x 128 -> 64 x 88 x 256 3.322 BF
25 conv 128 1 x 1/ 1 64 x 88 x 256 -> 64 x 88 x 128 0.369 BF
26 route 24 -> 64 x 88 x 256
27 conv 128 1 x 1/ 1 64 x 88 x 256 -> 64 x 88 x 128 0.369 BF
28 conv 128 1 x 1/ 1 64 x 88 x 128 -> 64 x 88 x 128 0.185 BF
29 conv 128 3 x 3/ 1 64 x 88 x 128 -> 64 x 88 x 128 1.661 BF
30 Shortcut Layer: 27, wt = 0, wn = 0, outputs: 64 x 88 x 128 0.001 BF
31 conv 128 1 x 1/ 1 64 x 88 x 128 -> 64 x 88 x 128 0.185 BF
32 conv 128 3 x 3/ 1 64 x 88 x 128 -> 64 x 88 x 128 1.661 BF
33 Shortcut Layer: 30, wt = 0, wn = 0, outputs: 64 x 88 x 128 0.001 BF
34 conv 128 1 x 1/ 1 64 x 88 x 128 -> 64 x 88 x 128 0.185 BF
35 conv 128 3 x 3/ 1 64 x 88 x 128 -> 64 x 88 x 128 1.661 BF
36 Shortcut Layer: 33, wt = 0, wn = 0, outputs: 64 x 88 x 128 0.001 BF
37 conv 128 1 x 1/ 1 64 x 88 x 128 -> 64 x 88 x 128 0.185 BF
38 conv 128 3 x 3/ 1 64 x 88 x 128 -> 64 x 88 x 128 1.661 BF
39 Shortcut Layer: 36, wt = 0, wn = 0, outputs: 64 x 88 x 128 0.001 BF
40 conv 128 1 x 1/ 1 64 x 88 x 128 -> 64 x 88 x 128 0.185 BF
41 conv 128 3 x 3/ 1 64 x 88 x 128 -> 64 x 88 x 128 1.661 BF
42 Shortcut Layer: 39, wt = 0, wn = 0, outputs: 64 x 88 x 128 0.001 BF
43 conv 128 1 x 1/ 1 64 x 88 x 128 -> 64 x 88 x 128 0.185 BF
44 conv 128 3 x 3/ 1 64 x 88 x 128 -> 64 x 88 x 128 1.661 BF
45 Shortcut Layer: 42, wt = 0, wn = 0, outputs: 64 x 88 x 128 0.001 BF
46 conv 128 1 x 1/ 1 64 x 88 x 128 -> 64 x 88 x 128 0.185 BF
47 conv 128 3 x 3/ 1 64 x 88 x 128 -> 64 x 88 x 128 1.661 BF
48 Shortcut Layer: 45, wt = 0, wn = 0, outputs: 64 x 88 x 128 0.001 BF
49 conv 128 1 x 1/ 1 64 x 88 x 128 -> 64 x 88 x 128 0.185 BF
50 conv 128 3 x 3/ 1 64 x 88 x 128 -> 64 x 88 x 128 1.661 BF
51 Shortcut Layer: 48, wt = 0, wn = 0, outputs: 64 x 88 x 128 0.001 BF
52 conv 128 1 x 1/ 1 64 x 88 x 128 -> 64 x 88 x 128 0.185 BF
53 route 52 25 -> 64 x 88 x 256
54 conv 256 1 x 1/ 1 64 x 88 x 256 -> 64 x 88 x 256 0.738 BF
55 conv 512 3 x 3/ 2 64 x 88 x 256 -> 32 x 44 x 512 3.322 BF
56 conv 256 1 x 1/ 1 32 x 44 x 512 -> 32 x 44 x 256 0.369 BF
57 route 55 -> 32 x 44 x 512
58 conv 256 1 x 1/ 1 32 x 44 x 512 -> 32 x 44 x 256 0.369 BF
59 conv 256 1 x 1/ 1 32 x 44 x 256 -> 32 x 44 x 256 0.185 BF
60 conv 256 3 x 3/ 1 32 x 44 x 256 -> 32 x 44 x 256 1.661 BF
61 Shortcut Layer: 58, wt = 0, wn = 0, outputs: 32 x 44 x 256 0.000 BF
62 conv 256 1 x 1/ 1 32 x 44 x 256 -> 32 x 44 x 256 0.185 BF
63 conv 256 3 x 3/ 1 32 x 44 x 256 -> 32 x 44 x 256 1.661 BF
64 Shortcut Layer: 61, wt = 0, wn = 0, outputs: 32 x 44 x 256 0.000 BF
65 conv 256 1 x 1/ 1 32 x 44 x 256 -> 32 x 44 x 256 0.185 BF
66 conv 256 3 x 3/ 1 32 x 44 x 256 -> 32 x 44 x 256 1.661 BF
67 Shortcut Layer: 64, wt = 0, wn = 0, outputs: 32 x 44 x 256 0.000 BF
68 conv 256 1 x 1/ 1 32 x 44 x 256 -> 32 x 44 x 256 0.185 BF
69 conv 256 3 x 3/ 1 32 x 44 x 256 -> 32 x 44 x 256 1.661 BF
70 Shortcut Layer: 67, wt = 0, wn = 0, outputs: 32 x 44 x 256 0.000 BF
71 conv 256 1 x 1/ 1 32 x 44 x 256 -> 32 x 44 x 256 0.185 BF
72 conv 256 3 x 3/ 1 32 x 44 x 256 -> 32 x 44 x 256 1.661 BF
73 Shortcut Layer: 70, wt = 0, wn = 0, outputs: 32 x 44 x 256 0.000 BF
74 conv 256 1 x 1/ 1 32 x 44 x 256 -> 32 x 44 x 256 0.185 BF
75 conv 256 3 x 3/ 1 32 x 44 x 256 -> 32 x 44 x 256 1.661 BF
76 Shortcut Layer: 73, wt = 0, wn = 0, outputs: 32 x 44 x 256 0.000 BF
77 conv 256 1 x 1/ 1 32 x 44 x 256 -> 32 x 44 x 256 0.185 BF
78 conv 256 3 x 3/ 1 32 x 44 x 256 -> 32 x 44 x 256 1.661 BF
79 Shortcut Layer: 76, wt = 0, wn = 0, outputs: 32 x 44 x 256 0.000 BF
80 conv 256 1 x 1/ 1 32 x 44 x 256 -> 32 x 44 x 256 0.185 BF
81 conv 256 3 x 3/ 1 32 x 44 x 256 -> 32 x 44 x 256 1.661 BF
82 Shortcut Layer: 79, wt = 0, wn = 0, outputs: 32 x 44 x 256 0.000 BF
83 conv 256 1 x 1/ 1 32 x 44 x 256 -> 32 x 44 x 256 0.185 BF
84 route 83 56 -> 32 x 44 x 512
85 conv 512 1 x 1/ 1 32 x 44 x 512 -> 32 x 44 x 512 0.738 BF
86 conv 1024 3 x 3/ 2 32 x 44 x 512 -> 16 x 22 x1024 3.322 BF
87 conv 512 1 x 1/ 1 16 x 22 x1024 -> 16 x 22 x 512 0.369 BF
88 route 86 -> 16 x 22 x1024
89 conv 512 1 x 1/ 1 16 x 22 x1024 -> 16 x 22 x 512 0.369 BF
90 conv 512 1 x 1/ 1 16 x 22 x 512 -> 16 x 22 x 512 0.185 BF
91 conv 512 3 x 3/ 1 16 x 22 x 512 -> 16 x 22 x 512 1.661 BF
92 Shortcut Layer: 89, wt = 0, wn = 0, outputs: 16 x 22 x 512 0.000 BF
93 conv 512 1 x 1/ 1 16 x 22 x 512 -> 16 x 22 x 512 0.185 BF
94 conv 512 3 x 3/ 1 16 x 22 x 512 -> 16 x 22 x 512 1.661 BF
95 Shortcut Layer: 92, wt = 0, wn = 0, outputs: 16 x 22 x 512 0.000 BF
96 conv 512 1 x 1/ 1 16 x 22 x 512 -> 16 x 22 x 512 0.185 BF
97 conv 512 3 x 3/ 1 16 x 22 x 512 -> 16 x 22 x 512 1.661 BF
98 Shortcut Layer: 95, wt = 0, wn = 0, outputs: 16 x 22 x 512 0.000 BF
99 conv 512 1 x 1/ 1 16 x 22 x 512 -> 16 x 22 x 512 0.185 BF
100 conv 512 3 x 3/ 1 16 x 22 x 512 -> 16 x 22 x 512 1.661 BF
101 Shortcut Layer: 98, wt = 0, wn = 0, outputs: 16 x 22 x 512 0.000 BF
102 conv 512 1 x 1/ 1 16 x 22 x 512 -> 16 x 22 x 512 0.185 BF
103 route 102 87 -> 16 x 22 x1024
104 conv 1024 1 x 1/ 1 16 x 22 x1024 -> 16 x 22 x1024 0.738 BF
105 conv 512 1 x 1/ 1 16 x 22 x1024 -> 16 x 22 x 512 0.369 BF
106 conv 1024 3 x 3/ 1 16 x 22 x 512 -> 16 x 22 x1024 3.322 BF
107 conv 512 1 x 1/ 1 16 x 22 x1024 -> 16 x 22 x 512 0.369 BF
108 max 5x 5/ 1 16 x 22 x 512 -> 16 x 22 x 512 0.005 BF
109 route 107 -> 16 x 22 x 512
110 max 9x 9/ 1 16 x 22 x 512 -> 16 x 22 x 512 0.015 BF
111 route 107 -> 16 x 22 x 512
112 max 13x13/ 1 16 x 22 x 512 -> 16 x 22 x 512 0.030 BF
113 route 112 110 108 107 -> 16 x 22 x2048
114 conv 512 1 x 1/ 1 16 x 22 x2048 -> 16 x 22 x 512 0.738 BF
115 conv 1024 3 x 3/ 1 16 x 22 x 512 -> 16 x 22 x1024 3.322 BF
116 conv 512 1 x 1/ 1 16 x 22 x1024 -> 16 x 22 x 512 0.369 BF
117 conv 256 1 x 1/ 1 16 x 22 x 512 -> 16 x 22 x 256 0.092 BF
118 upsample 2x 16 x 22 x 256 -> 32 x 44 x 256
119 route 85 -> 32 x 44 x 512
120 conv 256 1 x 1/ 1 32 x 44 x 512 -> 32 x 44 x 256 0.369 BF
121 route 120 118 -> 32 x 44 x 512
122 conv 256 1 x 1/ 1 32 x 44 x 512 -> 32 x 44 x 256 0.369 BF
123 conv 512 3 x 3/ 1 32 x 44 x 256 -> 32 x 44 x 512 3.322 BF
124 conv 256 1 x 1/ 1 32 x 44 x 512 -> 32 x 44 x 256 0.369 BF
125 conv 512 3 x 3/ 1 32 x 44 x 256 -> 32 x 44 x 512 3.322 BF
126 conv 256 1 x 1/ 1 32 x 44 x 512 -> 32 x 44 x 256 0.369 BF
127 conv 128 1 x 1/ 1 32 x 44 x 256 -> 32 x 44 x 128 0.092 BF
128 upsample 2x 32 x 44 x 128 -> 64 x 88 x 128
129 route 54 -> 64 x 88 x 256
130 conv 128 1 x 1/ 1 64 x 88 x 256 -> 64 x 88 x 128 0.369 BF
131 route 130 128 -> 64 x 88 x 256
132 conv 128 1 x 1/ 1 64 x 88 x 256 -> 64 x 88 x 128 0.369 BF
133 conv 256 3 x 3/ 1 64 x 88 x 128 -> 64 x 88 x 256 3.322 BF
134 conv 128 1 x 1/ 1 64 x 88 x 256 -> 64 x 88 x 128 0.369 BF
135 conv 256 3 x 3/ 1 64 x 88 x 128 -> 64 x 88 x 256 3.322 BF
136 conv 128 1 x 1/ 1 64 x 88 x 256 -> 64 x 88 x 128 0.369 BF
137 conv 256 3 x 3/ 1 64 x 88 x 128 -> 64 x 88 x 256 3.322 BF
138 conv 18 1 x 1/ 1 64 x 88 x 256 -> 64 x 88 x 18 0.052 BF
139 yolo
[yolo] params: iou loss: ciou (4), iou_norm: 0.07, cls_norm: 1.00, scale_x_y: 1.20
nms_kind: greedynms (1), beta = 0.600000
140 route 136 -> 64 x 88 x 128
141 conv 256 3 x 3/ 2 64 x 88 x 128 -> 32 x 44 x 256 0.830 BF
142 route 141 126 -> 32 x 44 x 512
143 conv 256 1 x 1/ 1 32 x 44 x 512 -> 32 x 44 x 256 0.369 BF
144 conv 512 3 x 3/ 1 32 x 44 x 256 -> 32 x 44 x 512 3.322 BF
145 conv 256 1 x 1/ 1 32 x 44 x 512 -> 32 x 44 x 256 0.369 BF
146 conv 512 3 x 3/ 1 32 x 44 x 256 -> 32 x 44 x 512 3.322 BF
147 conv 256 1 x 1/ 1 32 x 44 x 512 -> 32 x 44 x 256 0.369 BF
148 conv 512 3 x 3/ 1 32 x 44 x 256 -> 32 x 44 x 512 3.322 BF
149 conv 18 1 x 1/ 1 32 x 44 x 512 -> 32 x 44 x 18 0.026 BF
150 yolo
[yolo] params: iou loss: ciou (4), iou_norm: 0.07, cls_norm: 1.00, scale_x_y: 1.10
nms_kind: greedynms (1), beta = 0.600000
151 route 147 -> 32 x 44 x 256
152 conv 512 3 x 3/ 2 32 x 44 x 256 -> 16 x 22 x 512 0.830 BF
153 route 152 116 -> 16 x 22 x1024
154 conv 512 1 x 1/ 1 16 x 22 x1024 -> 16 x 22 x 512 0.369 BF
155 conv 1024 3 x 3/ 1 16 x 22 x 512 -> 16 x 22 x1024 3.322 BF
156 conv 512 1 x 1/ 1 16 x 22 x1024 -> 16 x 22 x 512 0.369 BF
157 conv 1024 3 x 3/ 1 16 x 22 x 512 -> 16 x 22 x1024 3.322 BF
158 conv 512 1 x 1/ 1 16 x 22 x1024 -> 16 x 22 x 512 0.369 BF
159 conv 1024 3 x 3/ 1 16 x 22 x 512 -> 16 x 22 x1024 3.322 BF
160 conv 18 1 x 1/ 1 16 x 22 x1024 -> 16 x 22 x 18 0.013 BF
161 yolo
[yolo] params: iou loss: ciou (4), iou_norm: 0.07, cls_norm: 1.00, scale_x_y: 1.05
nms_kind: greedynms (1), beta = 0.600000
Total BFLOPS 124.060
avg_outputs = 1020130
Loading weights from C:\YOLO\YoloV4_Custom_Large.weights...
seen 64, trained: 640 K-images (10 Kilo-batches_64)
Done! Loaded 162 layers from weights-file
Used AVX
Used FMA & AVX2
But if I load up the Tiny model with the _Tiny.cfg and weights files, it finds two bounding boxes:
And gives this console output:
Initializing YoloV4_Tiny AI...
If YOLO fails check for Microsoft Visual C++ 2017 Redistributable (x64) vc_redist.x64.exe
batch = 1, time_steps = 1, train = 0
layer filters size/strd(dil) input output
0 conv 32 3 x 3/ 2 512 x 704 x 3 -> 256 x 352 x 32 0.156 BF
1 conv 64 3 x 3/ 2 256 x 352 x 32 -> 128 x 176 x 64 0.830 BF
2 conv 64 3 x 3/ 1 128 x 176 x 64 -> 128 x 176 x 64 1.661 BF
3 route 2 1/2 -> 128 x 176 x 32
4 conv 32 3 x 3/ 1 128 x 176 x 32 -> 128 x 176 x 32 0.415 BF
5 conv 32 3 x 3/ 1 128 x 176 x 32 -> 128 x 176 x 32 0.415 BF
6 route 5 4 -> 128 x 176 x 64
7 conv 64 1 x 1/ 1 128 x 176 x 64 -> 128 x 176 x 64 0.185 BF
8 route 2 7 -> 128 x 176 x 128
9 max 2x 2/ 2 128 x 176 x 128 -> 64 x 88 x 128 0.003 BF
10 conv 128 3 x 3/ 1 64 x 88 x 128 -> 64 x 88 x 128 1.661 BF
11 route 10 1/2 -> 64 x 88 x 64
12 conv 64 3 x 3/ 1 64 x 88 x 64 -> 64 x 88 x 64 0.415 BF
13 conv 64 3 x 3/ 1 64 x 88 x 64 -> 64 x 88 x 64 0.415 BF
14 route 13 12 -> 64 x 88 x 128
15 conv 128 1 x 1/ 1 64 x 88 x 128 -> 64 x 88 x 128 0.185 BF
16 route 10 15 -> 64 x 88 x 256
17 max 2x 2/ 2 64 x 88 x 256 -> 32 x 44 x 256 0.001 BF
18 conv 256 3 x 3/ 1 32 x 44 x 256 -> 32 x 44 x 256 1.661 BF
19 route 18 1/2 -> 32 x 44 x 128
20 conv 128 3 x 3/ 1 32 x 44 x 128 -> 32 x 44 x 128 0.415 BF
21 conv 128 3 x 3/ 1 32 x 44 x 128 -> 32 x 44 x 128 0.415 BF
22 route 21 20 -> 32 x 44 x 256
23 conv 256 1 x 1/ 1 32 x 44 x 256 -> 32 x 44 x 256 0.185 BF
24 route 18 23 -> 32 x 44 x 512
25 max 2x 2/ 2 32 x 44 x 512 -> 16 x 22 x 512 0.001 BF
26 conv 512 3 x 3/ 1 16 x 22 x 512 -> 16 x 22 x 512 1.661 BF
27 conv 256 1 x 1/ 1 16 x 22 x 512 -> 16 x 22 x 256 0.092 BF
28 conv 512 3 x 3/ 1 16 x 22 x 256 -> 16 x 22 x 512 0.830 BF
29 conv 18 1 x 1/ 1 16 x 22 x 512 -> 16 x 22 x 18 0.006 BF
30 yolo
[yolo] params: iou loss: ciou (4), iou_norm: 0.07, cls_norm: 1.00, scale_x_y: 1.05
nms_kind: greedynms (1), beta = 0.600000
Unused field: 'resize = 1.5'
31 route 27 -> 16 x 22 x 256
32 conv 128 1 x 1/ 1 16 x 22 x 256 -> 16 x 22 x 128 0.023 BF
33 upsample 2x 16 x 22 x 128 -> 32 x 44 x 128
34 route 33 23 -> 32 x 44 x 384
35 conv 256 3 x 3/ 1 32 x 44 x 384 -> 32 x 44 x 256 2.491 BF
36 conv 18 1 x 1/ 1 32 x 44 x 256 -> 32 x 44 x 18 0.013 BF
37 yolo
[yolo] params: iou loss: ciou (4), iou_norm: 0.07, cls_norm: 1.00, scale_x_y: 1.05
nms_kind: greedynms (1), beta = 0.600000
Unused field: 'resize = 1.5'
Total BFLOPS 14.137
avg_outputs = 624151
Loading weights from C:\YOLO\YoloV4_Custom_Tiny_3_22_2022.weights...
seen 64, trained: 3840 K-images (60 Kilo-batches_64)
Done! Loaded 38 layers from weights-file
Used AVX
Used FMA & AVX2
Result [ processed in 634 ms ]
My method for initializing the Yolo weights in C#:
private void InitializeYoloModel(YoloConfiguration config)
{
try
{
if (this.YoloModel!= null)
{
this.YoloModel.Dispose();
}
var useOnlyCpu = true;
Console.WriteLine("If YOLO fails check for Microsoft Visual C++ 2017 Redistributable (x64) vc_redist.x64.exe");
var sw = new Stopwatch();
sw.Start();
this.YoloModel = new YoloWrapper(config); // , 0, useOnlyCPU
sw.Stop();
var action = new MethodInvoker(delegate ()
{
var detectionSystemDetail = string.Empty;
if (!string.IsNullOrEmpty(DetectionSystem.GPU.ToString()))
{
detectionSystemDetail = $"({DetectionSystem.GPU.ToString()})";
}
Console.WriteLine($"Initialize YoloModel in {sw.Elapsed.TotalMilliseconds:0} ms - Detection System:{this.YoloModel.DetectionSystem} {detectionSystemDetail} Weights:{config.WeightsFile}");
});
}
catch (Exception ee)
{
Console.WriteLine("Error loading YoloModel" + ee.Message);
YoloModel = null;
}
}
As an extra note, I don't have any Microsoft Visual C++ Redistributable versions beyond this one:
Microsoft Visual C++ 2015-2019 Redistributable (x64) - 14.22.27821
As I said, my configuration allows me to get the program working without issue with the Tiny model, the large model even initializes without returning any errors, it just doesn't return predictions as I would expect based on its performance in darknet.exe.
If there is any additional information that would be helpful to determine the cause of this failure, please let me know and I will provide it for you asap.
Thanks so much for your time.
EDIT: I have also loaded the provided YoloV3 pretrained model and have gotten the predictions I expect from that pretrained model, still not sure where the point of failure is for my custom-trained large YoloV4 model.
The text was updated successfully, but these errors were encountered:
Hello!
I'm using the windows version of darknet to train models in Yolo-v4. I'm specifically using only the CPU for predictions, this program will need to run on machines with only integrated graphics hardware.
As in the title, I've got a tiny model and a large model, both trained on the same datasets. I'm able to load both with darknet.exe for the test operation and get predictions from either one. Of course, the large model performs better, gives more consistent and more accurate bounding boxes, so I'd love to be able to implement the large model in my C# application, but unfortunately, as of right now, while I can load both models with Alturos.Yolo, I can't get predictions as I'm expecting from my large model.
Here are the .cfg files, converted to .txt, for each of the models...
YoloV4_Custom_Large.txt
YoloV4_Custom_Tiny.txt
Here's the test image that I've been trying to feed:
The file size limit prevents me from uploading the full sized, 5100x7019 pixel version, so I scaled it down by 50%.
Here are some screenshots of the predictions that each model gives on that image in darknet.exe, first the Large model:
And the Tiny model:
Now, when I load up the model in my C# application and try to predict on this same image, it initializes using the same .cfg that I provided and loads the weights, giving the following output to the console, but not finding any bounding boxes:
But if I load up the Tiny model with the _Tiny.cfg and weights files, it finds two bounding boxes:
And gives this console output:
My method for initializing the Yolo weights in C#:
As an extra note, I don't have any Microsoft Visual C++ Redistributable versions beyond this one:
Microsoft Visual C++ 2015-2019 Redistributable (x64) - 14.22.27821
As I said, my configuration allows me to get the program working without issue with the Tiny model, the large model even initializes without returning any errors, it just doesn't return predictions as I would expect based on its performance in darknet.exe.
If there is any additional information that would be helpful to determine the cause of this failure, please let me know and I will provide it for you asap.
Thanks so much for your time.
EDIT: I have also loaded the provided YoloV3 pretrained model and have gotten the predictions I expect from that pretrained model, still not sure where the point of failure is for my custom-trained large YoloV4 model.
The text was updated successfully, but these errors were encountered: