-
Notifications
You must be signed in to change notification settings - Fork 181
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mnist png recognition #15
Comments
Oh, that's great. |
Yes, with binary values it works fine. Thank you very much. Will it be supporting AbsVal activation function in future? |
That's perfect.
if (nonLinear) {
if (outputBlob[n - 1][i][j][k] < 0)
outputBlob[n - 1][i][j][k] = 0;
}
Note that with this approach, all of the ReLU layers will be converted to AbsVal, so you won't have the option to have both ReLU and AbsVal in the same network. |
I need this , because I have 99.54% recognition for MNIST in my specific structure of CNN, so I need AbsVal activation function for this. ok, I will try. Thank you |
I have tried this one. I've changed all search results
and now all of them like
is it right? after changes 'output' array returns to me {0,0,NaN,0,0,0,0,0,0} again. But NAN at right index. So if I write 3 to CNN - it returns {0,0,NaN,0,0,0,0,0,0}, if I write 5 - returns {0,0,0,0,NaN,0,0,0,0,0}, so I can work with it, but I think something wrong)) |
Note that for Fully Connected layers the code you need to change is slightly different: if (nonLinear) {
switch (nonLinearType) {
case RectifiedLinearUnit:
if (outputBlob[n][c] < 0)
outputBlob[n][c] = 0;
break;
}
} Usually the NAN value problem comes up when there is a bug in the inputs of the network or the network parameters (such as weights) are not matched with the actual CNN weights that you've trained in Caffe. |
Yes, I have changed Fully Connected in right way. Now cnndroid works fine, but returns NaN (at right index). Thank you for your help. I will try to find mistakes in InputArray |
Do you have AbsVal activation function just after the FC layer? |
Oh. Yes I have it. Ok, will try |
I have changed nonlinear.java , but cnn returns me the same array {0,0,NaN,0,0,0,0,0,0}. |
I have the same problem. Did you have resolved it? |
I just use .isNan() to understand what answer is correct. |
Thank you for help with starting application. Now it works fine, but I have one more trouble.
I use this image(28-28) to recognize by LeNet
and pyCaffe works fine and give me answer = 3
output = {6.72078215e-08, 6.08612163e-06, 8.28973225e-06, 9.99530911e-01, 6.53917276e-09, 4.10966662e-04, 1.80607959e-10, 2.87060775e-05, 2.26623479e-06, 1.27552030e-05}
cnnDroid returns to me float[1][10] = {0,NaN,NaN,NaN,0,NaN,0,0,0,0}
what I did wrong? (and one more question: can I use AbsVal activation function , not ReLU?)
The text was updated successfully, but these errors were encountered: