-
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Deepstack/CodeProject-AI.Server detector plugin #6143
Add Deepstack/CodeProject-AI.Server detector plugin #6143
Conversation
✅ Deploy Preview for frigate-docs ready!
To edit notification comments on pull requests, go to your Netlify site settings. |
What kinds of inference times are you seeing? |
thanks, and what is the deepstack server using to run the inferences (CPU, gpu, etc?) |
CodeProject Ai.Server 2.1.1, YOLOv5 6.2 Large model, half-precision. GPU NVIDIA GeForce GTX 1070 |
I've been seeing some hype around the interwebs lately about CodeProject AI, so I wanted to experiment with some of the custom models they've trained ("ipcam-dark" specifically). I tested this PR by quickly spinning up CodeProject AI via docker-compose:
config.yml:
The particular ipcam-dark model returns a label with a capital first letter, so that threw off Frigate. A quick hack on line 78 of this PR fixed it:
Additionally, the default YOLOv5 model in CodeProject seemed to return labels that weren't in Overall though, I love the flexibility to have various detectors as plugins. |
Please show the logs from the moment the frigate crashes |
|
So the issue here specifically has to do with the label "truck". In When the CodeProject model returns "truck", this is what causes Frigate to crash. |
Probably fixed |
Please use black to reformat so check can pass |
Done
|
Whoa, this would be huge! I currently have deepstack running with a model that I trained for delivery truck logos, and I use this in conjunction with "mail and packages" HACS integration to track when deliveries are outside. Right now this is all through a node-red flow that grabs the snapshot from frigate API and runs it against deepstack's api. I have a Coral USB that I'm using with Frigate. If deepstack were added as a detector and if I still wanted to use the coral for other objects, would I just set up a second camera with the same restream that uses only the deepstack detector? I wouldn't be recording, just detecting objects and passing along the values back into HA for consumption / sending alerts. I hate porch pirates. |
@sstratoti Running multiple different detectors is currently not supported |
Hmm, then maybe I could spin up a second Frigate with deepstack as the detector and use it just for detecting objects / logos? Either way, this is an exciting enhancement. Thanks! |
That's probably what you'd have to do at this point, and you'd need to also customize your labelmap based on your model. |
See #6258 |
This just needs docs updates before it can be merged. |
Can you help me with docs? I'm not a native english speaker, so my docs will be ugly (or funny :) ) |
I'd say the main section of https://docs.frigate.video/configuration/detectors#openvino-detector would be a good template. Basically you want to have:
|
Here's the question, describe deepstack or codeproject ai.server in Setup section? |
Given that this is directly interfacing with deepstack, it would be deepstack in that case |
but deepstack and ai.server implements the identical API (ai.server can be used as drop-in replacement of deepstack). Deepstack was the first, but not maintained now, unlike ai.server |
I see, that makes sense. In that case I would state exactly that, and mainly focus on codeproject ai |
dd2e8f2
to
f7f70ce
Compare
🤷 |
Re: deepstack being maintained... He posted this 3 months ago. No new commits, but it's not entirely dead yet. |
but latest release was more than year ago... |
Co-authored-by: Nicolas Mowen <[email protected]>
Perhaps it makes sense to add information that the documentation on codeproject.com is dramatically outdated, and user need to use another port (5000 instead of 32168) to run in the docker container? |
Interesting, that port worked for me when testing code project ai on my server on the latest version. |
I used those instructions and port 32168 when spinning up my container when testing this PR and didn't have any issues. |
What versions of the container and host OS (windows, linux,mac)? |
Linux, pulling the latest |
the same image on linux for me works on port 5000. can you check this port on you installation? :) |
Similar to Nick, Linux, using My docker-compose was here |
It's a bug in AI.Server. in some cases, related to docker network type (macvlan of ipvlan) it may stops listen on 32168. |
did you ever try out the |
No, I never did. |
They work worse than the main. The best thing you can do: fetch |
Hi, @skrashevich I want to use the license-plate model in codeproject.ai with the Frigate deepstack detector, I have codeproject.ai server running and detector pointing to the license-plate model as below but it is not detecting. The labels returned I think are Dayplate and Nightplate but I do not know how to make this work with Frigate is there something else I need to update? I am running dev 0.13.0-0996883 in docker. I am new so some simple steps instructions would really help.
I have tried your suggestion above with yolov5x6.pt and it works fine so I know I have got the connectivity to codeproject.ai server fine but I do not know how to setup for the license-plate model. I am sure it is something simple like add labels or something but I do not know how in docker and would appreciate it if you can tell me how. |
example configuration:
docker container for test: skrashevich/frigate:deepstack