Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Deepstack/CodeProject-AI.Server detector plugin #6143

Merged
merged 8 commits into from
May 4, 2023

Conversation

skrashevich
Copy link
Contributor

@skrashevich skrashevich commented Apr 18, 2023

example configuration:

deepstack:
  api_url: http://192.168.88.50:5500/v1/vision/detection
  type: deepstack
  api_timeout: 0.1

docker container for test: skrashevich/frigate:deepstack

@netlify
Copy link

netlify bot commented Apr 18, 2023

Deploy Preview for frigate-docs ready!

Name Link
🔨 Latest commit 11f3e3b
🔍 Latest deploy log https://app.netlify.com/sites/frigate-docs/deploys/64517986e5d63c0008703da7
😎 Deploy Preview https://deploy-preview-6143--frigate-docs.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site settings.

@NickM-27
Copy link
Collaborator

What kinds of inference times are you seeing?

@skrashevich
Copy link
Contributor Author

6 cameras, each on 10fps. frigate and ai.server on different physical machines:
telegram-cloud-photo-size-2-5330101685202307428-y

@NickM-27
Copy link
Collaborator

thanks, and what is the deepstack server using to run the inferences (CPU, gpu, etc?)

@skrashevich
Copy link
Contributor Author

CodeProject Ai.Server 2.1.1, YOLOv5 6.2 Large model, half-precision. GPU NVIDIA GeForce GTX 1070

@hawkeye217
Copy link
Collaborator

I've been seeing some hype around the interwebs lately about CodeProject AI, so I wanted to experiment with some of the custom models they've trained ("ipcam-dark" specifically). I tested this PR by quickly spinning up CodeProject AI via docker-compose:

  codeprojectai:
    container_name: codeprojectai
    image: codeproject/ai-server:2.1.3
    ports:
      - 32168:32168
    restart: always
    volumes:
      - /home/myuser/docker/codeprojectai/etc:/etc/codeproject/ai
      - /home/myuser/docker/codeprojectai/modules:/app/modules

config.yml:

detectors:
  deepstack:
    api_url: http://192.168.1.4:32168/v1/vision/custom/ipcam-dark
    type: deepstack
    api_timeout: 1

The particular ipcam-dark model returns a label with a capital first letter, so that threw off Frigate. A quick hack on line 78 of this PR fixed it:

int(self.get_label_index(detection["label"].lower()))

Additionally, the default YOLOv5 model in CodeProject seemed to return labels that weren't in /labelmap.txt - I didn't dig any deeper, but Frigate crashed whenever that happened.

Overall though, I love the flexibility to have various detectors as plugins.

@skrashevich
Copy link
Contributor Author

skrashevich commented Apr 23, 2023

Additionally, the default YOLOv5 model in CodeProject seemed to return labels that weren't in /labelmap.txt - I didn't dig any deeper, but Frigate crashed whenever that happened.

Please show the logs from the moment the frigate crashes

@hawkeye217
Copy link
Collaborator

Additionally, the default YOLOv5 model in CodeProject seemed to return labels that weren't in /labelmap.txt - I didn't dig any deeper, but Frigate crashed whenever that happened.

Please show the logs from the moment the frigate crashes

2023-04-23 08:17:22.489122981    File "/usr/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
2023-04-23 08:17:22.489124149      self.run()
2023-04-23 08:17:22.489125520    File "/usr/lib/python3.9/multiprocessing/process.py", line 108, in run
2023-04-23 08:17:22.489126709      self._target(*self._args, **self._kwargs)
2023-04-23 08:17:22.489128057    File "/opt/frigate/frigate/object_detection.py", line 121, in run_detector
2023-04-23 08:17:22.489129311      detections = object_detector.detect_raw(input_frame)
2023-04-23 08:17:22.489130588    File "/opt/frigate/frigate/object_detection.py", line 71, in detect_raw
2023-04-23 08:17:22.489131841      return self.detect_api.detect_raw(tensor_input=tensor_input)
2023-04-23 08:17:22.489133171    File "/opt/frigate/frigate/detectors/plugins/deepstack.py", line 78, in detect_raw
2023-04-23 08:17:22.489134415      int(self.get_label_index(detection["label"])),
2023-04-23 08:17:22.489135776  TypeError: int() argument must be a string, a bytes-like object or a number, not 'NoneType'

@hawkeye217
Copy link
Collaborator

So the issue here specifically has to do with the label "truck".

In /labelmap.txt, 7 truck has been renamed to 7 car - this was a decision by the devs a while ago as the default model has a hard time distinguishing between car and truck.

When the CodeProject model returns "truck", this is what causes Frigate to crash.

@skrashevich
Copy link
Contributor Author

Probably fixed

@NickM-27
Copy link
Collaborator

Please use black to reformat so check can pass

@skrashevich
Copy link
Contributor Author

Done

  • Some fixes to use build-in label map instead of custom loader

@sstratoti
Copy link

Whoa, this would be huge! I currently have deepstack running with a model that I trained for delivery truck logos, and I use this in conjunction with "mail and packages" HACS integration to track when deliveries are outside. Right now this is all through a node-red flow that grabs the snapshot from frigate API and runs it against deepstack's api.

I have a Coral USB that I'm using with Frigate. If deepstack were added as a detector and if I still wanted to use the coral for other objects, would I just set up a second camera with the same restream that uses only the deepstack detector? I wouldn't be recording, just detecting objects and passing along the values back into HA for consumption / sending alerts. I hate porch pirates.

@NickM-27
Copy link
Collaborator

NickM-27 commented Apr 25, 2023

@sstratoti Running multiple different detectors is currently not supported

@sstratoti
Copy link

Hmm, then maybe I could spin up a second Frigate with deepstack as the detector and use it just for detecting objects / logos? Either way, this is an exciting enhancement. Thanks!

@hawkeye217
Copy link
Collaborator

Hmm, then maybe I could spin up a second Frigate with deepstack as the detector and use it just for detecting objects / logos? Either way, this is an exciting enhancement. Thanks!

That's probably what you'd have to do at this point, and you'd need to also customize your labelmap based on your model.

@skrashevich
Copy link
Contributor Author

I have a Coral USB that I'm using with Frigate. If deepstack were added as a detector and if I still wanted to use the coral for other objects, would I just set up a second camera with the same restream that uses only the deepstack detector? I wouldn't be recording, just detecting objects and passing along the values back into HA for consumption / sending alerts. I hate porch pirates.

See #6258

@blakeblackshear
Copy link
Owner

This just needs docs updates before it can be merged.

@skrashevich
Copy link
Contributor Author

This just needs docs updates before it can be merged.

Can you help me with docs? I'm not a native english speaker, so my docs will be ugly (or funny :) )

@NickM-27
Copy link
Collaborator

NickM-27 commented May 2, 2023

I'd say the main section of https://docs.frigate.video/configuration/detectors#openvino-detector would be a good template. Basically you want to have:

# <title>

<brief description of CodeProject.AI and the devices it can be run on. I think it would be important for you to mention it is over the network so inference times won't be as good as a native detector.>

## Setup

<describe where to get codeproject ai, I think full setup instructions are outside the scope of the frigate docs>

<show the frigate config file changes that need to be made>

<describe how to verify it is working>

@skrashevich
Copy link
Contributor Author

Here's the question, describe deepstack or codeproject ai.server in Setup section?

@NickM-27
Copy link
Collaborator

NickM-27 commented May 2, 2023

Here's the question, describe deepstack or codeproject ai.server in Setup section?

Given that this is directly interfacing with deepstack, it would be deepstack in that case

@skrashevich
Copy link
Contributor Author

Given that this is directly interfacing with deepstack, it would be deepstack in that case

but deepstack and ai.server implements the identical API (ai.server can be used as drop-in replacement of deepstack). Deepstack was the first, but not maintained now, unlike ai.server

@NickM-27
Copy link
Collaborator

NickM-27 commented May 2, 2023

Given that this is directly interfacing with deepstack, it would be deepstack in that case

but deepstack and ai.server implements the identical API (ai.server can be used as drop-in replacement of deepstack). Deepstack was the first, but not maintained now, unlike ai.server

I see, that makes sense. In that case I would state exactly that, and mainly focus on codeproject ai

@skrashevich skrashevich force-pushed the deepstack-detector branch from dd2e8f2 to f7f70ce Compare May 2, 2023 20:21
@skrashevich
Copy link
Contributor Author

🤷

@sstratoti
Copy link

Re: deepstack being maintained...

johnolafenwa/DeepStack#182

He posted this 3 months ago. No new commits, but it's not entirely dead yet.

@skrashevich
Copy link
Contributor Author

He posted this 3 months ago. No new commits, but it's not entirely dead yet.

but latest release was more than year ago...
anyway, punks not dead! :)

docs/docs/configuration/detectors.md Outdated Show resolved Hide resolved
docs/docs/configuration/detectors.md Outdated Show resolved Hide resolved
@skrashevich skrashevich requested a review from NickM-27 May 2, 2023 21:00
@skrashevich
Copy link
Contributor Author

Perhaps it makes sense to add information that the documentation on codeproject.com is dramatically outdated, and user need to use another port (5000 instead of 32168) to run in the docker container?

@NickM-27
Copy link
Collaborator

NickM-27 commented May 2, 2023

Interesting, that port worked for me when testing code project ai on my server on the latest version.

@hawkeye217
Copy link
Collaborator

Perhaps it makes sense to add information that the documentation on codeproject.com is dramatically outdated, and user need to use another port (5000 instead of 32168) to run in the docker container?

I used those instructions and port 32168 when spinning up my container when testing this PR and didn't have any issues.

@skrashevich
Copy link
Contributor Author

What versions of the container and host OS (windows, linux,mac)?

@NickM-27
Copy link
Collaborator

NickM-27 commented May 2, 2023

What versions of the container and host OS (windows, linux,mac)?

Linux, pulling the latest codeproject/ai-server:gpu

@skrashevich
Copy link
Contributor Author

skrashevich commented May 2, 2023

Linux, pulling the latest codeproject/ai-server:gpu

the same image on linux for me works on port 5000. can you check this port on you installation? :)

@hawkeye217
Copy link
Collaborator

What versions of the container and host OS (windows, linux,mac)?

Similar to Nick, Linux, using codeproject/ai-server, non-gpu though.

My docker-compose was here

@skrashevich
Copy link
Contributor Author

It's a bug in AI.Server. in some cases, related to docker network type (macvlan of ipvlan) it may stops listen on 32168.
need to keep in mind...

@blakeblackshear blakeblackshear merged commit ede1ded into blakeblackshear:dev May 4, 2023
@skrashevich skrashevich deleted the deepstack-detector branch May 4, 2023 23:26
@kirsch33
Copy link
Contributor

kirsch33 commented May 9, 2023

I've been seeing some hype around the interwebs lately about CodeProject AI, so I wanted to experiment with some of the custom models they've trained ("ipcam-dark" specifically). I tested this PR by quickly spinning up CodeProject AI via docker-compose:

  codeprojectai:
    container_name: codeprojectai
    image: codeproject/ai-server:2.1.3
    ports:
      - 32168:32168
    restart: always
    volumes:
      - /home/myuser/docker/codeprojectai/etc:/etc/codeproject/ai
      - /home/myuser/docker/codeprojectai/modules:/app/modules

config.yml:

detectors:
  deepstack:
    api_url: http://192.168.1.4:32168/v1/vision/custom/ipcam-dark
    type: deepstack
    api_timeout: 1

The particular ipcam-dark model returns a label with a capital first letter, so that threw off Frigate. A quick hack on line 78 of this PR fixed it:

int(self.get_label_index(detection["label"].lower()))

Additionally, the default YOLOv5 model in CodeProject seemed to return labels that weren't in /labelmap.txt - I didn't dig any deeper, but Frigate crashed whenever that happened.

Overall though, I love the flexibility to have various detectors as plugins.

did you ever try out the ipcam models from Codeproject? just curious how good they were

@hawkeye217
Copy link
Collaborator

did you ever try out the ipcam models from Codeproject? just curious how good they were

No, I never did.

@skrashevich
Copy link
Contributor Author

skrashevich commented May 11, 2023

did you ever try out the ipcam models from Codeproject? just curious how good they were

They work worse than the main.

The best thing you can do: fetch https://github.com/ultralytics/yolov5/releases/download/v7.0/yolov5x6.pt, save to /app/preinstalled-modules/ObjectDetectionYolo/custom-models/ipcam-yolov5x6 and use in this detector: api_url: http://x.x.x.x:nnnn/v1/vision/custom/ipcam-yolov5x6
Even good within PR #6429

@inspired27
Copy link

inspired27 commented Jun 18, 2023

did you ever try out the ipcam models from Codeproject? just curious how good they were

They work worse than the main.

The best thing you can do: fetch https://github.com/ultralytics/yolov5/releases/download/v7.0/yolov5x6.pt, save to /app/preinstalled-modules/ObjectDetectionYolo/custom-models/ipcam-yolov5x6 and use in this detector: api_url: http://x.x.x.x:nnnn/v1/vision/custom/ipcam-yolov5x6 Even good within PR #6429

Hi, @skrashevich I want to use the license-plate model in codeproject.ai with the Frigate deepstack detector, I have codeproject.ai server running and detector pointing to the license-plate model as below but it is not detecting. The labels returned I think are Dayplate and Nightplate but I do not know how to make this work with Frigate is there something else I need to update? I am running dev 0.13.0-0996883 in docker. I am new so some simple steps instructions would really help.

detectors:
  deepstack:
    api_url: http://192.168.1.xx:32168/v1/vision/custom/license-plate
    type: deepstack
    api_timeout: 1 # seconds

I have tried your suggestion above with yolov5x6.pt and it works fine so I know I have got the connectivity to codeproject.ai server fine but I do not know how to setup for the license-plate model. I am sure it is something simple like add labels or something but I do not know how in docker and would appreciate it if you can tell me how.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants