Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak #128

Open
Tracked by #133
chriswessels opened this issue Jan 8, 2024 · 6 comments
Open
Tracked by #133

Memory leak #128

chriswessels opened this issue Jan 8, 2024 · 6 comments
Labels
meta:triaged This issue has been triaged (has a good description, as well as labels for priority, size and type) p1 High priority size:medium Medium type:bug Something isn't working

Comments

@chriswessels
Copy link
Member

chriswessels commented Jan 8, 2024

Describe the bug
Subgraph Radio seems to leak memory, reported by @trader-payne. See screenshots. Drops in memory are due to restarts.

Screenshots
Running docker tag 1.0:
image
image
image

@chriswessels chriswessels added type:bug Something isn't working size:medium Medium p1 High priority meta:triaged This issue has been triaged (has a good description, as well as labels for priority, size and type) labels Jan 8, 2024
@chriswessels
Copy link
Member Author

It looks like ghcr.io/graphops/subgraph-radio:0.1.10 may not exhibit this behaviour:

image

@pete-eiger pete-eiger self-assigned this Jan 9, 2024
@pete-eiger pete-eiger mentioned this issue Feb 27, 2024
2 tasks
@pete-eiger pete-eiger reopened this Mar 2, 2024
@aasseman
Copy link

aasseman commented May 8, 2024

Is it normal that it uses so much RAM at steady state? (3-5GB)

@pete-eiger
Copy link
Contributor

Is it normal that it uses so much RAM at steady state? (3-5GB)

which version are you using 😰

@aasseman
Copy link

aasseman commented May 8, 2024

Actually I didn't quite reach 3GB on the last run, but I see on Chris' sreenshot that he was at ~5GB
image

@pete-eiger
Copy link
Contributor

oof, 2.78GB is still quite a lot... thank you for reporting this!! We will look into it

@pete-eiger
Copy link
Contributor

@aasseman while the issue seems to be improved a bit and now the memory is getting freed at least sometimes, in the long run it still keeps increasing essentially forever :( we will have to dive into this again

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
meta:triaged This issue has been triaged (has a good description, as well as labels for priority, size and type) p1 High priority size:medium Medium type:bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants