You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a embedding array of size 1,000,000. Each embedding of dimension 386.
When I give the embedding array to community_detection, it keeps running for days and then fails.
VM Configuration:
40 GB GPU Ram
84 GB System RAM
Peak Utilisation:
GPU Memory - 17%
GPU - 11%
CPU Memory - 12%
Other observations:
Say, the name of the variable holding embeddings array is embeds
If I reduce the size of embeddings array to 1/10th, i.e., if I give 100,000 instead of 1,000,000 (embeds[:100000]), within 2 mins the execution is completed successfully.
BUT, instead of giving first 100,000 samples, if I give 100,000 from between, it takes unusually long (embeds[200000:300000], takes way more time)
The text was updated successfully, but these errors were encountered:
I have a embedding array of size 1,000,000. Each embedding of dimension 386.
When I give the embedding array to community_detection, it keeps running for days and then fails.
VM Configuration:
40 GB GPU Ram
84 GB System RAM
Peak Utilisation:
GPU Memory - 17%
GPU - 11%
CPU Memory - 12%
Other observations:
Say, the name of the variable holding embeddings array is embeds
If I reduce the size of embeddings array to 1/10th, i.e., if I give 100,000 instead of 1,000,000 (embeds[:100000]), within 2 mins the execution is completed successfully.
BUT, instead of giving first 100,000 samples, if I give 100,000 from between, it takes unusually long (embeds[200000:300000], takes way more time)
The text was updated successfully, but these errors were encountered: