Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any suggestion on memory usage for large-scale dataset? #7

Open
wlt027 opened this issue Jul 20, 2023 · 2 comments
Open

Any suggestion on memory usage for large-scale dataset? #7

wlt027 opened this issue Jul 20, 2023 · 2 comments
Labels
enhancement New feature or request

Comments

@wlt027
Copy link

wlt027 commented Jul 20, 2023

Hi,thanks for nice work!
I have test my own data and get good result.But for large-scale data or long-term data,I found that this process use too much memory,so could you please give some suggestions?
Uploading Selection_115.png…

@HViktorTsoi
Copy link
Owner

Hello, excessive memory consumption is caused by the large size of the VoxelMap. The structure of the VoxelMap is actually an std::unordered_map. You can try keeping track of the occupancy status of each map cell and remove those cells that have been unoccupied for a long time(or too far away), thus maintaining a smaller local map. We are also planning to add this feature.

@HViktorTsoi HViktorTsoi added the enhancement New feature or request label Jul 20, 2023
@HViktorTsoi
Copy link
Owner

Besides, if dense point cloud publishing is enabled, the RVIZ process may also consume a lot of memory.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants