-
Notifications
You must be signed in to change notification settings - Fork 465
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
get accurate frame sampling time #538
Comments
Hi @shockjiang However, what you write is not correct. The timestamp corresponds to the host system clock in the exact moment when the frames are ready in the buffer of the USB3 controller. |
Thank you for your response.
|
No, they are sampled per line and transmitted to the host per image
This is not possible. The HW does not allow precisely retrieving this kind of information. You can estimate the mean latency by pointing the camera to a monitor showing a stopwatch. |
For a GPS + VO fusion application, if the app cannot get the precise time of the camera "taking" phone, the performance of fusion may be greatly affected. Am I right? If so, there is a need for the feature of providing accurate timestamp. What's more, is there an API to set the time to the camera's OS? |
The method currently used to retrieve the frame timestamp is the most precise as possible.
Can you explain better? The question is not clear. |
|
In my first reply, I wrote this:
This is how the timestamp is set for the ZED. The internal ISP does not allow retrieving the timestamp in a more precise way. If you use the GPS with a rate of 10 Hz, and the ZED latency is 30-50 msec, then it's easy to assign each frame to the correct GPS datum. It would have been a problem if the latency was higher than the GPS period. |
Thank you so much @Myzhar |
It's my pleasure. Do not hesitate to write an email to [email protected] if you need help. |
Preliminary Checks
Proposal
According to the document here: https://www.stereolabs.com/docs/api/classsl_1_1Camera.html#a3cd31c58aba33727f35aeae28244c82d, timestamp for each frame is the time that indicate the ending of each frame's readout:
Is the above description right?
In some case, we really need an accurate time, which could be the middle time between the start and end of frame's sampling (readout), which is an accurate timestamp for the frame.
Use-Case
realtime multi-sensor fusion for navigation
Anything else?
No response
The text was updated successfully, but these errors were encountered: