Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why is the input to the model with_missing_mask? #45

Closed
tangkai-RS opened this issue Nov 6, 2024 · 3 comments
Closed

Why is the input to the model with_missing_mask? #45

tangkai-RS opened this issue Nov 6, 2024 · 3 comments
Labels

Comments

@tangkai-RS
Copy link

Hi, great job! I'm a beginner in the field of time-series imputation. May I ask why the "missing_mask" and "X" concat should be used in the input of SAITS?
def impute(self, inputs): X, masks = inputs["X"], inputs["missing_mask"] input_X = torch.cat([X, masks], dim=2) if self.input_with_mask else X input_X = self.embedding(input_X)

@WenjieDu
Copy link
Owner

WenjieDu commented Nov 6, 2024

Hi there,

Thank you so much for your attention to SAITS! If you find SAITS is helpful to your work, please star⭐️ this repository. Your star is your recognition, which can let others notice SAITS. It matters and is definitely a kind of contribution.

I have received your message and will respond ASAP. Thank you again for your patience! 😃

Best,
Wenjie

Copy link

This issue had no activity for 14 days. It will be closed in 1 week unless there is some new activity. Is this issue already resolved?

@github-actions github-actions bot added the stale label Nov 21, 2024
@WenjieDu
Copy link
Owner

NaNs are filled with 0. Concatenating missing_mask can tell the model which values are missing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants