Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How do you overcome the limitation of 100 active partition writers #149

Open
tonicebrian opened this issue Mar 8, 2023 · 2 comments
Open

Comments

@tonicebrian
Copy link

I thought that I was able to regenerate the warehouse from the data lake just by deleting the destination table and running DBT again but today I was hit by the limitation:

HIVE_TOO_MANY_OPEN_PARTITIONS: Exceeded limit of 100 open writers for partitions/buckets.

And looking at the documentation here, it seems like a known issue.

Since I'm partitioning by date and this is a sensible thing to do for the data I have, how do people here ingest data that is older than 3 months during the first non incremental run of DBT + Athena?

@nicor88
Copy link

nicor88 commented Mar 8, 2023

have a look at this conversation in the community adapter: dbt-labs/dbt-athena#87
the solution is to use a modified version of https://github.com/dbt-labs/dbt-utils#insert_by_period for athena

@tonicebrian
Copy link
Author

Hi @nicor88 thanks for pointing to the right direction. Do you know of any code snippet with this modifications for Athena? It looks a bit intimidating since it has lots of specific macro constructs.

Avinash-1394 added a commit to Avinash-1394/dbt-athena that referenced this issue Mar 27, 2023
Co-authored-by: nicor88 <[email protected]>
Co-authored-by: Jérémy Guiselin <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants