We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Somehow just starting a local dask client via
from dask.distributed import Client client = Client()
interferes with pyspark, seems like just in the USGS and NWM fetching. Probably in the _get_secondary_location_ids() function
_get_secondary_location_ids()
lcw_df = self.ev.location_crosswalks.query( filters={ "column": "secondary_location_id", "operator": "like", "value": f"{prefix}-%" } ).to_pandas()
where nothing gets returned by the query and the missing table error is raised. If we do not initialize the client, it works fine.
The text was updated successfully, but these errors were encountered:
However, it seems to work fine on TEEHR-Hub v0.4-beta either way
Sorry, something went wrong.
No branches or pull requests
Somehow just starting a local dask client via
interferes with pyspark, seems like just in the USGS and NWM fetching. Probably in the
_get_secondary_location_ids()
functionwhere nothing gets returned by the query and the missing table error is raised. If we do not initialize the client, it works fine.
The text was updated successfully, but these errors were encountered: