You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Fix an issue where updates on cubes or updates on datatsets using
dask.dataframe might not update all secondary indices, resulting in
a corrupt state after the update
Expose compression type and row group chunk size in Cube interface
via optional parameter of type ~kartothek.serialization.ParquetSerializer{.interpreted-text
role="class"}.
Add retries to ~kartothek.serialization._parquet.ParquetSerializer.restore_dataframe{.interpreted-text
role="func"} IOErrors on long running ktk + dask tasks have been
observed. Until the root cause is fixed, the serialization is
retried to gain more stability.