You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add CBS monthly data collection script to crontab:
Add CBS data to an S3 bucket - create a repo for provider code and for each year.
Import cbs data from email and upload to AWS - nowdays importmail process is running once a week and uploads to s3 data from last 2 cbs emails
Trigger Load CBS data from s3 when new CBS data arrives
command to delete a certain year and load starting that year, for example 2019: python main.py process cbs --path <cbs dir path> --delete_start_date 2019-01-01 -load_start_year=2019
cbs parser is in file cbs.py
delete only data starting the year of the current files that arrived
Create DB table versioning of emails we load from email to s3, and load new data to s3 only when new data arrives
Modify schedule from weekly back to daily (see this pr that changed from daily to weekly)
Note that CBS processes are now in anyway-etl repo - see process repo here
The text was updated successfully, but these errors were encountered:
Add CBS monthly data collection script to crontab:
command to delete a certain year and load starting that year, for example 2019:
python main.py process cbs --path <cbs dir path> --delete_start_date 2019-01-01 -load_start_year=2019
cbs parser is in file cbs.py
delete only data starting the year of the current files that arrived
Note that CBS processes are now in anyway-etl repo - see process repo here
The text was updated successfully, but these errors were encountered: