You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
there is no bulk upload feature in the ia tool, when trying to upload 300GB the connection drops at times which means i have to start from the beginning as the tool doesn't keep track of what is on the server compared to what is there locally. This is just a waste of bandwidth. I have made a tool(script) that keeps track of what has been uploaded so that it is easier to resume in case of interruptions. I wish this would be added as a function to the ia tool you are developing.
Maybe for most usecases but having to checksum several hundred GB of files every upload can take hours at best and days at worst, speaking from my case. I am still uploading a 300 GB archive with a Raspberry pi 3 and everytime my connection drops I have to restart from beginning meaning it has to rehash all the files both locally and on IA. The SQLite database keeps track of what files have been uploaded and can thus just continue where it left off and only checks the MD5 when all files are uploaded.
Hello,
there is no bulk upload feature in the ia tool, when trying to upload 300GB the connection drops at times which means i have to start from the beginning as the tool doesn't keep track of what is on the server compared to what is there locally. This is just a waste of bandwidth. I have made a tool(script) that keeps track of what has been uploaded so that it is easier to resume in case of interruptions. I wish this would be added as a function to the ia tool you are developing.
https://github.com/ockentap/Internet-Archive-CLI-Bulk-Upload-Script
The text was updated successfully, but these errors were encountered: