You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Something submitting hundreds of analyses is very time-intensive. And querying takes a lot of time as the api queries get very large. It would be good to allow to submit jobs by batches:
BATCH MODE: Running 1-400 of 2000
Retrieving 400 from experiments API endpoint...
RUNNING 400 TUPLES FOR CICERO 1.9.2 GRCH37
----------------------------------------
Checking for existing analyses...
Retrieving 0 from analyses API endpoint...
Creating analyses for 400 tuples... [####################################] 100%
IDENTIFIER PROJECTS TARGETS REFERENCES MESSAGE
493756 100 (TD) IID_H100004_T01_01_TD01 0 READY FOR SUBMISSION
/work/isabl/data/analyses/37/56/493756
[...]
RAN 4000 | SKIPPED 0 | INVALID 0
BATCH MODE: Running 401-800 of 2000
Retrieving 400 from experiments API endpoint...
RUNNING 400 TUPLES FOR CICERO 1.9.2 GRCH37
----------------------------------------
Checking for existing analyses...
Retrieving 0 from analyses API endpoint...
Creating analyses for 400 tuples... [####################################] 100%
IDENTIFIER PROJECTS TARGETS REFERENCES MESSAGE
493756 100 (TD) IID_H100004_T01_01_TD01 0 READY FOR SUBMISSION
/work/isabl/data/analyses/37/56/493756
[...]
RAN 4000 | SKIPPED 0 | INVALID 0
[...]
BATCH MODE: Running 1601-2000 of 2000
Retrieving 400 from experiments API endpoint...
RUNNING 400 TUPLES FOR CICERO 1.9.2 GRCH37
----------------------------------------
Checking for existing analyses...
Retrieving 0 from analyses API endpoint...
Creating analyses for 400 tuples... [####################################] 100%
IDENTIFIER PROJECTS TARGETS REFERENCES MESSAGE
493756 100 (TD) IID_H100004_T01_01_TD01 0 READY FOR SUBMISSION
/work/isabl/data/analyses/37/56/493756
[...]
RAN 4000 | SKIPPED 0 | INVALID 0
Add --commit to proceed.
This way it might be ⚡️ faster as api responses are smaller, and the first batches of analyses can start running while the others are still being created
The text was updated successfully, but these errors were encountered:
New feature
Something submitting hundreds of analyses is very time-intensive. And querying takes a lot of time as the api queries get very large. It would be good to allow to submit jobs by batches:
How it works now:
Currently it works like this:
Which can take up tens of minutes to run ⌛️😴💤, as the queried object is very large.
Suggestion:
This way it might be ⚡️ faster as api responses are smaller, and the first batches of analyses can start running while the others are still being created
The text was updated successfully, but these errors were encountered: