You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I am trying to run run_sequentially inside the docker container but getting this error.
How should I proceed?
submitter@ome/tvb-recon/:/opt/tvb-recon:/opt/tvb-recon/pegasus$ python run_sequentially.py "1"
Starting to process the following subjects: %s [1]
Starting to process the subject: TVB1
Configured atlas default for patient inside folder /home/submitter/data/TVB1/configs
Checking currently running job ids...
Currently running job ids are: []
Starting pegasus run for subject: TVB1with atlas: default
main_pegasus.sh: 7: main_pegasus.sh: Bad substitution
/opt/tvb-recon
2021-06-12 18:33:29,386 - tvb.recon.dax.configuration - INFO - Parsing patient configuration file /home/submitter/data/TVB1/configs/patient_flow.properties
Traceback (most recent call last):
File "/opt/conda/lib/python2.7/runpy.py", line 174, in _run_module_as_main
"main", fname, loader, pkg_name)
File "/opt/conda/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/opt/tvb-recon/tvb/recon/dax/main.py", line 55, in
tracts_generation = TractsGeneration(config.props[ConfigKey.DWI_MULTI_SHELL], config.props[ConfigKey.MRTRIX_THRDS],
KeyError: <ConfigKey.DWI_MULTI_SHELL: 'dwi.multi.shell'>
Traceback (most recent call last):
File "/usr/bin/pegasus-graphviz", line 507, in
main()
File "/usr/bin/pegasus-graphviz", line 497, in main
dag = parse_daxfile(dagfile, options.files)
File "/usr/bin/pegasus-graphviz", line 225, in parse_daxfile
f = open(fname,"r")
IOError: [Errno 2] No such file or directory: '/home/submitter/data/TVB1/configs/dax/main_bnm.dax'
Error: dot: can't open /home/submitter/data/TVB1/configs/dax/main_bnm.dot
2021.06.12 18:33:29.784 UTC: [ERROR] Problem while determining the version of dax class java.lang.RuntimeException: java.io.FileNotFoundException: The file (/home/submitter/data/TVB1/configs/dax/main_bnm.dax ) specified does not exist
2021.06.12 18:33:29.787 UTC: [FATAL ERROR]
[1]: Instantiating DAXParser at edu.isi.pegasus.planner.parser.DAXParserFactory.loadDAXParser(DAXParserFactory.java:235)
[2]: Invalid static initializer method name for DAXParser3 at edu.isi.pegasus.common.util.DynamicLoader.instantiate(DynamicLoader.java:131)
ERROR while logging metrics The metrics file location is not yet initialized
Checking currently running job ids...
Currently running job ids are: []
Traceback (most recent call last):
File "run_sequentially.py", line 185, in
current_job_id = new_job_ids[0]
IndexError: list index out of range
The text was updated successfully, but these errors were encountered:
Hi, I am trying to run run_sequentially inside the docker container but getting this error.
How should I proceed?
submitter@ome/tvb-recon/:/opt/tvb-recon:/opt/tvb-recon/pegasus$ python run_sequentially.py "1"
Starting to process the following subjects: %s [1]
Starting to process the subject: TVB1
Configured atlas default for patient inside folder /home/submitter/data/TVB1/configs
Checking currently running job ids...
Currently running job ids are: []
Starting pegasus run for subject: TVB1with atlas: default
main_pegasus.sh: 7: main_pegasus.sh: Bad substitution
/opt/tvb-recon
2021-06-12 18:33:29,386 - tvb.recon.dax.configuration - INFO - Parsing patient configuration file /home/submitter/data/TVB1/configs/patient_flow.properties
Traceback (most recent call last):
File "/opt/conda/lib/python2.7/runpy.py", line 174, in _run_module_as_main
"main", fname, loader, pkg_name)
File "/opt/conda/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/opt/tvb-recon/tvb/recon/dax/main.py", line 55, in
tracts_generation = TractsGeneration(config.props[ConfigKey.DWI_MULTI_SHELL], config.props[ConfigKey.MRTRIX_THRDS],
KeyError: <ConfigKey.DWI_MULTI_SHELL: 'dwi.multi.shell'>
Traceback (most recent call last):
File "/usr/bin/pegasus-graphviz", line 507, in
main()
File "/usr/bin/pegasus-graphviz", line 497, in main
dag = parse_daxfile(dagfile, options.files)
File "/usr/bin/pegasus-graphviz", line 225, in parse_daxfile
f = open(fname,"r")
IOError: [Errno 2] No such file or directory: '/home/submitter/data/TVB1/configs/dax/main_bnm.dax'
Error: dot: can't open /home/submitter/data/TVB1/configs/dax/main_bnm.dot
2021.06.12 18:33:29.784 UTC: [ERROR] Problem while determining the version of dax class java.lang.RuntimeException: java.io.FileNotFoundException: The file (/home/submitter/data/TVB1/configs/dax/main_bnm.dax ) specified does not exist
2021.06.12 18:33:29.787 UTC: [FATAL ERROR]
[1]: Instantiating DAXParser at edu.isi.pegasus.planner.parser.DAXParserFactory.loadDAXParser(DAXParserFactory.java:235)
[2]: Invalid static initializer method name for DAXParser3 at edu.isi.pegasus.common.util.DynamicLoader.instantiate(DynamicLoader.java:131)
ERROR while logging metrics The metrics file location is not yet initialized
Checking currently running job ids...
Currently running job ids are: []
Traceback (most recent call last):
File "run_sequentially.py", line 185, in
current_job_id = new_job_ids[0]
IndexError: list index out of range
The text was updated successfully, but these errors were encountered: