Replies: 3 comments 1 reply
-
Hi, there are no --read_datasink option for analysis. You can try rerunning confound correction using the --exclusion_ids option to exclude the bad scan, and other subjects should not have to be run again through confoudn correction. |
Beta Was this translation helpful? Give feedback.
-
Thanks. I tried using the option with the singularity container rabies 0.5.1. When it says to add the full path to the txt or nii files, does this mean to add the path after the --exclusion_id or with -B option. Tried just about every combination, and still receive the below message.
|
Beta Was this translation helpful? Give feedback.
-
Okay. Thank you for your suggestion. That was the problem. Added the path with the -B option: -B ${rabies_dir}/${input}/sub-41/ses-02/func/:${rabies_dir}/${input}/sub-41/ses-02/func/ \
After including this line it now works. Thanks. -g |
Beta Was this translation helpful? Give feedback.
-
Ran into an issue, due to 1 bad datasets, during confound correction. Somehow it got through preprocessing. Looks like something weird happened with the warping to commonspace.
Confound correction continued to run throughout the remaining dataset, but at the end of the batch, gave an error and did not save a rabies_confound_correction_workflow.pkl due to that one bad dataset
Is there a way to run Analysis on this data or is there a --read_datasink option with Analysis to use? Trying to avoid having to re-run confound correction again for over 60 subjects.
-g
Beta Was this translation helpful? Give feedback.
All reactions