Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2 bugs in SpecifySPMModel with concatenate_runs = True? #2844

Open
hstojic opened this issue Jan 10, 2019 · 2 comments
Open

2 bugs in SpecifySPMModel with concatenate_runs = True? #2844

hstojic opened this issue Jan 10, 2019 · 2 comments
Labels
Milestone

Comments

@hstojic
Copy link
Contributor

hstojic commented Jan 10, 2019

Summary

I get an error defining SPM model level 1, with multiple runs and concatenate_runs set to True. I made sure that I have the same regressors in each run. The culprit seems to be the nipype code that adds run specific dummy variables (code) - regressor_names are not updated with these added variables and then error is raised at this line as of course it cannot find the regressor name.

Here is the error part from stdout:

[Node] Error on "stimuli_LSA_test_pipeline.model" (/media/hstojic/dataneuro/fnclearning_fmri/dProcessed/nipype_work/stimuli_LSA_test_pipeline/_subject_s057/model)
[Node] Error on "stimuli_LSA_test_pipeline.model" (/media/hstojic/dataneuro/fnclearning_fmri/dProcessed/nipype_work/stimuli_LSA_test_pipeline/_subject_s057/model)
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-9-a5adcf3b88b5> in <module>()
      2 pipeline.run(
      3     'MultiProc',
----> 4     plugin_args = {'n_procs': pars_gen['resources']['n_cores']}
      5 )

/home/hstojic/.pyenv/nipy/local/lib/python2.7/site-packages/nipype/pipeline/engine/workflows.pyc in run(self, plugin, plugin_args, updatehash)
    593         if str2bool(self.config['execution']['create_report']):
    594             self._write_report_info(self.base_dir, self.name, execgraph)
--> 595         runner.run(execgraph, updatehash=updatehash, config=self.config)
    596         datestr = datetime.utcnow().strftime('%Y%m%dT%H%M%S')
    597         if str2bool(self.config['execution']['write_provenance']):

/home/hstojic/.pyenv/nipy/local/lib/python2.7/site-packages/nipype/pipeline/plugins/base.pyc in run(self, graph, config, updatehash)
    160                         if result['traceback']:
    161                             notrun.append(
--> 162                                 self._clean_queue(jobid, graph, result=result))
    163                         else:
    164                             self._task_finished_cb(jobid)

/home/hstojic/.pyenv/nipy/local/lib/python2.7/site-packages/nipype/pipeline/plugins/base.pyc in _clean_queue(self, jobid, graph, result)
    222 
    223         if str2bool(self._config['execution']['stop_on_first_crash']):
--> 224             raise RuntimeError("".join(result['traceback']))
    225         crashfile = self._report_crash(self.procs[jobid], result=result)
    226         if jobid in self.mapnodesubids:

RuntimeError: Traceback (most recent call last):
  File "/home/hstojic/.pyenv/nipy/local/lib/python2.7/site-packages/nipype/pipeline/plugins/multiproc.py", line 69, in run_node
    result['result'] = node.run(updatehash=updatehash)
  File "/home/hstojic/.pyenv/nipy/local/lib/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 471, in run
    result = self._run_interface(execute=True)
  File "/home/hstojic/.pyenv/nipy/local/lib/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 555, in _run_interface
    return self._run_command(execute)
  File "/home/hstojic/.pyenv/nipy/local/lib/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 635, in _run_command
    result = self._interface.run(cwd=outdir)
  File "/home/hstojic/.pyenv/nipy/local/lib/python2.7/site-packages/nipype/interfaces/base/core.py", line 521, in run
    runtime = self._run_interface(runtime)
  File "/home/hstojic/.pyenv/nipy/local/lib/python2.7/site-packages/nipype/algorithms/modelgen.py", line 453, in _run_interface
    self._generate_design()
  File "/home/hstojic/.pyenv/nipy/local/lib/python2.7/site-packages/nipype/algorithms/modelgen.py", line 628, in _generate_design
    outliers=outliers)
  File "/home/hstojic/.pyenv/nipy/local/lib/python2.7/site-packages/nipype/algorithms/modelgen.py", line 372, in _generate_standard_design
    info.regressor_names[j]
IndexError: list index out of range

If my interpretation is correct I can send a pull request, its an easy fix with a single line I think:

infoout.regressor_names.extend(['run' + str(i + 1)])

at this line. I already tested it and that seems to work.

Platform details:

{'commit_hash': '8690d55d9',
 'commit_source': 'installation',
 'networkx_version': '2.1',
 'nibabel_version': '2.3.0',
 'nipype_version': '1.1.2',
 'numpy_version': '1.15.4',
 'pkg_path': '/home/hstojic/.pyenv/nipy/local/lib/python2.7/site-packages/nipype',
 'scipy_version': '1.1.0',
 'sys_executable': '/home/hstojic/.pyenv/nipy/bin/python',
 'sys_platform': 'linux2',
 'sys_version': '2.7.12 (default, Nov 20 2017, 18:23:56) \n[GCC 5.4.0 20160609]',
 'traits_version': '4.6.0'}

Execution environment

  • My python environment outside container
@satra
Copy link
Member

satra commented Jan 21, 2019

@hstojic - sorry i have not been in the loop on this. could you first create an example use case here in the test that breaks:

https://github.com/nipy/nipype/blob/master/nipype/algorithms/tests/test_modelgen.py#L100

i'm trying to understand what piece of the code is not doing the correct thing. i'm sure your fix will work, but it would be good to generate the test case that breaks first. since we have used this form of concatenate runs in our examples for a long time and would like to make sure what case we did not cover or what we broke and stopped testing properly.

$ grep -rn "concatenate" *.py
fmri_freesurfer_smooth.py:190:modelspec.inputs.concatenate_runs = True
fmri_nipy_glm.py:186:modelspec.inputs.concatenate_runs = True
fmri_spm_dartel.py:132:modelspec.inputs.concatenate_runs = True
fmri_spm_nested.py:152:modelspec.inputs.concatenate_runs = True

@effigies effigies modified the milestones: 1.1.8, future, 1.1.9 Jan 22, 2019
@effigies effigies modified the milestones: 1.1.9, 1.2.0 Feb 25, 2019
@effigies effigies modified the milestones: 1.2.0, 1.2.1 May 8, 2019
@effigies effigies modified the milestones: 1.2.1, 1.2.2 Aug 16, 2019
@oesteban oesteban modified the milestones: 1.2.2, 1.3.0, 1.2.3 Sep 5, 2019
@effigies
Copy link
Member

Hi @hstojic, is there any news on this one?

@effigies effigies modified the milestones: 1.2.3, future Sep 18, 2019
@effigies effigies added the bug label Sep 18, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants