Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MRG, BUG: Fix combine evokeds #7869

Merged
merged 11 commits into from
Jun 7, 2020
Merged

Conversation

drammock
Copy link
Member

@drammock drammock commented Jun 5, 2020

closes #7865
closes #7874

still WIP because I need to check the examples/tutorials to make sure they're all consistent with the docstring. Also needs what's new.

@drammock drammock marked this pull request as ready for review June 6, 2020 00:05
@drammock drammock force-pushed the fix-combine-evokeds branch from 15827ea to 220837f Compare June 6, 2020 00:07
@drammock drammock changed the title Fix combine evokeds MRG, BUG: Fix combine evokeds Jun 6, 2020
# We can use basic arithmetic to, for example, construct and plot
# difference ERPs.
# We can use negative weights in `mne.combine_evoked` to construct difference
# ERPs.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# ERPs.
# of ERPs.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would say that "difference ERPs" is a valid term on its own, so no need to say "difference of ERPs" :)

mne/evoked.py Outdated
else:
weights = np.array(weights, float)
weights = np.squeeze(weights).astype(float)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
weights = np.squeeze(weights).astype(float)
weights = np.array(weights).astype(float)

I hate squeeze magic

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see why we would need to use squeeze here anyway…?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not really necessary. My thinking was that we are testing that ndim == 1, and I don't think the docstring explicitly says that weights must be a 1D array, so I thought using squeeze was being generous to the user. I don't mind changing it back though

@agramfort
Copy link
Member

agramfort commented Jun 6, 2020 via email

@drammock
Copy link
Member Author

drammock commented Jun 6, 2020

I've examined all of the examples/tutorials that use combine_evoked and it appears that all the figures match up exactly between master and this PR (other than the plot_eeg_erp tutorial which I've just corrected). So should be ready for merge when CIs are happy.

@agramfort agramfort merged commit 4bfeb85 into mne-tools:master Jun 7, 2020
@agramfort
Copy link
Member

Thx @drammock

@agramfort
Copy link
Member

@drammock shall we backport this? at least the fix in tutorial?

@drammock
Copy link
Member Author

drammock commented Jun 8, 2020

I've been wondering about backporting. The behavior that @SophieHerbst reported was a bug, which is a good reason to backport. On the other hand, this PR also changes the behavior and guidance around how to do evoked subtraction. I'd rather introduce a change like that with a regular release, when people are more likely to read the changelog. I think the latter consideration is stronger for me. @larsoner what is your opinion?

agramfort pushed a commit that referenced this pull request Jun 8, 2020
* simplify/clarify tests

* fix weights computation and docstring

* simplify grandaverage

* better warning

* remove redundant test; clarify existing test

* clarify tutorial

* fix subtractions

* clarify tutorial

* update what's new

* revert squeeze

* some tutorial fixes
@hoechenberger
Copy link
Member

hoechenberger commented Jun 8, 2020

@drammock
I ran into something in our docs that is contradictory and therefore confusing:

Keeping track of nave is important for inverse imaging, because it is used to scale the noise covariance estimate (which in turn affects the magnitude of estimated source activity). See The minimum-norm current estimates for more information (especially the Whitening and scaling section). For this reason, combining Evoked objects with either weights='equal' or by providing custom numeric weights should usually not be done if you intend to perform inverse imaging on the resulting Evoked object.

(from https://mne.tools/dev/auto_tutorials/evoked/plot_10_evoked_overview.html#combining-evoked-objects, highlight by me)

vs

contrast = combine_evoked(evoked, weights=[-1, 1])  # Faces - scrambled
...
inverse_operator = make_inverse_operator(contrast.info, forward, noise_cov,
                                         loose=0.2, depth=0.8)

# Compute inverse solution on contrast
stc = apply_inverse(contrast, inverse_operator, lambda2, method, pick_ori=None)

(from https://mne.tools/dev/auto_examples/datasets/spm_faces_dataset.html#from-raw-data-to-dspm-on-spm-faces-dataset)

As a user, what shall I do? 😭😭😭

@larsoner
Copy link
Member

larsoner commented Jun 8, 2020

I think the latter consideration is stronger for me. @larsoner what is your opinion?

I'm okay with not backporting

Keeping track of nave is important for inverse imaging, because it is used to scale the noise covariance estimate (which in turn affects the magnitude of estimated source activity). See The minimum-norm current estimates for more information (especially the Whitening and scaling section). For this reason, combining Evoked objects with either weights='equal' or by providing custom numeric weights should usually not be done if you intend to perform inverse imaging on the resulting Evoked object.

I'm pretty sure nave only has an effect for dSPM and sLORETA, and our weights math should be correct so actually we should still get this scale factor correct. So maybe we should get rid of this, unless @agramfort sees places it might be problematic that I'm missing.

@agramfort
Copy link
Member

agramfort commented Jun 8, 2020 via email

@drammock
Copy link
Member Author

drammock commented Jun 8, 2020

actually I guess you're right... it's only grand_average where that warning is relevant (it sets nave=len(evokeds)). I'll change the tutorial text today.

@drammock
Copy link
Member Author

drammock commented Jun 8, 2020

crossref #7881

@drammock drammock deleted the fix-combine-evokeds branch June 8, 2020 20:42
larsoner added a commit to larsoner/mne-python that referenced this pull request Jun 17, 2020
* upstream/master: (24 commits)
  WIP: Fix Travis (mne-tools#7906)
  WIP: Prototype of notebook viz (screencast) (mne-tools#7758)
  MRG, FIX: Speed up I/O tests, mark some slow (mne-tools#7904)
  Proper attribution for Blender tutorial (mne-tools#7900)
  MAINT: Check usage [ci skip] (mne-tools#7902)
  Allow find_bad_channels_maxwell() to return scores (mne-tools#7845)
  Warn if NIRx directory structure has been modified from original format (mne-tools#7898)
  Pin pvyista to 0.24.3 (mne-tools#7899)
  MRG: Add support for reading and writing sufaces to .obj (mne-tools#7824)
  Fix _auto_topomap_coords docstring. (mne-tools#7895)
  MRG, FIX: Ensure Info H5-writeable (mne-tools#7887)
  Website contents (mne-tools#7889)
  MRG, ENH: Add mri_resolution="sparse" (mne-tools#7888)
  MRG, ENH: Allow disabling FXAA (mne-tools#7877)
  remove "and and" [ci skip] (mne-tools#7882)
  fix evoked nave → inverse guidance (mne-tools#7881)
  ENH: Better error messages (mne-tools#7879)
  FIX : EDF+ Annotation Timestamps missing sub-second accuracy (mne-tools#7875)
  FIX: Fix get_channel_types (mne-tools#7878)
  MRG, BUG: Fix combine evokeds (mne-tools#7869)
  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants