Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Animate ECoG activity on top of the FreeSurfer pial surface of the brain #7787

Closed
adam2392 opened this issue May 14, 2020 · 4 comments · Fixed by #8190
Closed

Animate ECoG activity on top of the FreeSurfer pial surface of the brain #7787

adam2392 opened this issue May 14, 2020 · 4 comments · Fixed by #8190

Comments

@adam2392
Copy link
Member

Describe the problem

In #7768 animation is being added to the plot_ecog example. @larsoner suggested that we add in the animation into the example. Currently, it's put together using matplotlib.animation, but seems like mne-python has other animation funcs (e.g. topomap), so I think it would be nice to have a "3D brain animate" function sitting in mne-python.

Describe your solution

A function similar to the topomap (https://mne.tools/dev/generated/mne.Evoked.html#mne.Evoked.animate_topomap) animate function should be possible.

I think a nice feature, would be a four view panel (saggital, coronal, axial and "custom"), where "custom" is looking at the activity from an optimally selected view that shows as many channels as possible on the surface of the brain. The other 3 views are traditional.

Describe possible alternatives

@larsoner suggested this might go into the 3D brain that @GuillaumeFavelier is developing? Not familiar w/ this, but happy to help where possible.

If the 3D brain is really nice... I would be interested in discussing how visualization of SEEG activity might look.

Additional context

See: #7768 (comment)

@larsoner
Copy link
Member

We recently added add_volume to _Brain in #8064. That PR is big but it's because of the complications involved in volumetric rendering.

We could add a add_sensor that takes sensor locations from a supplied info, and data containing time courses, and colors the sensors according to data. For EEG these would be discs projected onto the scalp, for ECoG these would be disks projected onto the brain surface; for MEG these would be MEG sensors at their location determined by dev_head_t (all of these probably require info+trans input). Actually I don't think that this would be too difficult because it's mostly code re-use from plot_alignment.

I can take a stab at implementing it if it seems useful.

@hoechenberger
Copy link
Member

@larsoner Sounds like a great idea!!

@larsoner
Copy link
Member

larsoner commented Sep 1, 2020

Another option would be to come up with some function that takes an Evoked instance, subject, and trans and then projects activation directly on the pial surface to yield an STC. Then you can just do stc.plot(smoothing_steps='nearest') and it should at least look okay. An api like:

stc = stc_from_sensors(evoked, trans, subject, surface='pial', subjects_dir=None, distance=0.01, project=True)

where distance will make any vertex on the given surface within that many meters of a given electrode be colored according to that electrode's value. If project=True (default) it will project the sensor locations onto the given surface first (which seems reasonable). This will actually be pretty easy to code, and easier than modifying brain. I can try it to see if it looks reasonable.

WDYT @adam2392 ?

@larsoner
Copy link
Member

larsoner commented Sep 1, 2020

And we could make a interp='linear' | 'constant' or so where the linear mode scales by 1 when the distance is 0, and 0 when the distance is distance so you end up with something like an "activation cone" in terms of the amplitudes around the given location. Again, pretty easy to code and would give some flexibility.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants