Skip to content

Commit

Permalink
MAINT: Harmonize dependencies (#433)
Browse files Browse the repository at this point in the history
  • Loading branch information
larsoner authored Jan 28, 2022
1 parent 86d0050 commit f9aa7bd
Show file tree
Hide file tree
Showing 36 changed files with 415 additions and 237 deletions.
34 changes: 19 additions & 15 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,10 @@ _xvfb: &xvfb
jobs:
build_docs:
machine:
image: ubuntu-2004:202111-01
docker:
# Use 18.04 rather than 20.04 because MESA 20.0.8 on 18.04 has working
# transparency but 21.0.3 on 20.04 does not!
- image: cimg/base:stable-18.04
steps:
- checkout
- run:
Expand All @@ -32,31 +34,33 @@ jobs:
name: Set BASH_ENV
command: |
set -e
python3 -m venv ~/python_env --upgrade-deps
sudo apt update -qq
sudo apt install -qq libosmesa6 libglx-mesa0 libopengl0 libglx0 libdbus-1-3 \
libxkbcommon-x11-0 libxcb-icccm4 libxcb-image0 libxcb-keysyms1 libxcb-randr0 \
libxcb-render-util0 libxcb-shape0 libxcb-xfixes0 libxcb-xinerama0 \
graphviz optipng \
python3.8-venv python3-venv \
xvfb libxft2 ffmpeg
python3.8 -m venv ~/python_env
echo "set -e" >> $BASH_ENV
echo "export DISPLAY=:99" >> $BASH_ENV
echo "export OPENBLAS_NUM_THREADS=4" >> $BASH_ENV
echo "export XDG_RUNTIME_DIR=/tmp/runtime-circleci" >> $BASH_ENV
source tools/get_minimal_commands.sh
echo "export MNE_FULL_DATE=true" >> $BASH_ENV
echo "export MNE_3D_OPTION_ANTIALIAS=false" >> $BASH_ENV
echo "export MNE_3D_BACKEND=pyvista" >> $BASH_ENV
source tools/get_minimal_commands.sh
echo "export MNE_3D_BACKEND=pyvistaqt" >> $BASH_ENV
echo "export PATH=~/.local/bin/:$PATH" >> $BASH_ENV
echo "source ~/python_env/bin/activate" >> $BASH_ENV
mkdir -p ~/.local/bin
ln -s ~/python_env/bin/python ~/.local/bin/python
echo "BASH_ENV:"
cat $BASH_ENV
mkdir -p ~/mne_data
touch pattern.txt;
touch pattern.txt
- run:
name: Install 3D rendering libraries \ PyQt5 dependencies \ graphviz \ optipng (for optimized images)
name: check neuromag2ft
command: |
sudo apt update
sudo apt install libosmesa6 libglx-mesa0 libopengl0 libglx0 libdbus-1-3 \
libxkbcommon-x11-0 libxcb-icccm4 libxcb-image0 libxcb-keysyms1 libxcb-randr0 libxcb-render-util0 libxcb-shape0 libxcb-xfixes0 libxcb-xinerama0 \
graphviz \
optipng
neuromag2ft --version
- run:
<<: *xvfb
Expand All @@ -83,7 +87,7 @@ jobs:
- run:
name: Get Python running
command: |
python -m pip install --upgrade --progress-bar off pip setuptools
python -m pip install --upgrade pip setuptools wheel
python -m pip install --upgrade --progress-bar off --pre sphinx
python -m pip install --upgrade --progress-bar off -r requirements.txt -r requirements_testing.txt -r requirements_doc.txt
python -m pip install -e .
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/linux_pip.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ concurrency:
jobs:
# PIP + non-default stim channel + log level info
job:
name: 'py3.8'
name: 'py3.10'
runs-on: ubuntu-20.04
if: "!contains(github.event.head_commit.message, '[skip tests]')"
defaults:
Expand Down
19 changes: 11 additions & 8 deletions environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,38 +7,41 @@ dependencies:
- numpy
- scipy
- matplotlib
- tqdm
- pooch>=1.5
- decorator
- h5io
- packaging
- numba
- pandas==1.3.2
- pandas
- xlrd
- scikit-learn
- h5py
- h5io
- jinja2
- pillow
- statsmodels
- jupyter
- joblib
- psutil
- numexpr
- imageio
- tqdm
- spyder-kernels>=1.10.0
- imageio-ffmpeg>=0.4.1
- vtk>=9.0.1
- pyvista>=0.30
- traitlets
- pyvista>=0.32
- pyvistaqt>=0.4
- qdarkstyle
- darkdetect
- dipy
- nibabel
- nilearn
- python-picard
- pyqt!=5.15.3
- mffpy>=0.5.7
- ipywidgets
- lxml
- pytables
- pooch
- nilearn
- ipyvtklink
- mne-qt-browser
- pymatreader
- pip:
- https://github.com/mne-tools/mne-python/archive/main.zip
44 changes: 24 additions & 20 deletions examples/general/plot_11_hrf_measured.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,12 @@
In this example we analyse data from a real multichannel
functional near-infrared spectroscopy (fNIRS)
experiment (see :ref:`tut-fnirs-hrf-sim` for a simplified simulated
analysis). The experiment consists of three conditions
1) tapping with the left hand,
2) tapping with the right hand,
3) a control condition where the participant does nothing.
analysis). The experiment consists of three conditions:
1. tapping with the left hand,
2. tapping with the right hand,
3. a control condition where the participant does nothing.
We use a GLM analysis to examine the neural activity associated with
the different tapping conditions.
An alternative epoching style analysis on the same data can be
Expand All @@ -27,12 +29,12 @@
:local:
:depth: 2
.. note:: Parts of this tutorial require the latest development version of MNE-Python. See these instructions for
.. note:: Parts of this tutorial require the latest development version of MNE-Python. See these instructions for
`how to upgrade <https://mne.tools/dev/install/updating.html>`__.
But basically boils down to running
``pip install -U --no-deps https://github.com/mne-tools/mne-python/archive/main.zip``.
Sections of the code that require this version will be noted below.
"""
# sphinx_gallery_thumbnail_number = 9

Expand Down Expand Up @@ -98,10 +100,12 @@
# Next we update the annotations by assigning names to each trigger ID.
# Then we crop the recording to the section containing our
# experimental conditions.

#
# Because of limitations with ``nilearn``, we use ``'_'`` to separate conditions
# rather than the standard ``'/'``.
raw_intensity.annotations.rename({'1.0': 'Control',
'2.0': 'Tapping/Left',
'3.0': 'Tapping/Right'})
'2.0': 'Tapping_Left',
'3.0': 'Tapping_Right'})
raw_intensity.annotations.delete(raw_intensity.annotations.description == '15.0')
raw_intensity.annotations.set_durations(5)

Expand Down Expand Up @@ -166,7 +170,7 @@
# .. sidebar:: Relevant literature
#
# For further discussion on design matrices see
# the Nilearn examples. Specifically the
# the Nilearn examples. Specifically the
# `first level model example <http://nilearn.github.io/auto_examples/04_glm_first_level/plot_first_level_details.html>`_.
#
# Next we create a model to fit our data to.
Expand Down Expand Up @@ -221,7 +225,7 @@
# Examine expected response
# -------------------------
#
# The matrices above can be a bit abstract as they encompase multiple
# The matrices above can be a bit abstract as they encompase multiple
# conditions and regressors.
# Instead we can examine a single condition.
# Here we observe the boxcar function for a single condition,
Expand All @@ -239,7 +243,7 @@

s = mne_nirs.experimental_design.create_boxcar(raw_intensity, stim_dur=5.0)
plt.plot(raw_intensity.times, s[:, 1])
plt.plot(design_matrix['Tapping/Left'])
plt.plot(design_matrix['Tapping_Left'])
plt.xlim(180, 300)
plt.legend(["Stimulus", "Expected Response"])
plt.xlabel("Time (s)")
Expand Down Expand Up @@ -336,7 +340,7 @@
# negative of HbO as expected.

glm_est = run_glm(raw_haemo, design_matrix)
glm_est.plot_topo(conditions=['Tapping/Left', 'Tapping/Right'])
glm_est.plot_topo(conditions=['Tapping_Left', 'Tapping_Right'])


# %%
Expand Down Expand Up @@ -364,7 +368,7 @@
fig, axes = plt.subplots(nrows=1, ncols=2, figsize=(10, 6), gridspec_kw=dict(width_ratios=[0.92, 1]))

glm_hbo = glm_est.copy().pick(picks="hbo")
conditions = ['Tapping/Right']
conditions = ['Tapping_Right']

glm_hbo.plot_topo(axes=axes[0], colorbar=False, conditions=conditions)

Expand All @@ -380,7 +384,7 @@
# Another way to view the data is to project the GLM estimates to the nearest
# cortical surface

glm_est.copy().surface_projection(condition="Tapping/Right", view="dorsal", chroma="hbo")
glm_est.copy().surface_projection(condition="Tapping_Right", view="dorsal", chroma="hbo")


# %%
Expand Down Expand Up @@ -412,7 +416,7 @@
groups = dict(Left_ROI=picks_pair_to_idx(raw_haemo, left),
Right_ROI=picks_pair_to_idx(raw_haemo, right))

conditions = ['Control', 'Tapping/Left', 'Tapping/Right']
conditions = ['Control', 'Tapping_Left', 'Tapping_Right']

df = glm_est.to_dataframe_region_of_interest(groups, conditions)

Expand All @@ -439,7 +443,7 @@
contrast_matrix = np.eye(design_matrix.shape[1])
basic_conts = dict([(column, contrast_matrix[i])
for i, column in enumerate(design_matrix.columns)])
contrast_LvR = basic_conts['Tapping/Left'] - basic_conts['Tapping/Right']
contrast_LvR = basic_conts['Tapping_Left'] - basic_conts['Tapping_Right']

contrast = glm_est.compute_contrast(contrast_LvR)
contrast.plot_topo()
Expand Down Expand Up @@ -474,8 +478,8 @@
# the tapping, but we do expect 5% or less for the false positive rate.

(df
.query('Condition in ["Control", "Tapping/Left", "Tapping/Right"]')
.groupby(['Condition', 'Chroma'])
.query('Condition in ["Control", "Tapping_Left", "Tapping_Right"]')
.drop(['df', 'mse', 'p_value', 't'], axis=1)
.groupby(['Condition', 'Chroma', 'ch_name'])
.agg(['mean'])
.drop(['df', 'mse', 'p_value', 't'], 1)
)
Loading

0 comments on commit f9aa7bd

Please sign in to comment.