Skip to content

Commit

Permalink
Merge pull request #213 from NeLy-EPFL/dev-v1.0.2-pre.1
Browse files Browse the repository at this point in the history
Version 1.1.0
  • Loading branch information
sibocw authored Oct 1, 2024
2 parents c055f9b + feed64a commit 2323c4d
Show file tree
Hide file tree
Showing 33 changed files with 1,783 additions and 1,388 deletions.
3 changes: 2 additions & 1 deletion doc/source/api_ref/mdp_specs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,9 +17,10 @@ Default ``Simulation``
* "fly": The fly state as a NumPy array of shape (4, 3). 0th row: x, y, z position of the fly in arena. 1st row: x, y, z velocity of the fly in arena. 2nd row: orientation of fly around x, y, z axes. 3rd row: rate of change of fly orientation.
* "contact_forces": Readings of the touch contact sensors, one placed for each of the body segments specified in ``Fly.contact_sensor_placements``. This is a NumPy array of shape (num_contact_sensor_placements, 3).
* "end_effectors": The positions of the end effectors (most distal tarsus link) of the legs as a NumPy array of shape (6, 3). The order of the legs is: LF, LM, LH, RF, RM, RH (L/R = left/right, F/M/H = front/middle/hind).
* "fly_orientation": NumPy array of shape (3,). This is the vector (x, y, z) pointing toward the direction that the fly is facing.
* "fly_orientation": [Deprecated] this entry in the observation space is deprecated and will be removed in future releases. Use the "forward" vector from "cardinal_vectors" instead. Previously, this variable is a NumPy array of shape (3,). This is the vector (x, y, z) pointing toward the direction that the fly is facing.
* "vision" (if ``Fly.enable_vision`` is True): The light intensities sensed by the ommatidia on the compound eyes. This is a NumPy array of shape (2, num_ommatidia_per_eye, 2), where the zeroth dimension is the side (left, right in that order); the second dimension specifies the ommatidium, and the last column is for the spectral channel (yellow-type, pale-type in that order). Each ommatidium only has one channel with nonzero reading. The intensities are given on a [0, 1] scale.
* "odor_intensity" (if ``Fly.enable_olfaction`` is True): The odor intensities sensed by the odor sensors (by default 2 antennae and 2 maxillary palps). This is a NumPy array of shape (odor_space_dimension, num_sensors).
* "cardinal_vectors": The cardinal vectors (forward, left, up) of the fly's spatial orientation in the global frame. This is a NumPy array of shape (3, 3) where the 0th dimension specifies forward/left/up in that order, and the 1st dimension specifies the x/y/z components of the vector. Note that the forward vector is a bit tiled up; therefore if the fly is walking perfectly forward on the floor, one should expect a non-negligible positive z component in the forward vector (and a non-negligible negative value in the x component of the up vector).

**Info:** The info dictionary contains the following:

Expand Down
2 changes: 1 addition & 1 deletion doc/source/api_ref/vision.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Vision
This page documents the implementation of the visual input received by the simulated fly. Note that in the typical use case, the user should **not** have to access most of the functions described here. Instead, the visual inputs are given as a part of the *observation* returned by ``NeuroMechFly`` at each time step. Nonetheless, the full API reference is provided here for greater transparency.

.. note::
For API references of NeuroMechFly simulation with the connectome-constrained model proposed in `Lappalainen et al., 2023, <https://www.biorxiv.org/content/10.1101/2023.03.11.532232>`_, see the `Advanced Vision <api_ref/examples/vision.html>`_ page.
For API references of NeuroMechFly simulation with the connectome-constrained model proposed in `Lappalainen et al., 2024, <https://doi.org/10.1038/s41586-024-07939-3>`_, see the `Advanced Vision <api_ref/examples/vision.html>`_ page.

Retina simulation
-----------------
Expand Down
9 changes: 9 additions & 0 deletions doc/source/changelog.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,15 @@
Change Log
==========

* **1.1.0:**

* Added cardinal direction sensing (vectors describing +x, +y, +z of the fly) to the observation space.
* Removed legacy spawn orientation preprocessing: Previously, pi/2 was subtracted from the user-specified spawn orientation on the x-y plane. This was to make the behavior consistent with a legacy version of NeuroMechFly. This behavior is no longer desired; from this version onwards, the spawn orientation is used as is.
* Strictly fixed the required MuJoCo version to 3.2.3, and dm_control version to 1.0.23. This is to prevent API-breaking changes in future versions of these libraries from affecting FlyGym. FlyGym maintainers will periodically check for compatibility with newer versions of these libraries.
* Changed flip detection method: Previously, flips are reported when all legs reliably lose contact with the ground. Now, we simply check if the z component of the "up" cardinal vector is negative. Additionally, the ``detect_flip`` parameter of ``Fly`` is now deprecated; flips are always detect and reported.
* Allowed different sets of DoFs to be monitored vs. actuated. Previously, the two sets are always the same.
* From this version onwards, we will use `EffVer <https://jacobtomlinson.dev/effver/>`_ as the versioning policy. The version number will communicate how much effort we expect a user will need to spend to adopt the new version. While we previously tried to adhere to the stricter `SemVer <https://semver.org/>`_, we found that it was not effective because many core dependencies of FlyGym (e.g., MuJoCo, NumPy, and Python itself) do not use SemVer.

* **1.0.1:** Fixed minor bugs related to the set of DoFs in the predefined poses, and to rendering at extremely high frequencies. Fixed outdated class names and links in the docs. In addition, contact sensor placements used by the hybrid turning controller are now added to the ``preprogrammed`` module.

* **1.0.0:** In spring 2024, NeuroMechFly was used, for the second time, in a course titled "`Controlling behavior in animals and robots <https://edu.epfl.ch/coursebook/en/controlling-behavior-in-animals-and-robots-BIOENG-456>`_" at EPFL. At the same time, we revised the NeuroMechFly v2 manuscript. In these processes, we significantly improved the FlyGym package, added new functionalities, and incorporated changes as we received feedback from the students. These enhancements are released as FlyGym version 1.0.0. This release is not backward compatible; please refer to the `tutorials <https://neuromechfly.org/tutorials/index.html>`_ and `API references <https://neuromechfly.org/api_ref/index.html>`_ for more information. The main changes are:
Expand Down
5 changes: 4 additions & 1 deletion doc/source/contributing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,9 @@ Code of conduct & licensing
---------------------------
Please respect the `Contributor Covenant Code of Conduct <https://www.contributor-covenant.org/version/2/1/code_of_conduct/code_of_conduct.txt>`_. FlyGym is made open source under `Apache License 2.0 <https://github.com/NeLy-EPFL/flygym/blob/main/LICENSE>`_. By contributing to this package (including any issue, pull request, and discussion), you agree that your content will be shared under the same license.

Versioning
----------
FlyGym uses `EffVer <https://jacobtomlinson.dev/effver/>`_ versioning system, which is based on the amount of effort that users are expected to spend to adopt the new version. The version number is in the format of ``X.Y.Z``, where ``X`` is the macro version, ``Y`` is the meso version, and ``Z`` is the micro version. When a macro version is updated, it means that a large effort is required to adopt the new version. When a meso version is updated, it means that a some effort is required to adopt the new version. When a micro version is updated, it means that no effort is required at all (e.g. bug fixes or optimizations that are not exposed to the API).

Branches
--------
Expand All @@ -25,7 +28,7 @@ Documentation
-------------
We use the `NumPy Docstring Style <https://numpydoc.readthedocs.io/en/latest/format.html>`_. We use a line length limit of 75 characters for docstrings. Please stick with the NumPy style so the API reference can be generated automatically.

The source files (in RST) of the documentation website are located in the ``doc/source`` folder. The API reference is generated automatically using `Sphinx <https://www.sphinx-doc.org/en/master/>`_. The documentation is written in `reStructuredText <https://sphinx-tutorial.readthedocs.io/step-1/>`_ (RST). When you merge a pull request into the main branch, the documentation is automatically built and deployed on `neuromechfly.org <https://neuromechfly.org/>`_. If you want to check the documentation on a branch (that is not `main`) locally, you can run `make html` under the `doc` folder. The generated HTML files will be placed under `doc/build/html`. You can open `doc/build/html/index.html` in your browser to view the documentation.
The source files (in RST) of the documentation website are located in the ``doc/source`` folder. The API reference is generated automatically using `Sphinx <https://www.sphinx-doc.org/en/master/>`_. The documentation is written in `reStructuredText <https://sphinx-tutorial.readthedocs.io/step-1/>`_ (RST). When you merge a pull request into the main branch, the documentation is automatically built and deployed on `neuromechfly.org <https://neuromechfly.org/>`_. If you want to check the documentation on a branch (that is not ``main``) locally, you can run ``make html`` under the ``doc`` folder. The generated HTML files will be placed under ``doc/build/html``. You can open ``doc/build/html/index.html`` in your browser to view the documentation.

API changes / migration guide
-----------------------------
Expand Down
2 changes: 1 addition & 1 deletion doc/source/gallery/video_14_fly_follow_fly.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Fly chasing with a connectome constrained visual system
=======================================================

Our simulated fly chases another fly on a complex terrain. The chased fly is detected using the simulated responses of visual neurons. Visual neurons responses are simulated using the connectome constrained neural network constructed in `Lappalainen et al. 2023 <https://doi.org/10.1101/2023.03.11.532232>`_.
Our simulated fly chases another fly on a complex terrain. The chased fly is detected using the simulated responses of visual neurons. Visual neurons responses are simulated using the connectome constrained neural network constructed in `Lappalainen et al. 2024 <https://doi.org/10.1101/2023.03.11.532232>`_.

.. raw:: html

Expand Down
6 changes: 3 additions & 3 deletions doc/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Simulating embodied sensorimotor control with NeuroMechFly v2
changelog
contributing

`Paper <https://www.epfl.ch/labs/ramdya-lab/wp-content/uploads/2024/08/NMF2_postprint.pdf>`_ |
`Paper <https://www.biorxiv.org/content/10.1101/2023.09.18.556649>`_ |
`GitHub <https://github.com/NeLy-EPFL/flygym>`_

.. figure:: https://github.com/NeLy-EPFL/_media/blob/main/flygym/overview_video.gif?raw=true
Expand All @@ -33,7 +33,7 @@ Simulating embodied sensorimotor control with NeuroMechFly v2
API changes may occur in future releases. See the `changelog <changelog.html>`_ for details.


FlyGym is the Python library for NeuroMechFly v2, a digital twin of the adult fruit fly *Drosophila melanogaster* that can see, smell, walk over challenging terrain, and interact with the environment (see our `NeuroMechFly v2 paper <https://www.epfl.ch/labs/ramdya-lab/wp-content/uploads/2024/08/NMF2_postprint.pdf>`_).
FlyGym is the Python library for NeuroMechFly v2, a digital twin of the adult fruit fly *Drosophila melanogaster* that can see, smell, walk over challenging terrain, and interact with the environment (see our `NeuroMechFly v2 paper <https://www.biorxiv.org/content/10.1101/2023.09.18.556649>`_).

FlyGym consists of the following components:

Expand Down Expand Up @@ -86,4 +86,4 @@ If you use FlyGym or NeuroMechFly in your research, please cite the following tw
}
.. note::
**Privacy policy:** This site uses Google Analytics to collect data about your interactions with our website. This includes information such as your IP address, browsing behavior, and device type. We use this data to improve our website and understand user preferences. Google Analytics uses Cookies, which are small text files stored on your device. See `How Google uses information from sites or apps that use our services <https://policies.google.com/technologies/partner-sites>`_. To opt-out, you can use a `browser extension <https://tools.google.com/dlpage/gaoptout>`_ to deactivate Google Analytics.
**Privacy policy:** This site uses Google Analytics to collect data about your interactions with our website. This includes information such as your IP address, browsing behavior, and device type. We use this data to improve our website and understand user preferences. Google Analytics uses Cookies, which are small text files stored on your device. See `How Google uses information from sites or apps that use our services <https://policies.google.com/technologies/partner-sites>`_. To opt-out, you can use a `browser extension <https://tools.google.com/dlpage/gaoptout>`_ to deactivate Google Analytics.
8 changes: 1 addition & 7 deletions doc/source/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ Then, to install the FlyGym package:
pip install "flygym[examples]"
The tutorial and example on interfacing FlyGym with the `connectome-constrained vision model <https://github.com/TuragaLab/flyvis>`_ from `Lappalainen et al. (2023) <https://www.biorxiv.org/content/10.1101/2023.03.11.532232>`_ further requires the FlyVision package, which is not published on the Python Package Index (PyPI). As a result, the command above does not install FlyVision. Instead, you must either install it manually following `its installation instructions <https://github.com/TuragaLab/flyvis?tab=readme-ov-file#install-locally->`_, or install it with ``pip`` from our fork on GitHub:
The tutorial and example on interfacing FlyGym with the `connectome-constrained vision model <https://github.com/TuragaLab/flyvis>`_ from `Lappalainen et al. (2024) <https://doi.org/10.1038/s41586-024-07939-3>`_ further requires the FlyVision package, which is not published on the Python Package Index (PyPI). As a result, the command above does not install FlyVision. Instead, you must either install it manually following `its installation instructions <https://github.com/TuragaLab/flyvis?tab=readme-ov-file#install-locally->`_, or install it with ``pip`` from our fork on GitHub:

.. code-block:: bash
Expand All @@ -58,12 +58,6 @@ First, clone this repository:
git clone git@github.com:NeLy-EPFL/flygym.git
If you want to install code from a specific branch, you can checkout to the branch of your choice:

.. code-block:: bash
git checkout <branch_name>
Change into the cloned directory:

.. code-block:: bash
Expand Down
6 changes: 3 additions & 3 deletions doc/source/sfn2024.rst
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ Workshop @ SfN
.register-button:visited,
.register-button:active {
color: #dfd3fd; /* Maintain the original text color */
color: #ffffff; /* Maintain the original text color */
}
</style>
Expand Down Expand Up @@ -62,7 +62,7 @@ Workshop @ SfN
</p>

<p>
To bridge this gap, we developed NeuroMechFly (<a href="https://doi.org/10.1038/s41592-022-01466-7" target="_blank" rel="noopener noreferrer">Lobato-Rios et al., <em>Nature Methods</em>, 2022; <a href="https://www.epfl.ch/labs/ramdya-lab/wp-content/uploads/2024/08/NMF2_postprint.pdf" target="_blank" rel="noopener noreferrer">Wang-Chen et al., <em>Nature Methods</em>, 2024</a>). With NeuroMechFly, one can test models of the following embodied in an anatomically realistic body model:
To bridge this gap, we developed NeuroMechFly (<a href="https://doi.org/10.1038/s41592-022-01466-7" target="_blank" rel="noopener noreferrer">Lobato-Rios et al., <em>Nature Methods</em>, 2022; <a href="https://www.biorxiv.org/content/10.1101/2023.09.18.556649" target="_blank" rel="noopener noreferrer">Wang-Chen et al., <em>Nature Methods</em>, 2024</a>). With NeuroMechFly, one can test models of the following embodied in an anatomically realistic body model:
</p>

<ul>
Expand Down Expand Up @@ -113,4 +113,4 @@ Workshop @ SfN
.. raw:: html

<h3 class="smaller">Contact us</h3>
For any questions, please email <a href="https://people.epfl.ch/pavan.ramdya?lang=en" target="_blank" rel="noopener noreferrer"> Pavan Ramdya</a> or <a href="https://people.epfl.ch/sibo.wang?lang=en" target="_blank" rel="noopener noreferrer">Sibo Wang-Chen</a>.
For any questions, please email <a href="https://people.epfl.ch/pavan.ramdya?lang=en" target="_blank" rel="noopener noreferrer"> Pavan Ramdya</a> or <a href="https://people.epfl.ch/sibo.wang?lang=en" target="_blank" rel="noopener noreferrer">Sibo Wang-Chen</a>.
6 changes: 3 additions & 3 deletions doc/source/tutorials/advanced_olfaction.rst
Original file line number Diff line number Diff line change
Expand Up @@ -588,7 +588,7 @@ the fly stand still for the sake of this demonstration:
fly = Fly(
enable_olfaction=True,
spawn_pos=(60.0, 30.0, 0.25),
spawn_orientation=(0, 0, -np.pi / 2),
spawn_orientation=(0, 0, -np.pi),
)
cam = Camera(fly=fly, camera_id="birdeye_cam", play_speed=0.2, timestamp_text=True)
sim = SingleFlySimulation(fly=fly, arena=arena, cameras=[cam])
Expand Down Expand Up @@ -873,7 +873,7 @@ Let’s run a sample simulation where the fly walks blindly forward:
enable_vision=False,
contact_sensor_placements=contact_sensor_placements,
spawn_pos=(60.0, 30.0, 0.25),
spawn_orientation=(0, 0, -np.pi / 2),
spawn_orientation=(0, 0, -np.pi),
)
cam = Camera(fly=fly, camera_id="birdeye_cam", play_speed=0.2, timestamp_text=True)
Expand Down Expand Up @@ -1358,7 +1358,7 @@ Now, let’s run this controller:
contact_sensor_placements=contact_sensor_placements,
# Here the opposite spawn position can be tried (65.0, 15.0, 0.25)
spawn_pos=(65.0, 45.0, 0.25),
spawn_orientation=(0, 0, -np.pi / 2),
spawn_orientation=(0, 0, -np.pi),
)
wind_dir = [1.0, 0.0]
Expand Down
14 changes: 7 additions & 7 deletions doc/source/tutorials/advanced_vision.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Connectome-constrained visual system model
**Summary**: In this tutorial, we will (1) simulate two flies in the
same arena, and (2) integrate a connectome-constrained visual system
model `(Lappalainen et al.,
2023) <https://www.biorxiv.org/content/10.1101/2023.03.11.532232>`__
2024) <https://doi.org/10.1038/s41586-024-07939-3>`__
into NeuroMechFly. Combining these, we will simulate a scenario where a
stationary fly observes another fly walking in front of it and examine
the responses of different neurons in the visual system.
Expand Down Expand Up @@ -76,7 +76,7 @@ direction of movement.
draw_corrections=True,
timestep=timestep,
spawn_pos=(3, 3, 0.5),
spawn_orientation=(0, 0, 0),
spawn_orientation=(0, 0, -np.pi / 2),
)
observer_fly = HybridTurningFly(
Expand All @@ -89,7 +89,7 @@ direction of movement.
draw_corrections=True,
timestep=timestep,
spawn_pos=(0, 0, 0.5),
spawn_orientation=(0, 0, np.pi / 2),
spawn_orientation=(0, 0, 0),
# setting head_stabilization_model to "thorax" will make actuate
# neck joints according to actual thorax rotations (i.e., using ideal
# head stabilization signals)
Expand Down Expand Up @@ -159,7 +159,7 @@ projects for the VNC).
To illustrate how this might be accomplished, we will interface
NeuroMechFly a recently established connectome-constrained neural
network model (`Lappalainen et al.,
2023 <https://www.biorxiv.org/content/10.1101/2023.03.11.532232>`__;
2023 <https://doi.org/10.1038/s41586-024-07939-3>`__;
`code <https://github.com/TuragaLab/flyvis>`__). This study has
constructed an artificial neural network (ANN) representing the retina,
lamina, medulla, lobula plate, and lobula of the fly visual system (see
Expand All @@ -170,7 +170,7 @@ variables such as voltage.
.. image:: https://github.com/NeLy-EPFL/_media/blob/main/flygym/advanced_vision/lappalainen_model_schematic.png?raw=true
:width: 400

*Image from Lappalainen et al., 2023.*
*Image from Lappalainen et al., 2024.*

We will pass the visual experience of the simulated fly as inputs to
this pretrained model and simulate the activities of real neurons. For
Expand All @@ -191,13 +191,13 @@ replace the observer fly with an instance of ``RealisticVisionFly``:
draw_corrections=True,
timestep=timestep,
spawn_pos=(3, 3, 0.5),
spawn_orientation=(0, 0, 0),
spawn_orientation=(0, 0, -np.pi / 2),
)
observer_fly = RealisticVisionFly(
name="observer",
spawn_pos=(0, 0, 0.5),
spawn_orientation=(0, 0, np.pi / 2),
spawn_orientation=(0, 0, 0),
contact_sensor_placements=contact_sensor_placements,
head_stabilization_model="thorax",
)
Expand Down
Loading

0 comments on commit 2323c4d

Please sign in to comment.