Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: NeuroML import and export fixes and enhancements #810

Draft
wants to merge 109 commits into
base: neuroml_updates
Choose a base branch
from

Conversation

sanjayankur31
Copy link
Contributor

No description provided.

penguinpee and others added 30 commits July 26, 2023 12:23
Without it tests/ will be installed as a top level Python module.
…ate-medical-center/development

PR from development to master - VERSION 1.0.5
More readable, and IDE friendly
I seem to be able to run NetPyNE sims fine with MPI, so this comment
does not seem to apply any more.
See, override decoration helps :)
[skip ci]
It's a warning that the user should be aware of.
erev is not required in Nernst or GHK2
@vvbragin
Copy link
Collaborator

Hi @sanjayankur31, is this still WIP or it can be merged?

vvbragin and others added 3 commits August 30, 2024 21:26
…reilly/patch-1

Bug fix in scalebar - removing the use of the minimumdescent argument in the call to TextArea
…development

Exlude tests from discovery / installation
…at has emerged with spack installation (EBRAINS platform). Will re-visit later..

This reverts commit e5aa9e8.
@sanjayankur31
Copy link
Contributor Author

Still WIP I'm afraid. The bits are almost complete but we need to investigate differences in spike times b/w NeuroML's NetPyNE export and the NEURON export.

vvbragin and others added 20 commits September 4, 2024 09:16
…mplates (suny-downstate-medical-center#831)

see netpyne.batchtools esp. submits.py and search.py

slurm job submission:
search() now supports job_type='slurm' with associated 'ifs' and 'socket' communication modes

the run_config for these submission scripts should specify the following arguments:
'allocation' (allocation for the job)
'walltime' (time limit before job termination)
'nodes' (# nodes for job)
'coresPerNode' (number of cores per job--i.e. the mpiexec -n value)
'email' (user email)
'custom' (any custom commands to execute prior to sourcing ~/.bashrc)
'command' (execution of script, command to be run from the project directory to run the script)

update to see submission:
run_config for the submission now includes the following argument 
'realtime' (time limit before job termination)
This is only required if one modifies the keys of the dict while
iterating over it, which we do not.
The same segment objects are used everywhere, so the segment object in
`parent_seg` will also be the one used in the lists in `ordered_segs`.
So, there's no need to iterate and compare ids, simply look if the
object is in the list.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

10 participants