-
Notifications
You must be signed in to change notification settings - Fork 124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SeisFlows version 2.3.0 #180
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…nning the adjoint simulation (if it doesnt exist)
…nook to be mpiexec
…statement to fail quietly. these are now set in the input of init
…an in-person walkthrough of running seisflows on a cluster. expect this will be updated after the walkthrough
…ain job to, and the partition to submit each compute job. updated for chinook and frontera systems
…S if the user had changed the default location for LOCAL_PATH in the specfem Par_file. Now seisflows will make this sub-directory regardless
…correctly during 'solver.import_model'. changed load_yaml function to always set paths to absolute in the internal parameter definition to avoid any relative pathing issues
…ments which had values matching the actual parameter value to replace (ended up replacing parts of the comment instead of the actual parameter). simplified and fixed this implementation
* added specfem3d_globe class based on manual inspection of a specfem3d_globe working directory and adjoint simulation * included laplacian smoothing wrapper and allow user choice between gaussian and laplacian smoothing * adding specfem3D_globe compatability to SPECFEM Model class. Still need to figure out the intricacies of dealing with multiple regions and whether or not they are updated, which will affect the available kernels, but model reading for 3D globe as well as flavor guessing has been added * finished adding i/o capabilities for SPECFEM3D_GLOBE models in the Model class. Now allows for user-definition of region for building model, and can read and write globe models which have a different format from specfem2d and 3d models * bugfix specfem initialize solver ignore directories in DATA directory as we assume we only need text files * specfem Model class check function can now handle specfem3d_globe models, also deals with all anisotropic model parameters * specfem3d_globe can also have non-isotropic parameters, making this more general in the check statement * adjusted specfem solver class (and subclasses) to include a parameter which is 3dglobe only. this is used to deal with different regions in 3d globe models * changed specfem3d_globe parent to specfem due to some larger inconsistencies between specfem3d and 3d_globe that make it difficult to have 3d as parent to 3d_globe. need to make some larger scale changes to incorporate * preprocess read_ascii now deals with the different ascii formats that 3d_globe and 2d/3d have * bugfix default preprocessing was not properly renaming adjoint sources in line with how specfem3d_globe expects them * fixed bug for xcombine_sem and xsmooth functions not requiring 'reg' tag for parameter input
…n parameter was none causing a list comprehension error
… even when empty, throwing IndexError
* Update README.md * revamps documentation for ease of access and more linear flow for new users trying to understand package condenses smaller docs onto the main page and shortens up text on main page moves background into intro section and adds modularity to background renames a few of the docs pages fixes typos etc.
…ing examples to fail. small fix to checking logic, example runs ok now
…pdated docs conda environment file and tested docs building from a fresh install
* rewrote cluster setup to avoid notebook version edited notebook convert script to try and cleanup some of the weird ipython quirks to make docs read cleaner updated yml environment file to include jupytre restructured TOC in index * removed unused notebooks * removing ununsed files from docs * added a tips and tricks docs page for little useful tips for users
* adding legacy setup.py file for older packaging standards * doc update alternative install instructions for existing conda environments
removed '--prune' from Conda env update as that is not the intended behavior
* default preprocessing: synthetics and observations can have different format; added SAC to allowed observation formats * pyaflowa preprocess: changed variable name <data_format> to <syn_data_format> for consistency * solver: changed variable name <data_format> to <syn_data_format> for consistency * updated docstrings * updated examples to include <obs_data_format> and <syn_data_format> variables * updated tests to account for the different format of observations and synthetics * added test data in SAC format * updated documentation * added <unit_output> parameter to default preprocessing module * added check of <unit_output> parameter * the format of observed and synthetic traces is now checked; preprocessing is only done on traces with both observed and synthetic files * added <unit_output> parameter to examples and test * starting coding <_check_adjoint_traces> function * finished coding <_check_adjoint_traces> function * updated comments and removed assert * fixed: adjoint traces were incorrectly written * Update default.py --------- Co-authored-by: Bryant Chow <bhchow@alaska.edu>
* replaces pyatoa read_sem function with pysep, adds pysep as a direct dependency of seisflows * removed pysep read_sem import because it wasn't used in the pyaflowa class removed pysep as a dependency of seisflows * added available pyadjoint misfit functions to pyaflowa docstring * removed pyatoa requirements from environment yml file since they are not direct dependencies of seisflows * updates changelog * bumping version number and editing copyrights in docs * fixing preprocess tests which had out of order assertion statements * bugfix: during adj src initialization in default preprocess, read fiels using synthetic data format and not observed, since adj srcs are expected to match synthetic format * bugfix: preprocess test was changing the wrong data format for initializing adjoint sources * remove typos and debugger * rearrange logger imports in pyaflowa * bugfix: pyaflowa using outdated config parameter 'synthetics_only'. replace with 'st_obs_type' * fixing pyaflowa line search test * pinning pyatoa to greater than 0.2.2 to include latest bugfixes
…ward workflow now does not save forward arrays by default to save memory
…dule was initialized
exposes read_residuals function as a public method for other preprocessing classes to import and use. adds docstring definition for function
removes redundant definition of read_residuals function which is defined publicly by Default class, and is not used by this function. If needed in the future, the function can be directly imported
explicitly declares iteration and step count values for function evaluate_initial_misfit, so that objective function itself does not have to declare iteration number, keeping things a little bit more concise. overriding workflows (i.e., Inversion) will need to make this more generic.
Shift declaration of residual file iteration and step count values into calling functions (evaluate_misfit etc) rather than inside each called preprocessing function, to keep things slightly more concise
… to code through github
…borrow this function from the default preprocessing class
…rivate method, since it was only being accessed by the Inversion class. completely removed from preprocessing module
…ual files have actually been created
…ngs explicit (rather than doing a +=1 randomly in a workflow), and put that functionality directly into the Inversion line search function, rather than having it live separately in the optimziation module where it might get buried among other line search update functionality
…was NOT returning the correct misfit value for each line search run, but rather just the first misfit, which would have been the previous model also added in a first line to this text file which corresponds to the misfit of the starting model, then all subsequent lines are updated models. This bug will not have affected any actual inversion runs, but may have affected those plotting convergence based on the text file
* added pjm system module in preparation for wisteria testing * function query_job_states working with pjstat to check running and finished jobs * job id getting from stdout working for pjsub * cant do array jobs for pjsub so fall back to the frontera approach, that is: submit jobs one by one and track individual job ids rather than submitting one array job" * removed unncessary parameter from pjm and updated docstrings * renamed pjm to fujitsu to be a bit more recognizable * changed parameter name resource_group to rscgrp to match actual batch scripts * wisteria system can submit workflow * small update fujitsu * playing around with activating conda env in the fujitsu due to inability of inheriting conda env on compute node * removed hardcode and redundant path call to submit and run scripts stored in root dir. instead this is defined once at the top of cluster script and inherited by all child classes. this also allows wisteria to override this and set custom run and submit scripts which allow for activating conda environment after submitted job to scheduler * workflow submit working on wisteria, custom run and submit scripts to get around inability to pass command line arguments in pjsub command and inability to inherit conda env from login node * relaxed check on test flow log file creation, because it was checking for array jobs but not all systems run array style * same as last commit one more time * wisteria/fujitsu system works with some caveats, passes test flow * update wisteria caveat comments * updates changelog
… include backwards incompatible changes that will break SeisFlows. These will be accomodated in the next version of seisflows
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New year, new version! This is the official PR for SeisFlows version 2.3.0
The main motivation for pushing out a version now is that one of the feature branches has become too unwieldy to be kept as a separate branch, and needs to be brought into the
devel
branch so that some important changes are reflected in the main code base.Note on commits: Because of a previous "squash and merge", there are some 2.2.0 commits in the history of this PR. Should not affect code but in future releases I will need to make sure that I either do not use "squash and merge", or that I delete and re-create the 'devel' branch prior to committing to it.
Change Log
A collection of bugfixes and feature improvement (big thanks to @evcano for major PR #168)
export_residuals
(Fixed problems during line search plus other minor changes #168)SAVE_FORWARD_ARRAYS
submit
andrun
scripts a System Classvariable which can be overwritten.
system.Fujitsu
as a generalized System class for HPCs using theFujitsu workload manager
system.Wisteria
for System interactions with HPC Wisteriageneralized at the moment
Bugfixes
installed via Pip (Change Cluster-based run_script/submit_workflow script path location #162), added Manifest.in file to fix this.
source files (Running on Chinook issue: No such file or directory: 'DATA/Par_file' #169), leading to an incorrect number of source files being detected, process now only looks for sources files that start with source prefix (e.g., CMTSOLUTION, FORCESOLUTION)
iteration, fixed