-
Notifications
You must be signed in to change notification settings - Fork 597
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Restore 3 Gnarly tests #8892
Closed
Closed
Restore 3 Gnarly tests #8892
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
* deleted VDS * only one left
…tion of Delta (#8205) * Lees name * add vds validation script written by Tim * fix rd tim typo * make sure temp dir is set and not default for validate() * swap to consistent kebab case Co-authored-by: Miguel Covarrubias <mcovarr@users.noreply.github.com> * clean up validation * put init in the right place * add proper example to notes * update code formatting --------- Co-authored-by: Miguel Covarrubias <mcovarr@users.noreply.github.com>
* Lees name * add vds validation script written by Tim * fix rd tim typo * make sure temp dir is set and not default for validate() * swap to consistent kebab case Co-authored-by: Miguel Covarrubias <mcovarr@users.noreply.github.com> * clean up validation * put init in the right place * add proper example to notes * update code formatting * update review --------- Co-authored-by: Miguel Covarrubias <mcovarr@users.noreply.github.com>
* Don't run gatk tests when the only changes in a commit are in the scripts/variantstore directory.
* laying framework for FOFN bulk import code * adding in terra notebook utils code * updating wdl * updating environment variables to make this work better * quotey McBetterQuotes * extra environment variables * normalizing variable name with other wdls that require it * gotta explicitly set WORKSPACE_NAMESPACE to the google project as well. Apparently. * typoooooooooooooooooo * Didn't pipe the output files the entire way up * whoopsie * typo * two updates after testing: 1. We do NOT want to assume that the sample ids we want are in the name field. Pass that through as a parameter. 2. We want to explicitly pause every 500 samples, as that's our page size. It slows our requests down enough to not spam the backend server and hit 503 errors, although it does slow down the rate at which we can write the files if the dataset is too big. Which shouldn't be a concern, because as long as it doesn't cause errors it is still a hands off process. 3. We want to account to heterogenous data. In AoU Delta, for instance, the control samples keep their vcf and vcf_index data in a different field. This would cause the whole thing to fail if we weren't accounting for that explicitly, and now we generate an errors.txt file that will hold the row that we couldn't find the correct columns for so they can be examined later * silly mistake copying the functioning code over from the workbook * making script more robust against specifying imaginary columns in the data table and being slightly more informative in the output of the python script * increasing the size of the disk this is running on for the sake of efficiency (and handling larger callsets) * Passing errors up * update params * short term testing (rate lim) * make it only 25 shards! * add workspace id scraping * add workspace id scraping fixup * this is not functioning--need to curl in the wdl * clean up vcfs so we dont run out of space * add duplicates test to the shard loading * clean up namespace prep --------- Co-authored-by: Aaron Hatcher <hatcher@broadinstitute.org>
* Use the annotation 'AS_MQ' for indels.
… table (#8278) * Remove the unneeded SCORE field from the filter_set_info_vqsr table * Updated the docker images.
* add queries for testing mismatched sites and variants across possible duplicates * still need to wire these through * plumb thru dup validation * dockstore for testing * update docker * add xtrace * better bool logic * clean up bash * okay lets try ripping shit out to get this to work * okay lets put a few lines back * ok that worked, lets swap for better errors * short term remove clinvar * review changes * update docker * explain removal of clinvar test
* Adding tests for ExtractCohortLite.
* Simple fix to have the header of the VAT tsv to use tab characters.
* Updated to latest version of VQSR Lite (from Master) * Ported tests and files for VQSR Lite over * Refactored VQSR Classic code into its own WDL
* add python script to our repo * use the new python script! * remove whl from integration test * move script location for testing * remove the damn wheel! * add the replacement hail script * proper renaming * update docker
* Refactoring of ExtractCohortLite into ExtractCohort.
…ue. (#8312) * Update override jar to fix support issue.
* Fix bug in VCF Integration test
* add Aarons changes * put terra token in python * id not bucket * hardcode for testing * do we need a new docker image? * set workspace info * pull in name from rawls * pass output locations * add back prepare * add GvsImportGenomes back * update python for grabbing cols * split methods for easier testing * set defaults, but allow optional overrides for sample table and id * add unit test for python column guessing * clean up python for testing * add proper docker * is this where the loop is coming from? * better names * remove testing artifact * add back problem lines to the test * throw out columns with values other than strings * set defaults in the right place
* Optionally extract to bgz format. * Set bgzipping to be off (everywhere) by default. * Update assert_identical_outputs to handle bgzipped outputs.
* Have GvsAssignIds.wdl validate that input sample names (in the provided input file) are unique.
* Compress the tarball saves a bit. * Remove unused contigs from interval_list files by grepping. --------- Co-authored-by: Miguel Covarrubias <mcovarr@users.noreply.github.com>
* Change extract so that when we filter at the genotype level (with FT) the VCF header has the filter definition in the FORMAT field. * Also minor renaming of ExtractCohort argument. * Point to updated truth.
* Add ValidateVariants to our tests. * Bringing in Rori's change to add EXCESS_ALLELES to VCF Headers. * Updated truth path.
* remove the field 'yng_status' from the variant_data as_vqsr status dict of structs.
* Have GvsCreateVATfromVDS.wdl take sites-only-vcf as an optional input. * Added logic to allow/disallow CopyFile to overwrite.
* checkpointing here to switch branches * locally working first pass at adding in the ploidy info. Still needs to have the arguments passed through so it works in the WDLs * Propagating changes up through the wdl * Stupid WDL substitution mistake * On a roll with WDL today wheeeeee * Cleaning up slightly * PR feedback * PR feedback v2: Ploidy Boogaloo
* Fix Chromosome Encoding used in pgen merge.
…r_to_ah_var_store_again
…_master_to_ah_var_store_again
Github actions tests reported job failures from actions build 9666825011
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Accidentally turned off 3 Gnarly tests that didn't get run since #8741