-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cutoff for duplications #53
Comments
Hi, duplications are harder, but 1.3 is a reasonable start. |
The detection of duplications is harder. I’m unsure if I can use DHFFC>1.3 or DHBFC>1.3. After the population genotyping and Duphold, I found that some duplications(0/1,1/1) have DHBFC<1.3, but DHFFC>1.3 in the 30x WGS data, and the Samplot results confirm it to be true. Could you give me some advice? |
As you find, it's hard to come up with a good cutoff for duplications. |
Thanks for the quick reply.
|
Yes, you could try this.
I mean if you have a tandem duplication with 10 copies and then you add another single copy, you only expect a 10% increase in depth.
It's worth trying, but you'll have to evaluate for yourself how effective it is. If you have trios, you can look at mendelian violations and transmissions. Otherwise, you can look at samplots of variants that are filtered |
Thank you for your quick reply. |
The cutoff for deletions is DHFFC 0.7. What is the recommended DHBFC cutoff for duplications?
The text was updated successfully, but these errors were encountered: