You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have single cell data from several (~40) individual GBM samples, and would like to perform velocity analysis on the tumor cell population as a whole. Would you have any recommendations for how to correct for batch effects? Would it be sufficient that the lower dimensional embedding is corrected for batch effects, using scVI, or would I also have to use modified batch-corrected unspliced and spliced counts values as input for DeepVelo?
Thank you
The text was updated successfully, but these errors were encountered:
Hi, I would suggest you try using the batch-corrected unspliced and spliced counts as input, and all the training and analysis code should just work as usual. This is the easiest way to work with batch correction and hopefully, it can work well.
On the other hand, if you really want to input the latent embeddings, there are several spots regarding the input layer size and objective function computation you need to set up manually. I'd be happy to help work through these changes if you choose to work with the latent embeddings later.
Nevertheless, I'd say it is generally speaking still an open question about how to do batch correction for the field of velocity estimation, which is partially due to the large variation of unspliced reads across batches. So let me know if you encounter any issue later on
Hey guys,
I have single cell data from several (~40) individual GBM samples, and would like to perform velocity analysis on the tumor cell population as a whole. Would you have any recommendations for how to correct for batch effects? Would it be sufficient that the lower dimensional embedding is corrected for batch effects, using scVI, or would I also have to use modified batch-corrected unspliced and spliced counts values as input for DeepVelo?
Thank you
The text was updated successfully, but these errors were encountered: