Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Print out the alpha parameters #3

Closed
sshen82 opened this issue May 19, 2021 · 1 comment
Closed

Print out the alpha parameters #3

sshen82 opened this issue May 19, 2021 · 1 comment

Comments

@sshen82
Copy link

sshen82 commented May 19, 2021

Hi, thank you for your great software! Right now everything is great, but could you please add in the alpha which is the weight parameters for the output? I want to look at which data input is more important.

@vanhoan310
Copy link
Collaborator

vanhoan310 commented May 20, 2021

Dear sshen82,

thank you for using JVis. Here is how to print the weight for parameters

For j-SNE:
data = {'rna': expr_reduced, 'chromatin': atac_reduced, 'noise': noise_matrix}
jsne_obj = JTSNE(init='random')
% _lambda is the lambda parameter in our paper. We suggest to use _lambda from 1 to 3 for j-SNE.
joint_tsne = jsne_obj.fit_transform(X = data, method = 'auto', _lambda = 3)
% print alpha: the weight of modalities is same order as the data (for this example: rna, chromatin, noise)
print(jsne_obj.alpha)

For j-UMAP
jumap_obj = JUMAP(init='random')
% ld is lambda parameter in our paper. Try ld from 0.5 to 3 for j-UMAP.
joint_umap = jumap_obj.fit_transform(X = data, method = 'auto', ld = 1, max_iter = 10)
% print alpha
print(jumap_obj.alpha)

If you don't specify lambda in the function fit_transform, by default we use the uniform weight.

You can find more in our examples here:
https://github.com/canzarlab/JVis_paper/tree/master/proof_of_principle

Please let me know if you have any question.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants