OSError and Memory error while running gseapy #128
-
Hi, I would like to seek advice re: an error I encountered while running gseapy. Below is the error message MemoryError Traceback (most recent call last) D:\User\LocalDocuments\Bioinformatics\Jupyterlab\lib\site-packages\gseapy\gsea.py in prerank(rnk, gene_sets, outdir, pheno_pos, pheno_neg, min_size, max_size, permutation_num, weighted_score_type, ascending, processes, figsize, format, graph_num, no_plot, seed, verbose) D:\User\LocalDocuments\Bioinformatics\Jupyterlab\lib\site-packages\gseapy\gsea.py in run(self) D:\User\LocalDocuments\Bioinformatics\Jupyterlab\lib\site-packages\gseapy\algorithm.py in gsea_compute(data, gmt, n, weighted_score_type, permutation_type, method, pheno_pos, pheno_neg, classes, ascending, processes, seed, single, scale) D:\User\LocalDocuments\Bioinformatics\Jupyterlab\lib\site-packages\joblib\parallel.py in call(self, iterable) D:\User\LocalDocuments\Bioinformatics\Jupyterlab\lib\site-packages\joblib\parallel.py in dispatch_one_batch(self, iterator) D:\User\LocalDocuments\Bioinformatics\Jupyterlab\lib\site-packages\joblib\parallel.py in _dispatch(self, batch) D:\User\LocalDocuments\Bioinformatics\Jupyterlab\lib\site-packages\joblib_parallel_backends.py in apply_async(self, func, callback) D:\User\LocalDocuments\Bioinformatics\Jupyterlab\lib\site-packages\joblib_parallel_backends.py in init(self, batch) D:\User\LocalDocuments\Bioinformatics\Jupyterlab\lib\site-packages\joblib\parallel.py in call(self) D:\User\LocalDocuments\Bioinformatics\Jupyterlab\lib\site-packages\joblib\parallel.py in (.0) D:\User\LocalDocuments\Bioinformatics\Jupyterlab\lib\site-packages\gseapy\algorithm.py in enrichment_score(gene_list, correl_vector, gene_set, weighted_score_type, nperm, rs, single, scale) MemoryError: Unable to allocate 22.7 MiB for an array with shape (101, 29396) and data type float64 ========================================================= |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 3 replies
-
opps, could you try to run gseapy with bigger RAM ? or would you like to run this script in command line? Runing multiprocessing module in windows jupyter fails often. |
Beta Was this translation helpful? Give feedback.
-
You may try to install v0.10.5 by pip or conda, I fored gseapy to reduce memory usage |
Beta Was this translation helpful? Give feedback.
-
I solved the issue using gseapy.algorithm.gsea_compute instead of prerank() |
Beta Was this translation helpful? Give feedback.
-
refer to the Rust binding of GSEApy >=v0.13.0. This is now fixed |
Beta Was this translation helpful? Give feedback.
refer to the Rust binding of GSEApy >=v0.13.0. This is now fixed