Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High memory usage (limits maximum Python file size) #843

Closed
Viele opened this issue Feb 2, 2017 · 3 comments
Closed

High memory usage (limits maximum Python file size) #843

Viele opened this issue Feb 2, 2017 · 3 comments

Comments

@Viele
Copy link

Viele commented Feb 2, 2017

I have a problem with a python process all of a sudden taking up all 24GB of RAM. That only happens when i point the autocompletion to the location where the stubs from the Autodesk Maya devkit are (containing pyMel, PySide2 and OpenMaya). In total these stub files are 60MB.
I've tried this with SublimeText/Anaconda and Atom/autocomplete-python. Both use Jedi.

As soon as I do something like this

import pymel.core as pm
pm.

the python process takes um all RAM and creates cache files in \AppData\Roaming\Jedi up to ~12GB
After that the process decreases RAM usage but still stays around 3GB of RAM usage.

I've also tried deleting all but 1 stub file. While it is considerably faster, it still creates 60MB of cache out of a 600KB file using about 800MB of RAM while doing so.
I have the same problem on my desk PC as well as my Laptop
I've attached the single stub file, in hope this is reproduceable. (All the
stubs can be downloaded freely in the Autodesk Maya 2017 devkit)
py.zip

Win10
Autodesk Maya 2017
version is 0.10.0

@davidhalter
Copy link
Owner

I think we might need to add a setting to Jedi that limits the size of Python files.

@ghost
Copy link

ghost commented Dec 30, 2017

Another possible solution is to compress the cache by replacing repetitive patterns in cache with special Unicode characters and saving the mapping of the special Unicode characters to repetitive patterns in a separate dictionary file.

Example:
Original: The solution is to compress the cache
New: * solution is to compress * cache
Dictionary: (*) -> (the)

@davidhalter Limiting the size of Python files might end up losing useful functionality, and most Python code is long these days anyways.

@davidhalter
Copy link
Owner

@MohamedAlFahim With large code I mean code that is > than 500 kB, which is in one file I have 15'000 lines. That size is extremely rare for human generated code (even 5000 lines).

The files @Viele speaks about are 60MB big. That's pretty extreme. That's probably something like 2e6 LOC. So much code just makes no sense and it's obvious that completion is slow for that (even syntax highlighting for that in VIM is slow).

@davidhalter davidhalter changed the title High memory usage High memory usage (limits maximum Python file size) Mar 18, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants