-
Notifications
You must be signed in to change notification settings - Fork 95
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory Leak #36
Comments
No worries. import "github.com/moovweb/gokogiri/help"
help.LibxmlCleanUpParser() Note, that once you call the above function, you can no longer use gokogiri to parse any more documents. This should clean up a lot of memory, but it renders the parser useless until your program exits. |
Right, that explains why when my program is done it held onto a few hundred MB. So are you saying a long running process(parsing tens of thousands of large pages) with gokogiri is not possible, because memory will keep growing? |
Sorry it took me so long to get back to you. Do you have a copy of the 42k list of URLs you're using? I'd like to replicate the issue if possible to investigate further. |
No problem, I linked to the 42K url list in the original post. |
woops, sorry, I'm silly =x |
Also, what version of go are you using? |
go1.1.1 amd64 |
Hey @JakeAustwick, sorry, I haven't had too much time to investigate. However, I did manage to do this: I ran the program you provided with the urls you provided, and though the memory did keep climbing for a while, it seemed to somewhat stabilize around 300-400MB on my machine. I totally understand that that's a rather high memory footprint for what that program is doing, but my intuition is that perhaps go's garbage collection is not as aggressive as it could be when cleaning up resources. Have you tried tweaking the runtime garbage collection parameters? |
I'd accept 300-400mb happily, my problem was that it rose to >1Gb and kept rising. Did you run through the entire list? |
So I've noticed the memory jump up all the way up to 700MB at some points, but after going through the entire list of URLs that you provided, the memory settled down to 430MB, at least for my machine, which makes me feel that it's a garbage collection issue. You could try forcing the garbage collector to run at each iteration and see how that affects it, however, that'll take quite a while to run probably =/ |
Hmm, I seem to remember my memory going much higher. I'll give it another run and let you know. |
@JakeAustwick @mdayaram Did you find a solution to this? I'm looking at doing something similar with Gokogiri and have ran into a similar issue with other libs in the past. |
I never continued with this project in Go, I simply rewrote it in Python with gevent and lxml. Sorry. |
Unfortunately I was never able to replicate the issue. One thing that we never clarified was what operating system @JakeAustwick was running in. I ran all my tests in Linux/Ubuntu, however, I know that there are some memory issues with Windows. Also, this issues could've all disappeared with the introduction to go1.2. As far as I can tell, it seemed specific to the environment. My advice for you @jwarzech would be to run the test program in this issue with the URLs and see how your memory is handled. If it's doing OK (around the 400MBish range), then I wouldn't be concerned with it. Here at moovweb we have several production boxes constantly parsing html pages using gokogiri without any major memory issues, but all those boxes are also running Linux and go1.2. |
Just for reference, I was running on ubuntu 12.04, with my version of libxml installed from the ubuntu repos. |
would implementing this inside the lib helps? http://golang.org/pkg/runtime/#SetFinalizer |
We did have an implementation of |
Hello, I was wondering if this project is still active? it's been 4 months since any commits with an open memory leak which is a pretty scary things for server use. Either way, thank you for this project, it's great work. |
No one has been able to reproduce this memory leak (it should probably be closed by @mdayaram for that reason). |
If it can't be reproduced, we'll close it. |
Hi @mdayaram , I had the same issue using "doc, err := gokogiri.ParseHtml(page)" on my mac pro with go version go1.4 darwin/amd64 I have done a heavy test parsing more than 46557 websites. My memory usage kept going up to 120GB without declining, then the program process got killed and my machine ran out of memory. Note: I simply used gokogiri.ParseHtml(page). I tried to use runtime.GC() every min, but It did not help. |
This still appears to be popping up intermittently, so I'll take a look this weekend and see if I can turn up more info. Reopening for now. |
@akhleung @JakeAustwick
|
Thanks for the test case! I'll try it out. |
@weil I ran your example, and it did indeed leak very badly. However, when I modified the loop to assign the document to a variable and call its
|
If this indeed fixes the leak, the documentation and example for ParseHtml (and ParseXml) should be updated to include a call |
I'm seeing this as well and can reproduce with weil's code |
hi im new to golang and i want to use this library to process some small html page |
Hey, I'm new to Go so please excuse me if the problem is with my code.
When I run the following code, the memory just keeps growing and growing, as if it is being leaked somewhere. It leaks slow though, if I run it with a list of 42k urls, it slowly just keeps on climbing and climbing. You should be able to spot it with this url list: https://gist.github.com/JakeAustwick/82c9d4ce300639a4d275/raw/368c41ce6ba95f03cbc25a188dd3c07646a068b0/gistfile1.txt
Can you spot what I'm doing wrong, or have I found a bug?
Once again I apologise if it is me doing something wrong.
The text was updated successfully, but these errors were encountered: