You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I encountered the Out of Memory issue when making large scans > thousands of pages.
I have increased the limit of node when running npm run cli and npm run start to allocate up to 6GB (it will only consume then error out if it reaches that much memory usage).
If I crawl this sitemap.xml file https://cnib.ca/en/sitemap.xml?region=on
The page fails. There are over 1800 pages in it, but had hoped it would be able to manage sites larger than this.
What's the best way to deal with this memory error
The text was updated successfully, but these errors were encountered: