Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Too many open files" error crashing scraper #470

Closed
TheTrueColonel opened this issue Oct 8, 2024 · 2 comments
Closed

"Too many open files" error crashing scraper #470

TheTrueColonel opened this issue Oct 8, 2024 · 2 comments

Comments

@TheTrueColonel
Copy link

Describe the bug

When using the scraper, after about 33 users scraped, the program will crash and go back to the main menu.

To Reproduce

Steps to reproduce the behavior:
Hint: enter the command and args would be a good idea

Example

  1. Run ofscraper command
  2. Select over 33 users to scrape
  3. Wait until crash

Expected behavior

Scraper does not crash do to too many open files.

Screenshots/Logs

Logs

Config

Config

System Info

  • OS: Arch Linux
  • Browser: Firefox
  • Version: 131 64-bit
  • binary or python: Pipx
@A-Random-Mf
Copy link

@SSj-Saturn
Copy link

This has suddenly started happening for me too, and my ulimit is set to unlimited.
Never been an issue before.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants