Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: 'coroutine' object is not iterable #1576

Closed
Bulli opened this issue Aug 13, 2021 · 29 comments
Closed

TypeError: 'coroutine' object is not iterable #1576

Bulli opened this issue Aug 13, 2021 · 29 comments

Comments

@Bulli
Copy link

Bulli commented Aug 13, 2021

I have made an update to the latest version. When I start the script, it ends with this error.
However, I've made a revert to 1491 and it works fine as before.

(xxxxx = anonymized)

Names: Username = username | 0 = All | 1 = default
0
Auth (V1) Attempt 1/10
Welcome xxxxx | xxxxx
Scraping Paid Content
Scraping - xxxxx | 1 / 9
Scraping - xxxxx | 2 / 9
Scraping - xxxxx | 3 / 9
Scraping - xxxxx | 4 / 9
Traceback (most recent call last):
File "xxxx\start_ofd.py", line 60, in
loop.run_until_complete(main())
File "xxxx\Python39\lib\asyncio\base_events.py", line 642, in run_until_complete
return future.result()
File "xxxx\start_ofd.py", line 44, in main
api = await main_datascraper.start_datascraper(
File "xxxx\datascraper\main_datascraper.py", line 90, in start_datascraper
await module.paid_content_scraper(api, identifiers)
File "xxxx\modules\onlyfans.py", line 401, in paid_content_scraper
unrefined_set = await tqdm.gather(*tasks, **settings)
File "xxxx\Python39\lib\site-packages\tqdm\asyncio.py", line 74, in gather
ifs = [wrap_awaitable(i, f) for i, f in enumerate(fs)]
TypeError: 'coroutine' object is not iterable

@brakheart
Copy link

having the exact same error here, also after updating to the latest version

@metalmau5
Copy link

+1 having this issue

@JWeavis
Copy link

JWeavis commented Aug 19, 2021

+1 having this issue
Wiped dir, loaded new 7.4.1 rebuilt configs, updated req, tried both TQDM versions.

@americanseeder1865
Copy link
Contributor

-1 not having this issue. Lets take a look at some things. First what platform are you using? Windows, Linux, or Docker?

@Bulli
Copy link
Author

Bulli commented Aug 20, 2021

Windows 10

@americanseeder1865
Copy link
Contributor

alright. Does your requirements.txt look like this?

requests
beautifulsoup4
urllib3
win32-setctime
python-socks[asyncio]
psutil
python-dateutil
lxml
mergedeep
jsonpickle
ujson
sqlalchemy==1.4.20
alembic
tqdm>=4.62.0
selenium
selenium-wire==2.1.2
user_agent
aiohttp
aiohttp_socks

@chibi624
Copy link

alright. Does your requirements.txt look like this?

requests
beautifulsoup4
urllib3
win32-setctime
python-socks[asyncio]
psutil
python-dateutil
lxml
mergedeep
jsonpickle
ujson
sqlalchemy==1.4.20
alembic
tqdm>=4.62.0
selenium
selenium-wire==2.1.2
user_agent
aiohttp
aiohttp_socks

I'm having the same issue and my requirements look just like that

@americanseeder1865
Copy link
Contributor

Alright, are you on the latest update?

@americanseeder1865
Copy link
Contributor

I am going to help you all of the way through here.

@chibi624
Copy link

Alright, are you on the latest update?

Yes I am

@JWeavis
Copy link

JWeavis commented Aug 20, 2021

Also tested with 7.4.1. I think it's related to recovering from the previous run as installing it to a new dir with no saved data works.
Clean install and config to folder OF_471 auths and works.
Copying the .sites from the non-working 7.4 folder to the new 741 folder causes it to error out again.

Was able to narrow it down to metadata files in one of the member folders. Moving other folders over and running the scan one person at a time. I had to delete a second set of Metadata folders from a second and third person.

@chibi624
Copy link

chibi624 commented Aug 20, 2021

I've tried a fresh download 3 or 4 times, I've never migrated any files over at all, including the .sites file. I think I have an old .sites file somewhere with content from an old account, but there's no way it's getting pathed to. Is there something more significant of a reset/uninstall besides deleting the master folder and starting from scratch? Because that's what I've been doing.

@metalmau5
Copy link

Also tested with 7.4.1. I think it's related to recovering from the previous run as installing it to a new dir with no saved data works.
Clean install and config to folder OF_471 auths and works.
Copying the .sites from the non-working 7.4 folder to the new 741 folder causes it to error out again.

Was able to narrow it down to metadata files in one of the member folders. Moving other folders over and running the scan one person at a time. I had to delete a second set of Metadata folders from a second and third person.

This worked. Thanks.

@Gamma-Velorum
Copy link

Gamma-Velorum commented Aug 20, 2021

File "O:\OnlyFans-master\start_ofd.py", line 60, in
loop.run_until_complete(main())
File "C:\Users\XXXX\AppData\Local\Programs\Python\Python39\lib\asyncio\base_events.py", line 642, in run_until_complete
return future.result()
File "O:\OnlyFans-master\start_ofd.py", line 44, in main
api = await main_datascraper.start_datascraper(
File "O:\OnlyFans-master\datascraper\main_datascraper.py", line 66, in start_datascraper
setup, subscriptions = await module.account_setup(
File "O:\OnlyFans-master\modules\onlyfans.py", line 106, in account_setup
subscriptions += await manage_subscriptions(
File "O:\OnlyFans-master\modules\onlyfans.py", line 1270, in manage_subscriptions
for remote_blacklist in remote_blacklists:
TypeError: 'error_details' object is not iterable`

There's got to be a better solution than deleting metadata. I have nearly 1000 folders. That'll be impossible to narrow it down.

@fatchai1994
Copy link

fatchai1994 commented Aug 21, 2021

same issue fixed by running

pip install -r requirements.txt

again with the master branch files pasted inplace in the old ver folder

@metalmau5
Copy link

There's got to be a better solution than deleting metadata. I have nearly 1000 folders. That'll be impossible to narrow it down.

It's obviously not ideal, but you could just keep previously scraped content in a separate folder from the scraper.

@chibi624
Copy link

Still having this issue, nothing to do with metadata as it happens on an absolutely fresh master file. All past scrapes from old versions/before this version crapped out are on a different place on the drive.

@bmfss
Copy link

bmfss commented Aug 22, 2021

same issue, tried -master and -7.4.1 fresh install on windows 10 Python 3.9.5 (tags/v3.9.5:0a7dcbd, May 3 2021, 17:27:52) [MSC v.1928 64 bit (AMD64)] on win32

@bmfss
Copy link

bmfss commented Aug 22, 2021

mine happens earlier on actually but same error;

PS C:\OnlyFans-master> python.exe start_ofd.py
Choose Profile
Names: Username = username | 0 = All | 1 = default
0
Auth (V1) Attempt 1/10
Welcome Xxxxxx | xxxxxxx
Traceback (most recent call last):
File "C:\OnlyFans-master\start_ofd.py", line 60, in
loop.run_until_complete(main())
File "C:\Python39\lib\asyncio\base_events.py", line 642, in run_until_complete
return future.result()
File "C:\OnlyFans-master\start_ofd.py", line 44, in main
api = await main_datascraper.start_datascraper(
File "C:\OnlyFans-master\datascraper\main_datascraper.py", line 66, in start_datascraper
setup, subscriptions = await module.account_setup(
File "C:\OnlyFans-master\modules\onlyfans.py", line 101, in account_setup
mass_messages = await authed.get_mass_messages(resume=imported)
File "C:\OnlyFans-master\apis\onlyfans\classes\create_auth.py", line 420, in get_mass_messages
items.extend(results2)
TypeError: 'coroutine' object is not iterable

(..hit enter..)

sys:1: RuntimeWarning: coroutine 'create_auth.get_mass_messages' was never awaited
RuntimeWarning: Enable tracemalloc to get the object allocation traceback

@bmfss
Copy link

bmfss commented Aug 22, 2021

account1 has the issue (creator account, 2FA on, has a "linked account" (anther creator account), not subscribed to any creators)

account2 works! (subscriber account, 2FA off, subscribed to a few creators)

will try more combinations to attempt to narrow it down.

@apogorzelska
Copy link

I get an identical error message. The account is a creator account without subscriptions (I was helping out a friend and trying to backup her content for her).

@americanseeder1865
Copy link
Contributor

There is a fix but the author has lost access to their original account for now. Reinstall using this repo and report back here.

@JWeavis
Copy link

JWeavis commented Aug 25, 2021

There is a fix but the author has lost access to their original account for now. Reinstall using this repo and report back here.

Nope. Fresh install fails:
Scrape Processing
Name: xxxxxxxxxxxxxxxxxxxx
Traceback (most recent call last):
File "D:\OnlyFans-7.4.2\start_ofd.py", line 60, in
loop.run_until_complete(main())
File "C:\Program Files\Python39\lib\asyncio\base_events.py", line 642, in run_until_complete
return future.result()
File "D:\OnlyFans-7.4.2\start_ofd.py", line 44, in main
api = await main_datascraper.start_datascraper(
File "D:\OnlyFans-7.4.2\datascraper\main_datascraper.py", line 93, in start_datascraper
await main_helper.process_names(
File "D:\OnlyFans-7.4.2\helpers\main_helper.py", line 939, in process_names
result = await module.start_datascraper(authed, username, site_name)
File "D:\OnlyFans-7.4.2\modules\onlyfans.py", line 144, in start_datascraper
api_array = format_options(api_array, "apis")
File "D:\OnlyFans-7.4.2\modules\onlyfans.py", line 1363, in format_options
string += seperator
NameError: name 'seperator' is not defined

@americanseeder1865
Copy link
Contributor

americanseeder1865 commented Aug 25, 2021

Lol I submitted a pull request for this. Unfortunately we are going to have to install yet again. Reinstall from my request instead for now. Sorry to keep you running around in a loop.

@americanseeder1865
Copy link
Contributor

You are going to have to reinstall at some point. Email me when you want to and I will help you.

@JWeavis
Copy link

JWeavis commented Aug 25, 2021

Lol I submitted a pull request for this. Unfortunately we are going to have to install yet again. Reinstall from my request instead for now. Sorry to keep you running around in a loop.

That build works. TY

@americanseeder1865
Copy link
Contributor

Lol I submitted a pull request for this. Unfortunately we are going to have to install yet again. Reinstall from my request instead for now. Sorry to keep you running around in a loop.

That build works. TY

He accepted my pull request and it's fixed. Please pull from the original repo now. https://github.com/DIGITALCRIMINALS/OnlyFans

@ashaller2017
Copy link

hey im grabbing this exact same error and was wondering if you found a fix

@ashaller2017
Copy link

    items.extend(results2)
TypeError: 'coroutine' object is not iterable
sys:1: RuntimeWarning: coroutine 'create_auth.get_mass_messages' was never awaited

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests