-
Notifications
You must be signed in to change notification settings - Fork 109
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
On deploy_challenge "IndexError: list index out of range" #28
Comments
Ok, after manually replaying the request with PostMan I got it: the domain was actually not in CloudFlare, but still on Route53. |
Agreed, I'll push an update in the coming days. |
Yup, encountered the same bug, was scratching my head for a better while. |
Same. Resulting json from CF is:
|
Is this still an unaddressed issue @kappataumu ? |
Ran into this issue on Ubuntu 18.04. |
Appears to be related to #53 |
Looks like cloudflare deleted my site :( |
Because the get_tld changed: tld = get_tld('http://' + domain)
|
I have the same problem and I confirm that what @kuleyang says is correct. It fixes the problem. |
Try https://github.com/SeattleDevs/letsencrypt-cloudflare-hook to see if that has resolved an issue. That fork is actively maintained and PRs are welcome. |
this other fork works for me |
When I try to renew a certificate, the following happens:
Processing vpn.staging-wanderio.com
Traceback (most recent call last):
File "cloudflare_hook/hook.py", line 196, in
main(sys.argv[1:])
File "cloudflare_hook/hook.py", line 192, in main
opsargv[0]
File "cloudflare_hook/hook.py", line 165, in create_all_txt_records
create_txt_record(args[i:i+X])
File "cloudflare_hook/hook.py", line 103, in create_txt_record
zone_id = _get_zone_id(domain)
File "cloudflare_hook/hook.py", line 81, in _get_zone_id
return r.json()['result'][0]['id']
IndexError: list index out of range
This has worked in the past, but all of a sudden in broke (a change in CF API?).
CF_EMAIL and CF_KEY are exported in the environment in which the script runs
The text was updated successfully, but these errors were encountered: