-
Notifications
You must be signed in to change notification settings - Fork 145
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JSON decode error fix #256
Conversation
I've made a fix for this one before, but it seems like the PR got lost in the meantime. Did you try this fix against In case it doesn't work with other libs please fall back using this snippet: import json as jsonlib And inside the json_data = jsonlib.loads(response.content) |
@Kamforka No we didn't try against simplejson however, I will add my Python knowledge isn't as good as the people here so i will also defer to @octocolby if he has any other thoughts. |
@wizedkyle I'm not sure I don't mind your current solution if you can verify it against |
@Kamforka I am going to test the fix against simplejson tonight and will let you know the results. |
@Kamforka using |
@wizedkyle Maybe I explained myself poorly. I didn't mean to use the But no worries now I also have to verify it on my own, I'll get back to you. |
Okay I've checked the >>> from requests import JSONDecodeError
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: cannot import name 'JSONDecodeError' from 'requests' So I'd just suggest to fall back to my original recommendation and use the standard json library to parse the content of the response: json_data = jsonlib.loads(response.content) May I ask you to add this change to the PR? You need to adjust I also tested it in an environment with and without |
Fixing this in #259 |
Outlined in #255 there can be JSON decode errors when running TheHive4Py on AWS Lambda most likely due to Lambda not including the correct package that contains JSONDecodeError.
This PR updates the JSONDecodeError to come from the requests package. Thank you to all the hard work from @octocolby for identifying the root cause of this issue and providing a fix.