-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problems Codification .google_fit.token #7
Comments
Yeah; mine looks identical. I thought the .token file would be in a JSON format like previous integrations. I've tried the script on two different RasPIs as well as an Ubuntu box and they all generated the same output. |
In the HA subreddit, the dev replied to me that the token file isn't JSON and the format is correct. Right now I'm trying to find out how to debug correctly as I've added the following to my config:
but I'm getting nothing at all in the logs. |
Well, I finally succeeded in getting an error message and it looks identical to your error.
Right down to the same position. Seems like it might be an encoding issue. Found this link in StackOverflow that might be related: https://stackoverflow.com/questions/21129020/how-to-fix-unicodedecodeerror-ascii-codec-cant-decode-byte Hope the dev looks at this issue; I'd really like to get it working. So far the only thing I seem to have accomplished is killing my BadNest integration in the process! UPDATE: Got my BadNest install working again by updating the authentications, but still no love for my google_fit token. Same errors as above after the BN fix. So at least the issue looks consistent. |
Looking more into this, I feel like the issue may be in get_credentials.py; I think it may need to specify the encoding when it is saving the token file. I was reading the link I posted above and found this:
Maybe on certain installs of python, the encoding defaults are correct and on others, it's not set and has to be declared to be certain. I've run mine on Pi's with Debian and an Ubuntu 20.x desktop I have. I might try installing a Python for Windows setup and trying that as well. Of course, I'm just guessing here. I can sorta understand python code, but I am by no means a dev. I'm just following logic and looking up stuff. But I really want to make this work and I'd like to show @IvanVojtko that I'm willing to help in any way I can. Hope this helps somehow and isn't seen as annoying! 😉 |
Was just screwing around and thought, "What if I opened the token file in an editor and force saved it with UTF-8 encoding?". So I saved my original (renamed it) and opened a copy in Sublime Text and force saved it in UTF-8 encoding. When I restarted, the component still wasn't setting up, but I got a different error this time:
It looks like it bypassed the original encoding error and has possibly gone a bit further until it choked on the setup. |
Further updates: I decided to try installing the latest version of Python for Windows and after getting everything set up and installing the requirements, I was able to get a working token file. Even got the component to work for a bit, getting actual sensors and data. Unfortunately, this doesn't seem to play well in an installation using Google Calendar and BadNest. One integration will work and cause the others to fail. Then I update the authentication in the failing component and the others fail. I've had all three working at once, but after a reboot of HA it will fail again. I've tried setting up individual projects on the Google Cloud Dev console for each of them, but that seems to be having issues as well. So I've really tried my best to work this out and provide data but I don't think I can go any further without help or insight from the dev here. @daniperaleda - the only advice I can offer is to try a Python for Windows install to generate the token file - it DID seem to work and if you're not running other Google API integrations, it may be your solution. Hope this helps. As for me, I'm hoping that @IvanVojtko gets the time to read through this and can offer some advice. Until that time, I'm probably going to uninstall the GFit integration and try to get my original setup working again. Bummed, but Google really isn't making this easy. |
I think I have the same issue. Logger: homeassistant.components.sensor Error while setting up google_fit platform for sensor |
@mouws - You could try installing Python on Windows and generating the token file that way; it did work for me. But I ended up uninstalling the component. Even after separating it as it's own project on the Google Cloud Console, it still would cause authentication issues with my GCalendar and BadNest implementations. Taking it out immediately got both working again. Also it seemed to cause availability issues for my set up. I was getting constant messages that my HA server was down, then back up again within minutes. Once I removed the component, this too went away. I'd love to get this working, and I've indicated my willingness to help/test/provide info, but it doesn't appear the dev has the time for this so I'm sitting it out until I hear some response. Best of luck and I hope the suggestion helps. |
Finally I got it reinstalling PYTHON...I had PYTHON 2 and PYTHON 3 installed...so I removed PYTHON 2, and upgrade PYTHON3...and worked for me the bynary token file |
@rpitera Thanks for the reply. |
Hi:
First of all congrats for this project. I have tried to complete all the steps sent, but when the Google Fit token is generated, it is not generating a correct TOKEN file...it is with this format.
Error while setting up google_fit platform for sensor
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/helpers/entity_platform.py", line 249, in _async_setup_platform
await asyncio.shield(task)
File "/usr/local/lib/python3.9/concurrent/futures/thread.py", line 52, in run
result = self.fn(*self.args, **self.kwargs)
File "/config/custom_components/google_fit/sensor.py", line 215, in setup_platform
client = _get_client(token_file)
File "/config/custom_components/google_fit/sensor.py", line 132, in _get_client
creds = _get_creds(token_file)
File "/config/custom_components/google_fit/sensor.py", line 100, in _get_creds
creds = pickle.load(token)
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe6 in position 1: ordinal not in range(128)
This is part of the TOKEN file:
ccopy_reg _reconstructor p0 (cgoogle.oauth2.credentials Credentials p1 c__builtin__ object p2 Ntp3 Rp4 (dp5 S'_quota_project_id' p6 NsS'_token_uri' p7 Vhttps://oauth2.googleapis.com/token p8 sS'_scopes' p9 (lp10 S'https://www.googleapis.com/auth/fitness.heart_rate.read' p11 aS'https://www.googleapis.com/auth/fitness.body.read'
Could you help me with this topic?
The text was updated successfully, but these errors were encountered: