Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'Table.insert_data' does not support Datetime values #2957

Closed
inkrement opened this issue Jan 23, 2017 · 7 comments
Closed

'Table.insert_data' does not support Datetime values #2957

inkrement opened this issue Jan 23, 2017 · 7 comments
Assignees
Labels
api: bigquery Issues related to the BigQuery API. priority: p1 Important issue which blocks shipping the next release. Will be fixed prior to next release. type: question Request for information or clarification. Not an issue.

Comments

@inkrement
Copy link

Seems like the python api library does not support native python datetime objects for inserting BigQuery DateTime-values.

INFO:root:Sending load request
[(1L, 'as', datetime.datetime(2014, 2, 1, 0, 0), 'asdasd'),
 (2L, 'ds', datetime.datetime(2014, 2, 2, 0, 0), 'asdsad')]
Traceback (most recent call last):
...
  File "/home/chris/.local/lib/python2.7/site-packages/google/cloud/bigquery/table.py", line 770, in insert_data
    data=data)
  File "/home/chris/.local/lib/python2.7/site-packages/google/cloud/_http.py", line 326, in api_request
    data = json.dumps(data)
  File "/usr/lib/python2.7/json/__init__.py", line 244, in dumps
    return _default_encoder.encode(obj)
  File "/usr/lib/python2.7/json/encoder.py", line 207, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "/usr/lib/python2.7/json/encoder.py", line 270, in iterencode
    return _iterencode(o, 0)
  File "/usr/lib/python2.7/json/encoder.py", line 184, in default
    raise TypeError(repr(o) + " is not JSON serializable")
TypeError: datetime.datetime(2014, 2, 1, 0, 0) is not JSON serializable
@daspecster daspecster added the api: bigquery Issues related to the BigQuery API. label Jan 23, 2017
@daspecster
Copy link
Contributor

Thanks for reporting @inkrement!

@danoscarmike danoscarmike added Status: Acknowledged priority: p1 Important issue which blocks shipping the next release. Will be fixed prior to next release. type: question Request for information or clarification. Not an issue. labels Feb 28, 2017
@daspecster
Copy link
Contributor

@inkrement I haven't been able to reproduce this locally. Are you on the latest version of the library?

@inkrement
Copy link
Author

I installed it directly from the pip repository. But I don't use the library anymore so I don't know :/

@lukesneeringer
Copy link
Contributor

@daspecter Make sure there is a test for this case. If there already is, point to it and close this. If there is not, make one and post a PR.

@daspecster
Copy link
Contributor

I believe this is covered in this system test.

@kkinder
Copy link
Contributor

kkinder commented Mar 30, 2017

@daspecster Perhaps your environment is different? I'm still seeing the issue, and it's exceedingly easy to reproduce.

  1. Create a Python2.7 virtualenv. Activate it, obviously.
  2. pip install google-cloud-bigquery (currently resolves to the latest version on pip: google_cloud_bigquery-0.23.0-py2.py3-none-any)
  3. Make sure you're authenticated and create a test project. Do this in python interactvely:
>>> from google.cloud import bigquery
>>> import datetime
>>> client = bigquery.Client(project='myproject-1803')
>>> dataset = client.dataset('testset')
>>> dataset.create()
>>> table = dataset.table('test_datefield')
>>> table.schema = (bigquery.SchemaField('name', 'STRING'), bigquery.SchemaField('birthday', 'DATETIME'))
>>> table.create()
>>> table.insert_data(rows=[('Ken', datetime.datetime.now())])
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/kkinder/bigquerytest/testenv/lib/python2.7/site-packages/google/cloud/bigquery/table.py", line 770, in insert_data
    data=data)
  File "/Users/kkinder/bigquerytest/testenv/lib/python2.7/site-packages/google/cloud/_http.py", line 294, in api_request
    data = json.dumps(data)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/__init__.py", line 243, in dumps
    return _default_encoder.encode(obj)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/encoder.py", line 207, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/encoder.py", line 270, in iterencode
    return _iterencode(o, 0)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/encoder.py", line 184, in default
    raise TypeError(repr(o) + " is not JSON serializable")
TypeError: datetime.datetime(2017, 3, 30, 14, 23, 50, 699725) is not JSON serializable

It's also a pretty common thing with Python's json library. Any time you pass a datetime object, this happens. The question is, what is the other end expecting for formatting?

@tseaver tseaver changed the title Datetime not supported 'Table.insert_data' does not support Datetime values May 16, 2017
@tseaver
Copy link
Contributor

tseaver commented May 16, 2017

@tswast I'm running up against the difference in the JSON representation of datetime for row values (i.e., floating point seconds to microsecond precisions) vs. query parameters (ISO strings). I'm assuming that is a known issue (or design choice): are there other types where the row value representation is known to differ from query parameter representation?

tseaver added a commit that referenced this issue May 18, 2017
* Move '_row{,s}_from_json' next to scalar '_from_json' helpers.

* Add converter helpers for row data scalars.

* Convert row data using helpers.

Closes #2957.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigquery Issues related to the BigQuery API. priority: p1 Important issue which blocks shipping the next release. Will be fixed prior to next release. type: question Request for information or clarification. Not an issue.
Projects
None yet
Development

No branches or pull requests

6 participants