You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
csv_gdf.to_file(s3uri, driver="GeoJSONSeq")
File "/opt/conda/envs/project/lib/python3.7/site-packages/geopandas/geodataframe.py", line 746, in to_file
_to_file(self, filename, driver, schema, index, **kwargs)
File "/opt/conda/envs/project/lib/python3.7/site-packages/geopandas/io/file.py", line 257, in _to_file
colxn.writerecords(df.iterfeatures())
File "/opt/conda/envs/project/lib/python3.7/site-packages/fiona/collection.py", line 361, in writerecords
self.session.writerecs(records, self)
File "fiona/ogrext.pyx", line 1291, in fiona.ogrext.WritingSession.writerecs
File "fiona/ogrext.pyx", line 390, in fiona.ogrext.OGRFeatureBuilder.build
OverflowError: Python int too large to convert to C long
Could you check if the maximum integer you want to write is bigger than 2 ** 63 - 1? This is the maximum integer size gdal supports. (OGR_F_SetFieldInteger64 takes a long long as input value)
converting from an int to a string for those values avoids the fiona crash, not sure what other options are available for JSON serializable values; it's curious that the s2 cell id as a uint64 seemed to be JSON serializable in vanilla JSON but not for fiona/gdal (IIRC)
Obscure error when writing GeoJSONSeq to AWS S3:
some gdf info
versions
Installed from pip package/wheel (not from conda).
The text was updated successfully, but these errors were encountered: