fix: Upgrade the pyarrow to latest v14.0.1 for CVE-2023-47248. #3841
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What this PR does / why we need it:
Update the Pyarrow to the latest version v14.0.1 which has the fix for GHSA-5wvp-7f3h-6wmm
Which issue(s) this PR fixes:
Fixes # 3832
"By default, for version='1.0' (the default) and version='2.4', nanoseconds are cast to microseconds (‘us’),"
./python/feast/transformation_server.py:54: writer.write_table(result_arrow) ./python/feast/infra/offline_stores/file.py:109: pyarrow.parquet.write_table( ./python/feast/infra/offline_stores/file.py:470: writer.write_table(new_table) ./python/feast/infra/offline_stores/contrib/spark_offline_store/spark.py:236: pq.write_table(table, tmp_file.name) ./python/feast/infra/offline_stores/contrib/mssql_offline_store/mssql.py:373: pyarrow.parquet.write_table( ./python/feast/infra/offline_stores/bigquery.py:358: pyarrow.parquet.write_table(table=data, where=parquet_temp_file, coerce_timestamps="us") ./python/feast/infra/offline_stores/bigquery.py:407: pyarrow.parquet.write_table(table=table, where=parquet_temp_file, coerce_timestamps="us") ./python/feast/infra/utils/aws_utils.py:207: pq.write_table(table, file_path) ./python/feast/infra/utils/aws_utils.py:356: pq.write_table(table, parquet_temp_file, coerce_timestamps="us") ./python/feast/infra/utils/aws_utils.py:1049: pq.write_table(table, parquet_temp_file)
I try to keep a minimum change. If there is an error shows up in the future, for example, the "upload_arrow_table_to_athena()" function of "aws_utils.py:1049", a new PR can be created with the necessary unit tests and integration tests.