You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a use case where I need to download dataframes from multiple s3 buckets with different credentials.
By default, s3fs uses env variables such as AWS_PROFILEAWS_ACCESS_KEY_ID etc to determine credentials. However, this will not work for me as I need different credentials for different buckets.
I have a use case where I need to download dataframes from multiple s3 buckets with different credentials.
By default, s3fs uses env variables such as
AWS_PROFILE
AWS_ACCESS_KEY_ID
etc to determine credentials. However, this will not work for me as I need different credentials for different buckets.The s3fs docs show you can alternatively authenticate like so:
https://fs-s3fs.readthedocs.io/en/latest/#authentication
I attempted to use this idea with pandas
but this raised an exception deep within s3fs saying invalid bucket name. potentially caused by stripping logic here:
https://github.com/pandas-dev/pandas/blob/master/pandas/io/s3.py#L29
I think we could easily support authentication using this syntax:
By modifying the code here:
https://github.com/pandas-dev/pandas/blob/master/pandas/io/s3.py#L27
The idea being we first attempt to match the
filepath_or_buffer
for the access key and secret key. If matched, we pass these intos3fs.FileSystem
The text was updated successfully, but these errors were encountered: