-
-
Notifications
You must be signed in to change notification settings - Fork 18.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Read/Write files on specific S3 accounts #16692
Comments
As of 0.20, pandas uses http://s3fs.readthedocs.io/en/latest/ I believe you should be able to do import pandas as pd
import s3fs
fs = s3fs.S3FileSystem(profile_name='foo')
f = fs.open("my-bucket/file.csv", "wb")
df.to_csv(f) Could you try that out, and if it works make a pull request for the documentation? I don't have a test bucket handy at the moment. |
I know this post is quite old at this point. However @TomAugspurger 's solution certainly works. For py3, I did the small change of using |
Would a solution to this be allowing a dask style storage_options parameter on its read_x functions? It's a little frustrating not being able to just pass these things through, most frequently i'm trying to pass in credentials rather than let boto search my system for them. |
Yes, I think that request has come up in a few places. I'd be happy to see something like that. |
Ref similar issue: #33639 |
#35381 closes this. You should now be able to use the storage_options kwarg to pass in "profile" |
Say I want to save a file to S3 using a specific account:
where my accounts are listed in
~/.aws/credentials
:What's the best or recommended way to do this with Pandas
0.20.2
?Any way to use/specify what account to use when we have multiple of them?
Perhaps related: Does Pandas use
boto
orboto3
?The text was updated successfully, but these errors were encountered: