Skip to content

Commit

Permalink
Calculate settings for S3Boto3StorageFile when instantiated, not impo…
Browse files Browse the repository at this point in the history
…rted (#930)

This allows the "storages.backends.s3boto3" module to be cleanly imported
before Django settings are configured. This further supports the changes
from #524.

Removing "S3Boto3StorageFile.buffer_size" as a class variable does not
affect the API surface of "S3Boto3StorageFile", as "buffer_size" was
always accessible as an instance variable.
  • Loading branch information
brianhelba authored Nov 16, 2020
1 parent bf30e29 commit b743396
Showing 1 changed file with 1 addition and 4 deletions.
5 changes: 1 addition & 4 deletions storages/backends/s3boto3.py
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,6 @@ def _cloud_front_signer_from_pem(key_id, pem):

@deconstructible
class S3Boto3StorageFile(File):

"""
The default file object used by the S3Boto3Storage backend.
Expand All @@ -92,7 +91,6 @@ class S3Boto3StorageFile(File):
order to properly write the file to S3. Be sure to close the file
in your application.
"""
buffer_size = setting('AWS_S3_FILE_BUFFER_SIZE', 5242880)

def __init__(self, name, mode, storage, buffer_size=None):
if 'r' in mode and 'w' in mode:
Expand All @@ -113,8 +111,7 @@ def __init__(self, name, mode, storage, buffer_size=None):
# Amazon allows up to 10,000 parts. The default supports uploads
# up to roughly 50 GB. Increase the part size to accommodate
# for files larger than this.
if buffer_size is not None:
self.buffer_size = buffer_size
self.buffer_size = buffer_size or setting('AWS_S3_FILE_BUFFER_SIZE', 5242880)
self._write_counter = 0

@property
Expand Down

0 comments on commit b743396

Please sign in to comment.