Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issues with compressor adding double-hash to the output CACHE file #1098

Open
striveforbest opened this issue Feb 7, 2022 · 2 comments
Open

Comments

@striveforbest
Copy link

Versions:
django 2.2.24
django-compressor 3.1
django-storages 1.8

I am using django-compressor in conjunction with django-storages (remote storage is S3) and Django's staticfiles.

My settings:

LESS_PRODUCTION = True
COMPRESS_ENABLED = True
COMPRESS_OFFLINE = True

COMPRESS_PRECOMPILERS = (
    ('text/less', 'lessc {infile} {outfile}'),
)

# Assets handling
INSTALLED_APPS += ['storages', ]

STATICFILES_STORAGE = 'americana.storages.S3StaticStorage'
COMPRESS_STORAGE = STATICFILES_STORAGE
DEFAULT_FILE_STORAGE = 'americana.storages.S3DefaultStorage'

AWS_STORAGE_BUCKET_NAME = '<REDACTED>'
AWS_S3_STATIC_LOCATION = 'static'
AWS_S3_MEDIA_LOCATION = 'media'

STATIC_URL = f'https://{AWS_STORAGE_BUCKET_NAME}.s3.amazonaws.com/{AWS_S3_STATIC_LOCATION}/'
MEDIA_URL = f'https://{AWS_STORAGE_BUCKET_NAME}.s3.amazonaws.com/{AWS_S3_MEDIA_LOCATION}/'

AWS_DEFAULT_ACL = None
AWS_QUERYSTRING_AUTH = False
AWS_S3_SIGNATURE_VERSION = 's3v4'
AWS_IS_GZIPPED = True
AWS_S3_ACCESS_KEY_ID = '<REDACTED>'
AWS_S3_SECRET_ACCESS_KEY = '<REDACTED>'

# cache-control headers for S3 objects
AWS_S3_OBJECT_PARAMETERS = {
    'CacheControl': 'max-age=3600,public',  # 1 hour
}

# Applies only to staticfile with cache busting tokens in filenames
AWS_S3_TOKENIZED_STATIC_OBJECT_PARAMETERS = {
    'CacheControl': 'max-age=31536000,public',  # 1 year
}

My custom storage class:

class S3StaticStorage(CachedFilesMixin, S3Boto3Storage):
    """
    Storage for static files.
    The folder is defined in `settings.AWS_S3_STATIC_LOCATION`
    """

    default_acl = 'public-read'
    location = settings.AWS_S3_STATIC_LOCATION
    tokenized_filename_headers = getattr(settings, 'AWS_S3_TOKENIZED_STATIC_OBJECT_PARAMETERS', {})

    def __init__(self, *args, **kwargs):
        super().__init__(*args, **kwargs)
        self.local_storage = get_storage_class('compressor.storage.CompressorFileStorage')()

    def _save(self, name, content):
        cleaned_name = self._clean_name(name)
        name = self._normalize_name(cleaned_name)

        self.local_storage._save(cleaned_name, content)  # per django-compressor docs 

        parameters = self.object_parameters.copy()

        _type, encoding = mimetypes.guess_type(name)
        content_type = getattr(content, 'content_type', None)
        content_type = content_type or _type or self.default_content_type

        # setting the content_type in the key object is not enough.
        parameters.update({'ContentType': content_type})

        # Add cache
        if cache_token_pattern.search(name):
            parameters.update(self.tokenized_filename_headers)

        if self.gzip and content_type in self.gzip_content_types:
            content = self._compress_content(content)
            parameters.update({'ContentEncoding': 'gzip'})
        elif encoding:
            # If the content already has a particular encoding, set it
            parameters.update({'ContentEncoding': encoding})

        encoded_name = self._encode_name(name)
        obj = self.bucket.Object(encoded_name)
        if self.preload_metadata:
            self._entries[encoded_name] = obj

        if isinstance(content, File):
            content = content.file

        self._save_content(obj, content, parameters=parameters)
        # Note: In boto3, after a put, last_modified is automatically reloaded
        # the next time it is accessed; no need to specifically reload it.
        return cleaned_name

manage.py collectstatic works as expected collecting files from all the apps into the filesystem (per your docs via self.local_storage._save(cleaned_name, content)), then while running manage.py compress does run for a while, compressing the files until finally, it errors out with:

CommandError: An error occurred during rendering c4c/charity-update.html: The file 'CACHE/js/output.d09081cb3c88.7d3a7143a524.js' could not be found with <americana.storages.S3StaticStorage object at 0x11452d410>.

Please note how the filename contains d09081cb3c88 and then 7d3a7143a524 on top of it. The file CACHE/js/output.d09081cb3c88.js IS present, but obviously, the double-hashed file is not. Please advise.

@carltongibson
Copy link
Contributor

@striveforbest There's too many things that could be going on here to be able to say anything clearly. I think you'll need to investigate more closely to pin down exactly what's going on.

@tonkolviktor

This comment was marked as spam.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants