Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot use compressed buffer with elasticsearch data stream #992

Open
1 task done
OranShuster opened this issue Oct 12, 2022 · 0 comments
Open
1 task done

Cannot use compressed buffer with elasticsearch data stream #992

OranShuster opened this issue Oct 12, 2022 · 0 comments

Comments

@OranShuster
Copy link

Problem

I am ingesting very big logs (30MB) so i want to use a compressed buffer + ES compression to reduce the size
When i set the buffer to be gzip compressed i get the following errors

2022-10-12 14:26:08 +0000 [warn]: #0 [elasticearch_output] failed to flush the buffer. retry_times=0 next_retry_time=2022-10-12 14:26:09 +0000 chunk="5ead67778ec810a53837e51b712620ee" error_class=Zlib::GzipFile::Error error="not in gzip format"

Steps to replicate

  <match **>
    @type elasticsearch_data_stream
    @id elasticearch_output
    data_stream_name fluentd-k8s-master-audit
    data_stream_template_name fluentd-k8s-master-audit-template
    data_stream_ilm_policy default-ilm
    hosts "#{ENV['ES_HOSTS']}"
    logstash_format false
    verify_es_version_at_startup false
    default_elasticsearch_version 7
    include_timestamp true
    compression_level default_compression
    <buffer>
      @type file
      path /usr/share/fluentd-k8s-master/buffers/k8s-master-audit
      flush_thread_count 3
      chunk_limit_size 10mb
      compress gzip
    </buffer>
  </match>

Expected Behavior or What you need to ask

Using a gzip compressed buffer shouldn't lead to errors

Using Fluentd and ES plugin versions

  • OS version N/A
  • Kubernetes 1.20
  • Fluentd v1.15
  • ES plugin 5.2.4
  • ES version 7.16.2
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant