Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Querier pods are crashing when trying to use multiple tenants in Loki 2.6.1 #7009

Closed
lukas-unity opened this issue Aug 31, 2022 · 3 comments
Closed

Comments

@lukas-unity
Copy link
Contributor

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:

  1. Make sure that you have auth_enabled and multi_tenant_queries_enabled in Loki 2.6.1 enabled
  2. Open datasource settings for Loki and add two tenants separated by pipe e.g. X-Scope-OrgID: A|B
  3. Check for Querier pods and you would see they crashed

Expected behavior
Querier should not crash and allow using multiple tenants.

Environment:

  • Infrastructure: Kubernetes
  • Deployment tool: helm - loki-distributed chart

Screenshots, Promtail config, or terminal output
Logs from querier container:

panic: exemplar labels have 80 runes, exceeding the limit of 64 [recovered]
        panic: exemplar labels have 80 runes, exceeding the limit of 64

panic: exemplar labels have 80 runes, exceeding the limit of 64 [recovered]
        panic: exemplar labels have 80 runes, exceeding the limit of 64

goroutine 16807 [running]:
github.com/opentracing-contrib/go-stdlib/nethttp.MiddlewareFunc.func5.1()
        /src/loki/vendor/github.com/opentracing-contrib/go-stdlib/nethttp/server.go:150 +0x139
panic({0x1e30f00, 0xc0011d8af0})
        /usr/local/go/src/runtime/panic.go:1038 +0x215
github.com/prometheus/client_golang/prometheus.(*histogram).updateExemplar(0xc00201b320, 0x40b01d, 0x1, 0xc0018acf30)
        /src/loki/vendor/github.com/prometheus/client_golang/prometheus/histogram.go:434 +0xb1
github.com/prometheus/client_golang/prometheus.(*histogram).ObserveWithExemplar(0xc00201b320, 0x3f83dcbbd832dc76, 0x21e7e2a)
        /src/loki/vendor/github.com/prometheus/client_golang/prometheus/histogram.go:314 +0xcd
github.com/weaveworks/common/instrument.ObserveWithExemplar({0x266fb88, 0xc00060b0e0}, {0x7f76f42ce2a0, 0xc00201b320}, 0x0)
        /src/loki/vendor/github.com/weaveworks/common/instrument/instrument.go:78 +0x202
github.com/weaveworks/common/middleware.Instrument.Wrap.func1({0x264c5c0, 0xc00073f720}, 0xc00020ad00)
        /src/loki/vendor/github.com/weaveworks/common/middleware/instrument.go:76 +0x66e
net/http.HandlerFunc.ServeHTTP(0xc00020ad00, {0x264c5c0, 0xc00073f720}, 0x0)
        /usr/local/go/src/net/http/server.go:2047 +0x2f
github.com/weaveworks/common/middleware.Log.Wrap.func1({0x26506a0, 0xc00182b420}, 0xc00020ad00)
        /src/loki/vendor/github.com/weaveworks/common/middleware/logging.go:55 +0x287
net/http.HandlerFunc.ServeHTTP(0x40d547, {0x26506a0, 0xc00182b420}, 0xc000a57701)
        /usr/local/go/src/net/http/server.go:2047 +0x2f
github.com/opentracing-contrib/go-stdlib/nethttp.MiddlewareFunc.func5({0x264e7e0, 0xc0015bd9c0}, 0xc00017bf00)
        /src/loki/vendor/github.com/opentracing-contrib/go-stdlib/nethttp/server.go:154 +0x62d
net/http.HandlerFunc.ServeHTTP(0xc00017be00, {0x264e7e0, 0xc0015bd9c0}, 0x30)
        /usr/local/go/src/net/http/server.go:2047 +0x2f
github.com/grafana/loki/pkg/loki.(*Loki).initServer.func2.1({0x264e7e0, 0xc0015bd9c0}, 0xc00017be00)
        /src/loki/pkg/loki/modules.go:131 +0x4de
net/http.HandlerFunc.ServeHTTP(0xc0017e1df0, {0x264e7e0, 0xc0015bd9c0}, 0x1eca5e0)
        /usr/local/go/src/net/http/server.go:2047 +0x2f
github.com/weaveworks/common/httpgrpc/server.Server.Handle({{0x26328e0, 0xc000896900}}, {0x266fae0, 0xc001f74740}, 0xc00073f4a0)
        /src/loki/vendor/github.com/weaveworks/common/httpgrpc/server/server.go:61 +0x41f
github.com/grafana/loki/pkg/querier/worker.(*frontendProcessor).runRequest(0xc001e59680, {0x266fae0, 0xc001f74740}, 0xc0001321e0, 0x40, 0xc000116a20)
        /src/loki/pkg/querier/worker/frontend_processor.go:122 +0xe3
created by github.com/grafana/loki/pkg/querier/worker.(*frontendProcessor).process
        /src/loki/pkg/querier/worker/frontend_processor.go:97 +0x1a5
@lukas-unity
Copy link
Contributor Author

Found that as a workaround issue could be circumvented by disabling tracing in Loki
https://grafana.com/docs/loki/latest/configuration/#tracing

original issue: #6667

@jeschkies
Copy link
Contributor

@lukas-unity could you try with the latest version on main? I think you are right, this is probably a duplicate of the issue you've mentioned.

@lukas-unity
Copy link
Contributor Author

yes, seems to be fine on main

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants