Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ECK Elasticsearch CRD is waiting for elasticsearch label in kibana resource although kibana variant exists. #7615

Open
mathijswesterhof opened this issue Mar 11, 2024 · 1 comment
Labels

Comments

@mathijswesterhof
Copy link

Bug Report

What did you do?
With the operator in a separate namespace try to create a cluster with multiple nodes and one Kibana instance.

What did you expect to see?
I expect to see a green cluster after a few minutes.

What did you see instead? Under which circumstances?
the Elastic pods for master ingest and data come online but the elasticsearch CRD states health unknown and events state

"version label elasticsearch.k8s.elastic.co/version is missing"

It does not say what pod only a stacktrace (posted below)
To fix this issue edit the kubernetes pod by adding elasticsearch.k8s.elastic.co/version: 7.16.2 label under the existing kibana.k8s.elastic.co/version: 7.16.2

(personal observation, for some reason the clustername does get transfered to both types so I observe

    elasticsearch.k8s.elastic.co/cluster-name: eck-cluster
    kibana.k8s.elastic.co/name: eck-cluster
    kibana.k8s.elastic.co/version: 7.16.2

in the kibana labels.

Environment

  • ECK version:
    both elastic and kibana run on version 7.16.2
    eck-operator:2.11.1

  • Kubernetes information:
    tested on both

    • bare on-premise cluster v1.28.4+rke2r1
    • Rancher managed on-premise cluster v1.27.4+rke2r1
  • Resource definition:

if relevant insert the resource definition
  • Logs:
insert operator logs or any relevant message to the issue here
version label elasticsearch.k8s.elastic.co/version is missing
github.com/elastic/cloud-on-k8s/v2/pkg/controller/common/version.FromLabels
	/go/src/github.com/elastic/cloud-on-k8s/pkg/controller/common/version/version.go:138
github.com/elastic/cloud-on-k8s/v2/pkg/controller/common/version.MinInPods
	/go/src/github.com/elastic/cloud-on-k8s/pkg/controller/common/version/version.go:105
github.com/elastic/cloud-on-k8s/v2/pkg/controller/elasticsearch/driver.(*defaultDriver).Reconcile
	/go/src/github.com/elastic/cloud-on-k8s/pkg/controller/elasticsearch/driver/driver.go:179
github.com/elastic/cloud-on-k8s/v2/pkg/controller/elasticsearch.(*ReconcileElasticsearch).internalReconcile
	/go/src/github.com/elastic/cloud-on-k8s/pkg/controller/elasticsearch/elasticsearch_controller.go:304
github.com/elastic/cloud-on-k8s/v2/pkg/controller/elasticsearch.(*ReconcileElasticsearch).Reconcile
	/go/src/github.com/elastic/cloud-on-k8s/pkg/controller/elasticsearch/elasticsearch_controller.go:192
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Reconcile
	/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.16.3/pkg/internal/controller/controller.go:119
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler
	/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.16.3/pkg/internal/controller/controller.go:316
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem
	/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.16.3/pkg/internal/controller/controller.go:266
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2
	/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.16.3/pkg/internal/controller/controller.go:227
runtime.goexit
	/usr/local/go/src/runtime/asm_amd64.s:1650
@botelastic botelastic bot added the triage label Mar 11, 2024
@rhr323
Copy link
Contributor

rhr323 commented Mar 12, 2024

Hey @mathijswesterhof, thank you for reporting this issue. Could you attach the definition you used to create the Elasticsearch cluster and the Kibana instance (yaml files)?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants