Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unknown setting 'discovery.zen.minimum_master_nodes' in elasticsearch testcontainer 8.x #228

Closed
lippertto opened this issue Jul 11, 2022 · 5 comments

Comments

@lippertto
Copy link

I am trying to start the testcontainers with a new 8.x elasticsearch image with testcontainers 3.6.0.
However, the container quits immediately, and I cannot execute any tests.

I found out that the discovery.zen.* settings are no longer supported and will cause the container to quit now. ES 8 breaking changes

java.lang.IllegalArgumentException: unknown setting [discovery.zen.minimum_master_nodes] please check that any required plugins are installed, or check the breaking changes documentation for removed settings

Also, the check if the container is up will not work properly. I assume this is because Elasticsearch will use https now by default. I did not find anything in the changelogs, but there is a comment in the Java testcontainers here [ElasticsearchContainer.java] "major version 8 is secure by default" (https://github.com/testcontainers/testcontainers-java/blob/84d3444600aaeeac6813505fe5bfbf9ffce760ef/modules/elasticsearch/src/main/java/org/testcontainers/elasticsearch/ElasticsearchContainer.java#L178)

Here is the error message that I get.

received plaintext http traffic on an https channel, closing connection Netty4HttpChannel{localAddress=/10.199.0.2:9200, remoteAddress=/10.199.0.1:47096

In order for the container to start, I need to change the environment variables as follows:

This does not work

from testcontainers.elasticsearch import ElasticSearchContainer

es = ElasticSearchContainer(image=f"elasticsearch:8.3.2")

try:
    es.start()
finally:
    es.stop()

If I remove the environment variable, this code works

from testcontainers.elasticsearch import ElasticSearchContainer

es = ElasticSearchContainer(image=f"elasticsearch:8.3.2") \
    .with_env("xpack.security.enabled", "false")

es.env.pop('discovery.zen.minimum_master_nodes')

try:
    es.start()
finally:
    es.stop()

I could prepare a Pull Request in which I check whether the version of the container is 8 or higher and set the environment variables accordingly. Does this sound like the correct approach?

@tillahoffmann
Copy link
Collaborator

Do you know whether simply removing discovery.zen.minimum_master_nodes would break elasticsearch tests for earlier versions?

@lippertto
Copy link
Author

This setting is ignored in Elasticsearch 7. (https://www.elastic.co/blog/a-new-era-for-cluster-coordination-in-elasticsearch). So yes, this can easily be removed. I tried with 7.17.5, and it worked.

As for version 6: I get an error message during start-up.

[2022-08-08T11:37:29,432][WARN ][o.e.b.BootstrapChecks    ] [-n7nWHc] max virtual memory areas vm.max_map_count [65530] is too low, increase to at least [262144]

When I set the kernel parameter to a higher value, this works again. But I think it is better to keep discovery.zen.minimum_master_nodes for version 6 than to introduce a regression here.

I am always nervous about this kind of change. Since there are only few supported major versions at any time, (currently, 6, 7, 8). I do not think it would hurt to have a set of setting for each version.

So, I would do the following changes:

  • remove the parameter for the general case
  • Version 6 adds discovery.zen.minimum_master_nodes
  • Version 7 does nothing more
  • Version 8 additionally sets xpack.security.enabled

@tillahoffmann
Copy link
Collaborator

Nice, thanks for digging into this. Your proposal of submitting a PR that checks for the version and sets environment variables appropriately sounds great!

@lippertto
Copy link
Author

I have done a small PR and added the three currently supported versions to the tests. They seem to be doing fine.
Can you please have a look?

@tillahoffmann
Copy link
Collaborator

Fixed in #232.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants