New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Airflow issue: Failed to fetch log file from worker. 403 Client Error #7167
Comments
Hi, It seems weird to me that this functionality stops working from one patch version to the other, specially taking into account that we did not perform any substantial changes in the logic. Did you report this to the Airflow devs? Maybe there is an issue on their side. |
Not yet. Why I have to report to the Airflow devs when I'm using Bitnami Helm chart? |
If you see the issue in the upstream Airflow chart using version 2.1.1, then it would be clear to me that it is an upstream issue. |
To be honest, no idea where the issue is. |
Let me clarify this better. While we package the Bitnami Airflow helm chart, we are not Airflow developers, so if an issue is not related how we package and configure Airflow but related to an upstream bug (which, according to what you mention, it seems highly likely), then this should be reported to the Airflow developers so they release a new version we can package. We do test our charts in all major Kubernetes distributions, and we perform a basic test suite that ensures that it is configured correctly, such as login/logout, create users, run a DAG, check that the configuration is not exposed... But we cannot perform a bigger test suite than the Airflow devs themselves, as we assume most of the functionalities have been tested upstream. In this sense, we do not have a test that checks if it is possible to fetch a log file from the worker. Looking at what happened here, maybe it makes sense to add it. In any case, today I have more bandwidth so I will try checking if it is an upstream issue or something related to our configuration. I will keep you posted. |
I tried with the upstream chart version 2.1.2 and the logs fetch works. Because of this, I confirm that there must be something incorrect with how we configure the Airflow Worker logs server. I will open a task for investigation. Thank you so much for reporting |
Thanks for looking into this. We really need the other bugfix that they have solved in 2.1.2 - 'Fix “Invalid JSON configuration, must be a dict” bug' |
Perhaps this one apache/airflow#17260 will also help. |
Great, this solves current issue. ...
extraEnvVars:
- name: AIRFLOW__WEBSERVER__SECRET_KEY
value: "<your_secret>"
... Reference: https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#secret-key After this the logs fetch via web works. |
Since this is an important parameter needed to get a fully working Airflow deployment I added support in the latest airflow containers and chart for secret_key so no need to user an "extraEnvVars" and you can set the values.yaml to set the secret_key, otherwise a random one is configured. The same secret_key is set for the workers, webservers and the scheduler. For this feature just use the latest version of Airflow containers or the chart. |
@randradas ,
Currently don't see this parameter "secret_key" in the latest helm chart values.yaml: Please explain, how and where need to add this "secret_key"? |
'secret_key' is the name used by Airflow's conf file. In the values.yaml you can find it with the name 'secretKey'. |
Airflow issue:
Failed to fetch log file from worker. 403 Client Error
.Can't fetch logs via web from worker node, there is the following error:
Airflow running in the following configuration:
Can access the logs on the worker pods directly.
Downgrading and running
airflow-worker
to2.1.1
version (for example2.1.1-debian-10-r9
), all works fine and there is possibility to fetch logs from worker node via web.The text was updated successfully, but these errors were encountered: