Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Storage S3 - AWS node role IMDSv1 does not work after upgrade to v2.2 #2743

Closed
coufalja opened this issue Aug 1, 2023 · 8 comments · Fixed by #2760
Closed

Storage S3 - AWS node role IMDSv1 does not work after upgrade to v2.2 #2743

coufalja opened this issue Aug 1, 2023 · 8 comments · Fixed by #2760

Comments

@coufalja
Copy link
Contributor

coufalja commented Aug 1, 2023

Describe the bug
Tempo does not start with "transparent" AWS credentials (node role). It seems to not take the credentials/role into account at all.

To Reproduce
Steps to reproduce the behavior:

  1. Start from tempo-distributed chart
  2. Use node role credentials to connect to S3
  3. Fail with "err":"failed to init module services error initialising module: store: failed to create store unexpected error from ListObjects on <bucket name>: Access Denied"

Expected behavior
Tempo connects and starts.

Environment:

  • Kubernetes
  • Helm chart tempo-distributed >=1.5.3

Additional Context

  • Just downgrading image version while using the very same config fixes the problem, so must be code or image related issue.
  • I also though about missing some niche permission in the attached policy so I temporarily granted * to the node role but no joy.
@joe-elliott
Copy link
Member

I'm not seeing much changes to the s3 backend.

Prefix Support: a55da81

Standardized TLS config: e809ae607a

Updated minio: e9898effad

The first two seem innocuous. WDYT? The minio changes are here:

minio/minio-go@v7.0.23...v7.0.52

If it's possible to test image versions immediately before some of these changes it could help us narrow down the issue. We could also try just upgrading minio again and see if it fixes it.

@coufalja
Copy link
Contributor Author

coufalja commented Aug 2, 2023

So it is the minio upgrade. Image grafana/tempo:main-9996433-amd64 is working as expected, the docker.io/grafana/tempo:main-e9898ef-amd64 fails.

@coufalja
Copy link
Contributor Author

coufalja commented Aug 2, 2023

Tried bumping to github.com/minio/minio-go/v7 v7.0.61 and building from the main branch but ended up with the same error.

@joe-elliott
Copy link
Member

Nothing really stands out in the minio changelog. Can you file an issue here:

https://github.com/minio/minio-go

Detailing your auth setup and see if they have insight into what may have changed?

@coufalja
Copy link
Contributor Author

coufalja commented Aug 3, 2023

I have found the culprit: Use 1s timeout for fetching imdsv2 token introduced in v7.0.24. Reverting that and building custom tempo image with custom minio-go fixes the issue.

@coufalja
Copy link
Contributor Author

coufalja commented Aug 3, 2023

Seems to me as issue in Tempo that surfaced in due to 2 bugfixes made in minio lib: minio/minio-go#1626 and minio/minio-go#1682 I fixed the bug title as it affects the IMDSv1 only. It seems that the same issue appeared and was fixed some time ago in Cortex cortexproject/cortex#4897

@coufalja coufalja changed the title Storage S3 - AWS node role does not work after upgrade to v2.2 Storage S3 - AWS node role IMDSv1 does not work after upgrade to v2.2 Aug 3, 2023
@coufalja
Copy link
Contributor Author

coufalja commented Aug 3, 2023

I backported the Cortex fix in #2760

@joe-elliott
Copy link
Member

Excellent research and fix, @coufalja. Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants