Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Event handler to "unlock" GCPs Vertex.AI hyperparameter tuning service #3023

Open
St3V0Bay opened this issue Aug 14, 2023 · 2 comments · May be fixed by #3061
Open

Event handler to "unlock" GCPs Vertex.AI hyperparameter tuning service #3023

St3V0Bay opened this issue Aug 14, 2023 · 2 comments · May be fixed by #3061
Assignees

Comments

@St3V0Bay
Copy link

Dear PyTorch team,

I have developed a custom event handler that would make ignite code (more specifically: MONAI code) accessible to hyperparamer tuning jobs in GCP Vertex AI. It is an Ignite-tified version of this Link. As you can see from the code in the link, the metrics are simply saved at a certain time, in a certain place with a certain syntax. Once this is possible the Vertex HPO orchestration kicks in. Input arguments are controlled via Vertex.AI custom training jobs and output model performance can be extracted from that output file written at the end of the training.

Why is this useful? With this handler Ignite code can be subject to "outsourced" hyperparameter screening with just adding this handler and a few lines of Vertex config files. I found the outsourcing of HPO to the cloud platform way easier than coding it myself.

If you want I can contribute my solution to the codebase via a PR. Just let me know.
image003_obf
image002_obf
image001

@vfdev-5
Copy link
Collaborator

vfdev-5 commented Aug 14, 2023

@St3V0Bay looks great, thanks for posting!

If you want I can contribute my solution to the codebase via a PR. Just let me know.

Yes, this contribution is very welcome! Technically, we split the code as following:

  • handlers that require external packages to work: e.g. TensorboardLogger, ClearMLLogger etc, their code goes to ignite.contrib.handlers
  • other handlers go to ignite.handlers.

I expect that we would need to install and use a python client for vertex ai, so most probably we can put the code into contrib module. Let me know if you need any guidance.

@github-actions
Copy link

Hey 👋, I've just created a thread for this issue on PyTorch-Ignite Discord where you can quickly talk to the community on the topic.

🤖 This comment was automatically posted by Discuss on Discord

@vfdev-5 vfdev-5 linked a pull request Sep 8, 2023 that will close this issue
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants