Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add resume for adapter_v2, enable continued finetuning for adapter #1354

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

altria-zewei-wang
Copy link

Hi all!
I was checking #238 to add a function to finish resume finetuning for adapter. It would search the largest number of step point it saved from out_dir and update the state_dict.
Current Problem: I updated the step_count but find out to keep the iteration count from last time would have to read in the metrics in the log folder. The problem is that I don't know how to retrieve the corresponding version in the log file without adding an input of the version of the metrics.csv (currently not implemented).
Let me know what you think! Thanks for your repo!

@@ -16,12 +16,12 @@
from torchmetrics import RunningMean

from litgpt.adapter_v2 import GPT, Block, Config, adapter_filter, mark_only_adapter_v2_as_trainable
from args import EvalArgs, TrainArgs
from litgpt.args import EvalArgs, TrainArgs
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is also how we import elsewhere and looks good to me.

@rasbt
Copy link
Collaborator

rasbt commented Apr 25, 2024

Thanks for looking into this. Sorry, I haven't spent much time on thinking through the ramifications here. But would the simple resuming from the full finetuning code not work in your case:

https://github.com/Lightning-AI/litgpt/blob/main/litgpt/finetune/full.py#L43

@altria-zewei-wang
Copy link
Author

I was specifically looking into testing finetuning with adapter and loras in my paper, and that my gpu cuts off after certain time limit. I figure adding this feature can help anyone who is in similar situation as me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants