Skip to content

How to load the weight weight from checkpoint while we didn't self.save_hyparams(.) ? #19290

Discussion options

You must be logged in to vote

So, will the pretrained model be overrides by the random init modelA, modelB weights ?

I think no.
The lightning module load_from_checkpoint method is like this:

@_restricted_classmethod
def load_from_checkpoint(
    cls,
    checkpoint_path: Union[_PATH, IO],
    map_location: _MAP_LOCATION_TYPE = None,
    hparams_file: Optional[_PATH] = None,
    strict: bool = True,
    **kwargs: Any,
) -> Self:

and the discription of strict is: strict: Whether to strictly enforce that the keys in :attr:`checkpoint_path` match the keys returned by this module's state dict.
Since the your lightning module ckpt contains both two models (you can print your ckpt to check this), and default strict is True

Replies: 2 comments 3 replies

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
3 replies
@HuangChiEn
Comment options

@flashszn
Comment options

@HuangChiEn
Comment options

Answer selected by HuangChiEn
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment