Skip to content

How to use pytorch-lightning distributed training without SLURM? #1334

Discussion options

You must be logged in to vote

you can configure your own environment variables and do your own setup.

Just override LightningModule.init_ddp_connection
https://pytorch-lightning.readthedocs.io/en/latest/lightning-module.html#lightningmodule-class

(corrected)

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by Borda
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #1334 on December 23, 2020 19:31.