Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP]add npu support #19308

Draft
wants to merge 2 commits into
base: master
Choose a base branch
from
Draft

Conversation

hipudding
Copy link

@hipudding hipudding commented Jan 18, 2024

Add Ascend NPU as a new backend for pytorch-lighting

Fixes #19498

Before submitting
  • Was this discussed/agreed via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or minor internal changes/refactors)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:

Reviewer checklist
  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

📚 Documentation preview 📚: https://pytorch-lightning--19308.org.readthedocs.build/en/19308/

@hipudding hipudding marked this pull request as draft January 18, 2024 08:55
@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Jan 18, 2024
Copy link

gitguardian bot commented Jan 19, 2024

️✅ There are no secrets present in this pull request anymore.

If these secrets were true positive and are still valid, we highly recommend you to revoke them.
Once a secret has been leaked into a git repository, you should consider it compromised, even if it was deleted immediately.
Find here more information about risks.


🦉 GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request.

Our GitHub checks need improvements? Share your feedbacks!

@hipudding hipudding force-pushed the npu_support branch 5 times, most recently from 62ad9d2 to 08243c0 Compare February 20, 2024 02:05
@hipudding
Copy link
Author

Reference implementation #17700 #19443

@hhllxx1121
Copy link

Hello, if possible, could you show me a demo of how to use Lightning on an NPU?

@GuWei007
Copy link

GuWei007 commented Apr 7, 2024

@hipudding @hhllxx1121
More and more device manufacturers need to support their own backends. Can this be achieved by using the PrivateUse1 mechanism?

@hipudding
Copy link
Author

hipudding commented Apr 8, 2024

Hello, if possible, could you show me a demo of how to use Lightning on an NPU?

Yes, it's very simple to use npu instead of other backends. Just replace accelerator to 'npu'. Note that this PR is under development and not pass all the tests. DO NOT use for production.

Here's official example with NPU.

# main.py
# ! pip install torchvision
import torch, torch.nn as nn, torch.utils.data as data, torchvision as tv, torch.nn.functional as F
import lightning as L

# --------------------------------
# Step 1: Define a LightningModule
# --------------------------------
# A LightningModule (nn.Module subclass) defines a full *system*
# (ie: an LLM, diffusion model, autoencoder, or simple image classifier).


class LitAutoEncoder(L.LightningModule):
    def __init__(self):
        super().__init__()
        self.encoder = nn.Sequential(nn.Linear(28 * 28, 128), nn.ReLU(), nn.Linear(128, 3))
        self.decoder = nn.Sequential(nn.Linear(3, 128), nn.ReLU(), nn.Linear(128, 28 * 28))

    def forward(self, x):
        # in lightning, forward defines the prediction/inference actions
        embedding = self.encoder(x)
        return embedding

    def training_step(self, batch, batch_idx):
        # training_step defines the train loop. It is independent of forward
        x, y = batch
        x = x.view(x.size(0), -1)
        z = self.encoder(x)
        x_hat = self.decoder(z)
        loss = F.mse_loss(x_hat, x)
        self.log("train_loss", loss)
        return loss

    def configure_optimizers(self):
        optimizer = torch.optim.Adam(self.parameters(), lr=1e-3)
        return optimizer


# -------------------
# Step 2: Define data
# -------------------
dataset = tv.datasets.MNIST(".", download=True, transform=tv.transforms.ToTensor())
train, val = data.random_split(dataset, [55000, 5000])

# -------------------
# Step 3: Train
# -------------------
autoencoder = LitAutoEncoder()
trainer = L.Trainer(accelerator='npu', devices='0,1', max_epochs=1, strategy='deepspeed')
trainer.fit(autoencoder, data.DataLoader(train), data.DataLoader(val))

@hipudding
Copy link
Author

@hipudding @hhllxx1121 More and more device manufacturers need to support their own backends. Can this be achieved by using the PrivateUse1 mechanism?

As far as I know. pytorch-lightning use abstract accelerator, and not support plugins. If not all backends' accelerator can be merge. accelerator plugin is a good choice.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pl Generic label for PyTorch Lightning package
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add Ascend NPU as a backend
3 participants