Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changes suggested to fastai integration #848

Open
2 of 3 tasks
omarespejel opened this issue Apr 21, 2022 · 6 comments
Open
2 of 3 tasks

Changes suggested to fastai integration #848

omarespejel opened this issue Apr 21, 2022 · 6 comments
Assignees
Labels

Comments

@omarespejel
Copy link
Contributor

omarespejel commented Apr 21, 2022

A cluster of changes suggested to the fastai integration after the merge of #678

@adrinjalali
Copy link
Contributor

The third issue is related to the backend and not the integration, is that correct?

The first and second issues I think we need to fix in the existing PR since changing them is a change in the behavior for the users.

@osanseviero
Copy link
Member

The third point means that instead of raising errors when loading a model if there is no toml specified, we could instead just load the model but show a clear warning to users.

Pretty much making these less strict

I suggested Omar we can do this particular point in a follow-up PR since I think it's better to get this out and get some feedback from users that will be using this from head (not from release) so we can also do other changes based on the feedback.

I agree the first point in particular should be fixed in the existing PR. The second seems like a minor internal implementation thing that should not change user behavior, so no strong opinions.

@omarespejel
Copy link
Contributor Author

Staling for the moment. Lacking point two: Consider using packaging and packaging.markers when reviewing if the local fastai and fastcore versions are supported.

@omarespejel omarespejel closed this as not planned Won't fix, can't repro, duplicate, stale May 27, 2022
@adrinjalali
Copy link
Contributor

Would you mind letting us know why it's closed as won't fix?

@omarespejel omarespejel reopened this Jun 4, 2022
@omarespejel
Copy link
Contributor Author

@adrinjalali, I was staling it since it's not on the short-term roadmap to fix it. However, I reopened it now for future reference.

@osanseviero
Copy link
Member

That's not a reason to close an issue though. An issue documents a feature/thing we want to add/change in this repo, both in short and long term. Even if it won't be worked in right now, it's still good to have. If the issue is closed it will likely never be revisited 😄

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants