Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fabric.print only works on sys.stderr, does not print inference result #1384

Open
lastmjs opened this issue May 4, 2024 · 1 comment
Open
Labels
question Further information is requested

Comments

@lastmjs
Copy link

lastmjs commented May 4, 2024

I've just started playing with litgpt, and I first started out with some basic inference with just the CPU I believe, since I hadn't fully restarted my computer to get the drivers to work. Inference would print in the console.

Once I restarted my computer I don't believe I was able to get inference printing again. Other logging would print, but not inference. I finally cloned the repo and figured out that file=sys.stderr needs to be set in order for the inference to actually print to my terminal in libgpt/generate/base.py.

I am using Ubuntu 22.04 and I've tried with multiple versions of Python with and withou a virtual env.

This is where I fixed the issue by adding file=sys.stderr:

https://github.com/Lightning-AI/litgpt/blob/main/litgpt/generate/base.py#L238

I am not sure why enabling my drivers and having GPU inference seemed to cause this problem.

@carmocca
Copy link
Contributor

carmocca commented May 6, 2024

The script is designed to print the inference results to stdout and everything else to stderr, in case you want to pipe them separately.

There might be something wrong with your environment, do regular prints in a script not work for you?

@carmocca carmocca added the question Further information is requested label May 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants