Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ContainerFactory::clearOldContainers() can take an enormous amount of time on Windows #5825

Closed
dktapps opened this issue Oct 23, 2021 · 19 comments

Comments

@dktapps
Copy link
Contributor

dktapps commented Oct 23, 2021

Bug report

When analysing pmmp/PocketMine-MP@701a71a on Windows with dev deps installed, 8.0.11, opcache disabled, PHPStan takes a significant amount of time to do ... apparently nothing when running vendor/bin/phpstan analyse or vendor/bin/phpstan clear-result-cache.

The below video demonstrates this:

  • vendor/bin/phpstan clear-result-cache takes 5 seconds to do apparently nothing
  • vendor/bin/phpstan analyze (00:15 - 01:43) after clearing results cache takes a whopping 1min 10sec to do nothing whatsoever; the actual scan of the code takes only 17sec (01:26 - 01:43)
  • vendor/bin/phpstan analyze (01:46 - 01:53) with results cache takes a whole 7 seconds just to apparently load the results cache.

https://youtu.be/0xFON6kY6zo (might not be immediately available, still being processed by YT)

What exactly is going on here? This is really negatively impacting the UX for me.

The issue with pre-analysis taking several seconds (as seen in step 1 and 3) is an annoying problem I've seen for quite some time, but step 2 taking over 1 minute to do nothing visible is pretty extreme.

@mergeable
Copy link

mergeable bot commented Oct 23, 2021

This bug report is missing a link to reproduction on phpstan.org.

It will most likely be closed after manual review.

@dktapps
Copy link
Contributor Author

dktapps commented Oct 23, 2021

Machine specs for context:

Dell XPS 9710
i7 11800H @ 2.3 GHz x8
32GB DDR4
PCIe Gen4 SSD

So this is definitely not a problem with hardware.

@ondrejmirtes
Copy link
Member

Hi, it's happening on your machine so you need to debug it. You can clone phpstan-src adjacent to the project directory and analyse it with ../phpstan-src/bin/phpstan. That way you can edit the sources and see where the hangup is.

@dktapps
Copy link
Contributor Author

dktapps commented Oct 23, 2021

These hangs can be reproduced on basically any Windows machine. I've reproduced it myself on two different ones.

One thing to note is that it's an enormous amount faster in WSL2 on those same machines.

@dktapps
Copy link
Contributor Author

dktapps commented Oct 23, 2021

A quick Visual Studio profile is showing SplFileInfo::getRealPath() is taking a large amount of time, FilterIterator::next(). FilterIterator::next() alone accounts for almost 40% of the time spent by the main process, while SplFileInfo::getRealPath() accounts for over 20%. I suppose this is Windows' famously slow I/O talking. But a quick git grep would suggest the problem comes from PHPStan's dependencies and not PHPStan itself.

I don't have more detailed info since I don't have any PHP profiler right now.

@johnbillion
Copy link
Contributor

Does the same problem occur with other code scanning libraries such as PHP Code Sniffer, Psalm, or Phan?

@dktapps
Copy link
Contributor Author

dktapps commented Oct 23, 2021

Hard to say. I don't use any of those but I have noticed that PHP CS isn't very fast on PHPStan itself when running on Windows. But I don't have any baseline to compare it to.

@dktapps
Copy link
Contributor Author

dktapps commented Oct 23, 2021

xdebug profile of a run with result cache, and unchanged files:
image

The first thing that jumps out at me is that PHPStan\DependencyInjection\ContainerFactory::clearOldContainers() takes an extraordinarily large amount of time (43.7% of the time to just verify result cache and not actually analyse anything).

I suspect that this is caused by overpopulation of Temp/phpstan/cache/nette.configurator. Wiping out the Temp folder did seem to substantially improve performance, but still not to Linux levels.

Probably, a manifest file that lists the currently useful containers would be better than checking the ATime/CTime of all the files (certainly much faster on Windows, since it would be a single IOP, compared to potentially thousands).

@dktapps dktapps changed the title Taking an eternity to actually do anything ContainerFactory::clearOldContainers() can take an enormous amount of time on Windows Oct 24, 2021
@dktapps
Copy link
Contributor Author

dktapps commented Oct 24, 2021

I can see a variety of ways to optimise clearOldContainers(), but I think the real question that should be asked is this: Why the heck are there so many containers being generated? I'm not changing my configuration this frequently... Even just since deleting cache yesterday, there's now 846 files in nette.configurator.

@ondrejmirtes
Copy link
Member

@dktapps Diff them to see what's different.

@dktapps
Copy link
Contributor Author

dktapps commented Oct 24, 2021

The problems seem to be caused by two things:

  1. Stub revalidation on every run - A new container is generated to validate stubs because config.stubValidator.neon is added to the configuration. AnalyseApplication: Do not re-analyse stubs on every run phpstan-src#730 might reduce this problem since only one container would be generated instead of two if result cache was used.
  2. analysedPaths changing in the configuration causes the entire container to be regenerated. For me, something is doing a heck of a lot of this:
    image
    image

I don't know what is generating all these temp folders, but whatever it is isn't cleaning up after itself. Furthermore, because a different temp path is being used every time, a new container is generated every time also, causing even more pollution.

I suspected PhpStorm's phpstan plugin was causing this initially, but I wasn't able to find any code in either the plugin or PHPStan that was generating these folders. Perhaps someone else can shed some light on this.

@dktapps
Copy link
Contributor Author

dktapps commented Oct 24, 2021

I have to be honest, I find it really stupid that the whole container gets regenerated just because the analysedPaths changed. Is there a way to get around that?

@ondrejmirtes
Copy link
Member

This should help, please try dev-master: phpstan/phpstan-src@27df5cf

@dktapps
Copy link
Contributor Author

dktapps commented Oct 24, 2021

Thanks, I'd just made a similar change locally.

I wonder if it might be worth making more of those parameters dynamic. It shouldn't really change anything functionally - not really losing anything by passing parameters from variable instead of having them as constexpr in the container. Would improve performance overall when changing configurations since the container wouldn't need to keep getting regenerated.

@ondrejmirtes
Copy link
Member

This caused me to pull a lot of hair out so I don't want to experiment a lot in this area: phpstan/phpstan-src@e5c5e32

@dktapps
Copy link
Contributor Author

dktapps commented Oct 24, 2021

I see. I'll test the analysedPaths change and see how it goes.

@dktapps
Copy link
Contributor Author

dktapps commented Oct 25, 2021

The major performance hit I experienced is fixed by phpstan/phpstan-src@0f6245b, and once the cache ages beyond 2 days after updating, phpstan/phpstan-src@27df5cf ensures that the cache will become small again, making PHPStan once again nice and snappy.

@dktapps dktapps closed this as completed Oct 25, 2021
@dktapps
Copy link
Contributor Author

dktapps commented Nov 1, 2021

During updating today on PHPStan 1.0, I experienced another cache explosion, this time caused by ignoreErrors, since the entire ignoreErrors array is embedded into the container.

@github-actions
Copy link

github-actions bot commented Dec 3, 2021

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Dec 3, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants