Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance issue in GitBuildVersionCore #800

Open
TFTomSun opened this issue Sep 3, 2022 · 7 comments
Open

Performance issue in GitBuildVersionCore #800

TFTomSun opened this issue Sep 3, 2022 · 7 comments

Comments

@TFTomSun
Copy link

TFTomSun commented Sep 3, 2022

We see recently very bad performance in our builds caused by GitBuildVersionCore in a solution with about 25 projects:

image

Each call of takes about 1.2 to 1.5 seconds:
image

What could be the root cause of that problem? Are there any network calls done or is the git height determined based on the local cloned repository?

We generate version.json files per project with longer lists of path filters. Could that cause problems?
image

@AArnott
Copy link
Collaborator

AArnott commented Sep 4, 2022

Are there any network calls done or is the git height determined based on the local cloned repository?

It's all done based on local git history.

We generate version.json files per project with longer lists of path filters

NB.GV does a lot of caching to prevent the git history walk from being repeated during a build, but when you have a separate version.json file per-project, each project has to do its own walk, compounding the time required.
I don't know how much path filters impact the build perf.

What version of Nerdbank.GitVersioning are you using? The 3.4+ versions default to a much faster all-managed implementation of git.

@TFTomSun
Copy link
Author

TFTomSun commented Sep 4, 2022

We use the latest 3.5.* version but it's the same behavior with 3.4. So I assume that it gets worse because of the increasing number of projects and/or path filters. Are 1.2 seconds for GetBuildVersionCore run a usual duration? I'd like to know whether the amount of GetBuildVersionCore executions are the problem or the duration of the executions.

We generate a per project version.json to increase the version for a project, when changes in the project folder or in one of its dependency have been made. We wanted to avoid that all projects in the repo get a new version even if they and their dependency were not touched. Is that approach not recommended?

@AArnott
Copy link
Collaborator

AArnott commented Sep 4, 2022

Are 1.2 seconds for GetBuildVersionCore run a usual duration?

That's definitely on the long side of the curve, but it's not completely crazy. I just tried on a large repo and it took 395ms for one call. It gets slower with the 'version height', so rev'ing the version in version.json will reset it to its 'minimum' for your particular project, path filters, etc.

We generate a per project version.json to increase the version for a project, when changes in the project folder or in one of its dependency have been made. We wanted to avoid that all projects in the repo get a new version even if they and their dependency were not touched. Is that approach not recommended?

Personally, I have never bothered with that. It's very difficult to maintain (it sounds like you have a tool that helps with that, which is great) but mostly it's a hassle that I haven't seen evidence to say that it pays off for customers. If I have a repo that builds a bunch of packages, I tend to find it more useful to push all the packages together and everyone can readily see that this set of packages go together because all their version #s match exactly. There are certainly arguments on both sides of this.

By having just one version.json file in the repo (and no path filters), you become compatible with the most effective build perf optimizations that drop GetBuildVersionCore impact on your build to almost zero.

@AArnott
Copy link
Collaborator

AArnott commented Sep 4, 2022

#114 has more history in our perf measurements and opportunities, FYI.

@TFTomSun
Copy link
Author

TFTomSun commented Sep 5, 2022

thanks, I read the whole history and it seems that there was already a good solution to the problem. Just wonder what happened to this MR #499 (comment)

@AArnott
Copy link
Collaborator

AArnott commented Sep 5, 2022

Good point. That PR may have slipped through the cracks. I've created another.

@TFTomSun
Copy link
Author

TFTomSun commented Sep 5, 2022

thanks... just for tracking purposes, here's a link to the new PR #801

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants