Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reuse coverage data from test run (Second try) #214

Closed
mschnelte opened this issue Nov 26, 2021 · 12 comments
Closed

Reuse coverage data from test run (Second try) #214

mschnelte opened this issue Nov 26, 2021 · 12 comments

Comments

@mschnelte
Copy link

Hi everyone,

I would like to revisit the idea already pitched in #57
Using coverlet in the runsettings did not work for me because the Known Issues hit me.

However the "native" code coverage from microsoft is making some good progress. They have just a few days ago released a new preview that outputs Cobertura files directly

For me taking the coverage output from the test run instead of re-running the whole test project again would be a huge improvement. We have some integration tests in our projects and it takes quite a while for them to finish. In its current state FCC is not really usable. Taking the coverage results from the runsettings ResultsDirectory would mean that:

  • I can select only a few tests to run instead of whole test projects and
  • tests are only run once.

Can you maybe comment why you have abandoned the proposed idea in #57?
I have some modified version of FCC that takes the coverage result from the last test run and it works pretty fine and enables me to use it even with slow running tests.

Here is the runsettings I am using: (i have downloaded the preview package into c:\temp\ms.codecoverage)

<RunSettings>
  <RunConfiguration>
    <ResultsDirectory>_place your testresult directory here_</ResultsDirectory>
    <TargetPlatform>X64</TargetPlatform>
    <TargetFrameworkVersion>.NETCoreApp,Version=v3.1</TargetFrameworkVersion>
    <TestAdaptersPaths>C:\Temp\ms.codecoverage\</TestAdaptersPaths>
    <DesignMode>False</DesignMode>
    <CollectSourceInformation>False</CollectSourceInformation>
  </RunConfiguration>
  <DataCollectionRunSettings>
    <DataCollectors>
      <DataCollector friendlyName="Code Coverage" enabled="True">
        <Configuration>
            <Format>Cobertura</Format>
        </Configuration>
      </DataCollector>
    </DataCollectors>
  </DataCollectionRunSettings>
  <LoggerRunSettings>
    <Loggers>
      <Logger friendlyName="Console" uri="logger://microsoft/TestPlatform/ConsoleLogger/v1" assemblyQualifiedName="Microsoft.VisualStudio.TestPlatform.CommandLine.Internal.ConsoleLogger, vstest.console, Version=15.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" codeBase="C:\Program Files\dotnet\sdk\5.0.400\vstest.console.dll" enabled="True" />
    </Loggers>
  </LoggerRunSettings>
</RunSettings>
@mschnelte
Copy link
Author

Sorry I just realized that I probably should have put this into the discussion area instead of into issue. Feel free to close it and move it if you prefer.

@tonyhallett
Copy link
Collaborator

Like this ?

A solution with 2 test projects that have run settings files with the appropriate Data Collector and path.
One uses a solution wide run settings proving a full path to the results directory. The other specifies particular run settings with MSBuild property. This has a relative path to the results directory.

CoberturaTest.-.Microsoft.Visual.Studio.-.Experimental.Instance.2022-02-13.18-23-51.mp4

New code.

If there are run settings for the projects with tests being run then we skip normal FCC execution and collect the test result directories. When the tests have finished the cobertura files are found and the FCCEngine does report generation and coverage lines as per usual. Unfortunately the recording cannot show the report appearing in the Fine Code Coverage tool window.

FineCodeCoverage.Running.-.Microsoft.Visual.Studio.2022-02-13.18-28-11.1.1.mp4

@FortuneN
I think that this should be included. Only available to .Net Core.
https://www.nuget.org/packages/Microsoft.CodeCoverage/17.1.0-release-20220113-05

Benefits
Microsoft providing the instrumentation. ( I suspect that this is what used to be an enterprise service )
No need to copy dlls.
None of this #225
Selective coverage - can run subset of project tests reducing time taken for tests and coverage when you are only interested in specific files..

If you are still looking at doing your own instrumentation then you might want to download the nuget and look inside !
image

( OpenCover is now in archive mode )

@mschnelte
Copy link
Author

Glad to see that you want to pick this up! :-)

I have already implemented this in my own branch and am using it actively. I wanted to wait for feedback from you guys first before acting on pushing anything to the repo.

On top of what you already wrote, I have implemented automatic adding of runsettings with the appropriate test adapter, in case no runsettings are present so that the user does not need to worry about that but still gets the better performance. Much more user friendly this way if you ask me. This behaviour can be enabled via a FineCodeCoverage option.

Regarding .net core only: I was able to make it work with .net framework projects as well by copying some needed dlls to the output directory. This is done automatically on project build as well by the code in my branch. In fact I eat my own dogfood (well your dogfood to be more precise ;-) ) and tested it with FineCodeCoverage's own unit tests and it worked fine apart from the issue with the reportGenerator.

I could push my changes to a branch if you grant me contributor rights, so that you could have a look/pick up the changes if you want. I am pretty sure it would need to be polished as I did some quick coding just to get it running.

There used to be one minor issue in the report generator that I reported to Daniel when using the Microsoft collector.
I have not followed up on that discussion with Daniel yet, as it is a minor shortcoming and I simply did not find the time. But I just now looked at it and it seems as if Daniel has fixed it already in December. I have not verified it yet. In that case a version containing the fix would need to be used.

@tonyhallett
Copy link
Collaborator

tonyhallett commented Feb 13, 2022

Glad to see that you want to pick this up! :-)

I was unaware of the new dotnet functionality. Similar to what I had suggested in #57 but more relevant now I think. I think it is essential but Fortune decides ultimately.

On top of what you already wrote, I have implemented automatic adding of runsettings with the appropriate test adapter, in case no runsettings are present so that the user does not need to worry about that but still gets the better performance.

Interested in that. This is done at test time rather than a vs command ?

I was able to make it work with .net framework projects.

Interesting. Where did you find information on this solution ? On vstest repo ? How does it work ?

I could push my changes to a branch if you grant me contributor rights, so that you could have a look/pick up the changes if you want.

That's up to @FortuneN . You could email me your fork ?

Daniel has fixed it already in December.

We are currently fixed with version 4.7.1.

Have you changed any of the report generation code ?

@tonyhallett
Copy link
Collaborator

Interested in that. This is done at test time rather than a vs command ?

Is there a runsettings setter that can be accessed with reflection that is respected after the TestContainerDiscoverer has finished ?

Just notice that there is some information in reflection that can save a bit of processing in the code that I wrote this afternoon.

@tonyhallett
Copy link
Collaborator

Interested in that. This is done at test time rather than a vs command ?

Build step ?

@tonyhallett
Copy link
Collaborator

adding of runsettings with the appropriate test adapter

I assume you are including the nuget as a zipped tool ?

Are you excluding the test adapter as I found that there were results for NUnit3.TestAdapter ?
Are you using fine code coverage exclusions / inclusions ?

https://docs.microsoft.com/en-us/visualstudio/test/customizing-code-coverage-analysis?view=vs-2022

@mschnelte
Copy link
Author

Tony, all the questions are more or less answered once you have a glance at the code. I have not pushed my code yet to a fork on github.

This issue was in hibernation for month - so I think we can wait a bit more until we hear back from Fortune.
I will not put work into this until then. I do not want to work on sth that might not be picked up. I hope you understand.

@FortuneN
Copy link
Owner

FortuneN commented Feb 14, 2022

I'm happy to accept more contributors.
There more the marrier.

  1. Is it not possible to create pull request? (We have accepted some in the past)
  2. @tonyhallett You are now the driving force behind this project. You tell me who to add as a contributor and i will; no questions asked

@tonyhallett
Copy link
Collaborator

tonyhallett commented Feb 14, 2022

Ok cool so we will add the new functionality.

You tell me who to add as a contributor and i will; no questions asked

I would like to determine all the outstanding tasks before opening up to more contributors - no offence @mschnelte .
e.g Refactoring ( better async / decouple from vs ), better ( much better ) tests and more of them - yes I did write them ! React report. Add new build task that I wrote some time ago for the zips. Correct an issue that exists in my github actions that we use.....

Once I have done that, it might be in a month or so, then accept additional ideas and contributor/s.

Is it not possible to create pull request?

@mschnelte I will not be creating any more pull requests of my own until we have added this functionality. So please create a pull request or if the two repos have diverged too much then just send me your code.

@tonyhallett
Copy link
Collaborator

I will not be creating any more pull requests

Had to resolve issues. I have one more to do this afternoon.

@mschnelte
Copy link
Author

I just created a fork.
I will sync the code with the latest changes and create a pull request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants