-
Notifications
You must be signed in to change notification settings - Fork 388
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement solution based merging for data collector #1307
Comments
Non-archived link (They merged into vstest): https://github.com/microsoft/vstest/blob/main/docs/RFCs/0031-Test-Run-Attachments-Processing.md |
This issue is stale because it has been open for 3 months with no activity. |
Relevant |
This issue is stale because it has been open for 3 months with no activity. |
Relevant |
This issue is stale because it has been open for 3 months with no activity. |
And still relevant |
I prototyped something here and would work on this topic. I will just add my thoughts here and make some proposal how to implement it. The solution wide coverage report could be the new default behaviour for the There are at least two rough concepts I can think of:
I would now start with implementing option 1 until I have some feedback. I will get back here once I have a more production ready version of this or am facing some issues with it. What do you think? @MarcoRossignoli @Bertk @petli |
I would simplify and have by default always the merge of all "projects"( if you run sln or single project In this way we reuse the merge of jsons before the convert to the final report type. Users usually want's one report to upload to the UX report systems. |
For me, this is no longer as critical as it was prior thanks to improvements in The one thing that is off.. though I specify coverage thresholds in My coverage reporting collection now looks like: # run:
# dotnet tool restore
# prior to running this file
line_threshold=95
branch_threshold=95
#remove prior test coverage results
rm -rf ./TestResults
rm -rf ./reports/coverage
#Generate Cobertura files, one per test assembly
dotnet test --no-restore --settings CodeCoverage.runsettings --collect "Code Coverage;Format=cobertura" --logger:"junit;LogFilePath=..\reports\unit-tests\;LogFileName={assembly}.test-result.xml;MethodFormat=Class;FailureBodyFormat=Verbose" --results-directory ./TestResults
cobertura_file=./TestResults/coverage.cobertura.xml
# Merge the Cobertura files into a single file
dotnet coverage merge --output-format cobertura --threshold 100 --threshold-type branch --output ${cobertura_file} "./TestResults/**/*.cobertura.xml"
#Generate a consolidated report, Cobertura is for Gitlab reporting
dotnet reportgenerator "-reports:./TestResults/coverage.cobertura.xml" "-targetdir:reports/coverage" "-reporttypes:Html;Cobertura;CodeClimate"
# Extract line-rate and branch-rate from the Cobertura file
line_rate=$(xmllint --xpath "string(/coverage/@line-rate)" "$cobertura_file")
branch_rate=$(xmllint --xpath "string(/coverage/@branch-rate)" "$cobertura_file")
# Convert line-rate and branch-rate to percentages
line_coverage=$(echo "$line_rate * 100" | bc)
branch_coverage=$(echo "$branch_rate * 100" | bc)
flag=0
# Check if the line coverage meets the threshold
line_result=$(echo "$line_coverage < $line_threshold" | bc -l)
if [ "$line_result" -eq 1 ]; then
echo "Line coverage ($line_coverage%) is below the threshold ($line_threshold%)."
flag=1
fi
# Check if the branch coverage meets the threshold
branch_result=$(echo "$branch_coverage < $branch_threshold" | bc -l)
if [ "$branch_result" -eq 1 ]; then
echo "Branch coverage ($branch_coverage%) is below the threshold ($branch_threshold%)."
flag=1
fi
if [ $flag -eq 0 ]; then
echo "Coverage meets the thresholds. Line coverage: ${line_coverage}% Branch coverage: ${branch_coverage}%"
fi
exit $flag
|
VSTest allows to post process artifacts, now we can implement solution wide report merge.
RFC https://github.com/microsoft/vstest-docs/blob/main/RFCs/0031-Test-Run-Attachments-Processing.md#how-to-register-an-idatacollectorattachmentprocessor
Contributes to #357
The text was updated successfully, but these errors were encountered: