-
Task
-
Resolution: Done
-
Neutral
-
None
-
None
-
None
-
None
-
-
Empty show more show less
As we nowdays automatically deploy to the SaaS anything we merge to master, we need to better check the quality of the code. We can't enforce something like a JaCoCo threshold due to amount of untested code. What we're looking for instead is PR-level validation. Here are a couple of options:
bitbucket-code-coverage
The easiest to roll out solution I've seen. Using JaCoCo reports as material, it annotates the code to highlight tested/untested lines of code. This only requires a Bitbucket server plugin and running the following commands:
mvn clean verify -Penable-code-coverage jacoco:report
mvn com.atlassian.bitbucket:code-coverage-maven-plugin:publish \
-Dbitbucket.url=https://git.magnolia-cms.com \
-Dbitbucket.user=mmichel \
-Dbitbucket.password=$LDAP_PSWD \
-Dbitbucket.commit.id=8e3839923cb1434dcf320e1c9115676d206ef081 \
-Dcoverage.file=magnolia-core/target/site/jacoco/jacoco.xml \
-Dcoverage.format=JACOCO
Source: https://bitbucket.org/atlassian/bitbucket-code-coverage/src/master/
Problem: this isn't helpful in the sense that it doesn't validate the PR with a yes/no answer to the question of whether the quality is sufficient. As agreed to in the March 21st' architecture meeting, it's anyway another tool in the toolbox, we might as well give it a try.
Bitbucket code insights
Already built into Bitbucket is an API to which we can post checks of any nature about the code: https://confluence.atlassian.com/bitbucketserver/code-insights-966660485.html
This is (I assume) what is used by professional tools such as Sonarqube. However, because it's a REST API, nothing prevents us from pushing simpler reports/metrics into it: test coverage, Snyk report, style, etc.
The downside is that it will require inhouse development work & maintenance. It will also never match Sonarqube's own insights.
jacoco-report GitHub action
Using the Jacoco data and a bit of tinkering, as well as the Bitbucket code insights, it is possible to calculate the coverage for the PR's changed files. This coverage can then be compared to a limit, and fail the PR if deemed insufficient.
See https://github.com/Madrapps/jacoco-report as well as attached screenshot for Bitbucket Server rendering
curl -u mmichel:$LDAP_PSWD --request PUT 'https://git.magnolia-cms.com/rest/insights/latest/projects/PLATFORM/repos/main/commits/8e3839923cb1434dcf320e1c9115676d206ef081/reports/test-report-1' \ --header 'Content-Type: application/json' \ --data-raw '{ "title": "Coverage PR quality check", "details": "This checks evaluates whether coverage on changed files is sufficient.", "result": "FAIL", "data": [ { "title": "Number of analysed files", "type": "NUMBER", "value": 1 }, { "title": "Coverage on changed files", "type": "PERCENTAGE", "value": 80 }, { "title": "Safe to merge?", "type": "BOOLEAN", "value": false } ], "reporter": "Foundation" }'
Sonarqube
We have already looked into rolling Sonarqube. It's likely a popular name on the market for a good reason, however, we've found it expensive per line of code. Also, it will need to be adapted to our code style.
- would the price be lower if we ran it ourselves? As we use Bitbucket server, we have no choice but to deploy it ourselves, actually.
- can it run exclusively on pull requests, dramatically lowering the lines of code it's scanning, vastly reducing the price? Yes, but no impact pricing.
- that being said I have pulled our most critical repositories (as defined by: is a dependency of SaaS webapp) and the total lines of java code valid for Sonar is 230k, which is only $100 a month. We would likely start with the $12/month plan, but, even if we scaled, it would be totally acceptable (assuming developers like the added value)
Conclusion: we roll out SonarQube.
- is related to
-
MGNLTEST-215 Setup SonarQube instance with fargate
- Closed
-
MGNLTEST-213 Enable SonarQube to use (on some) "core" projects
- Closed