[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]
Re: [tor-bugs] #13080 [Onionoo]: Evaluate using SonarQube to improve Onionoo's code quality
#13080: Evaluate using SonarQube to improve Onionoo's code quality
-------------------------+-----------------
Reporter: karsten | Owner:
Type: task | Status: new
Priority: normal | Milestone:
Component: Onionoo | Version:
Resolution: | Keywords:
Actual Points: | Parent ID:
Points: |
-------------------------+-----------------
Comment (by iwakeh):
Some thoughts (sort of leading into the opposite direction to reach the
same goal):
== What and Why
The decision about which quality metrics are really important,
for both the general Tor project and Onionoo in particular, should come
first.
In my opinion, the most important metric is test coverage,
b/c tests really document, what the code is supposed to do
and verify this once a test is in place. (Onionoo's tests are quite
helpful for me designing the client-api-protocol, but I also noticed
that there are some part not yet covered by tests.)
Design metrics (like cycle-detection, code complexity, etc.) should
be a top level concern, too.
After that I would put metrics along the lines of findbugs
with a focus on security vulnerabilities.
Metrics about style, 'comments per code line' or javadoc coverage might
not be as useful. Who wants to read javadoc like 'this is the string input
to
function xYz'? Unfortunately, such comments are 'rewarded' by these
metrics.
The Onionoo code base hardly contains any comments. In my opinion
that's fine and the comments I encountered were really worth reading.
== How (and Why)
When coding metrics are supposed to be enforced, it should be possible
to measure during the coding itself, i.e. having one or more ant tasks
for measuring coverage, dependencies, etc. would be convenient.
These can be easily integrated into many IDEs and are also useful
when only a basic editor and the command line are available.
In addition, results from these tasks could be integrated and visualized
in Sonar and/or some CI (Does Tor use any CI?)
== Integration, Visualization
When there are meaningful metrics these should be published using Sonar,
JeninsCi or ...
----
Is there an overview what other Tor projects use? What metrics and
visualization tools?
----
--
Ticket URL: <https://trac.torproject.org/projects/tor/ticket/13080#comment:1>
Tor Bug Tracker & Wiki <https://trac.torproject.org/>
The Tor Project: anonymity online
_______________________________________________
tor-bugs mailing list
tor-bugs@xxxxxxxxxxxxxxxxxxxx
https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-bugs