Documentation Metrics


All metrics used in the quality model are described thereafter, with useful information and references. They are classified according to their source. Please note also that several other metrics may be retrieved but not used in the quality model.



  • Average number of attributes ( ATTRS )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Average number of attributes defined in a class. This can be compared to the total number of operands (TOPD) considered at the class level, and is representative of the data complexity of the class (not including the complexity of methods).

  • Average number of public attributes ( ATTRS_PUBLIC )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Average number of attributes defined in classes with a public modifier.

    Publicness of attributes is meaningful for reusability: if there are a lot of public attributes available, it may be more time-consuming to find the right one. Also, good practices for object-oriented programming recommend to use getters and setters in most cases.

  • Percentage of branches covered by tests ( BRANCH_COVERAGE )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 0 ≤ 2 < 15 ≤ 3 < 36.5 ≤ 4 < 73.3 ≤ 5

    The percentage of branches that is exercised by the tests. The SonarQube measure used for this purpose is branch_coverage.

    On a given line of code, Line coverage simply answers the following question: Has this line of code been executed during the execution of the unit tests?

  • Number of classes ( CLASSES )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 100 ≤ 2 < 1000 ≤ 3 < 10000 ≤ 4 < 100000 ≤ 5

    The number of Java classes, including nested classes, interfaces, enums and annotations.

    The SonarQube measure used for this purpose is classes. This metric is often used as rule-of-thumb indicator for the size of software, and to make other measures relatives.

    See also SonarQube's definitions for size metrics: http://docs.sonarqube.org/display/SONAR/Metric+definitions#Metricdefinitions-Size.

  • Comment rate ( COMMENT_LINES_DENSITY )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 10 ≤ 2 < 15 ≤ 3 < 20 ≤ 4 < 30 ≤ 5

    The ratio of source lines of code on the number of comment lines of code.

    Density of comment lines = Comment lines / (Lines of code + Comment lines) * 100. With such a formula, 50% means that the number of lines of code equals the number of comment lines and 100% means that the file only contains comment lines. The SonarQube measure used for this purpose is comment_lines_density.

    See also SonarQube's page on comment lines metrics: http://docs.sonarqube.org/display/SONAR/Metrics+-+Comment+lines.

  • Number of downloads on the web site ( DL_REPO_1M )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The number of downloads on the web site of the project (download page) during last three months. Example of download page (for CDT) : http://www.eclipse.org/downloads/ .

  • Number of downloads on update site ( DL_UPDATE_SITE_1M )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The number of downloads on the update site of the project during last three months. Is computed using successful and failed installs from marketplace.

  • Cloning density ( DUPLICATED_LINES_DENSITY )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 50 ≤ 2 < 40 ≤ 3 < 30 ≤ 4 < 10 ≤ 5

    The amount of duplicated lines in the code, divided by the number of lines. This is expressed as a percentage; The SonarQube metric used for this purpose is duplicated_lines_density.

    See also SonarQube's definition for duplications: http://docs.sonarqube.org/display/SONAR/Metric+definitions#Metricdefinitions-Duplications. There is a longer description of the code cloning algorithm here: http://docs.sonarqube.org/display/SONAR/Duplications.

  • Number of files ( FILES )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 100 ≤ 2 < 1000 ≤ 3 < 10000 ≤ 4 < 100000 ≤ 5

    The number of code files (e.g. with a .java for the Java language) in the source code hierarchy. The SonarQube measure used for this purpose is files. This metric is often used as rule-of-thumb indicator for the size of software, and to make other measures relatives.

  • Number of functions ( FUNCTIONS )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 100 ≤ 2 < 1000 ≤ 3 < 10000 ≤ 4 < 100000 ≤ 5

    The number of functions defined in the source files.

    The SonarQube measure used for this purpose is functions. This metric is often used as rule-of-thumb indicator for the size of software, and to make other measures relatives.

    See also SonarQube's definitions for size metrics: http://docs.sonarqube.org/display/SONAR/Metric+definitions#Metricdefinitions-Size. More details on the metric: http://docs.sonarqube.org/display/SONAR/Metrics+-+Methods.

  • Average Cyclomatic Complexity ( FUNCTION_COMPLEXITY )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 5.3 ≤ 2 < 3.5 ≤ 3 < 2.8 ≤ 4 < 2.3 ≤ 5

    The average number of execution paths found in functions.

    Basically, the counter is incremented every time the control flow of the function splits, with any function having at least a cyclomatic number of 1. The sum of cyclomatic numbers for all functions is then divided by the number of functions. In SonarQube the measure is function_complexity.

    The cyclomatic number is a measure borrowed from graph theory and was introduced to software engineering by McCabe in [McCabe1976]. It is defined as the number of linearly independent paths that comprise the program. To have good testability and maintainability, McCabe recommends that no program modules (or functions as for Java) should exceed a cyclomatic number of 10. It is primarily defined at the function level and is summed up for higher levels of artefacts.

    See also Wikipedia's entry on cyclomatic complexity: http://en.wikipedia.org/wiki/Cyclomatic_complexity.

    See also SonarQube's definition for code complexity: http://docs.sonarqube.org/display/SONAR/Metrics+-+Complexity. There is also a discussion about its meaning here: http://www.sonarqube.org/discussing-cyclomatic-complexity/

    See also the Maisqual wiki for more details on complexity measures: http://maisqual.squoring.com/wiki/index.php/Category:Complexity_Metrics.

  • IP log code coverage ( IP_LOG_COV )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Percentage of contributions covered by an appropriate IP log.

    Eclipse foundation has a very strict IP policy and this measure should always be 100%.

  • Defect density ( ITS_BUGS_DENSITY )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 3.36826347305 ≤ 2 < 0.0755602867593 ≤ 3 < 0.0335380213652 ≤ 4 < 0.009367219332 ≤ 5

    Ratio of the total number of tickets in the issue tracking system to the size of the source code in KLOC, at the time of the data retrieval.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Source code is all source code found in the source code management repository.

  • ITS issues changed lifetime ( ITS_CHANGED )

    Provided by: EclipseIts

    Used by:

    Scale: 1 < 2 ≤ 2 < 4 ≤ 3 < 9.75 ≤ 4 < 80 ≤ 5

    Number of updates on issues during the overall time range covered by the analysis.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Update means an operation on a ticket changing its state, or adding information. Opening and closing a ticket are considered as updates. Identities of updaters are the character strings found in the corresponding field in the ticket information. Time range is measured as a calendar month period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • ITS issues changers lifetime ( ITS_CHANGERS )

    Provided by: EclipseIts

    Used by:

    Scale: 1 < 2 ≤ 2 < 4 ≤ 3 < 9.75 ≤ 4 < 80 ≤ 5

    Number of distinct identities updating tickets during the overall time range covered by the analysis.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Update means an operation on a ticket changing its state, or adding information. Opening and closing a ticket are considered as updates. Identities of updaters are the character strings found in the corresponding field in the ticket information. Time range is measured as a calendar month period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • ITS issues closed lifetime ( ITS_CLOSED )

    Provided by: EclipseIts

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Number of issues closed during the overall time range covered by the analysis.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Opened means that the ticket information was submitted to the issue traking system for the first time for that ticket. Identities of openers are the character strings found in the corresponding field in the ticket information. Time range is measured as a calendar month period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included). Number of lines are those calculated at the end of the period of analysis.

  • Number of issue closed during last month ( ITS_CLOSED_30 )

    Provided by: EclipseIts

    Used by: QM_ITS

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Number of distinct identities opening tickets during the last month in the issue tracking system.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Opened means that the ticket information was submitted to the issue traking system for the first time for that ticket. Identities of openers are the character strings found in the corresponding field in the ticket information. Time range is measured as a calendar month period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included). Number of lines are those calculated at the end of the period of analysis.

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ITS issue closers lifetime ( ITS_CLOSERS )

    Provided by: EclipseIts

    Used by:

    Scale: 1 < 2 ≤ 2 < 4 ≤ 3 < 9.75 ≤ 4 < 80 ≤ 5

    Number of distinct identities closing tickets during during the overall time range covered by the analysis.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Update means an operation on a ticket changing its state, or adding information. Opening and closing a ticket are considered as updates. Identities of updaters are the character strings found in the corresponding field in the ticket information. Time range is measured as a calendar month period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • ITS closers ( ITS_CLOSERS_30 )

    Provided by: EclipseIts

    Used by: QM_ITS

    Scale: 1 < 2 ≤ 2 < 4 ≤ 3 < 9.75 ≤ 4 < 80 ≤ 5

    Number of distinct identities closing tickets during the last month in the issue tracking system.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Update means an operation on a ticket changing its state, or adding information. Opening and closing a ticket are considered as updates. Identities of updaters are the character strings found in the corresponding field in the ticket information. Time range is measured as a calendar month period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ITS issues opened lifetime ( ITS_OPENED )

    Provided by: EclipseIts

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Number of issues opened during the overall time range covered by the analysis.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Opened means that the ticket information was submitted to the issue traking system for the first time for that ticket. Identities of openers are the character strings found in the corresponding field in the ticket information. Time range is measured as a calendar month period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included). Number of lines are those calculated at the end of the period of analysis.

  • ITS issue openers lifetime ( ITS_OPENERS )

    Provided by: EclipseIts

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Number of distinct identities opening tickets during the overall time range covered by the analysis.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Opened means that the ticket information was submitted to the issue traking system for the first time for that ticket. Identities of openers are the character strings found in the corresponding field in the ticket information. Time range is measured as a calendar month period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included). Number of lines are those calculated at the end of the period of analysis.

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ITS updates ( ITS_UPDATES_3M )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 4 ≤ 2 < 13 ≤ 3 < 37 ≤ 4 < 596 ≤ 5

    Number of updates to tickets during the last three months, in the issue tracking system.

    The subset of tickets considered from the issue tracking system are those specified in the project documentation, for example by specifying a tracker or a project name appearing in ticket data. Update means an operation on a ticket changing its state, or adding information. Opening and closing a ticket are considered as updates. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • Number of jobs ( JOBS )

    Provided by: Hudson

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The total number of jobs defined on the Hudson engine.

  • Number of failed jobs ( JOBS_FAILED_1W )

    Provided by: Hudson

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The number of jobs that failed during last week on the Hudson engine.

  • Number of green jobs ( JOBS_GREEN )

    Provided by: Hudson

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The number of green(successful) jobs on the Hudson engine.

    Green (or blue) jobs in Hudson define successful builds.

  • Number of red jobs ( JOBS_RED )

    Provided by: Hudson

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The number of red (failed) jobs on the Hudson engine.

    Red jobs in Hudson define failed builds.

  • Number of yellow jobs ( JOBS_YELLOW )

    Provided by: Hudson

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The number of yellow (unstable) jobs on the Hudson engine.

    Yellow jobs in Hudson define unstable builds. According to Hudson's documentation, a build is unstable if it was built successfully and one or more publishers report it unstable. For example if the JUnit publisher is configured and a test fails then the build will be marked unstable.

  • License identification ( LIC_IDENT )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Is there a list of the different licences used in the product?

    This measure may use an external tool like Ninka (ninka.turingmachine.org) to identify the licence of components used in the project. Another way would be to identify a file named LICENCES in the root folder.

  • Percentage of lines of code covered by tests ( LINE_COVERAGE )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 0 ≤ 2 < 20 ≤ 3 < 48.89 ≤ 4 < 87.7 ≤ 5

    The percentage of source lines of code that is exercised by the tests. The SonarQube measure used for this purpose is line_coverage.

  • Depth of Inheritance Tree ( MDIT )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The maximum depth of inheritance tree of a class within the inheritance hierarchy is defined as the maximum length from the considered class to the root of the class hierarchy tree and is measured by the number of ancestor classes. In cases involving multiple inheritance, the MDIT is the maximum length from the node to the root of the tree [Chidamber1994].

    A deep inheritance tree makes the understanding of the object-oriented architecture difficult. Well structured OO systems have a forest of classes rather than one large inheritance lattice. The deeper the class is within the hierarchy, the greater the number of methods it is likely to inherit, making it more complex to predict its behavior and, therefore, more fault-prone [Chidamber1994]. However, the deeper a particular tree is in a class, the greater potential reuse of inherited methods [Chidamber1994].

    See also SonarQube's page on the depth in tree: http://docs.sonarqube.org/display/SONAR/Metrics+-+Depth+in+Tree

  • Average number of methods ( METHS )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Average number of methods defined in the class. This can be compared to the total number of operators (TOPT) considered at the class level.

  • Average number of public methods ( METHS_PUBLIC )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Average number of methods defined in classes with a public modifier.

    Publicness of methods is meaningful for reusability: if there are a lot of public methods available, it may be more time-consuming to find the right one.

  • Number of favourites on the Marketplace ( MKT_FAV )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 10 ≤ 2 < 30 ≤ 3 < 70 ≤ 4 < 300 ≤ 5

    The number of favourites on the Eclipse Marketplace for the project.

    This measure uses the MarketPlace REST API to retrieve the number of registered users who marked this project as a favourite. If the project is not registered on the Marketplace, the metric should be zero.

  • Number of successfull installs on the Marketplace ( MKT_INSTALL_SUCCESS_1M )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 100 ≤ 2 < 500 ≤ 3 < 1000 ≤ 4 < 10000 ≤ 5

    The number of successful installs for the project as reported on the Marketplace during the last three months.

    This measure uses the list of successful installs available on the Marketplace for every registered project. Here is an example of log page, metrics tab. If the project is not registered on the Marketplace, the metric should be zero.

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ML authors ( MLS_SENDERS_30 )

    Provided by: EclipseMls

    Used by: QM_DIVERSITY

    Scale: 1 < 1.5 ≤ 2 < 3 ≤ 3 < 7.5 ≤ 4 < 26 ≤ 5

    Number of distinct senders for messages dated during the last month, in developer mailing list archives.

    Developer mailing list is the list or lists considered as 'for developers' in the project documentation. The date used is the mailing list server date, as stamped in the message. Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included). Distinct senders are those with distinct email addresses. Email addresses used are the strings found in 'From:' fields in messages.

    Note that developer communications are measured through the mailing list, and user communications are measured through the forums.

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ML posts ( MLS_SENT_30 )

    Provided by: EclipseMls

    Used by: QM_ACTIVITY

    Scale: 1 < 2 ≤ 2 < 6 ≤ 3 < 22 ≤ 4 < 103 ≤ 5

    Number of messages dated during the last month, in developer mailing list archives.

    Developer mailing list is the list or lists considered as 'for developers' in the project documentation. The date used is the mailing list server date, as stamped in the message. Time range is measured as one calendar month period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

    Note that developer communications are measured through the mailing list, and user communications are measured through the forums.

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ML sent responses ( MLS_SENT_RESPONSE )

    Provided by: EclipseMls

    Used by: QM_SUPPORT

    Scale: 1 < 2 ≤ 2 < 6 ≤ 3 < 22 ≤ 4 < 103 ≤ 5

    Number of responses (i.e. replies in a thread) in developer mailing list archives.

    Threads are identified using 'In-Reply-To' message headers. Developer mailing list is the list or lists considered as 'for developers' in the project documentation.

    Note that developer communications are measured through the mailing list, and user communications are measured through the forums.

  • ML threads ( MLS_THREADS )

    Provided by: EclipseMls

    Used by: QM_SUPPORT

    Scale: 1 < 2 ≤ 2 < 6 ≤ 3 < 22 ≤ 4 < 103 ≤ 5

    Number of threads (i.e. an email and its replies) in developer mailing list archives.

    Threads are identified using 'In-Reply-To' message headers. Developer mailing list is the list or lists considered as 'for developers' in the project documentation.

    Note that developer communications are measured through the mailing list, and user communications are measured through the forums.

  • Number of non-conformities for analysability ( NCC_ANA )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The total number of violations of rules that impact analysability in the source code.

    See also the page on practices for a list of checked rules and their associated impact on quality characteristics.

  • Average number of non-conformities for analysability ( NCC_ANA_IDX )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 0.5 ≤ 3 < 0.3 ≤ 4 < 0.1 ≤ 5

    The total number of violations of rules that impact analysability in the source code, divided by the number of thousands of lines of code.

    See also the page on practices for a list of checked rules and their associated impact on quality characteristics.

  • Number of non-conformities for changeability ( NCC_CHA )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The total number of violations of rules that impact changeability in the source code.

    See also the page on practices for a list of checked rules and their associated impact on quality characteristics.

  • Average number of non-conformities for changeability ( NCC_CHA_IDX )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 3 ≤ 2 < 1 ≤ 3 < 0.5 ≤ 4 < 0.3 ≤ 5

    The total number of violations of rules that impact changeability in the source code, divided by the number of thousands of lines of code.

    See also the page on practices for a list of checked rules and their associated impact on quality characteristics.

  • Number of non-conformities for reliability ( NCC_REL )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The total number of violations of rules that impact reliability in the source code.

    See also the page on practices for a list of checked rules and their associated impact on quality characteristics.

  • Average number of non-conformities for reliability ( NCC_REL_IDX )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 0.5 ≤ 3 < 0.3 ≤ 4 < 0.1 ≤ 5

    The total number of violations of rules that impact reliability in the source code, divided by the number of thousands of lines of code.

    See also the page on practices for a list of checked rules and their associated impact on quality characteristics.

  • Number of non-conformities for reusability ( NCC_REU )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The total number of violations of rules that impact reusability in the source code.

    See also the page on practices for a list of checked rules and their associated impact on quality characteristics.

  • Average number of non-conformities for reusability ( NCC_REU_IDX )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 3 ≤ 2 < 1 ≤ 3 < 0.5 ≤ 4 < 0.3 ≤ 5

    The total number of violations of rules that impact reusability in the source code, divided by the number of thousands of lines of code.

    See also the page on practices for a list of checked rules and their associated impact on quality characteristics.

  • Source Lines Of Code ( NCLOC )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 915828 ≤ 2 < 198290.25 ≤ 3 < 97961.5 ≤ 4 < 38776.25 ≤ 5

    Number of physical lines that contain at least one character which is neither a whitespace or a tabulation or part of a comment. This is mapped to SonarQube's ncloc metric.

    See also SonarQube's definition for size metrics: http://docs.sonarqube.org/display/SONAR/Metric+definitions#Metricdefinitions-Size. More details can be seen here: http://docs.sonarqube.org/display/SONAR/Metrics+-+Lines+of+code

  • Maximum depth of nesting ( NEST )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The maximum depth of nesting counts the highest number of imbricated code (including conditions and loops) in a function.

    Deeper nesting threatens understandability of code and induces more test cases to run the different branches. Practitioners usually consider that a function with three or more nested levels becomes significantly more difficult for the human mind to apprehend how it works.

  • Number of milestones ( PLAN_MILESTONES_VOL )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 5 ≤ 3 < 10 ≤ 4 < 20 ≤ 5

    The number of milestones that occured during the last five releases.

    Milestones are retrieved from the PMI file and are counted whatever their target release is. Milestones are useful to assess the maturity of the release and improves predictability of the project's output, in terms of quality and time.

  • Project is on time for next milestone ( PLAN_NEXT_MILESTONE )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Is the project on time for the next milestone?

  • Reviews success rate ( PLAN_REVIEWS_SUCCESS_RATE )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 20 ≤ 2 < 40 ≤ 3 < 60 ≤ 4 < 80 ≤ 5

    What is the percentage of successful reviews over the past 5 releases?

    The reviews are retrieved from the project management infrastructure record, and are considered successful if the entry is equal to success. If there are less than 5 releases defined, the percentage is computed on these only.

  • ITS information ( PMI_ITS_INFO )

    Provided by: EclipsePmi

    Used by: QM_ITS

    Scale: 1 < 2 ≤ 2 < 3 ≤ 3 < 4 ≤ 4 < 5 ≤ 5

    Is the bugzilla info correctly filled in the PMI records?

    The project management infrastructure file holds information about one or more bugzilla instances. This test checks that at least one bugzilla instance is defined, with a product identifier, a create_url to enter a new issue, and a query_url to fetch all the issues for the project.

  • SCM information ( PMI_SCM_INFO )

    Provided by: EclipsePmi

    Used by: QM_SCM

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    Is the source_repo info correctly filled in the PMI records?

    The project management infrastructure file holds information about one or more source repositories. This test checks that at least one source repository is defined, and accessible.

  • Public API ( PUBLIC_API )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 48347 ≤ 2 < 14018 ≤ 3 < 6497.5 ≤ 4 < 2420.75 ≤ 5

    Number of public Classes + number of public Functions + number of public Properties. The SonarQube measure used for this purpose is public_api.

    See also SonarQube's definition for size metrics: http://docs.sonarqube.org/display/SONAR/Metric+definitions#Metricdefinitions-Size. More details can be seen here: http://docs.sonarqube.org/display/SONAR/Metrics+-+Public+API

  • Public documented API ( PUBLIC_API_DOC )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The percentage of documented API accesses. Density of public documented API = (Public API - Public undocumented API) / Public API * 100. The SonarQube measure used for this purpose is public_documented_api_density.

    See also SonarQube's definition for size metrics: http://docs.sonarqube.org/display/SONAR/Metric+definitions#Metricdefinitions-Size.

  • Number of violated rules for analysability ( RKO_ANA )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The number of analysability rules that have at least one violation on the artefact.

  • Number of violated rules for changeability ( RKO_CHA )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The number of changeability rules that have at least one violation on the artefact.

  • Number of violated rules for reliability ( RKO_REL )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The number of reliability rules that have at least one violation on the artefact.

  • Number of violated rules for reusability ( RKO_REU )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The number of reusability rules that have at least one violation on the artefact.

  • Adherence to analysability rules ( ROKR_ANA )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 10 ≤ 2 < 30 ≤ 3 < 50 ≤ 4 < 75 ≤ 5

    Ratio of conform analysability practices on the number of checked analysability practices.

  • Adherence to changeability rules ( ROKR_CHA )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 10 ≤ 2 < 30 ≤ 3 < 50 ≤ 4 < 75 ≤ 5

    Ratio of conform changeability practices on the number of checked changeability practices.

  • Adherence to reliability rules ( ROKR_REL )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 10 ≤ 2 < 30 ≤ 3 < 50 ≤ 4 < 75 ≤ 5

    Ratio of conform reliability practices on the number of checked reliability practices.

  • Adherence to reusability rules ( ROKR_REU )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 10 ≤ 2 < 30 ≤ 3 < 50 ≤ 4 < 75 ≤ 5

    Ratio of conform reusability practices on the number of checked reusability practices.

  • Number of rules for analysability ( RULES_ANA )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The total number of rules checked for analysability.

  • Number of rules for changeability ( RULES_CHA )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The total number of rules checked for changeability.

  • Number of rules for reliability ( RULES_REL )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The total number of rules checked for reliability.

  • Number of rules for reusability ( RULES_REU )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The total number of rules checked for reusability.

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • SCM committers ( SCM_AUTHORS_30 )

    Provided by: EclipseScm

    Used by: QM_DIVERSITY , QM_SCM

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 18 ≤ 5

    Total number of identities found as authors of commits in source code management repositories dated during the last three months.

    Source code management repositories are those considered as such in the project documentation. Commits in all branches are considered. Date used for each commit is 'author date' (when there is a difference between author date and committer date). An identity is considered as author if it appears as such in the commit record (for systems logging several identities related to the commit, authoring identity will be considered). Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • SCM Commits ( SCM_COMMITS_30 )

    Provided by: EclipseScm

    Used by: QM_ACTIVITY , QM_SCM

    Scale: 1 < 2 ≤ 2 < 5 ≤ 3 < 13 ≤ 4 < 121 ≤ 5

    Total number of commits in source code management repositories dated during the last three months.

    Source code management repositories are those considered as such in the project documentation. Commits in all branches are considered. Date used for each commit is 'author date' (when there is a difference between author date and committer date). Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • Files committed ( SCM_FILES )

    Provided by: EclipseScm

    Used by: QM_ACTIVITY

    Scale: 1 < 3 ≤ 2 < 19 ≤ 3 < 95.75 ≤ 4 < 2189 ≤ 5

    Total number of files touched by commits in source code management repositories dated during the last three months.

    Source code management repositories are those considered as such in the project documentation. Commits in all branches are considered. A file is 'touched' by a commit if its content or its path are modified by the commit. Date used for each commit is 'author date' (when there is a difference between author date and committer date). Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • File stability index ( SCM_STABILITY_3M )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 3.79487179487 ≤ 2 < 1.4430952381 ≤ 3 < 1.14285714286 ≤ 4 < 1 ≤ 5

    Average number of commits touching each file in source code management repositories dated during the last three months.

    Source code management repositories are those considered as such in the project documentation. Commits in all branches are considered. A file is 'touched' by a commit if its content or its path are modified by the commit. Date used for each commit is 'author date' (when there is a difference between author date and committer date). Time range is measured as three calendar months period starting the day before the data retrieval (example: if retrieval is on Feb 3rd, period is from Jan 3rd to Feb 2nd, both included).

  • Stack Overflow Answers (5Y) ( SO_ANSWERS_VOL_5Y )

    Provided by: StackOverflow

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 3 ≤ 4 < 4 ≤ 5

    The number of answers to questions related to the project's tag posted on Stack Overflow during the last 5 years.

    Having many answers posted about the project indicates a strong interest from the community, and a good support. The list of questions and their answers associated to the tag can be browsed on the [Stack Overflow web site](https://stackoverflow.com/questions/tagged/java).

  • Stack Overflow Answer rate (5Y) ( SO_ANSWER_RATE_5Y )

    Provided by: StackOverflow

    Used by:

    Scale: 1 < 1 ≤ 2 < 2 ≤ 3 < 4 ≤ 4 < 5 ≤ 5

    The average number of answers per questions related to the project's tag on Stack Overflow during the last 5 years.

    Having many answers posted about the project indicates a strong interest from the community, and a good support. The list of questions and their answers associated to the tag can be browsed on the [Stack Overflow web site](https://stackoverflow.com/questions/tagged/java).

  • ( )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < ≤ 2 < ≤ 3 < ≤ 4 < ≤ 5

  • Stack Overflow Questions (5Y) ( SO_QUESTIONS_VOL_5Y )

    Provided by: StackOverflow

    Used by:

    Scale: 1 < 5 ≤ 2 < 50 ≤ 3 < 400 ≤ 4 < 800 ≤ 5

    The number of questions related to the project's tag posted on Stack Overflow during the last 5 years.

    Having many questions posted about the project indicates a strong interest from the community. The list of questions associated to the tag can be browsed on the [Stack Overflow web site](https://stackoverflow.com/questions/tagged/java).

  • Stack Overflow Views (5Y) ( SO_VIEWS_VOL_5Y )

    Provided by: StackOverflow

    Used by:

    Scale: 1 < 100 ≤ 2 < 1000 ≤ 3 < 2500 ≤ 4 < 5000 ≤ 5

    The total number of views for questions related to the project's tag on Stack Overflow during the last 5 years.

    Having many views on questions about the project indicates a strong interest from the community. The list of questions and their answers associated to the tag can be browsed on the [Stack Overflow web site](https://stackoverflow.com/questions/tagged/java).

  • Stack Overflow Votes (5Y) ( SO_VOTES_VOL_5Y )

    Provided by: StackOverflow

    Used by:

    Scale: 1 < 10 ≤ 2 < 100 ≤ 3 < 250 ≤ 4 < 500 ≤ 5

    The total number of votes on questions related to the project's tag on Stack Overflow during the last 5 years.

    Having many votes on questions about the project indicates a strong interest from the community. The list of questions and their answers associated to the tag can be browsed on the [Stack Overflow web site](https://stackoverflow.com/questions/tagged/java).

  • Test success density ( TEST_SUCCESS_DENSITY )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 50 ≤ 2 < 65 ≤ 3 < 80 ≤ 4 < 95 ≤ 5

    The percentage of failed tests during the last execution of the test plan. The SonarQube measure used for this purpose is test_success_density.

    Computation is as follows: Test success density = (Unit tests - (Unit test errors + Unit test failures)) / Unit tests * 100, where Unit test errors is the number of unit tests that have failed, Unit test failures is the number of unit tests that have failed with an unexpected exception.

  • Number of tests relative to the code size ( TST_VOL_IDX )

    Provided by: No provider defined!

    Used by:

    Scale: 1 < 2 ≤ 2 < 5 ≤ 3 < 7 ≤ 4 < 10 ≤ 5

    The total number of test cases for the product, divided by the number of thousands of SLOC.

    Metric is computed from SonarQube tests and ncloc metrics.


Page generated by Alambic 3.2 on Wed Feb 22 14:39:54 2017.