Describes how to calculate some common metrics.

Note
This tutorial is written for version 1.6.0 of jQAssistant.

1. Introduction

A famous quote of Peter F. Drucker is

If you can't measure it, you can't improve it.

Metrics are just one way to measure your software system. They can help you to understand your system and to discover anomalies in your system. If the anomalies discovered, it’s up to you to improve your system to fulfill the metrics.

In the following chapters I describe different metrics and their purpose. Each chapter consists of following points:

  • Description of the metric.

  • How to calculate the metric with jQAssistant.

  • A simple example how to calculate the metric.

2. OOD-Metrics

This chapter describes object oriented design metrics mentioned by Robert C. Martin in his book "Agile Principles and Practices". He distinguishes between stability metrics, abstractness and the main sequence. This three metrics will be described in the following:

2.1. Stability Metrics

The stability of a package can be measured by counting the incoming and outgoing dependencies. A package is the more stable the less dependencies to other packages it has. In other words a package is the more unstable the more dependencies to other packages is has.

To measure the stability you have to count incoming and outgoing dependencies.

2.1.1. Ca (afferent couplings)

The number of classes outside this component that depend on classes within this component. In short: the number of incoming dependencies.

Computes the afferent couplings for all packages
MATCH (p:Package)
OPTIONAL MATCH (p)-[:CONTAINS]->(it:Java:Type)<-[:DEPENDS_ON]-(et:Java:Type)<-[:CONTAINS]-(ep:Package)
WHERE p <> ep
WITH p, COUNT(et) AS afferentCouplings
SET p.ca = COALESCE(afferentCouplings, 0)
RETURN p

2.1.2. Ce (efferent couplings)

The number of classes inside this component that depends on classes outside this component. In short: the number of outgoing dependencies.

Computes the efferent couplings for all packages
MATCH (p:Package)
OPTIONAL MATCH (p)-[:CONTAINS]->(it:Java:Type)-[:DEPENDS_ON]->(et:Java:Type)<-[:CONTAINS]-(ep:Package)
WHERE p <> ep
WITH p, COUNT(et) AS efferentCouplings
SET p.ce = COALESCE(efferentCouplings, 0)
RETURN p

2.1.3. I (instability)

Formula
I = Ce / (Ca + Ce)

This metric has the range [0, 1]. I=0 indicates a maximally stable component. That means no type (neither a interface nor a class) in this package has a dependency to any other interface or class in another package. I=1 indicates a maximally instable component. That means no type (either interfaces or classes) in this package have no dependency to any other interfaces or classes in another package. In other words all interfaces or classes in this package have at least one dependency to an interface or class in another package.

This metric requires the previous execution of Ca (afferent couplings) and Ce (efferent couplings).

Computes the instability for all packages
MATCH (p:Package)
WHERE p.ce + p.ca > 0
WITH p, toFloat(p.ce) / (p.ce + p.ca) as instability
SET p.instability = instability
RETURN p

2.1.4. Example

OO-Metrics Instability
Figure 1. OO-Metrics Instability

2.2. Abstractness Metrics

The abstractness is the ratio of abstract classes and interfaces to the total number of classes and interfaces in a package.

2.2.1. Na (number abstracts)

The number of abstract classes and interfaces in this component.

Computes the number of abstract classes and interfaces for all packages
MATCH (p:Package) WITH p
OPTIONAL MATCH (p)-[:CONTAINS]->(ac:Java:Type:Class {abstract:true}) WITH p, COUNT(ac) AS numberAbstractClasses
OPTIONAL MATCH (p)-[:CONTAINS]->(i:Java:Type:Interface) WITH p, numberAbstractClasses, COUNT(i) AS numberInterfaces
SET p.na = COALESCE(numberAbstractClasses + numberInterfaces, 0)
RETURN p

2.2.2. Nc (number classes)

The number of all classes and interfaces in this component.

Computes the number of all classes and interfaces for all packages
MATCH (p:Package)
OPTIONAL MATCH (p)-[:CONTAINS]->(c:Java:Type)
WITH p, COUNT(c) AS numberClasses
SET p.nc = COALESCE(numberClasses, 0)
RETURN p

2.2.3. A (abstractness)

Formula
A = Na / Nc

The ratio of abstract classes and interfaces to the number of all classes and interfaces per package. This metric has the range [0, 1]. A=0 indicates that this package has no abstract classes or interfaces. A=1 indicates that this package only contains abstract classes or interfaces.

This metric requires the previous execution of Na (number abstracts) and Nc (number classes).

Computes the abstractness for all packages
MATCH (p:Package)
WHERE p.nc > 0
SET p.abstractness = toFloat(p.na) / p.nc
RETURN p

2.2.4. Example

OO-Metrics Abstractness
Figure 2. OO-Metrics Abstractness

2.3. The main sequence

The main sequence is the relationship between abstractness and instability. A package is totally abstract or totally instable at the best. The packages with high abstractness and high instability are in the zone of uselessness. Classes in this zone are maximally abstract and have no dependents. The packages with low abstractness and low instability are in the zone of pain. Classes in this zone are highly stable and concrete. This classes are difficult to change because they are stable, but they cannot be extended because they aren’t abstract.

OO-Metrics Distance quarter
Figure 3. OO-Metrics Distance

Ideally is the distance of a package as low as possible.

2.3.1. D (distance)

Formula
D = ABS(A + I - 1) / SQRT(2)

This is the distance like geometrically defined. This metric has the range [0, ~0.707].

This metric requires the previous execution of I (instability) and A (abstractness) and there prerequisites.

Computes the distance for all packages
MATCH (p:Package)
SET p.distance = abs(p.abstractness + p.instability -1) / sqrt(2)
RETURN p

2.3.2. D' (normalized distance)

Formula
D' = ABS( A + I - 1)

More common is the normalized distance. This metric has the range [0, 1].

Computes the normalized distance for all packages
MATCH (p:Package)
SET p.normalizedDistance = abs(p.abstractness + p.instability -1)
RETURN p

2.3.3. Example

OO-Metrics Distance
Figure 4. OO-Metrics Distance

3. Visibility Metrics

This chapter describes the visibility of packages as defined in Visibility Metrics and the Importance of Hiding Things by Herbert Dowalil. The idea of this metrics is to set the public visible types in a relationship to all types.

3.1. Relative Visibility (RV)

Formula
RV = (NumberOfInnerComponentsVisibleOutside / TotalNumberOfInnerComponents)

The relative visibility is the relationship of the number of inner components visible outside to the total number of inner components.

Computes RV per package
MATCH (p:Package)
WITH p
OPTIONAL MATCH (p)-[:CONTAINS]->(it:Java:Type {visibility:"public"})
WITH p, COUNT(it) AS noicvo
OPTIONAL MATCH (p)-[:CONTAINS]->(it:Java:Type)
WITH p, noicvo, COUNT(it) AS tnoic
WHERE noicvo > 0 AND tnoic > 0
SET p.relativeVisibility = toFloat(noicvo)/tnoic
RETURN p

3.2. Average Relative Visibility (ARV)

Formula
ARV = SumAllRVValues / NumberOfComponents

The average relative visibility is the sum of all relative visibilities in relationship to the total number of components. This metric exists just one time for all components.

This metric requires the previous execution of Relative Visibility (RV).

Computes ARV for all components
MATCH (p:Package)
RETURN SUM(p.relativeVisibilty)/COUNT(p) AS averageRelativeVisibilty

3.3. Global Relative Visibility (GRV)

Formula
GRV = (NumberOfVisibleSubcomponentsOfAllComponents / NumberOfAllSubcomponentsOfAllComponents) * 100

The global relative visibility is the number of all visible classes in all packages in relationship to the total number of all classes in all packages. This metric exists just one time for all components.

Computes GRV for all components
MATCH (p:Package)
WITH p
OPTIONAL MATCH (p)-[:CONTAINS]->(it:Java:Type {visibility:"public"})
WITH p, COUNT(it) AS noicvo
OPTIONAL MATCH (p)-[:CONTAINS]->(it:Java:Type)
WITH p, noicvo, COUNT(it) AS tnoic
WHERE noicvo > 0 AND tnoic > 0
RETURN toFloat(sum(noicvo))/sum(tnoic) AS globalRelativeVisibility

3.3.1. Example

Relative Visibility
Figure 5. Relative Visibility

4. Implementation

This metrics are implemented in the jQAssistant Java Metrics plugin.

Please check the documentation to apply this metrics in your project. In cases of issues please report there.

5. Critique

I’ve shown different metrics. Each of them has a different purpose. Some focus on coupling, others on visibility or cohesion. Each can tell you something about your system.

Important
Metrics are an indicator, no silver bullet.

The metrics don’t know your architectural rules. Therefore metrics can’t express how good your system comply with your architectural rules. Metrics try to condense information in one or a few numbers, but software architecture is complicate thing and cannot be expressed in just one number. But metrics a good indicator to describe anomalies regarding to the common understanding of software systems. They are a hint, that something is rather unusual. You have to analyse the deviation.

Tip
Define metrics for system, measure them and take a closer look if your expecations aren’t met. Then improve your code or change your metrics' threshold or align your architectural rules. Repeat this process as long as your system lives.