Wednesday, March 19, 2008

(Simple) Metrics

I've been using metrics for a long time (certainly more than 10 years now). I've been using metrics to control project quality (including my own stuff, of course), to define acceptance criteria for outsourced code, to understand the way people work, to "smell" large projects before attempting a refactoring activity, to help making an informed refactor / rewrite decision, to pinpoint functions or classes in need of a careful review, to estimate residual bugs, an so on.

Of course, I use different metrics for different purposes. I also combine metrics to get the right picture. In fact, you can now find several tools to calculate (e.g.) code metrics. You can also find many papers discussing (often with contradictory results) the correlation between any given metric and (e.g.) bug density. In most cases, those papers are misguided, as they look for correlation between a single metric and the target (like bug density). Reality is not that simple; it can be simplified, but not to that point.

Consider good old cyclomatic complexity. You can use it as-is, and it can be useful to calculate the minimum reasonable number of test cases you need for a single function. It's also known that functions with higher cyclomatic complexity tend to have more bugs. But it's also well known that (on average) there is a strong, positive correlation between cyclomatic complexity (CC) and lines of code (LOC). That's really natural: long functions tend to have a complex control flow. Many people have therefore discounted CC, as you can just look at the highly correlated (and easier to calculate) LOC. Simple reasoning, except it's wrong :-).

The problem with that, again, is trying to use just one number to understand something that's too complex to be represented by a single number. A better way is to get both CC and LOC for any function (or method) and then use quadrants.

Here is a real-world example, albeit from a very small program: a smart client invoking a few web services and dealing with some large XML files on the client side. It has been written in C# using Visual Studio, therefore some methods are generated by the IDE. Also, the XML parser is generated from the corresponding XSD. Since I'm concerned with code which is under the programmer's control, I've excluded all the generated files, resulting in about 20 classes. For each method, I gathered the LOC and CC count (more on "how" later). I used Excel to get the following picture:


As you can see, every method is just a dot in the chart, and the chart has been split in 4 quadrants. I'll discuss the thresholds later, as it's more important to understand the meaning of each quadrant first.

The lower-left quadrant is home for low-LOC, low-CC methods. These are the best methods around: short and simple. Most code ought to be there (as it is in this case).

Moving clockwise, the next you get (top-left) is for high LOC, low CC methods. Although most coding standards tend to somehow restrict the maximum length of any given method, it's pretty obvious that a long method with a small CC is not that bad. It's "linear" code, likely doing some initialization / configuration. No big deal.

The next quadrant (top-right) is for high LOC, high CC methods. Although this might seem the worst quadrant, it is not. High LOC means an opportunity for simple refactoring (extract method, create class, stuff like that). The code would benefit from changes, but those changes may require relatively little effort (especially if you can use refactoring tools).

The lower-right quadrant is the worst: short functions with high CC (there are none in this case). These are the puzzling functions which can pack a lot of alternative paths into just a few lines. In most cases, it's better to leave them alone (if working) or rewrite them from scratch (if broken). When outsourcing, I usually ask that no code falls in this quadrant.

For the project at hand, 3 classes were in quadrant 3, so candidate for refactoring. I took a look, and guess what, it was pretty obvious that those methods where dealing with business concerns inside the GUI. There were clearly 3 domain classes crying to be born (1 shared by the three methods, 1 shared by 2, one used by the remaining). Doing so brought to better code, with little effort. This is a rather ordinary experience: quadrants pinpoint problematic code, then it's up to the programmer/designer to find the best way to fix it (or decide to leave it as it is).

A few words on the thresholds: 10 is a rather generous, but somewhat commonly accepted threshold for CC. The threshold for LOC depends heavily on the overall project quality. I've been accepting a threshold of 100 in quality-challenged projects. As the quality improves (through refactoring / rewriting) we usually lower the threshold. Being a new development, I adopted 20 LOC as a rather reasonable threshold.

As I said, I use several different metrics. Some can be used in isolation (like code clones), but in most cases I combine them (for instance, code clones vs. code stability gives a better picture of the problem). Coupling and cohesion should also be considered as pairs, never as single numbers, and so on.
Quadrants are not necessarily the only tool: sometimes I also look at the distribution function of a single metric. This is way superior to what too many people tend to do (like looking at the "average CC", which is meaningless). As usual, a tool is useless if we can't use it effectively.

Speaking of tools, the project above was in C#, so I used Source Monitor, a good free tool working directly on C# sources. Many .NET tools work on the MSIL instead, and while that may seem like a good idea, in practice it doesn't help much when you want a meaningful LOC count :-).

Source Monitor can export in CSV and XML. Unfortunately, the CSV didn't contain the detailed data I wanted, so I had to use the XML. I wrote a short XSLT file to extract the data I needed in CSV format (I suggest you use the "save as" feature, as unwanted spacing / carriage returns added by browsers may cripple the result). Use it freely: I didn't put a license statement inside, but all [my] source code in this blog can be considered under the BSD license unless otherwise stated.

4 comments:

Romano Scuri said...

REGALO DI NATALE

Quest'articolo viene come un regalo di Natale ed il bello รจ che siamo a Pasqua.

Beh, allora... Auguri a tutti!

Carlo.Pescio said...

Romano, non pretendo di aver capito la cosa del regalo di Natale :-)), ma sicuramente ricambio gli auguri, anzi, li estendo a mia volta a tutti quanti!

Alessandro Gentilini said...

In SourceMonitor, to get the data for all the methods in a project, the following option must be checked:

File -> Options -> Export -> "Include method-level metrics counts in XML detail exports"

To get the report:

File -> Export Checkpoint Detail(s) As XML...

To apply Carlo Pescio's xsl transformation here it is a jscript (Microsoft based):

> cscript /nologo xslt.js details.xml SourceMonitor.xslt

/*
xslt.js - Transform an xml file by means of an xslt file.
*/

main( WScript.Arguments.length, WScript.Arguments );

function main( argc, argv )
{
try {
if ( argc != 2 ) {
WScript.StdErr.Write( help() );
} else {
WScript.StdOut.Write( trasform( argv(0), argv(1) ) );
}
} catch ( e ) {
WScript.StdErr.WriteLine( e.message );
WScript.Quit( e.number );
}
}

function trasform( xmlFile, xslFile )
{
var xml = loadXMLDoc( xmlFile );
var xslt = loadXMLDoc( xslFile );
return xml.transformNode( xslt );
}

function loadXMLDoc( fileName )
{
var xmlDoc = new ActiveXObject( "Microsoft.XMLDOM" );
xmlDoc.async=false;
xmlDoc.load( fileName );
if ( xmlDoc.parseError.errorCode != 0 ) {
throw new Error( xmlDoc.parseError.errorCode,"Error loading ``" +
fileName + "'' : " + xmlDoc.parseError.reason );
}
return( xmlDoc );
}

function help()
{
var s = "NAME";
s += "\n\t";
s += "xslt.js - Transform an xml file by means of an xslt file.";
s += "\n\n";
s += "SYNOPSIS";
s += "\n\t";
s += "cscript xslt.js xmlFile xsltFile";
s += "\n\n";
s += "DESCRIPTION";
s += "\n\t";
s += "Transform xmlFile by means of xsltFile.";
s += "\n\t";
s += "Transformed file is written to standard output.";
s += "\n\t";
s += "Exit code different from zero means errors happened, an ";
s += "error message is written to standard error."
s += "\n";
return s;
}

Carlo.Pescio said...

Alessandro, thanks for contributing your script.

For one-shot uses like exporting metrics I normally open both files (XML and XSLT) in Visual Studio and let it do the transformation, but a batch routine can be useful (for instance) when you want to get a report as part of the build process.