EDITING BOARD
RO
EN
×
▼ BROWSE ISSUES ▼
Issue 20

Metrics in Visual Studio 2013

Radu Vunvulea
Solution Architect
@iQuest



PROGRAMMING

In the article of the last issue, we talked about how we can measure software metrics by using Sonar. This is a tool that can be very useful not only to the technical lead but also to the rest of the team. Any team member can very easily check on the web interface of the Sonar what the value of different metrics is.

If we use Visual Studio 2013 as a development environment we will find out that we have the possibility to calculate some of the metrics right from the Visual Studio, without having to use other applications or tools. Within this article we will see the metrics that we can calculate by using directly what Visual Studio provides us with.

Why should we run such a tool?

Such a tool can help us not only detect the possible problems that our application might have, but also, it can reveal to us the quality of the code we have written. As we will see further on, all the rules and recommendations that Microsoft has in relation to the code can be found in this tool.

Some of the defaults discovered by such a tool are sometimes difficult to find by using unit tests. That is why a tool of this kind can reinforce our trust that the application we are writing is of a good quality.

Ce metrice putem să obținem?

Starting with Visual Studio 2013, all the Visual Studio versions (except for Visual Studio test Professional) offer us the possibility to calculate metrics directly in it. Right from the start we should know that the number of metrics we can calculate by using Visual Studio is limited. Unfortunately, we do not have the possibility to calculate all the metrics available in Sonar, but there are a few extensions for Visual Studio which help us calculate other metrics, too, not only those existing in Visual Studio.

Visual Studio allows us to calculate some of the metrics by using the Static Code Analysis. It analyses the code, trying to give the developers data on the project and the code they have written, even before pushing it on the source control. Based on this analysis, we can identify possible problems related to:

  • Design
  • Performance
  • Security
  • Globalization
  • Interoperability
  • Duplicated code
  • Code that is not being used

And many other problems. It all depends also on the developer"s ability to interpret these metrics. A rather interesting thing of this analyzer is the fact that all the rules and recommendations that Microsoft has in relation to the code, the style of the code, the manner in which we should use different classes and methods can be found within this analyzer. All these rules are grouped into different categories.

This way it can be extremely easy to identify areas of our application that do not use an API as they should. In case you wish to create a specific rule, you will need Visual Studio 2013 Premium or Ultimate. These two versions of Visual Studio allow us to add new rules that are specific to the project or the company we are working for. Once these rules added, the code analyzer will check whether they are obeyed, and if they are not obeyed, it will be able to warn us. Unfortunately, at the moment we can only analyze code written in C#, F#, VB and C/C++. I would have liked it very much to be able to analyze code written in JavaScript in order to be able to see what its quality is.

Some of our readers might say that this thing could also be done in older versions of Visual Studio. This is true. What the new version (2013) brought as new is the possibility to analyze the code without having to run it. This thing could also be done more or less in Visual Studio ٢١٢.

How do we run this tool?

These tools can be run in different ways, manually, from the "Analyze" menu, as well as automatically. In order to be able to run them automatically, we need to select the option "Enable Code Analysis on Build" for each project that we wish to analyze.

Another quite interesting option is to activate from TFS a policy through which, before being able to check-in on TFS, the developer has to run this analyzer. This option can be activated from the "Check-in Policy" area, where we have to add a new "Code Analysis" type rule.

We must be aware that enforcing such a rule does not guarantee that the developer will also read the report that is being generated and will take it into account. All that it guarantees is that this report is generated. That is why each team should be educated to observe these reports and to analyze them when we decide to use such tools.

The moment when we enforce this rule, we have the possibility to select which rules must not be breached when there is a check-in on TFS. For instance, one will not be able to perform a check-in on TFS for a code that uses an instance of an object implementing IDisposable without also applying the Dispose method.

When a developer will attempt a check-in for a code that does not obey one of the rules, he will get an error which won"t allow him to enter the modification on TFS without solving the problem first.

In addition, we have the possibility to also run this tool as part of the build. In order to do this, we have to activate this option from Build Definition.

What does the Code Analysis tell us?

The result of running this tool is a set of warnings. The most important information that a warning contains is:

  • Title: the type of warning
  • Description: a short description of the warning
  • Category: the category it belongs to
  • Action: what we can do in order to solve the problem

Each warning allows us to navigate exactly to the code line where the problem is. Not only that, but for each and every warning there is a link to MSDN which explains in detail the cause of the warning and what we can do to eliminate it.

How can we create custom rules?

As I have already said before, this thing can only be done through Visual Studio Premium or Ultimate. In order to do this, we have to go to "New>File>General>Installed Templates>Code Analysis Rule Set".

Once we have a blank rule, we can specify different properties that we want it to have.

Besides this tool, in Visual Studio there are also two other extremely interesting tools available.

Code Clones

This tool allows us to automatically detect code that is duplicated. The most interesting thing to it is that there are several types of duplicated (cloned) code which it can detect:

  • Exact match: when the code is exactly the same, with no difference
  • Strong match: the code is similar, but not 100% (for example, it differs in the value of a string or in the action that is executed in a given case)
  • Medium match: the code is pretty similar, but there are a few differences
  • Weak match: the code resembles a little; the chances that this code be duplicated are the smallest

Besides this information, we can also find out, for each duplicated code, in how many locations it is duplicated and we can navigate up to the code line where it appears. Another metric which I like quite a lot is the total number of duplicated (cloned) lines. By this metric, we can quite easily realize how many code lines we could get rid of.

Code Metrics

By means of this tool, we can analyze each project that we have to solve and we can extract different metrics. Being a tool integrated with Visual Studio, we can navigate in each project and see the value of each metric from the level of the project, to the level of namespace, class and method.

There are 5 metrics that can be analyzed by using Code Metrics:

  • Lines of Code: this metric tells us the number of code lines that we have at the level of method, class, namespace, project. It is good to know that, when at project level, this metric indicates to us the total number of code lines that the project has.
  • Class Coupling: we could say that this metric indicates how many classes a class is using - the smaller the value, the better.
  • Depth of Inheritance: it indicates the inheritance level of a class - just like in the case of class coupling, the smaller the value, the better.
  • Cyclomatic Complexity: indicates to us which the complexity level of a class or of a project is. We must be careful because if we implement a complex algorithm, then we will always have a rather high value for this metric.
  • Maintainability Index: is a value between 0 and 100 which indicates how easily the respective code can be maintained. A high value indicates that we have big problems (over 20). Any value below 10 shows us that we are in a good area and anything between 10 and 20 is of a medium level. It is not serious, but we have to be careful. This metric is calculated according to other metrics.

Conclusion

In this article we have discovered that Visual Studio provides us with different methods to assess the quality of the code. Some of these tools are available in normal versions of Visual Studio, and others only in the Ultimate version. Compared to Sonar, Visual Studio does not allow us to share these metrics through a portal. Instead, it allows us to explore them in Excel in order to be able to send them to the team. The Visual Studio tools are a good start for any team or developer that wishes to see the quality of the code written by him or by the team.

Conference TSM

VIDEO: ISSUE 109 LAUNCH EVENT

Sponsors

  • Accenture
  • BT Code Crafters
  • Accesa
  • Bosch
  • Betfair
  • MHP
  • BoatyardX
  • .msg systems
  • P3 group
  • Ing Hubs
  • Cognizant Softvision
  • Colors in projects

VIDEO: ISSUE 109 LAUNCH EVENT