Tuesday 17 November 2009

MbUnit Equality Assertions

The more I investigate MbUnit and the Gallio Automation Platform the more impressed I become.  The latest features I've been playing with are the new equality assertions in MbUnit 3.1.

How often would you like to do

Assert.AreEqual(a, b)

where a, b are object instances?

Suppose you haven't overridden Object.Equals and Object.GetHashCode. You shouldn't need to do these just because you want to write tests.

Gallio team member, Yann Trévin, has written an excellent post describing the new equality assertions in MbUnit 3.1 so there's nothing more for me to say here but this.

I ran up a modified and simplified version of his example in the Gallio UI and one cool feature I noticed is the differences highlighting you get when two object instances fail their equality test. Below is an object equality comparison performed using the StructuralEqualityComparer that Yann describes in his post.

image

As you can see, the differences in the values of each field are highlighted.

Wednesday 16 September 2009

MbUnit and WatiN Screen Capture in Test Reports

I have blogged about WatiN recently. MbUnit is an alternative and very rich unit unit testing framework for .NET that is offered as part of the Gallio test automation platform. Gallio itself can be used with a variety of unit testing frameworks and other tools, not just MbUnit. For example, from within the Gallio GUI you can run any NUnit tests you may have.

MbUnit 3.1 was released recently and has some slick new features to facilitate web application testing with WatiN.

MbUnit allows you to create a WatiN test and automatically embed an image of the web page on test failure or on demand. This is very slick. The images below are from the results of running the Gallio sample project that comes with the Gallio installation (Gallio includes MbUnit). The images are rendered in .png format.

image

 

image

Also notice how it describes the WatiN operations...

Capture on Failure

Go to Google, enter MbUnit as a search term and click I'm Feeling Lucky

Navigating to 'http://www.google.com/'
Typing 'MbUnit' into TextField 'Google Search'
Clicking Button 'I'm Feeling Lucky'

and

Capture on Success

Go to Google, enter MbUnit as a search term and click I'm Feeling Lucky

Navigating to 'http://www.google.com/'
Typing 'MbUnit' into TextField 'Google Search'
Clicking Button 'I'm Feeling Lucky'

Click on About.

Clicking Link 'About'

The Gallio test runner GUI also has options for viewing the report in various HTML formats. For example if you select XHTML it loads the report in your default browser and you are able to expand/collapse the test fixtures and tests, e.g.,

image

This is very impressive and I would say this WatiN integration is a "killer feature" for MbUnit. I wish it had been available for my last commercial project that used WatiN.

Monday 7 September 2009

Wintellect's Power Collections for .NET

Introduction

Wintellect's open source Power Collections library describes itself as "a set of classes and methods that add new generic collection types and algorithms to the .NET Framework."

Power Collections was developed originally by a member of the C# 1.0 design team. It is designed to be used with .NET 2.0 generics and provides a library of data structures and algorithms somewhat similar to the C++ Standard Template Library (STL). With the advent of .NET 3.5 and LINQ the algorithms part of the mix is perhaps of less value than it was, since I suspect many of them could be implemented more "fluently" using LINQ. However, even now there are some data structures present in Power Collections that are not present in the .NET Base Class Library (BCL).

The library itself is well-documented and the source includes XML code comments (so you can see class and method documentation via IntelliSense) and an accompanying set of unit tests. It is generally easy to use in that it fits seamlessly with .NET's existing generics library.

Data Structures

A while ago I wrote a unique random number generator class. What does "unique" mean? Given a collection of positive integers that can contain duplicates I wanted to select a number from a random position in the range such that once selected that position can never be selected again.

So the generator class works such that once a number has been generated for an instance the number at that position in the range will not be generated again. Subsequent generations will eventually exhaust the entries in the range. An example will clarify this.

This range has duplicates.

{ 1, 2, 3, 4, 3, 6, 4, 8, 17, 42, 6 }

If the first 3 is selected at random then the generator can only subsequently select the second 3. If a random number were generated eleven times we might get this result: 42 6 3 4 2 4 6 17 3 8 1 .

This range is an increasing sequence of consecutive numbers.

{ 1, 2, 3, 4, 5, 6, 7, 8 }

If a random number were generated eight times we might get this result: 4 2 5 1 8 7 3 6

I also created some unit tests for the class. One such test was to take a collection with duplicates and run the generator until it exhausted all the entries in the range and then check that the source and generated collections are equal.

A collection containing duplicates is a Bag (or Multiset) data structure. The .NET Base Class Library does not have such a data structure but the Power Collections library does. So it was convenient to make use of this in my unit tests.

I was then able to set up tests such as this:

int[] numbers = new int[] { 1, 2, 3, 4, 3, 6, 4, 8, 17, 42, 6 };
Bag<int> expected = new Bag<int>(numbers);
Bag<int> actual = new Bag<int>();
// Generate contents of actual
// ...
Assert.IsTrue(expected.IsEqualTo(actual));

I often have a need to pass around just a pair of values that have no behaviour associated with them. A struct would be suitable but rather than define a new type I can use Power Collection's Pair data structure.  This is defined as Pair<TFirst,TSecond>. Power Collections also has a Triple defined as Triple<TFirst,TSecond,TThird>.


In .NET 4.0 a new Tuple type has been defined which serves the same purpose as Pair and Triple but until then we can use the Power Collections implementations.


Here's how we might use Pair:

// Display print resolution and drops per dot for current selection
Pair<string, string> pair =
SplitPrintResolutionAndDropsPerDot(text);
string printResolution = pair.First;
string dropsPerDot = pair.Second;

txtPrintModeResolution.Text = printResolution;
txtPrintModeDPD.Text = dropsPerDot;

Another useful data structure is the MultiDictionary (also referred to as a Multimap). This is missing from the .NET BCL prior to .NET 3.5 (.NET 3.5 has the Lookup class in the System.Linq namespace). The Power Collections library defines the class thus:


The MultiDictionary class associates values with a key. Unlike a Dictionary, each key can have multiple values associated with it. When indexing a MultiDictionary, instead of a single value associated with a key, you retrieve an enumeration of values.

Here's an example:

// Initialise
bool allowDuplicateValues = true;
MultiDictionary<int, int> dict =
    newMultiDictionary<int, int>(allowDuplicateValues);
dict.AddMany(1, new int[] { 4, 4, 6});
dict.AddMany(2, new int[] { 40, 3, 6});
dict.AddMany(3, new int[] { 42, 15, 8});

// Print
dict.ForEach(
    delegate(KeyValuePair<int, ICollection<int>> kvp)
    {
        Console.WriteLine("Key: "+ kvp.Key);
        Console.Write("Values: ");

        foreach (int var inkvp.Value)
        {
            Console.Write(var + " ");
        }
        Console.Write(Environment.NewLine);
    });


Algorithms


The Power Collections documentation describes the Algorithms class thus:


Algorithms contains a number of static methods that implement algorithms that work on collections. Most of the methods deal with the standard generic collection interfaces such as IEnumerable<T>, ICollection<T> and IList<T>.


Much of what is provided here can now be implemented more "fluently" using LINQ. It also includes some methods that are already available in .NET 2.0. However, Power Collections was written while .NET 2.0 was being developed and as such some methods were provided before they were added to the completed .NET 2.0 BCL.


A very useful algorithm is TypedAs defined as follows:


Given a non-generic IEnumerable interface, wrap a generic IEnumerable<T> interface around it. The generic interface will enumerate the same objects as the underlying non-generic collection, but can be used in places that require a generic interface. The underlying non-generic collection must contain only items that are of type T or a type derived from it. This method is useful when interfacing older, non-generic collections to newer code that uses generic interfaces.


This is especially useful in UI code when dealing with collections defined as IEnumerable or ICollection where you want to apply some generics or Power Collections algorithm to them. Here's an example that is not a UI one but illustrates the idea:

AuthorizationRuleCollection rulesCol = 
fileSecurity.GetAccessRules(
includeExplicit,
includeInherited,
type
);
List<AuthorizationRule> rules =
new List<AuthorizationRule>(
Algorithms.TypedAs<AuthorizationRule>(rulesCol)
);

AuthorizationRule rule =
rules.Find(delegate(AuthorizationRule match)
{
return match.IdentityReference.Value.Contains(AccountName);
});


By converting AuthorizationRuleCollecton to List<AuthorizationRule> via TypedAs we are able to apply List.Find using an anonymous  delegate.


In .NET 3.5 we can now do this via the IEnumerable.Cast (and similar ICollection and IList) extension method.


Set Operations

PowerCollections has a Set class that was missing from .NET 2.0 but there is now one in .NET 3.5 (HashSet). However, Power Collections also has some set methods as generic algorithms that can be applied to IEnumerable<T> collections. Here is an example:

// Get the authorised roles for this page
string[] authorisedRoles =
Array.ConvertAll<string, string>(
GetAuthorisedRoles(configPath),
StringHelper.ToLowerAndTrim
);

// Get all roles for the user
string[] userRoles =
Array.ConvertAll<string, string>(
Roles.GetRolesForUser(),
StringHelper.ToLowerAndTrim
);

// Get the subset of the user's roles that are
// authorised for this page
List<string> userAuthorisedRoles =
new List<string>(
Algorithms.SetIntersection(authorisedRoles, userRoles)
);

The idea here is that there is a set of authorised roles for the page, let's say A, B, C, D, E. The user may have a set of roles C, E, F, G. They can access the page if one or more of their roles matches an authorised role for the page, in this case C and E. This is a set intersection operation.


The Power Collections library is very easy to use and is well-documented and tested. More developers should have it in their toolbox.

Wednesday 5 August 2009

Testing an Internet Banking Simulator using WatiN

I blogged about WatiN back in December. Here I will show how to use it to automate a simplified Internet Banking Simulator.  Most Internet banking sites do not yet use two-factor authentication schemes, such as Barclays' PINsentry card reader system. Instead you are typically asked for a membership number and one or more pass codes. A pass code consists of a string of alphanumeric characters. When the user logs into the site they are not asked to enter the complete code. Instead they are asked to enter (usually three) characters from random positions within the code. The idea is to protect against keystroke logging programs.

Suppose a users' secret code is 373549. A typical login screen may ask the user to enter the 2nd , 4th and 5th digits. In a subsequent session it will ask for the 1st, 2nd and 6th. The three positions are randomly selected for each session.

On most sites these positions are selected as a strictly monotonically increasing sequence.  In other words, they might ask you to enter the 2nd, 4th and 5th digits in that order. But some sites do not impose an increasing monotonicity. In other words, they might ask you to enter the 5th, 2nd and 4th digits in that order. In my example I will stick to a monotonically increasing sequence although it works just as well without.

In order to simulate the generation of random positions I first had to write some code to generate a subset of unique random positions from all the positions in the given secret code.  The standard random number generator in .NET could not be used as is because it would potentially generate repetitions of the positions.  For the purposes of this post we can take it that this problem is solved. We can then concentrate on hooking this up to ASP.NET and WatiN.

A typical Internet Banking entry screen looks like this:

image

The secret code is shown at the bottom for visual checking.

We wish to check that

  1. Entering valid entries and clicking the Next button navigates to the account page (just an empty finish page for this exercise).
  2. Entering invalid entries and clicking the Next button displays an error message.

To do so we create a pair of NUnit functional tests that invoke WatiN to type the entries and click the Next button for us.

Test 1 - Entering valid entries

/// <summary>
///
Enters valid entries that should display finish page.
/// </summary>
[Test]
public void EnterValidEntriesShouldDisplayFinishPage()
{
// Extract entry positions
int firstPosition;
int secondPosition;
int thirdPosition;
ExtractEntryPositions(
out firstPosition,
out secondPosition,
out thirdPosition
);

// Enter valid entries for those positions
string validFirstEntry = SecretCode[firstPosition - 1].ToString();
ie.
TextField(Find.ByName("txtFirstEntry")).
TypeText(validFirstEntry);


string validSecondEntry = SecretCode[secondPosition - 1].ToString();
ie.
TextField(Find.ByName("txtSecondEntry")).
TypeText(validSecondEntry);


string validThirdEntry = SecretCode[thirdPosition - 1].ToString();
ie.
TextField(Find.ByName("txtThirdEntry")).
TypeText(validThirdEntry);


// Click Next
ie.Button(Find.ByName("btnNext")).Click();

// Assert Finished page displays
string expectedPageName = "Finished.aspx";
Assert.IsTrue(
ie.Url.Contains(expectedPageName),
String.Format("Url should contain {0}.", expectedPageName
)
);
}

image


Here we have just shown the first two entries. It is a snapshot of WatiN as it was typing the entries.


Test 2 - Entering invalid entries

/// <summary>
///
Enters invalid entries that should display home page and error message.
/// </summary>
[Test]
public void EnterInvalidEntriesShouldDisplayHomePageAndErrorMessage()
{
// Extract entry positions
int firstPosition;
int secondPosition;
int thirdPosition;
ExtractEntryPositions(
out firstPosition,
out secondPosition,
out thirdPosition
);

// Enter invalid entries for those positions
string invalidFirstEntry =
(SecretCode[firstPosition - 1] + 1).ToString();
ie.
TextField(Find.ByName("txtFirstEntry")).
TypeText(invalidFirstEntry);


string invalidSecondEntry =
(SecretCode[secondPosition - 1] + 1).ToString();
ie.
TextField(Find.ByName("txtSecondEntry")).
TypeText(invalidSecondEntry);


string invalidThirdEntry =
(SecretCode[thirdPosition - 1] + 1).ToString();
ie.
TextField(Find.ByName("txtThirdEntry")).
TypeText(invalidThirdEntry);


// Click Next
ie.Button(Find.ByName("btnNext")).Click();

// Assert Home page displays
string expectedPageName = "default.aspx";
Assert.IsTrue(
ie.Url.Contains(expectedPageName),
String.Format("Url should contain {0}.", expectedPageName
)
);

// Assert error message displays
string expectedErrorText = "Invalid code. Please try again.";
string actualErrorText = ie.Span("CustomValidator1").Text;
Assert.AreEqual(expectedErrorText, actualErrorText);
}

image


 


In the above the relevant WatiN code is highlighted in bold. The ie variable represents the Internet Explorer object. In the NUnit test setup methods we start this up and navigate to the start page like this.

ie = new IE();
ie.GoTo(startUrl);

As you can see the WatiN API sports a fluid interface. The best way of getting up to speed quickly is to use the WatiN Test Recorder and also refer to the HTML Mapping Table.  It provides a list of mappings between the HTML element code in a web page and the WatiN API.


At the moment both WatiN and WatiN Test Recorder are at version 1 for their official releases and support automation of Internet Explorer only. Both are in beta for version 2 and will support Firefox.

Thursday 4 June 2009

NUnit 2.5 Dabblings

Version 2.5 of NUnit was released recently. As I often do when a new version of a tool is released, I looked to see what's new. This also often gives me the opportunity to take a peek at features that were there in the last version which I hadn't noticed or hadn't had a reason to make use of. Looking back at unit tests I've written to date I've not been that adventurous in my use of NUnit. Here are the assertions I've mostly used.

Assert.IsTrue
Assert.IsFalse
Assert.AreEqual

I have also occasionally used the alternative "fluent" syntax


but found that they don't offer much for simple asserts, e.g.,


Assert.That(x, Is.EqualTo(y))


vs


Assert.AreEqual(x, y)


The former doesn't offer anything over the latter and is more unwieldy to write. However, the fluent form comes into its own in contexts like this:


Assert.That(x, Is.EqualTo(y).Within(0.000001))


This could also have been written


Assert.AreEqual(x, y, 0.000001)


but in this case it is clear that the fluent form is more readable.


Parameterised Tests

These allow you to supply data to a test case via parameters. The MbUnit framework added them some time ago via their RowTest attribute and prior to NUnit 2.5 it was possible to get the same behaviour via an NUnit extension. Parameterised tests aid in reducing code duplication for tests that use the same algorithm but with differing inputs. Consider a simple test of an email address validation routine.


Example Using Test Attribute
[Test]
public void ValidEmailInUsername()
{
string email = "joe@abc.co.uk";
Assert.IsTrue(ValidationTool.IsValidEmail(email));
}

[Test]
public void ValidEmailWithPeriodInUsername()
{
string email = "joe.bloggs@abc.com";
Assert.IsTrue(ValidationTool.IsValidEmail(email));
}

[Test]
public void ValidEmailWithUnderscoreInUsername()
{
string email = "joe_bloggs@abc.com";
Assert.IsTrue(ValidationTool.IsValidEmail(email));
}

[Test]
public void ValidEmailWithHyphenInUsername()
{
string email = "joe-bloggs@abc.com";
Assert.IsTrue(ValidationTool.IsValidEmail(email));
}

Of course, there is no real logic in the test cases in this simple example but we can see how we can easily start to get nasty code duplication.  As the tests develop we can factor out this code into helper routines but we're still faced with staring at a bunch of tests that look structurally the same.


When we load these in NUnit we get:


image


Example Using TestCase Attribute - First Version
[TestCase(
"joe@abc.co.uk",
Description = "Valid email in username"
)]
[TestCase(
"joe.bloggs@abc.com",
Description = "Valid email with period in username"
)]
[TestCase(
"joe_bloggs@abc.com",
Description = "Valid email with underscore in username"
)]
[TestCase(
"joe-bloggs@abc.com",
Description = "Valid email with hyphen in username"
)]
public void ValidEmail(string email)
{
Assert.IsTrue(ValidationTool.IsValidEmail(email));
}

Loading this in the NUnit GUI produces the following:


image


However, there is a disadvantage in that you cannot easily tell what is being tested. In the GUI you can see different parameters being passed to each test but you don't get a description such as ValidEmailWithPeriodInUsername, ValidEmailWithUnderscoreInUsername, etc. This is described in a post at Vadim Kreynin's blog. In our case we just have a single parameter but clearly it would be even worse with multiple parameters.


However, we can improve on this scenario while still getting the benefit of the TestCase attribute. We can add the TestName attribute.


Example Using TestCase Attribute - Second Version
[TestCase(
"joe@abc.co.uk",
Description = "Valid email in username",
TestName = "ValidEmailInUserName"
)]
[TestCase(
"joe.bloggs@abc.com",
Description = "Valid email with period in username",
TestName = "ValidEmailWithPeriodInUsername"
)]
[TestCase(
"joe_bloggs@abc.com",
Description = "Valid email with underscore in username",
TestName = "ValidEmailWithUnderscoreInUsername"
)]
[TestCase(
"joe-bloggs@abc.com",
Description = "Valid email with hyphen in username",
TestName = "ValidEmailWithHyphenInUsername"
)]
public void ValidEmail(string email)
{
Assert.IsTrue(ValidationTool.IsValidEmail(email));
}

Loading this in the NUnit GUI produces the following:


image


We also get the valid email tests nicely grouped as sub-nodes of ValidEmail. So we get both the visual readability of the Test attribute and the elimination of code duplication of the TestCase attribute. Nice.

Monday 11 May 2009

Readability in Method Calls

I generally prefer to call methods using variables rather than literals or expressions as arguments . I don't follow this religiously but it is especially helpful when calling API methods that have boolean or object parameters that can be true/false or null. For example,

AuthorizationRuleCollection rules = 
fileSecurity.GetAccessRules(true, true, typeof(NTAccount));

This is a fairly mild case but what does true mean? I've no idea. It's worse when we see this kind of thing,

DoSomething(width, height, null, null, name, false);

Here, I have no clue what null or false represents.


What I do in such cases is replace the null or boolean with a local variable. So, in the first case I write,

bool includeExplicit = true;
bool includeInherited = true;
Type type = typeof(NTAccount);
AuthorizationRuleCollection rules =
fileSecurity.GetAccessRules(includeExplicit, includeInherited, type);

Suppose includeExplicit or includeInherited is false? Then I write,

bool includeExplicit = true;
bool includeInherited = true;
Type type = typeof(NTAccount);
AuthorizationRuleCollection rulesCol =
fileSecurity.GetAccessRules(!includeExplicit, includeInherited, type);

(The only problem with the last case is that it easy to miss the ! operator. This is one of the shortcomings of the C-family languages. A not keyword would have been preferable.)


Even outside the cases discussed it is generally more readable to use variables instead of literals or expressions as arguments.

Monday 6 April 2009

Developer Productivity Tools

My primary developer tool is Microsoft Visual Studio. However, I use a number of Visual Studio add-ins and other complementary tools.  Here I describe what (other than Visual Studio) I use and why.

Visual Studio Add-ins

The MSDN articles, Ten Must-Have Tools Every Developer Should Download Now and Visual Studio Add-Ins Every Developer Should Download Now, are a useful point of reference.

Smart Paster

Smart Paster allows you to paste text on the clipboard into a Visual Studio code document as a comment, a string, a StringBuilder or Region. See here. The link mentioned there is broken. The Visual Studio 2008 version can be found here. I most often use "Paste as Comment." This is useful for inserting words from technical specs. into your code as comments. Smart Paster works with both C# and VB. I like it for the reasons stated in the MSDN article.

CodeKeep

CodeKeep is a repository for storing code snippets online. You can make these snippets available to the public or keep them private.  Snippets are available in multiple programming languages. You can either grab a snippet by browsing to the site snippet and copying and pasting it into your code editor or you can make use of a handy Visual Studio add-in. I find it most useful for being able to access my own code repository when I'm working at different client sites.

GhostDoc

This is one of the slickest add-ins I've used. Basically it reduces the tedium of writing XML documentation comments. Visual Studio allows you to type /// + Enter to generate an empty summary element for a class or member in C#. At that point, to complete it you must fill in your summary plus, for a method, documentation for any parameters and return value. What GhostDoc does is provide placeholders for these and also tries to infer a "starter" description from the name of your class or method. Often, depending on how well you've named your method, it gets the descriptions exactly right. But even when it doesn't then simply reducing the tedium of the angled brackets is a godsend. See also here.

GhostDoc is also intelligent enough to update its generated documentation should you, say, add an extra parameter to a method. It can also generate any existing documentation from base class methods in inherited classes. Its description rules are customisable, though I've barely scratched the surface. Because GhostDoc reduces the tedium of documentation it actually encourages you to write more of it than you otherwise would. For example, I used to be fairly good at writing at least summary documentation but now I pay more attention to documenting parameters as well, especially when combined with another excellent but more obscure plug-in I use called CR_Documentor which I discuss next. GhostDoc works with both C# and VB, although VB support is described as "experimental." There are indeed one or two glitches with VB, though nothing too serious.

CR_Documentor

CR_Documentor is a plug-in for Developer Express's freely downloadable DXCore extensibility engine. If you are a user of CodeRush or Refactor!, either commercial or free, then DXCore is installed with them. It is the engine that makes those products work. Alternatively, DXCore can be installed by itself. There is a small community of plug-in developers who have provided a number of useful plug-ins. CR_Documentor is one such plug-in. Here is a good overview

Below is an example from my own code. To examine this in more detail see here. Click on the magnifying glass icon to zoom in.

CR_Documentor

The great thing about CR_Documentor is that it allows you to view "in-place" and in real time what your XML documentation comments look like when rendered in tools such as NDoc or Sandcastle without actually having to first build your solution and then run these tools.  With CR_Documentor you can spot any errors there and then rather than having to wait for Sandcastle to generate the docs in order for you to identify and correct the errors and re-run. Again, because this is such a fun product, it actually encourages you to write documentation so you can get instant gratification.

Refactor! Pro

Developer Express is a .NET components and tools vendor. One of their products is a code refactoring tool called Refactor! Pro. There is a companion tool called CodeRush that includes Refactor! Pro.  The two products together compete with the better known ReSharper from JetBrains.

I used ReSharper a few years ago at a client site. It was an excellent product and I daresay it must be even better today. But a while later I discovered Refactor! Pro initially via the licensing agreement between Microsoft and Developer Express in 2005 to include Refactor! for Visual Basic in Visual Basic 2005. I happened to be working in a Visual Basic contract and one of my assignments was to engage in a major refactoring exercise, so I thought I'd give Refactor! a spin. I became hooked immediately by it's slick, highly visual and non-modal UI paradigm. Below is a picture of the Extract Method refactoring. Visual Studio already has this for C# but I prefer the Refactor! implementation. Besides, Visual Studio C# has only about a half dozen built-in refactorings. Refactor! Pro now has nearly 200 and had about 50+ when I bought it in about 2006.

Extract Method

Not long after I took the plunge and purchased the full version.  Apart from preferring its UI paradigm my other reasons for preferring it over ReSharper were that Refactor! Pro offered support for both C# and VB, as well as C++ and more recently JavaScript. At the time of my decision ReSharper only offered C#.  As I anticipated using all these languages Refactor! Pro was a no-brainer. Thus far I've not taken the plunge and opted for the full CodeRush package. How do CodeRush and ReSharper compare today? As far as I can tell CodeRush/Refactor! Pro may be a little stronger on refactoring while ReSharper is stronger on code analysis and unit testing. Apart from this which is preferable seems largely to be a matter of taste.

CodeRush Xpress

Recently Developer Express made available a cutdown version of CodeRush for  C# developers called CodeRush Xpress. I have started using this and its best features are its file and class navigation support. You could say "Solution Explorer kiss my ass."

File Navigation

image

Quick Navigation

 image

The idea here is that CodeRush dynamically displays a list of file names or code elements as you type additional letters. Moreover it searches for any fragment within a name, not just the starting characters. Especially useful is its Pascal/Camel Case feature. This is easiest to explain by examining the following picture.

image

Typing the letters BDA displays a list of all types that are made up of words starting in BDA.

Code Metrics

Visual Studio Team System (VSTS) has a code metrics feature. This measures properties such as cyclomatic complexity and maintainability.  If you don't have VSTS then it is possible to obtain similar information via Reflector and its code metrics plug-in. See here for a close-up.

Code Metrics

CodeRush/Refactor! Pro also has a code metrics feature. The major difference and advantage it has over the other two is that the display is dynamic, i.e., the complexity graphs update themselves immediately after edits.

Cyclomatic Complexity

It's a useful way of driving your refactoring efforts. For example, Developer Express suggest that code complexity should be <= 10 and maintenance complexity should be <= 200. The complexity measures also work with C++ code. I don't think VSTS complexity does.

Saturday 31 January 2009

A Visual Basic Annoyance - Option Strict Off By Default

In Microsoft's .NET environment I normally program in C#. But occasionally I am asked to do Visual Basic development.  Visual Basic has a project property known as option strict. This is set off by default. Unfortunately, I usually forget to set it to on. When I eventually remember I update the global environment setting to make on the default for new projects. However, when developing, I usually have one or two test projects on the side for trying things out before incorporating them in my main project. This was exactly the situation I found myself  in recently.  I had created some test projects but, unknown to me , some of them were created before I remembered to change the global environment setting, so they still had option strict off. So I merrily tried out my ideas and then copied and pasted them back into my main project, which had option strict on, to find they didn't compile. Why didn't Microsoft make option strict on the default?