Category Archives: Other

Book review: Continuous Integration in .NET

Manning Publications provided me a free electronic copy of their book “Continuous Integration in .NET” that I was interested in. If you have started with automated unit testing, continuous integration (CI) is the next major step to increase the quality of the software that you deliver.

This book is excellent to get you introduced in CI for .NET and to get started with a set of tools (not only from Microsoft) such as:

  • CruiseControl.NET, TeamCity and Team Foundation Server (2010) as CI servers
  • MS build as build automation tool
  • NUnit, MS Test as unit testing framework
  • PartCover as test coverage tool
  • White, Selenium and FiNesse for integration, functional, acceptance tests
  • FxCop (Code Analysis), NDepend for compiled code analysis
  • StyleCop for source code analysis
  • Sandcastle as documentation generation tool
  • MS Setup, WiX and MS Deploy as setup / deployment tools
  • Etc.

I read programming books on a Kindle app for iPad / iPhone rather that on a computer, so I appreciate that the book is easy to read, with handy screenshots that avoid you to install/use the tools on a computer to get the idea about them.

The book covers the basics of each area and tool, detailing some good and not so good things of each tool, and providing examples of use with a sample project. If you have decided that you are going to stick to free tools, you can skip some sections (and the same applies if you have decided to use only Microsoft tools).

But once you have decided which of the above area(s) of CI and tools you are going to use, likely you will need other books/docs to get deeper. For example, for unit testing I recommend The Art of Unit Testing, also from Manning, but wait until the second edition is published. If you decide to use Team Foundation Server (TFS) version 2012, I recommend the free book Testing for Continuous Delivery with Visual Studio 2012. For other tools, you will need to use the documentation of the product.

The book is also a good resource to get your team introduced to CI is that is your scenario. Many developers tend to be much more focused on the code / manual testing than on automated testing, let alone on continuous integration which maybe they ignore at all. Maybe with books like this they realize the spectrum of things that can be done automatically and in a continuous fashion.

Book review: The Art of Unit Testing, 2nd edition (Roy Osherove)

After buying and reading the book Dependency Injection in .NET (Mark Seemann), I was about to buy The Art of Unit Testing (Roy Osherove), when I learned that the second edition will be available in a couple of months. I contacted Roy and Manning Publications Co were kind enough of providing me a complimentary copy of the Early Access Edition of The Art of Unit Testing, Second Edition, which I have read avidly in the last few days.

I was interested in the book because while I had worked with automated testing for the version 7.0 of my MZ-Tools add-in, I was not sure if I was doing it properly and I wanted to learn about integration testing / unit testing, the difference between fakes, mocks and stubs and the best practices. After reading the book, there is no doubt that Roy knows the stuff, and explains everything to make it clear:

– Part 1 explains the basics of unit testing and a first unit test.

– Part 2 explains the three kinds of testing: result testing, state testing and interaction testing, and how stubs are used for result / state testing and mocks are used for interaction testing. The last chapter of this part (and the first chapter of the next part) explains isolation (mocking) frameworks (as well as appendix B). It seems that not all isolation frameworks are created equal (constrained vs unconstrained) and the book explains quite well how they differ.

– Part 3 has one chapter to dig deeper into isolation frameworks and two chapters that I was very interested in: test hierarchies/organization and the pillars of good unit tests. The book puts an enormous effort in the importance of writing tests that are 1) Trustworthy 2) Maintainable and 3) Readable, and I couldn’t agree more because I have written tests that don’t follow those principles… this part of the book provides tons of good practices to the point of almost reaching perfectionism.

– Part 4 has two chapters: one devoted to integrating unit testing into the organization and other to working with legacy code (lacking unit testing). It is sad that books have to have such chapters because many developers and managers don’t see the value of automated testing, thinking only about the short term costs and not about the QA in the long term. But since this is the reality, this part can help you a lot in that case.

I have enjoyed the book a lot. It seems that as part of his consulting job, Roy has seen tons of bad
practices (as any team leader has seen with any kind of code, sadly) and this edition is a new attempt to explain developers how to become artists of unit testing. If they fail, it won’t be because of this excellent book.

Book review: Dependency Injection in .NET (Mark Seemann)

After hearing about Microsoft Unity from the .NET architects at my current company (my daily job), I bought the book Dependency Injection in .NET, by Mark Seemann some weeks ago to learn dependency injection and Inversion of Control (IoC) containers, just for curiosity because I thought it wouldn’t apply to my MZ-Tools add-in, or to Visual Studio extensions. I couldn’t be more wrong. I think this is the book that more profoundly has changed my mindset about creating software , because the concepts of programming against interfaces and loose coupling (explained in part 1 of the book) are one of the most important things that you need to master as a developer to create maintainable software, as now I have realized. Dependency Injection (DI) is the means to achieve loose coupling. The second part of the book explains DI patterns, anti-patterns and refactorings. The third part explains the three dimensions of DI: object composition, object lifetime and interception. The fourth part explains several DI containers of the many ones available for .NET. As the book explains, DI containers are optional, but even if you don’t use them, the software will benefit a lot from a loose coupled design.

How does this relate to developing Visual Studio add-ins? After reading the book I have realized that in my case in at least three aspects, and I wish I had read the book before:

1) Visual Studio add-ins hold a tight coupling with some dependencies such as the EnvDTE.dll, EnvDTE80.dll and VSLangProj.dll assemblies, that provide the automation model for add-ins. In all likelihood, all the features of your add-in depend on EnvDTE.Solution, EnvDTE.Project, EnvDTE.EditPoint, etc. Imagine now that you have to migrate your add-in to a Visual Studio package (SDK), because Microsoft plans to remove add-ins. It would be much easier if the features of your add-in would depend only on interfaces such as ISolution, IProject, IEditPoint that are implemented by classes that encapsulate the automation model for add-ins (EnvDTE.Solution, EnvDTE.Project), and later by classes that encapsulate the SDK services of packages (IVsSolution, IVsProject).

2) A loose coupled design and dependency injection allows to use unit testing more easily. Automated testing for add-ins is hard. So far I had done integration testing (with my own test runner), which while it is automated, it is slow and can’t be integrated easily with continuous integration (CI). Now I want to use unit testing too (because of the fast speed) through stubs, but since I haven’t used dependency injection I am finding problems to achieve a clean approach.

3) MZ-Tools 8.0 will support Visual Studio, VB6 and VBA as hosts. I am encapsulating the very different automation models with concrete classes such as SolutionEx, ProjectEx, etc. using partial classes and conditional compilation. But after reading the book I have realized that it would be better to use either interfaces or abstract classes.

I will try to find time to write a series of posts with code about using loose coupled design and dependency injection developing add-ins for Microsoft IDEs. Stay tuned.

Debugging .NET Framework working only “sometimes”

I am these days (well, nights) debugging an extremely difficult issue with the ImageList control and I had that idea that debugging the .NET Framework (source step) would help a lot. That feature was introduced in VS 2008 and my feeling these years when I tested it from time to time was that sometimes it worked, sometimes it didn’t. Yesterday that I needed it badly I searched the web and I discovered that I was not alone. There are tons of issues and workarounds about it. I tried all them to no avail.

Finally someone from Microsoft posted here (only a few weeks ago after all these years!) that the root cause is out-of-date PDBs, and that it is going to happen each time that Microsoft releases a new build of an assembly, which is not only in service packs but also any security patch. Since there is cadence between the release of the assembly and the release of the PDB (that has been up to 1 year in the past!), this feature is basically broken, bad designed.

The alternative, suggested strongly by that guy, is to use .NET Reflector, a paid tool which for developers of Visual Studio extensions is even more useful to debug Visual Studio assemblies (not just .NET Framework assemblies) as I explained in this guest post on their blog.

Long time without blogging

I have been long time without blogging, and the reason is that I have been quite busy doing the following:

– I have migrated the whole code of my MZ-Tools 7.0 add-in from VB.NET to C#. After 10 years programming in VB.NET, I decided to switch to C# and the best way is to use it everyday, so I had to migrate the product. The migration was successful and build 7.0.0.103 released on December 1 was C#-based. Only a couple of bugs were introduced that were not detected by the automated integration tests. FWIW, I used Instant C# from Tangible Software Solutions.

– I have enforced the Code Analysis feature of Visual Studio on the MZ-Tools code base with All Microsoft Rules, and after lots of fixes I was able to pass all with some custom suppressions and four of them disabled: CA1031 Do not catch general exception types, CA1502 Avoid excessive complexity, CA1506 Avoid excessive class coupling and CA5122 P/Invoke declarations should not be safe-critical. If you have tried to enforce them on a large code base you know how time-consuming is that.

– I have done massive architectural changes in the code base of MZ-Tools for Visual Studio to prepare a new unified version 8.0, .NET-based, that will support Visual Studio (VB.NET, C#), Visual Basic 6.0 (“Classic”), VBA editor of Office 32-bit and VBA editor of Office 64-bit. That means to encapsulate the automation models of VS and VB “classic”. I already have the user interface, options, setup and unit-test/integration-test subsystems. It “only” remains the features, but it will take me months yet :-). I will blog about this in the next months when I am closer to the release.

– I have created an integration test runner that runs in the IDE where MZ-Tools is loaded, rather than in the IDE where the MZ-Tools source code project is loaded. While the Visual Studio SDK provides a remote MS-Test-based host adapter for this purpose, I tried it two years ago with disappointing results, so I created my own integration testing infrastructure in MZ-Tools. But the MZ-Tools add-in, its integration tests and the test-runner were in the same assembly (using a special configuration). Now I have isolated them so I have the add-in, the integration tests and the test-runner in three separate assemblies. I hope to release the test runner in CodePlex or similar some day.

– I migrated to Visual Studio 2012 (I am almost used to the new UI style) and I am planning to adopt TFS (I am finishing the simultaneous reading of Professional Team Foundation Server 2012 and Testing for Continuous Delivery with Visual Studio 2012). In the past I used Perforce, but after a failed restore after a crash (likely my fault) I didn’t use source control for some time and I want to adopt TFS now.

And I was almost three weeks on vacation during Christmas, resting and watching lots of TV series 🙂

10 years as Microsoft Most Valuable Professional (MVP)

Happy New Year!

I have received a new Most Valuable Professional (MVP) award from Microsoft just a couple of hours ago. This is a very special one for me because it is my 10th MVP award and, as always, I am very grateful to Microsoft for this recognition:

“Dear Carlos Quintero,

Congratulations! We are pleased to present you with the 2013 Microsoft® MVP Award! This award is given to exceptional technical community leaders who actively share their high quality, real world expertise with others. We appreciate your outstanding contributions in Visual Basic technical communities during the past year.”

A wrong way of checking the Visual Studio version where an add-in is loaded

Take a look at this code of an add-in that checks that it is loaded only in Visual Studio 2012 and try to figure out what’s wrong:

private DTE2 _applicationObject;
private AddIn _addInInstance;

public void OnConnection(object application, ext_ConnectMode connectMode, object addInInst, ref Array custom)
{
   const string REGISTRY_KEY_VS_11_0 = "SOFTWARE\MICROSOFT\VISUALSTUDIO\11.0";

   string addInRegistryRoot = null;

   _applicationObject = (DTE2)application;
   _addInInstance = (AddIn)addInInst;

   switch (connectMode)
   {
      case ext_ConnectMode.ext_cm_AfterStartup:
      case ext_ConnectMode.ext_cm_Startup:

         addInRegistryRoot = _applicationObject.RegistryRoot.ToUpper();

         if (!addInRegistryRoot.StartsWith(REGISTRY_KEY_VS_11_0))
         {
            System.Windows.Forms.MessageBox.Show("This add-in only works with Visual Studio 2012");
         }
         break;
   }
}

If you have discovered it, either you are quite expert on globalization issues or you are a Turkish developer. My MZ-Tools add-in had similar code that has worked correctly for years until a few weeks ago, when a developer reported that using the Turkish culture (user interface, language input keyboard, etc.) MZ-Tools refused to load. You can simulate that effect inserting the following line at the start the OnConnection method that sets the Turkish culture for the current thread:

System.Threading.Thread.CurrentThread.CurrentCulture = System.Globalization.CultureInfo.GetCultureInfo(“tr-TR”);

The reason is that in Turkish, the upper case of the character i is not I but İ (with a dot), as explained here. So _applicationObject.RegistryRoot.ToUpper() returns the following string:

“SOFTWARE\MİCROSOFT\VİSUALSTUDİO\11.0”

which is different from:

“SOFTWARE\MICROSOFT\VISUALSTUDIO\11.0”

Of course, the solution is to use _applicationObject.RegistryRoot.ToUpperInvariant() or an overloaded version of StartsWith() that uses the ordinal or invariant culture comparison ignoring the case.

So, if you make software for international markets, ensure that you run the Code Analysis feature of Visual Studio with the Microsoft Globalization Rules, because two rules are violated with the code above:

– CA1304: Specify CultureInfo (in _applicationObject.RegistryRoot.ToUpper())
– CA1307: Specify StringComparison (in sAddInRegistryRoot.StartsWith(REGISTRY_KEY_VS_11_0))

Back from Microsoft Tech / Community Day 2012 Madrid

I’m back from the Microsoft Tech Day / Community Day 2012 at Madrid, which have been great. The aspect that I like most is networking with people, either old MVPs that I already know, new MVPs or other people.

In this event I was introduced to Jesús Salas, the author of VS Anywhere, also available on the Visual Studio Gallery, an extension for Visual Studio that provides team collaboration in real time with a very innovative approach.

I also met my friend Pep Lluis Baños, Visual Basic MVP like me, who brought to the event an intriguing kind of electronic mini motherboard that happened to be the mainboards used in .NET Micro Framework, a technology that I only knew the name. Since I returned, I have been so excited about this technology that I have watched the Gadgeteering and the .NET Micro Framework video, I have read documentation, I have bought the Getting Started with .NET Gadgeteer book that I am reading and I do plan to buy some .NET Gadgeteer Starter Kit for Christmas to start playing. It’s great also to see how powerful and flexible Visual Studio and the .NET Framework have become to accommodate all kind of extensions with project types, visual designers for electronic layouts, debuggers with external devices, etc.