Saturday 6 December 2008

WCF With The Castle Windsor Facility

Towards the end of a rather busy Saturday of coding in the office I decided to take on the exposing of some services at the boundary of a system I am working on. We're using the Castle project's Windsor container for our IOC on all of our new projects and so I figured that it would make sense to do a short spike into the WCF facility that ships with it to see whether this would be worth using going forwards. The short answer is that I think it is.

It proved very simple to get going. I'd recommend anyone looking to use this facility gets the latest version of the source code in the trunk before starting and has a look at the demo project in there as this proved very helpful.

I quickly defined an interface and set it up as a contract (as per normal with WCF), I then created a class that implemented the interface. At this point in the Global.asax.cs I configured my Windsor container mappings like this:


ServiceDebugBehavior returnFaultsAndMex =

                new ServiceDebugBehavior

                    {

                        IncludeExceptionDetailInFaults = true,

                        HttpHelpPageEnabled = true

                    };

            

            ServiceMetadataBehavior metadata =

                new ServiceMetadataBehavior {HttpGetEnabled = true};

 

            container = new WindsorContainer()

                .AddFacility<WcfFacility>()

                .Register(

                          Component.For<INHibernateSessionManager>()

                              .Named("NHibernateSessionManager")

                              .ImplementedBy<NHibernateSessionManager>(),

                          Component.For<IBarRepositoryFactory>()

                              .Named("BarRepositoryFactory")

                              .ImplementedBy<BarRepositoryFactory>()

                              .DependsOn(Property.ForKey("sessionFactoryConfigPath")

                                             .Eq(Path.Combine(

                                                     Path.GetDirectoryName(

                                                         GetType().Assembly.Location),

                                                     "Hibernate.cfg.xml"))),

                          Component.For<BarService>()

                              .Named("BarDomainService")

                              .ImplementedBy<BarService>(),

                          Component.For<IBarEnterpriseService>()

                              .Named("BarEnterpriseService")

                              .ImplementedBy<BarEnterpriseService>());



Next up was the .svc file:


<% @ServiceHost Service="BarEnterpriseService" 

    Factory="Castle.Facilities.WcfIntegration.DefaultServiceHostFactory, Castle.Facilities.WcfIntegration" %>



With a bit of work in the web.config file then, with a press of F5, I can navigate to the svc and get the MEX page:


  <system.serviceModel>

    <services>

      <service name="Foo.Bar.EnterpriseServices.BarEnterpriseService" behaviorConfiguration="ReturnFaultsAndMex">

        <endpoint contract="Foo.Bar.EnterpriseServiceContracts.IBarEnterpriseService"

                          binding="basicHttpBinding"/>

      </service>

    </services>

    <behaviors>

      <serviceBehaviors>

        <behavior name="ReturnFaultsAndMex" >

          <serviceDebug includeExceptionDetailInFaults="true" />

          <serviceMetadata httpGetEnabled="true" />

        </behavior>

      </serviceBehaviors>

    </behaviors>

  </system.serviceModel>



This is great and very nicely integrates the whole WCF experience into my IOC centric application. I do still have a couple of areas where I have questions. In the global.asax file I included details of two behviours, for error details and to enable the MEX service. This code was lifted from the sample in the Castle trunk. I still needed to add these behaviours explicitly into the web.config though. Present or removed these registrations seem to have no effect, and I find the same to be the case with the demo app.

Wednesday 19 November 2008

Continuous Integration with CI Factory part 2

I've just finished setting up a new build instance with CI Factory. This time I used the latest version (1.0.1.5 at the time of writing). I followed the steps here, together with some from my last CIFactory post.

I tried this time to add a property in Arguments.xml of this:
< property name="MSBuild.Framework.Version" value="${framework::get-framework-directory('net-3.5')}" />
In addition then I edited Compile.Target.xml to use this property like so:
program="${MSBuild.Framework.Version}\msbuild.exe" workingdir="${ProductDirectory}"
After running the run.bat file the only thing that failed was that I got the same old error about needing to set the path to the share for VSS so I went in to the ccnetproject.xml file and edited the ssdir setting to provide the appropriate path to my network share that holds my vss ini file (as described in my last CIFactory post).

I ran the CCNETServer.bat and that problem went away, but then I saw that the first build failed. On checking the log what I saw was that the MSBuild.Framework.Version property was not being set. To resolve this I went to the Compile.Properties.xml file for the MSBuild package and added the property in there, not ideal, but it did the job.

I now have a working build for the project on this server, alongside the build from the previous post (which is still to be upgraded). Next up the upgrade of the old one!

Monday 17 November 2008

Defending Scrum

Recently Scrum has been taking a bit of a battering in the wider community, being made to take responsibility for some of the perceived failings of Agile. Under particular attack has been the scrum based certifications which are seen to churn out 'experts' on the basis of short courses. But who then in many cases lack the necessary expertise, of which experience is an important constituent, to actually make Scrum work. This is a view that I have much sympathy with, but I guess I distinguish between Scrum the methodology and Scrum the business and so am not required to hold the methodology to account for its', damn near literal, selling out by the associated business(es). That said, it was refreshing today then to read a defence of agile, and scrum in particular, on Robert C. Martin's 'Uncle Bob' blog.

http://blog.objectmentor.com/articles/2008/11/16/dirty-rotten-scrumdrels

I find myself in agreement with Uncle Bob's sentiment and arguments. The thing that I find a little surprising is the implication that teams can adopt XP or Scrum, maybe I have misread. Personally I have always thought of Scrum as a kind of management wrapper that works very well when surrounding XP practices. Another interesting post I read today was by Jeremy Miller.

http://codebetter.com/blogs/jeremy.miller/archive/2008/11/16/thoughts-on-the-decline-and-fall-of-agile.aspx

I agree with most of what Jeremy has to say, and for me it comes back to the twelve agile principles (http://agilemanifesto.org/principles.html). If we as a development team lack the discipline to ensure that each of these twelve principles are adhered to then we invite issues of the sort that Scrum's accusers make reference. Scrum provides a framework that I find makes the introduction of Agile into an organisation easier. It is not the endpoint to my mind, as always there is a maturation process and with the right people, and good use of retrospectives, the teams practices may evolve out of the strictures of Scrum and into something different. What it does do though is provide some initial norms that the team can work to. In a project I am managing at work we have brought together a number of contractors, all highly skilled and some with excellent agile experience, to develop some software where the need to 'provide early and continuous delivery of valuable software' could not be greater. Scrum, I find, accelerates the process through Tuckman's four stages of team development, enabling the team to reach the stages of norming and performing far more quickly that they might were they to have to deal with project management issues during the forming and storming phases. Quite frankly, given the timescales, it is quite enough that they come to a sense of group ownership of the architecture, design and requirements, I am more than happy to begin with a Scrum based imposition of how the team will work, and allow them through retrospectives (starting after at least a couple of sprints) to begin with debating, and incorporating, improvements. This to me parallels closely the idea from the 5-S model that Shitsuke only begins once the initial 4 S's have been established.

This parallel with the 5S's brings me to Lean. Something that I've not been able to entirely figure out yet is the way that Scrum seems to be set off against Lean so much at the moment. If I look at the 5-S model for example I struggle to see how these would not be carried into a successful Scrum implementation. Perhaps it is because a lot of the Scrum material typically focuses on the project management, but the requirement for every feature developed to be of quality (even if deciding on quality can be left to the team) is still there. If quality becomes a relative concept where it can be 'good enough' then this is up to the team. If the team is committed to agile then the hope would be that there definition of quality includes the principle that 'Continuous attention to technical excellence and good design enhances agility.' Kanban were introduced to me whilst I was working at Immediacy and have been an essential in my Scrum toolkit since, Scrum in its essence seems to me to be as much about promoting visibility as anything else and Kanban are an excellent way of achieving this. Like I would imagine any agile system to be, Scrum is centred around a Pull system vision where the software developed is closely driven by its end users. The system is also a self-Pull system where the team pulls in detail and direction as required, pulls in the removal of impediments or resources where required.

In all of these respects and many more Scrum provides an COTS for Agile teams, where the customisations may lead to the team no longer doing 'Scrum' in the strictest sense, it may be a more open 'Lean' style where sprints are not timeboxed for example. At least some of the roots of Scrum (at the least its' name) are in the work of Nonaka and Takeuchi, as well as in the Agile software movement, and their text 'The Knowledge Creating Company' is one that I keep close as a constant source of interest. This text with its' examinations of companies such as Honda, Canon and Matsushita provides valuable insights into the practices of management in Japanese companies which typically come under the banner of 'Lean'. As a point in case, let us again return to the 5-S model which was first developed by Canon (Japanese Management Association, 1987) who were studied by Nonaka and Takeuchi, who in turn provided, at the least, inspiration to Ken Schwaber. A tenuous link perhaps, but one that perhaps makes it less surprising that this model fits so well with Scrum. The common ancestry, commonality of practice, shared principles, and current open warfare between Lean and Scrum always put me in mind that often the intra-denominational arguments can be at least as fierce as extra-denominational ones.

As a final thought, when in his post James Shore states that 'when people say "Agile," they usually mean Scrum' I am left wondering if this is true. A great many people (particularly in project management and related capacities) who I deal with know little of Scrum, and many have not heard of it. They are as likely to think of Agile Unified Processes or DSDM, and in many cases are so imbued with Waterfallesque thinking that they apply Scrum as a label to their clearly non-Agile UP practices. Agile is not something that they currently grok. Whilst I think it is true that Scrum has gained a very real momentum, in no small part due to the business that promotes it and certifies some of its' practitioners, I do not think that this is in any way a bad thing despite this. As Scrum implementations fail some will no doubt use it as an excuse to write it, and possible Agile with it, off. To my mind such people are unlikely to be successful whatever they do, harsh perhaps, but not reflexively analysing and learning from failures is a sure way to stunt professional growth and progression (and probably a reason for a failure to make Scrum work). If, however, good professionals (and this is the vast majority of those whom I have been privileged to work with) reflexively analyse the reasons for a perceived failure of Scrum then in many cases I suspect they will see the problem as not being with the methodology, but rather with its' interpretation and implementation in their context. With the adoption of methods not compatible with it, and with the failure to adopt methods required by it, in short the failure to work in the Scrum way.

Friday 14 November 2008

Continuous Integration with CI Factory

For the project that I'm currently working on I've needed to set up an automated build process, no surprises there. Having worked in environments previously where the development and maintenance of the build occupied a large amount of effort, and required a reasonable amount of specialised knowledge I was keen to ensure that this would not be the case (it wouldn't be sustainable here). That in mind I chose to use CI Factory. There is some excellent documentation on how to install CIFactory on their site, and I will not repeat this here. Rather I will focus on what I had to do to get my application up and running. The application in question is a .NET 3.5 application.

I downloaded CI Factory following the link from the CI Factory website. This is to version 1.0.0.76 and this is the version that I used. I have just noticed that if you click to view the full download list of all versions then the current version is 1.0.1.5. I will be doing my next setup with this and it looks like I'll be following the upgrade instructions from here, which should help me to better understand this tool.

The first thing I did was to get VS2008 Team Suite installed on the Build machine, and VSS 2005 (yes we are still using VSS - TFS should be along in the next couple of weeks but we need our CI in place now).

I opened CIFactory.sln, converting it 2008 along the way using the Wizard, then opened the Arguments.xml file. Editing this file I:
  1. Renamed the project and pointed the projects location to the D drive. This is because, for SAN reasons, we have limited storage space on the C drive but the D drive can grow plenty.
  2. Entered the build master and an initial set of developer details.
  3. Removed the SVN details.
  4. Removed the unwanted packages (for now it is very stripped down, I will look to add more packages in the future when we buy licences for things like NDepend)
  5. Added the VSS package.
  6. Added the VSS arguments.
I opened the Properties.xml file for the MSBuild package and edited it to point to .net 3.5 instead of .net 2.0 (more of this later).

I then opened the Properties.xml file for the Visual Source Safe package and edited:
  1. The VSS.ExePath to point to the correct VSS install location, for me this is Program Files\Microsoft Visual SourceSafe\ss.exe
  2. The VSS.TemplateDB so that it used the D drive
At this point I ran run.bat and it went off and created the build system, the project structure in VSS, the build solution and its' files all by itself and all very nicely.

I still had some extra work to do though.

I got an error that VSS needed to set the ssdir directory. A bit of playing soon discovered that doing this did not help. What I did to resolve this was to open the ccnetproject.xml file and edit it in two places:
  1. Change the cruisecontrol\sourcecontrol\ssdir element so that it points to the correct path for the VSS share.
  2. Change the project\sourcecontrol\sourcecontrolprovider\ssdir element points to the correct path for the VSS share.
The final thing to sort was in response to an error message "External Program Failed: C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\msbuild.exe (return code was 1)]]".

To resolve this I opened the Compile.Target.xml file in the MSBuild package folder and edited it to alter the version of the framework that it refers to from 2.0 to 3.5, so that look like this:
<target name="Compile.CompileSource">
<exec
program="${framework::get-framework-directory('net-2.0')}\msbuild.exe" workingdir="${ProductDirectory}"
failonerror="false"
resultproperty="Private.Compile.Result"
verbose="true"
>
Having to change this was a bit annoying as I thought that changing this initially would have prevented my needing to repeat this step.

With this change I got a green light from the build, happy days, and a nicely packaged set of files. I still have a bit to do, getting some more packages added in. I also have to get some builds setup for some other projects, and upgrade this install. Additionally once I've upgraded I'll see if any of the issues I've had look like bugs and raise them if they are.

So I guess over the next couple of weeks I may blog more on this.

Blog posts that I found useful to get this far are:

http://www.cifactory.org/joomla/
http://geekswithblogs.net/twickers/archive/2007/04/03/110691.aspx
http://geekswithblogs.net/twickers/archive/2007/04/03/110693.aspx
http://codebetter.com/blogs/jeffrey.palermo/archive/2007/11/28/upgrade-nant-for-use-with-vs2008-solutions-and-net-3-5.aspx

Finally a big thanks to Jay Flowers for putting this all together.


Thursday 6 November 2008

Visual Studio 2010 (Rosario) - First Impressions (1)

I've just gone through the pain of downloading each of the files required to extract the Visual Studio 2010 aka Rosario September CTP VPC. Why couldn't they use the Microsoft Download Manager??

Anyway, first things first, I opened Visual Studio 2010 and went to the New Project dialog. The first thing that I noticed was that the MVC Framework is missing. Now I'm not surprised as it is yet to be released, and I don't doubt that it'll make it to the final release, but it's a bit disappointing all the same as it means I'll have to settle for bad old asp.net when I get to the web stuff.

New project dialog box

The new options that stand out to me are the WiX project options, nice to see that getting shipped with Visual Studio, and Modelling Projects (which currently just has an empty project). Being curious to see what this is this Modelling Project is my first point of call.

Choosing to create a new Empty Modelling Project presents me with this:

MyFirstModellingProject1

I get a Model Explorer on the left and in the Solution Explorer I can see that I have been given a ModelDefinition folder. Opening this folder reveals a single file - ModelDefinition.uml. When I right click and choose to Add a File I have a number of different types of model presented to me. These are:

  • Activity Diagram
  • Component Diagram
  • Layer Diagram
  • Logical Class Diagram
  • Sequence Diagram
  • Use Case Diagram

My first thoughts are that it's a useful enough set of diagrams to work from. I'm a little surprised not to see a State Machine available, especially given the way that Workflow Foundation might have some obvious synergy with this type of model. There's also no sign of support for OCL, but I'm less surprised by this (and certainly not disappointed). I've also never come across a 'Layer Diagram' diagram before, and a quick look at the OMG UML Superstructure specification doesn't seem to mention it - though maybe I just can't spot it there. That being the case the Layer Diagram seems to be like a logical place to start playing.

The Layer Diagram seems to be about showing the various layers that will make up an application, and the dependencies between those layers. It has 3 patterns available out of the box that you can drag and drop on; 3 layer, 4 layer, and MVC. Seeing the MVC pattern there is nice, but I'm not sure how useful this is. How many applications end up wth just these three layers, hmmm, maybe if you're following the ActiveRecord pattern but otherwise you're likely to have some form of Repository/DAO layer I would think. So being a generally DDD guy the first 'layer' pattern that I've tried to put together follows an Onion/Hexagonal pattern.

Onion Layer Diagram2

Ok, so why is this better than what I can do with basic shapes in Visio. Well, one thing is that I can specify acceptable namespaces for each layer (and unacceptable ones), drag and drop projects onto the layer from the solution explorer, which associates the project with that layer. Now if I specify an invalid namespace in a class file then when the model is validated I get a series of errors telling me of the bad namespace naming. Now I've got to say that this feature on its' own is not exactly compelling. Yes, I like to ensure compliance to conventions across the dev team(s) that I work with, but I wouldn't create this diagram just for that purpose. Also, I haven't created this in the way that I normally would - I typically follow the examples provided by Kyle Bailey and Jeffrey Palermo. With this I wasn't sure of how to represent the utilities that would normally be present. I was also tempted to break some of the layers up and make this more like a project diagram, a temptation I resisted. Another feature I spotted was the ability to put layers inside of layers, so I could have dropped the MVC layer inside of my UI layer, if I did though then suddenly for me my UI layer would contain layers that are typically just folders inside of a project for me. Maybe I'm just not 'getting it', or I'm just not working on big enough systems)(though I'd question the ability to manage the complexity of a system with many more layers that represented collections of projects. I guess that when more features are added to this type of model then its' real potential for usefulness will become clear. At the moment I'm not really sold on it.

Looking at the xml that is used to represent the diagram behind the scenes makes me worried about the potential merge conflicts that might emerge should two people ever work on the same artifact at the same time. This is something that I think I'll definitely check out soon as it has a real impact on how practical something like this can be when you are working as a part of a team.

Next up is a sequence diagram. These are diagrams that I use a lot and typically I use Enterprise Architect to create them. I've tried using Visio and always just end up incredibly frustrated at the lack of even basic support for this type of artifact.

Well, the less than headline news is that this is nowhere near being Enterprise Architect so Sparx Systems can sleep easy for at least another release of Visual Studio. On dragging a lifeline on to the surface I am able to specify that I am dealing with an actor. I am not able to specify that it is a boundary however which is something that I use quite a lot. Support for messaging seems to be ok with synchronous and asynchronous messages easily added into the diagram. Self messaging is also nicely catered for. Creating objects is handled ok, I prefer it when the arrow points to the box rather than to the lifeline itself, but I'm not that fussy. I can't see that it is possible to denote deletion from another object but I guess that given garbage collection in managed applications this doesn't ever really happen as such. I also can't see that it's possible to add constraints (OCL or otherwise). Frames seem to be supported through the 'Interaction Use' which I am assuming is an interaction frame.

SequenceDiagram1

Currently Guard clauses don't seem to be possible which is a bit of a miss. This is probably because the only operator currently available for the interaction frame is 'ref' and I'd need at least support for 'alt', 'loop', 'opt', and 'par' before I could make much practical use of the sequence diagram facility. Still there's at least a year to go before release so I hope that they do expand this feature. Also it would be nice to be able to drag a class on to the designer and then use it. With Enterprise Architect I can reverse my code base into a model and then get to select from classes and their available methods when I am putting the model together. I don't expect this feature to be comparable to Enterprise Architect when it releases, but if it is to be more than a gimmick there is a very long way to go. The good news for Microsoft (I think) is that I'm already more inclined to use this CTP than the UML features of Visio 2007 (which are just bloody awful), they may not be as feature rich yet, but they already feel more right.

Another post to follow as I continue to explore this CTP.

Monday 27 October 2008

Searching, Ordering, and Paging with NHibernate

For a while now I've been looking at ways of making it as simple as possible to create scalable searches and add ordering and paging to them when using NHibernate as my ORM. I read this article by Oren a while back and it has guided my thinking since. I had the pleasure of mentioning this to Oren at the last ALT.NET drinks in London. He grabbed Seb's laptop and threw some code together, code very similar to this that has since appeared on his blog here.

I like the way that he has a class to organise the building of the criteria, it controls the complexity that filtering can quickly gain. I had a couple of reservations though. Primarily, the way that the filtering class assumes some of the responsibilities of the repository by executing the criteria. Relatedly, adding ordering and paging into the filtering class would seem to add additional responsibilities that do not belong to the class, but that I need to configure. So I've come up with a variant and with more than a little trepidation I'm putting it on my blog.

In the repository class I have a FindAll method that takes a DetachedCriteria object:

public IList<T> FindAll(DetachedCriteria criteria)
{
ICriteria toFire = criteria.GetExecutableCriteria(NHibernateHelper.OpenSession());
return toFire.List<T>();

}

To provide this DetachedCriteria I have a SearchCriteria object, in this case a CandidateSearchCriteria object. This is currently very simple in only providing for two fields but obviously more could be added to the interface and implemented. It provides a List of ICriterion objects.

public class CandidateSearchCriteria : List<ICriterion>, ICandidateSearchCriteria
{
private string _name;
public string Name
{
get { return _name; }
set
{
if (string.IsNullOrEmpty(value) || _name == value) return;
_name = value;
Add(Restrictions.Like("Name.FullName", value, MatchMode.Anywhere));
}
}

private DateRange? _registrationDate;
public DateRange? RegsitrationDate
{
get { return _registrationDate; }
set
{
if (_registrationDate == null || _registrationDate.Value.Equals(value.Value)) return;
_registrationDate = value;
AddRange(_registrationDate.Value.BuildCriterion());
}
}
}

public interface ICandidateSearchCriteria : IList<ICriterion>
{
string Name { get; set; }
DateRange? RegsitrationDate { get; set; }
}

To enable easy addition of paging and ordering concerns I've then created some extension methods:

public static DetachedCriteria Build<T> (this IList<ICriterion> list)
{
DetachedCriteria criteria = DetachedCriteria.For<T>();
foreach (ICriterion criterion in list)
{
criteria.Add(criterion);
}
return criteria;
}

public static DetachedCriteria Page (this DetachedCriteria criteria, int pageNumber, int pageSize)
{
criteria.SetMaxResults(pageSize);
criteria.SetFirstResult(pageSize*pageNumber - 1);
return criteria;
}

public static DetachedCriteria OrderBy (this DetachedCriteria criteria, string fieldName, Direction direction)
{
criteria.AddOrder(new Order(fieldName, direction.IsAscending()));
return criteria;
}

The first of these adds a Build extension to an IList<ICriterion>, this produces the DetachedCriteria object. The next two provide for a more 'Fluent' way to set the Paging and Ordering capabilities.

This all get used like this:

ICandidateRepository candidateRepository = MvcApplication.WindsorContainer.Resolve<ICandidateRepository>();
ICandidateSearchCriteria criteria = MvcApplication.WindsorContainer.Resolve<ICandidateSearchCriteria>();
if (!string.IsNullOrEmpty(name)) { criteria.Name = name; }

DetachedCriteria toFire = criteria.Build<Candidate>().OrderBy(sortBy.ToString(), direction).Page(fetchPage, recordsPerPage);

In particular this call is what I like:

criteria.Build<Candidate>().OrderBy(sortBy.ToString(), direction).Page(fetchPage, recordsPerPage);

Having exposed the DetachedCriteria from the searchCriteria class I can simply add the paging and ordering using the new extension methods.

My biggest dislike for this at the moment is that I have needed to reference the DetachedCriteria in my Repository's interface. This bleeding through of NHibernate into my interface is something that I try to minimise. Ideally I'd like to be able to create a repository layer using a different ORM, or even that persists to something other than a relational database, without having to change my interface too much.

Another thing that is unsettling me with this (unfinished) solution is the whole open-closed thing. It was pointed out to me the other day that this approach is not open to extension, and is not closed to modification. I had already tried to accommodate this to an extent by maintaining the SRP (allowing the paging & ordering to be outside of the filtering, and requiring collaboration with the Repository) through exposing the list of ICriterion, thinking that this would allow for some extension.

This is all very much a WIP for me at the moment and I think it'll be a while before I settle on anything. I suspect that I'll be blogging more on this topic before too long.

Thursday 2 October 2008

I Wish the Framework Team Used Interfaces More

One of the things that often bugs me about the .NET Framework is that so many of the classes are written without an interface. A case in point is the System.Net.Mail.SmtpClient class. This is an excellent class and very useful. It extends System.Object, but it does not implement an interface. This is extremely frustrating!

It's not that I would like to write my own version and swap it in, but I would like to be able to Mock it nicely and follow that development 101 practice of coding to the Interface. Not having an interface is very limiting, what if I want to use a Decorator pattern to implement logging (yes I know that it does some of this internally) or authorisation, no chance!

How hard would it have been to write it to an interface? It takes me no more than a couple of clicks and movements of the mouse.

Sunday 28 September 2008

Keeping code clean - Decorators for Error Handling?

I read a review a few days ago of the new book by Robert C Martin (Uncle Bob) 'Clean Code'. I'm going to order a copy as soon as I clear a couple of books from my current reading list backlog as it sounds interesting and the Uncle Bob blog on Object Mentor is one that I always enjoy to read. One of the things that stuck in my mind from the review was that the 'Do one thing' principle applies even to error handling.

Hmmm, if you see the example in the review (I don't know if its' from the book) then this is shown as not evaluating the exception and acting according to its' type, but rather just logging it. This has been nagging in my mind since I read it. Its' not that I disagree with the example, using exceptions for control of flow has always struck me as being a thoroughly poor practice in every place I've seen it done. It was more the idea that perhaps I shouldn't even have the try-catch block around my code in the first place. Now I know that the where's and when's of exception handling (let alone the how's and why's - any of Kiplings six honest serving men in fact) are a very contested country and I'm certainly not positing what follows as 'the answer', just an idea.

What struck me was that I could use the decorator pattern to implement my error handling. For example I have an interface:

public interface ICandidateRepository
{
Candidate GetCandidate(Guid candidateId);

Candidate RegisterNewCandidate();

void SaveCandidate(Candidate candidate);

List<Candidate> FindCandidates(ICandidateCriteria criteria);

}

I already have a concrete class that implements this interface and uses NHibernate to perform the functions that are described. Instead of wrapping these calls in try-catch blocks I can create a new concrete class that implements this interface and looks like this:

public class CandidateRepositoryExceptionDecorator : ICandidateRepository
{
private readonly ICandidateRepository _nestedCandidateRepository;
private ILogger _logger;

public ILogger Logger
{
get { return _logger ?? NullLogger.Instance; }
set { _logger = value; }
}

public CandidateRepositoryExceptionDecorator(ICandidateRepository candidateRepository)
{
_nestedCandidateRepository = candidateRepository;
}

public Candidate GetCandidate(Guid candidateId)
{
try
{
return _nestedCandidateRepository.GetCandidate(candidateId);
}
catch (Exception ex)
{
_logger.ErrorFormat(ex, "Exception when calling GetCandidate({0}).", candidateId);
throw;
}
}

// ... etc ...

}

I use my DI framework of choice to get the parameters into the constructor and to organise the nesting of my objects. In my case this is with the Windsor container. The components section is left like this:

<components>
<component id="candidateRepositoryErrorHandler" service="Semeosis.Examinations.Model.Interfaces.ICandidateRepository, Semeosis.Examinations.Model.Interfaces" type="Semeosis.Examinations.Model.Repositories.Decorators.CandidateRepositoryExceptionDecorator, Semeosis.Examinations.Model.Repositories.Decorators" />
<component id="candidateRepository" service="Semeosis.Examinations.Model.Interfaces.ICandidateRepository, Semeosis.Examinations.Model.Interfaces" type="Semeosis.Examinations.Model.Repositories.NHibernate.CandidateRepository, Semeosis.Examinations.Model.Repositories.NHibernate" />
</components>

The order in which the components are listed determines the order in which the container chains the objects. See this post by Oren for more info.

The upshot of this is that when I call:

ICandidateRepository repository = container.Resolve<ICandidateRepository>();

I get an instance of my CandidateRepositoryExceptionDecorator and this contains my CandidateRepository class from my NHibernate repositories project.

Now my concerns are separated and I can keep the functions that deal with the creational and persistence concerns separate from the error handling concerns. If in the future I need to change the way that my exception handling is manager, or I need to perform other actions than just logging when an exception is logged then I can very easily extend this code. Also unit testing this is incredibly trivial using mocking, whilst also reducing the number of tests on the functions that actually 'do the work'. Separation of concerns in one layer helps keep things clean and tidy across many other layers of the onion! I guess that's why TDD can have such a beneficial effect on model design - but the benefits can be two way.

Testing DateTime Type Properties for Value Equality in Integration Tests

I've just been writing some integration tests to ensure that some Repositories I'm writing work with NHibernate properly, basically ensuring that the mappings do what I want. The Repository classes in question use the latest release of NHibernate (2.0.0.GA) for their implementation and I'm using MbUnit for my integration tests.

To try and save on the number of asserts that I have to write I thought that I'd try to use the Assert.AreValueEqual method that MbUnit provides. This allows me to write this in my test:

foreach (PropertyInfo propertyInfo in typeof(Candidate).GetProperties())
{
Assert.AreValueEqual(propertyInfo,expected,actual, null);
}

The context for this test is that I'm persisting a copy of a new Candidate object to the database (using NHibernate, but not my repository) which exposes a property, RegistrationDate, of the type ?DateTime. When this property is cycled through to the Assert always fails. Initially this puzzled me as the two DateTime instances appeared to be the same. A quick look at the DateTime.Equals(DateTime) method in reflector soon revealed that when this operation is called it is the InternalTicks property that is the subject of the comparison:

return (this.InternalTicks == value.InternalTicks);

When I get the DateTime back from the database this is not as precise as it was originally in the world of the CLR. The expected (original) value is 633581595292333682 whilst the actual (retrieved) value is 633581595290000000. This isn't enough of a difference to be significant for my purposes, as the ToString() representations are Equal and I don't need a level of accuracy beyond days, let alone fractions of a second.

I figure my options here are:

  1. Don't use Assert.AreValueEqual and hand write each Assertion as an Assert.AreEqual instead;
  2. Store the DateTime in the database as the value of the InternalTicks rather than the DateTime value itself. The obvious problem I can see with this is that any extraction, such as to a reporting database, would become more problematic if this was done. I figure there are times when testing should inform design, but this is clearly not one of them!
  3. Isolate NullableDateTime type properties and deal with these differently.

By changing the code a little I am able to isolate DateTime properties and deal with these using the ToString() method like this:

Type NullableDateTimeType = Type.GetType("System.Nullable`1[[System.DateTime, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]]");
foreach (PropertyInfo propertyInfo in typeof(Candidate).GetProperties())
{
if (propertyInfo.PropertyType == NullableDateTimeType)
{
DateTime expectedDateTime = (DateTime)propertyInfo.GetValue(expected, null);
DateTime actualDateTime = (DateTime)propertyInfo.GetValue(expected, null);
Assert.AreEqual(expectedDateTime.ToString(), actualDateTime.ToString(), "The expected RegistrationDate is not equal to the actual RegistrationDate.");
}
else
{
Assert.AreValueEqual(propertyInfo, expected, actual, null);
}
}

I'm not too sure how much coding I've saved myself with this, but I figure that as the number of properties involved increases it will reduce maintenance. If anyone out there has a better solution (doing this at gone 2 in the morning makes me think that there probably is a better solution) then please feel free to suggest it.

Tuesday 23 September 2008

Uninstalling AVG Anti-Virus SBS Edition broke my MS Exchange

The licence for my AVG Small Business Server 2003 Edition recently came to an end and after looking around I decided that rather than renew it I'd replace it with Microsoft's new Forefront products. This is probably vast overkill, but I figured it'd be interesting to see how it all works and I generally prefer vast overkill anyway.

What I discovered was that it's quite an involved process to get all the various pieces installed and working (ever the way with 'Enterprise software') so I uninstalled the AVG software and for a little while now have been going without protection on this server. Yeah, yeah, bad idea I know.

Shortly after uninstalling I found that Outlook clients couldn't connect to the Exchange server on this box. Not only that, but people were getting bounce emails saying that their messages couldn't be delivered. Thankfully I route all emails through another external email server (with its' own anti-virus and spam filtering) before having them forwarded through to my internal Exchange server so I wasn't actually losing the emails, but clearly something was wrong. I tried the old 'restart the box' trick but even this didn't work so I turned my attention to the Event logs and started to look for any errors of warning. What I found, amongst others not seemingly relevant, were the following:

Event Type: Error
Event Source: MSExchangeFBPublish
Event Category: General
Event ID: 8197
Date: 19/09/2008
Time: 11:18:36
User: N/A
Computer: The server name
Description:
Error initializing session for virtual machine SERVER01. The error number is 0x8004011d. Make sure Microsoft Exchange Store is running.

Event Type: Information
Event Source: EXCDO
Event Category: General
Event ID: 8196
Date: 19/09/2008
Time: 11:17:13
User: N/A
Computer: The server name
Description:
Calendaring agent is stopping successfully.

Event Type: Information
Event Source: EXOLEDB
Event Category: General
Event ID: 101
Date: 19/09/2008
Time: 11:17:13
User: N/A
Computer: The server name
Description:
Microsoft Exchange OLEDB has successfully shutdown.

Event Type: Error
Event Source: MSExchangeIS
Event Category: General
Event ID: 9564
Date: 19/09/2008
Time: 11:17:12
User: N/A
Computer: The server name
Description:
Error 0x80004005 starting the Microsoft Exchange Information Store.
Failed to init VSERVER.

Event Type: Error
Event Source: MSExchangeIS
Event Category: Virus Scanning
Event ID: 9581
Date: 19/09/2008
Time: 11:17:12
User: N/A
Computer: The server name
Description:
Error code -2147467259 returned from virus scanner initialization routine. Virus scanner was not loaded.

What I took from these was that due to my uninstalling the anti-virus software Exchange was now broken because the Microsoft Exchange Information Store couldn't now start.

A quick google on 'Error 0x80004005 starting the Microsoft Exchange Information Store' led me to an Experts-Exchange post, and scrolling to the bottom informed me about a registry key: HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\MSExchangeIS\VirusScan. I changed this, as suggested, so that Enabled is set to 0, restarted the server, waited a few minutes, started Outlook and held my breath. Very quickly emails started to flood in to their relevant folders.

Now all I have to do is get the Forefront products up and running nicely, and then probably upgrade my infrastructure to Server 2008 and Exchange 2008 (following the new branch office configuration).

ASP.NET MVC - If it feels this right it can't be wrong!

Having had a few days leave from work I've spent most of my evenings playing with the not-so new MVC framework. It's been great to spend some focused time playing with this new toy for web development from Microsoft. One thing that I've noticed is how nicely and quickly you can start to put together powerful, enterprise happy software with it. My starting point has been a model (a bunch of POCOs based around the educational assessment domain, which I know well) using NHibernate for persistence, Windsor for a DI framework (though not for the controllers yet), MBUnit and RhinoMocks for unit testing, and the latest (5) release of the MVC framework. After getting some simple stories completed (add a candidate, edit a candidate, register a candidate for an assessment) I wanted to try and improve the user experience by getting some Ajax into the Views. I've had a play with using Scriptaculous and JQuery with the MVC framework during brief forays in the past, so this time I thought I'd just use the default MS stuff that ships with it.

One example of the kind of implicit wiring that the framework deals with you that I found really neat was this. I have a view that lists candidates based on a search (I hope to blog about that before long).

<%
foreach (Candidate candidate in ViewData.Model as List<Candidate>)
{%>
<tr>
<td><%=Ajax.ActionLink("View", "ViewDetail", "Candidate", candidate, new AjaxOptions { UpdateTargetId = "CandidateDetail" })%></td>
<td><%=Ajax.ActionLink("Edit","EditDetail","Candidate", candidate, new AjaxOptions{ UpdateTargetId = "CandidateDetail"}) %></td>
<td><%=Html.Encode(candidate.Id.ToString())%></td>
<td><%=Html.Encode(candidate.Name.FullName)%></td>
</tr>
<%
}%>

Note the use of the Ajax.ActionLink method here. This will update my CandidateDetail div with the results of the Ajax call. The call itself will be to the ViewDetail action of the CandidateController class, and the properties of the Candidate object for the row will be available to this action. The neat thing for me is that in the action if I request just an id argument then this is automagically wired up to the Id property of the Candidate object for me!

[AcceptVerbs("POST")]
public ActionResult ViewDetail(Guid id)
{
ICandidateRepository candidateRepository = MvcApplication.WindsorContainer.Resolve<ICandidateRepository>();
Candidate candidate = candidateRepository.GetCandidate(id);
return View("Detail", candidate);
}

As you can see, the code above has one argument that accepts a Guid and is called id. The framework, following convention (love that), realises I mean the id of the candidate object and passes it into the method. Neat huh! I then get the appropriate candidate object from my repository and then return the detail view (an ascx control) passing the retrieved candidate object in. This control will then be rendered into the div originally specified (CandidateDetail).

This is just one example of how easy it can be to get this framework working for you. I'm really, really liking working with all this new MVC goodness, it just feels so, right!

Monday 1 September 2008

Google Chrome

Google this and you will find a slew of links talking about a new open source browser that Google are developing.

Check this link for a scanned in copy of a comic book that is apparently being sent out to various chosen people/organisations. Also this link (same site) for a description.

It looks interesting, a bit more competition in this space should be good. I just hope that they implement the DOM and Javascript properly (unlike [all] other well known implementations). Which makes me wonder if there will be a Flash or Silverlight add-in for it (and if these will be treated with parity).

My favourite feature from those discussed is that there will be a task manager where users will be able to view the plugins and web pages active in the browser, and the resources that they are responsible for consuming.

Google's power in the web-space is already significant and ownership of a browser would no doubt increase this, open source or not. Hmmmm... guess I'll get back to playing with AppEngine and learning Python!

Tuesday 26 August 2008

Getting Ubuntu 'Hardy Heron' on to a VM - Part 2

New day, and armed with the download of the Desktop version of 'Hardy Heron' I'm installing it on to a Virtual PC VM following the instructions that I found here: http://blogs.technet.com/seanearp/archive/2008/05/13/installing-ubuntu-8-04-hardy-heron-in-virtual-pc-2007.aspx

After reaching the point where I have started the install on the desktop - so far so good.

installing linux on virtual pc 1

installing linux on virtual pc 2

It already is looking far more promising.

Well I've gotten through now as far as altering the audio, and all good so far! I'm holding my breath whilst it restarts (not literally - its' not that quick).

Well, I just heard some bongo like sounding noise immediately before being invited to login (and again after getting the details wrong first time).

installing linux on virtual pc success

Success!

The moral to this seems to be to use the desktop version instead of the server version. If anyone has got the server version on an MS flavour VM then I'd love to hear about it.

Next will be following the instructions to get the screen size right, templating it, and then getting Oracle installed.

Saturday 23 August 2008

Getting Ubuntu 'Hardy Heron' on to a VM - An Unhappy Saga

I've been meaning to dip my toes into the Linux water for a while now - my last exposure being a brief one about ten years ago at Uni. Yesterday evening I thought I'd actually do it. My particular motivation in this case was to run Oracle, something I'm working with a lot at work, and I figured I might as well run it on Linux as not (although we run it on Windows at work).

So, with my Ubuntu Hardy Heron downloaded, and a nice enough VM fresh and clean (2GB RAM, 16GB HDD) I naively thought I'd have it done before bed. Sadly I was wrong :(

Problem number 1 was "An unrecoverable processor error has been encountered.".

HardyHeronVirtualPCError1

After a lot of Googling I found quite a few posts on this. A number of them recommended installing it in Safe Graphics Mode. When I pressed F4 though I only got one option, Normal, so that wasn't going to help me I figured. In the end the article that I followed was this one: http://www.corey-m.com/blog/?p=326.

This recommended that I press F6 and alter the Boot Options line by adding noreplace-paravirt to the end of it (well, just before the --). This was to be done on the initial install, and then again after this but using the 'Rescue a broken system' option this time. Well I did this and then from the Shell I tried to execute the 'apt-get update' command.

This brought about problem 2.

linuxvirtualpcproblem2

Basically, a bunch of 'Could not resolve' messages. Hmmm, at this point I went to bed. This evening though when I got in from work I thought I'd have another crack at it. I tried pinging www.google.com with no joy, but then tried pinging it by IP address and that succeeded. Clearly my issue was connected with the resolution of domain names. A bit more googling later and I found this posting: https://bugs.launchpad.net/ubuntu/+source/resolvconf/+bug/177767. Although not specifically for my situation, it seemed to present a resolution for something close enough so I figured that I'd give it a go.

touch /etc/resolv.conf
dhclient eth0

Having entered the above the resolv.conf file was created, google could be pinged by name and best of all the apt-get update command worked.

This working I returned to the aticle I was following originally (http://www.corey-m.com/blog/?p=326) and tried the next instruction: apt-get install linux-386.

This brought me to problem 3. A series of 'dependency problems' with a variety of Linux packages were reported apparently because they were 'not configured yet', leading to a message that 'Errors were encountered while processing'.

linuxvirtualpcproblem3

Day 3... still no further...

I tried various tricks to get the packages to be configured, in particular reading and following the advice in this 'http://ubuntuforums.org/showthread.php?t=186672' post. But no matter what I tried, still no success.

So I gave up (for now at least) and thought I'd try a different tack - install Windows Server 2005 R2 and get it working on that. I remember seeing a blog post that said that Hardy installs easily onto a Virtual Server vm.

Installing Virtual Server proved to be not too much trouble. I'm running Vista Enterprise and I needed to install the CGI feature (I already had the IIS 6 Compatibility stuff), the Windows Authentication feature. This allowed the exe to run (for the admin interface) and got rid of the annonying 'An error occurred accessing the website application data folder.' message. Just remember to run the Admin interface as an Administrator (and make sure you don't have Anonymous Authentication enabled for the site).

Virtual Server 2005R2 installed and working, I created a new VM and began the install of Hardy Heron on to it. I found I still had to use the noreplace-paravirt option to get it the install to run. One install later and a screen I've unhappily seen before appeared.

linuxvirtualpcproblem4

I'd seen this screen already, when trying to let an install of Ubuntu boot.

Day 4: This is not an experience that I'm enjoying! I'm downloading the latest version of Hardy Heron, but this time the Desktop version rather than the Server version. I left my computer downloading it last night but Vista decided to restart after updating itself with only about 20MB left to go, so I've had to restart the download today. I feel as if this whole process is cursed! I have to drive to Liverpool now, so I'll post later when I've had a chance to see if this has made any difference.

NHibernate 2.0 is Finally Released

I've just read on Ayende's blog that the final release of NHibernate 2.0 is now available. I've just got the latest release and TortoiseSVN just finished getting me the latest code now.

That's my bank holiday sorted!

Thursday 21 August 2008

Simian - A Copy and Paste Code Hunter Killer!

In my new(ish) job I have to look after a legacy application with a very large code base. Over the years this has been developed with a lot of copy and paste coding practices. This is a significant contributor to our 'technical debt' and certainly increases the costs of ownership associated with the application, adversely affecting troubleshooting, maintenance, and new feature development.

I have already been making good use of NDepend and Resharper to assist with refactoring, and still have hopes that we will invest in Ants Profiler, but I was missing a tool that would nicely identify for me where copy and paste coding might have occurred. After a little Googling I discovered Simian.

This great little command prompt utility will analyse a Visual Studio solution and spot instances of the same code existing in multiple places. It was really simple to integrate in to the Visual Studio IDE and will make a big difference to my team going forward. Now when working on a piece of code we can quickly check to see if it exists else where and refactor accordingly. This new visibility is going to be invaluable as we move forward and attempt to bit by bit pay back the technical debt.

I found a great blog post by a chap from Conchango that showed how to integrate it with Visual Studio, but in the end I mainly just used the, plenty good enough, documentation that ships with it.

The line that I am currently using to have this tool scan my solution is:

-formatter=vs:c:\temp\buks_simian.log -language=cs $(ProjectDir)/../**/*.cs $(ProjectDir)/../**/*.aspx $(ProjectDir)/../**/*.js

What this does is:

-formatter=vs:c:\temp\buks_simian.log (specify that the output should be dumped to a file on my C drive and formatted for Visual Studio)

-language=cs $(ProjectDir)/../**/*.cs (specify that the all the files that are children of the parents of my current project directory should be searched (this is recursive thanks to the /**/))

$(ProjectDir)/../**/*.aspx $(ProjectDir)/../**/*.js (like above but also look at the aspx and js files too).

The reason why I am starting from the ProjectDirectory and then going up a level before initiating a recursive scan, rather than just using the Solution directory is that my Solution files live outside of the structure of the projects, ensuring that they don't end up source controlled.

Wednesday 6 August 2008

SQL Server 2008 is RTM!

So SQL Server 2008 has finally made it to RTM. I can't wait to get it installed and to start seeing how the features look now, and how the performance compares. They're available now on MSDN and TechNet.

SQL Server 2008 versions available for download.

So there they are, although I don't seem to be able to download the Developer version yet.

Expect blog posts on this before long...

Wednesday 30 July 2008

.NET 3.5 Microsoft Courseware with IEEE Computer Society

I logged on to the IEEE Computer Society site for the first time in a few months today and got a pleasant surprise when I saw that at least some of the latest .NET 3.5 courseware is available free (usually £210.33 last time I looked) to members. It even includes courses on the ADO.NET Data Services and the Entity Framework.

I've done a few of the course they provide before including the Certified Ethical Hacker course, and a bunch of .NET ones. The only downside with the .NET ones is that (at least last time I did one) you don't get the virtual labs, you do get the exercises though so you can setup your own machine and work through them anyway. There are bucket loads of courses available to memebers on .NET, Java, Databases (SQL Server 2000, 2005 and Oracle 8 and 9i), Cisco, Adobe, business skills, IT Security, SharePoint (WSS and MOSS) and lots, lots more.


So if you're not an IEEE Computer Society member and you're thinking of doing any self-paced at home IT courses then I'd recommend joining because for the cost of buying one of these courses from Microsoft you can have a years membership, a great magazine (Computer) every month, and lots and lots of free courses!

Oh and well done to Microsoft for supporting IEEE Computer Society members in this way.

Monday 28 July 2008

Resharper 4 Bug with 'Use Format String'

I'd like to say first that I really, really like Resharper 4.0 and have made it my mission to persuade my boss to buy it for all of the Dev's where I work. I got my copy free at a NxtGenUG Southampton meeting and it continues to make my life as a developer easier.

But...

I've come across a small bug in Resharper 4.0 today, so I thought that I'd share so that others become aware.

Real code has been changed to protect the innocent (and because it belongs to my employer).

Start off with a StringBuilder.Append statement that uses the '+' operator to concatenate a string with an object. Why an object, well because that's what the code I'm working with has, not ideal, but that's working with legacy code for you sometimes.

Of course, using the Append statement with string concatenation isn't ideal practice either and what I'd like is to change it to an AppendFormat call. Now Resharper makes this very, very (seductively) simple.

Here's my example code before I let Resharper do its' thing:

Here's Resharper offering to do its' thing:


Here's the code that results:

The important line to note is that we now have:

stringBuilder.AppendFormat("{0} likes Resharper.", (object[])canBeCastToString);

Note the unnecessary cast to an object array. Because of this we now get the error:

System.InvalidCastException: Unable to cast object of type 'System.String' to type 'System.Object[]'.

The solution is just to get rid of the superfluous cast to Object[].

I still really, really like Resharper 4.0.

Saturday 19 July 2008

You Grok What You Eat

Following on from conversations that I've had with various people recently it has struck me, and its' not a revolutionary thought, that my practice, and my thinking about my practice, is intimately linked to the feeds I subscribe to, the books I read, the events I attend, the people I chat with, and my work. So, as I finally got around to adding my ALT.NET Geek Code, I thought I'd write a short blog focusing on the feeds I subscribe to and some of the more influential (on me that is) books I have read in the last couple of years.

With feeds I of course subscribe to those of friends (see right) and I won't list them again as they are always present on this site. I'll start by giving a special mention to my two favourite blogs.

Ayende @ Rahein's blog has to be top of my list, his blog is a pure goldmine of ideas and inspiration and I'd thoroughly recommend his Hibernating Rhinos series if you haven't seen it.

Next up, Scott Gu, always interesting to see what MS is coming out with next.

After these two, here's the rest:

For books the list is a lot shorter.

First up comes Object Thinking. I just loved this book and will happily rave about it to anyone. Having originally read Social Anthropology at university and then gone on to do a masters in social science research methodologies reading this book brought together for me a number of thoughts that had been swirling around in my mind about the potential importance of insights from Anthropology, Sociology and other social sciences that are rarely given any attention in computing. In fact when I read it, I read it as much as a anthropology of the practice of software development as a book that gives great concrete guidance on the practice of software development. It wasn't until going to a great talk by Alan Dean at the Southampton NxtGen that I realised I had probably missed a lot of what the author was actually trying to communicate. I fully intend to reread it before too long in the light of Alan's talk.

Next is 'Domain Driven Design' by Eric Evans. A fantastic book that has really changed the way that I try to design and code software. Putting it here is almost as much of a cliche as citing Scott Gu's blog but to ignore it would be to not mention an important part of my diet. I actually came to read it as it had been used so much as a counterpoint in Object Thinking's sometimes quite duological arguments.

Agile Project Management with Scrum by Ken Schweber. The first scrum book that I read. Easy to read, every page is worthwhile.

C# In Depth by John Skeet. I haven't quite finished this yet but I'm going to give it a mention anyway because already it is the best 'code' book I've read to date. A lot of coding books that I've got and read have really frustrated me because they all seem to cover the same old, same old ground every time. Jon's book is focused on C# and avoids being drawn in to the framework which is fantastic because the last thing that I'd have wanted is another book with another chapter on ADO.NET, ASP.NET, databinding, etc... I saw Jon give a talk on C# a while ago and for me it was the best technical talk I have ever attended. He involved the audience and I left feeling that I had really learnt something. This book is almost as good as his talks.

If C# in Depth was a book that I hadn't finished reading then the next book is a book that the authors haven't finished writing! 'Brownfield Application Development in .NET' by Kyle Baley and Donald Belcham is, so far, great and represents everything that I am trying to achieve in my job at the moment. I can't wait for the later chapters to be released and would recommend it to anyone who has to transform not just a poorly designed and coded piece of software, but the practices and patterns of a team also.

Other books worth a mention include the GoF Design Patterns, CLR via C#, Object Oriented Software Construction, The Knowledge Creating Company, The Object Primer, Being Digital (now that's going back some).

So where does this diet lead me? See the ALT.NET geek code on the right.

Thursday 19 June 2008

Resharper 4.0 and Lambda Expressions

Now I'm fairly new to the wonderful world of Resharper, so barely a day goes by where I don't discover something new about it, or through it. But today I found a feature that just made me want to share about it straight away.

Let me set the scene.

I'm coding a Lambda expression for use as a predicate and when finished I notice the now familiar little light bulb. Hmmm I thought, now what could Resharper be wanting to tell me about my code?

Resharper offers conversion from Lambda to statement, and to anonymous method.(real code replaced throughout to protect the innocent)

That's right Resharper is offering to convert my Lambda to either an anonymous method or a statement! So let's see what it does to the code if I choose these options.

Well If I choose the option 'Lambda expression to statement it turns my code into this:

The same code as inline statement.Note with this option I am given the choice to now convert on to an expression, or to an anonymous method.

If I choose the option 'To anonymous method' my code looks like this:

The same code as anonymous method. Note that it is offering to convert my code back to a lambda

If I choose the 'Lambda expression to anonymous method' option and place the caret in to the delegate keyword then I get a different option however.

Different Resharper options with the anonymous method.

This time the options include one to convert my anonymous method to a named method. Choosing this does what you will no doubt expect. It creates named method.

private static bool NewPredicate(Person p)
{ return p.Name == "Sue"; }

and changes my code to read:

Person sue = people.Find(NewPredicate);

Actually I added the prefix new, as Resharper just put Predicate, but you get the idea. Of course, Resharper now gives me the option to inline my Predicate as either an anonymous method, or as a Lambda.

Resharper offering to inline my named delegate method.

Neat huh!

Choosing to inline as an anonymous method acts as I would expect, and puts the code back to be the same as the anonymous version above. Choosing to inline as a Lambda however a have a tiny, tiny gripe with.

Resharper offering to remove the 'redundant parenthesis' that it inserted.

As you can see above, Resharper has placed the target variable within brackets. Still, at least Resharper knows that it shouldn't have done this, as it offers to remove them!

What I really, really like most about this, and made me want to blog about it, it that it very neatly highlights some of the evolution of C# from version 1 with its' named delegates (though of course we didn't have delegate comparisons then), through version 2, with its' anonymous methods, through to the current version 3 with its' Lambda goodness.

Thursday 12 June 2008

NxtGenUG Fest '08

Just got back from an excellent day attending the NxtGenUG Fest '08 at Microsoft's campus at Reading.

Like all NxtGen related events everything was smooth and incredibly well put together. I enjoyed all of the talks and some of them have definitely inspired me to go and hit the computer for a play.

Mike Taulty's talk on ADO.NET Data Services was an excellent and engaging introduction to this feature that will be properly released with the forthcoming .NET 3.5 Service Pack 1. The technology seems very seductive, and I will try and find some time before long to have a play with it but it seems very data driven to me and the implications of this if the technology was to be deployed much beyond a simple demo app concerns me a little. Still, until I look into it properly that's just a gut reaction and may well be wrong.

Daniel Moth gave a short talk on Parallel Linq which looked very interesting and this is something that I will definitely be looking into before long.

Dave Morrow gave a really interesting talk on providing BI using SQL Analysis Services, Proclarity, MS Performance Point and SharePoint (WSS 3). The world of BI (business information) is not one that I know particularly, but this talk has really gotten me wanting to have a play with these tools. It'll stretch the limits of my hardware (even with VMs) to get the necessary environments setup to play around sensibly with this stuff but I think that I'll enjoy it and it'll be worth it. One thing that I wondered watching this demo was whether I could bring SPSS (a statistical analysis package I used during my MSc) to the party and get some even more interesting information out of this mix. No doubt I'll blog about this when I've looked into it, and had a good play, some.

The talk that most interested me however was Oliver Strum's talk on F#. I can't overstate how interesting and well delivered this presentation was in my opinion. I'd been wanting to look into F# for a few months now without ever getting around to it. Oliver's talk has made this (along with my continued playing with Spec#) the new big priority for my personal R&D time. I'd heard/read that F# would be a good tool for writing business rules engines, but had been leaning toward doing this with a Boo based DSL. This introduction to F# is making me think that I really need to look into this before I make that decision. I suspect also that, as an added benefit, a play in this space will do wonders for my use of Lambda expressions in C# too.

Of course the day was not just about the great speakers (they were all great, not just the ones that I've highlighted), but like any NxtGen event it was also about the pizza, the swag. I got, amongst other things, a free copy of the full DevExpress software (no that's not why I'm so full of praise for Olivers' fantastic talk), and some Microsoft 'Heroes' chocolates. Beyond all of that though the chance to meet and chat with former colleagues (great to see such a large number from Immediacy) and other NxtGeners (great to see so many from the Southampton branch) really made the day. Congratulations and thanks to Dave, Rich and John for another brilliantly hosted and put together event.

Now I just can't wait for DDD 7 (which, incidentally, I have just submitted a proposal for a Grok talk on Design by Contract and Spec# for).

Saturday 7 June 2008

Revisited: Insufficient Permissions on web.config when delegating administration of a web site to remote users in IIS 7

I have been meaning to return to this since I first posted on it and gave my talk about the remote management of IIS 7 at NxtGenUG Southampton. Well I've finally gotten around to it.

This error occurred when remote administration of an IIS 7 web site and/or application is delegated to an IIS 7 user, and that user attempts to alter a setting of the site/application from a remote machine. The solution that I originally posted was to grant the local Service account modify permissions on the web.config file in question. This always seemed unsatisfactory to me as it quite a sweeping grant that increases the surface area for security issues, breaking with the principal of least privilege as it does. However at the time I first looked into this it was already gone midnight and I had a big day the next day with work, a long drive, and then the talk to give, so I went to bed and decided to look into this again another day.

If you just want the solution then skip now to the bottom paragraph, if like me you believe that the journey is worth more than the destination (and if you've been struggling with this under pressure then I doubt you will) then here is also how I got there.

I finally got around to looking into this further last night (another post-midnight session). I ran procmon.exe (part of the Sysinternals suite) and replicated the issue on a new VM so that I could see what was going on 'under the hood'.

Image showing that the wmsvc process was impersonating the NT AUTHORITY\LOCAL SERVCE account when its' access was denied.

As the image above shows the wmsvc process was impersonating the NT AUTHORITY\LOCAL SERVICE account when it was denied access the web.config file of the site. So it would seem that in giving this account permissions on the web.config file I did the correct thing (I didn't so keep reading).

Something about all of this still puzzled me however, when I look at those events logged in procmon that show the wmsvc process accessing (successfully) the applicationHost.exe process it too is impersonating the NT AUTHORITY\LOCAL SERVICE account. Well you might think that makes sense, wmsvc is after all a service and that is the account that it is impersonating. The strange thing for me was this.

Image showing that applicationHost.config does not hanve any ACL entries for the NETWORK SERVICE account, but does have an entry for 'WMSvc'.

Looking at this what stands out to me are these two things. 1, there is no entry for NT AUTHORITY/LOCAL SERVICE. 2, there is an entry for something called WMSvc.

After a bit of Googling and reading around I have sussed what is going on, and why I couldn't work it out that evening.

With Windows Vista (and so of course Windows Server 2008) Microsoft looked to address the security risks that having services all running as the LOCAL SERVICE account poses. When I set the permissions on the web.config file to allow LOCAL SERVICE I am not just permitting the service that I am interested in (wmsvc) I am permitting any service that impersonates this user. This is after all what bothered me in the first place. Microsoft have introduced a feature termed Service Hardening that addresses this issue. What this means is that I can now give permissions that target a specific service only. Wow, that's a great (and necessary) improvement.

So the solution to the problem is this: give modify permissions to NT SERVICE\WMSvc. Do this and the remote delegated access will work like a dream, and the principal of least privelege will have been adhered to!

My only criticism of the new features in all of this is: why can't IIS Management do this setting of permission for me when I choose to delegate permissions to an IIS User for the management of a site or application? It would be a lot easier!