Tuesday, December 15, 2009

Using jQuery on the client with JSON data for smart client-side data filtering



Recently I started using some of the more advanced features of jQuery via a FUBU MVC project and from this experience I have grown a deep love and appreciation for the power it brings to the web developers set of tools. jQuery is a full-featured library add-in to JavaScript that enables DOM traversal. To more easily grok what jQuery sets out to accomplish, think of traversing the browser DOM in a SQL query like manner. With jQuery I will demonstrate that you can easily parse through a JSON data set stored in the DOM. The solution requires client side filtering for three selectors that reside on a page that displays a grid of data. The users desire an AJAX like selection process. We will store the JSON data in a hidden field mimicking the initial data set. This hidden field will additionally contain fields for the client-side filtering that are not part of the grids initial display.

View Models

FUBU MVC is an opinionated approach to web development. One of the underlying concepts of the FUBU MVC architecture that I think is fundamentally strong is what the authors refer to as the “Thunderdome Principal”. One model in and one model out. As I mentioned at the outset above, our out model defined for the view will need a property containing the entire record set that’s initially sent to the view together with additional filter selectors. To aid in the transformation to and from jQuery to JSON we will utilize an additional jQuery plug-in called jQuery-JSON. Jason Grundy, one of the Elegant Coders provides an insightful discussion on this plug-in and complements the ideas I will discuss here together with applicable code for wiring up the solution. The idea is not that difficult. A word of caution, when your data set is in the thousands of rows this approach may not be the most sensible choice due to the amount of data passing over the wire. The first step is to utilize the JavaScriptSerializer class that is contained within the System.Web.Script.Serialization namespace. To serialize data to the client, simply call the Serialize method and conversely to fill a property with JSON data coming back from the client call the Deserialize method. The MapLocations property below is of type IEnumerable<T> and is initially mapped to a user control contained within our view that will render a list of mapping locations. We will include three drop-down selectors directly above the grid of map locations for implementing the client side filtering requirement. The property TargetMapLocationsListJson which is of type string, will contain the data in JSON format for the grid and also the fields required for filtering.

 outModel.MapLocations = locationList;
 outModel.TargetMapLocationsListJson = _javaScriptSerializer.Serialize( outModel.MapLocations);


 <div id="selectedMapLocations">
                <table class="data-grid">
                                Store Name
                                Number of Employees
                            ("No locations exist for these choices").WithoutItemWrapper().WithoutListWrapper()

Our view model that displays the grid of map locations passes into the filtered list view a strongly typed list of store locations that are rendered by a ForEachOf HTML helper extension included within the FUBU MVC’s core framework. Ryan Kelley of Elegant Code wrote a great series on FUBU MVC and he covers wiring up the view model with the view in greater detail than I will here. If you are committed to using the FUBU MVC architecture I highly recommend that you check out his 4 part series.

We set up a hidden control on the page to hold the serialized JSON data that will mimic what is displayed within the selectedMapLocations div that is populated with another FUBU MVC helper extension called the HiddenFor control.

<div id="mapLocationsList">

                <%= this.HiddenFor(x => x.TargetMapLocationsListJson).ElementId("targetMapLocationsListJSON")%>




And now we can move on to the JavaScript. There will not be any server involvement until the user clicks the submit button. You will note above in the HTML for the data-grid that we have included a column named Action. This column will contain a button click event allowing the user to add the various map locations to their profile. This will add the location selection to a hidden field that is converted to JSON so that we can Deserialize the selections on the server. We will then add each of the selections to IEnumerable<T> property that is contained within the Domain object.

On document ready we created a function that contains initialization methods, event handling routines and associated helper routines. We set three global Boolean properties to false. As previously noted we use three filters that are invoked by dropdown controls. On initialization each of these dropdown controls are set to null. Then each of the dropdown event handlers are instantiated. When a user changes a selection the event handler associated to the dropdown control fires as noted below for State.

            function initSelectedStateChangeEventHandling() {
                $("#list-maplocations-state").change(function() {
                    if (stateChanged == false && $(this).val() == "") return;
                    $("#list-maplocations-state option:selected").each(function() {
                        stateChanged = true;
                        findMapLocationsByFilter($(this).val(), 'state');

What should be evident from the above JavaScript snippet is that when the Boolean property is no longer false and the selection value is not null then we for each through the users selection choices and call into a function that performs the client-side filtering. Lets have a look at the findMapLocationsByFilter function.

            function findMapLocationsByFilter(criteria, filterType) {
                var originalList = $('tbody');
                var data = $("#targetMapLocationListJSON").val();
                if (data == "") return;
                $.each($.evalJSON(data), function() {
                    originalList.append(getSelectedFilteredMapsHtml(this, criteria, filterType));

This is a generic function that is used by all dropdown selectors. In this function we utilize jQuery selectors for the existing content within the tbody element. We then modify the selector to contain only the elements within the tr and td tags. We then call the remove method on the content contained within the elements. A new selector is instantiated with the content contained in the hidden field that is structured as JSON data. We utilize the jQuery for each method together with the jQuery-JSON plugin’s evalJSON method. Each data element in the hidden field is passed into another function that will append the filtered content back into the selector that we previously cleared. Lets now have a look and the getSelectedFilteredMapsHtml function.

    function getSelectedFilteredmapsHtml(mapData, criteria, filterType) {   
        if (filterHasData(mapData, criteria, filterType)) {   

return "<tr id=" + "maplocations-maplist-row-" + mapData.mapId + "" + " stateId=" + mapData.StateId

                + " countyId=" + mapData.CountyId + " zipId=" + mapData.ZipCodeId + ">"   
                + "<td id='storeName'>" + mapData.StoreName + "</td>"   
                + "<td id='sales'>" + mapData.Sales + "</td>"   
                + "<td id='numEmployees'>" + mapData.NumberOfEmployees + "</td>"   
                + "<td id='includemaps'>"  
                + "<span class='form-item'" + ">"  
                + "<input id=" + "save-maplocations-" + mapData.mapId + " type=button" +  " value='Include Location'" + "></input>"  
                + "</span>"  
                + "</td></tr>"; 



Clearly there will not be any row returned if the filter contains no data. The filterHasData function compares the stateId within the JSON data to the value of the selectors changed stateId. If they are equal then we return and append content back to the div we cleared earlier. The filterHasData function is quite simple and shown below.

            function filterHasData(mapData, criteria, filterType) {
                switch (filterType) {
                    case "state":
                        return (mapData.StateId == criteria);
                    case "county":
                        return (mapData.CountyId == criteria);
                    case "zip":
                        return (mapData.ZipCodeId == criteria);
                        return null;

What has become quite evident to me from having dabbled in a couple of MVC frameworks is the importance of knowing how to use JavaScript correctly. Adding a few extensions to it really assists you in developing smarter and more powerful UI experiences. For added assistance I would recommend the following books:

  1. Manning – jQuery in Action, Bear Bibeault & Yehuda Katz
  2. O’Reilly – JavaScript: The Good Parts, Douglas Crockford

I hope to post a detailed solution with all code in the near future. I appreciate any comments or alternatives the approach I have taken.

Sunday, October 4, 2009

NH Profiler – Using Filters to find the needle in the haystack

The allows a developer to analyze the behaviors between the domain and data layer more effectively. If you are using NHibernate I think this piece of software should be in your arsenal of tools. Oh, and for you Java folks it works against Hibernate. image

Session Filtering

Before I start the analysis notice to the left we have an unfiltered view of our current sessions. If you look a bit closer, you’ll notice that the recent statements list contains some added noise. To remedy this we will apply a filter to all session statements containing a url path for image,css and JavaScript files.

Locate the Filter component, in the upper right window of NH Profiler. imageBy default filtering is Inactive. So lets define one custom to our session!

  1. Click on the filter icon and in the dropdown list of the Edit Filter window choose filtering for  “Sessions by URL.
  2. Click the Add button. image
  3. Choose the not containing operator.
  4. Enter the text for filtering. In my example I want to filter URL’s being sourced with the text '/Assets/’. 
  5. Click apply.


And voila! we now have a more concise session list.image

When the analysis is complex try to make it simpler with filters! In a future discussion we will look into how NH Prof can diff sessions and what insight we can gain from doing this.

Thursday, September 10, 2009

Part 2 - Extending Sharp Architecture with the version 1.0 of Fluent NHibernate’s ManyToManyTableConvention


In my last post I showed how to override S#arp Architecture’s implementation of Fluent NHibernate’s auto-mapping conventions. In the text that follows we will show how you can easily continue following the default behavior of S#arp Architecture and use convention over configuration. We will add a convention mapping strategy to automatically handle ManyToMany relationships. In doing so S#arp Architecture will be enabled to work with M:M entity relationships by default out of the box.

Before we dive into the implementation of code changes to facilitate this functionality, we will review the steps that will be required to implement some of the new FNH Interface improvements and changes introduced with the version 1.0 release. Our current project is based off of S#arp Architecture 1.0 which uses the version preceding Fluent NHibernate 1.0. Our project was previously using NHibernate and the new version of Fluent NHibernate compiles to NHibernate There has been mention by others that the Castle stack being used in S#arp Architecture requires an update also. In my situation the only Castle component requiring updating was to down-grade the Castle byte code provider from version to

The Detour – Housekeeping tasks

I used a great new tool named to ease the pain of upgrading versions of Fluent NHibernate and its interdependent parts. I have heard about Horn for a while but had not spent any cycles on it till now. There is a great thread on the S#arp Architecture Google group discussion where a frustrated individual lamented the pain of upgrading Open Source Software. Horn also contains a discussion group and has a contrib group. It’s still in the infancy stages but is definitely worth a look. After downloading the binary of Horn, build it and then issue the following command line statements:

Horn –install:fluentnhibernate

Horn –install:nhibernate.validator

Through trial and error when building S#arp Architecture with the new version of NHibernate the existing NHibernate Validator assembly did not work properly with version of NHibernate. Horn will look-up FNH’s hard dependencies, retrieve the projects and build them within your local Horn Package Tree. As an example I recently installed the project by Jimmy Bogard. I issed the following command line arguments against Horn.exe:

Horn –install:automapper

and a  little over a minute the result folder is populated as follows:


All updated assemblies and any required dependent assemblies are placed into this location. Take the rebuilt assemblies from this folder and add them to your S#arp Architecture’s lib folder location. Rebuild the binaries for S#arp Architecture and then copy these binaries together with the updated binaries for Fluent NHibernate, NHibernate, NHibernate Validator to your project’s lib folder. Rebuild your project solution and it will likely fail with the following error:

Server Error in '/' Application.

Could not load file or assembly 'NHibernate, Version=, Culture=neutral, PublicKeyToken=aa95f207798dfdb4' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)

The fix for this is to add a dependent assembly binding entry for the NHibernate binary in your web.config file. This was likely required to get the S#arp Architecture binaries to fully compile with an app.config entry and is sometimes forgotten in the projects that use the core framework libraries. Place the following configuration settings in your web config file’s <runtime> tag:

assemblyIdentity name="NHibernate" publicKeyToken="AA95F207798DFDB4" culture="neutral"/>
bindingRedirect oldVersion="" newVersion=""/>

wrote a detailed 3 part series on individual changes required to get the conventions to compile with FNH 1.0. I recommend you review and make the changes that make sense for your projects use of the FNH conventions.

Addition of new convention classes

The Fluent NHibernate convention classes are located within your data assembly and are organized within a folder named NHibernateMaps as follows:


I have added CustomManyToManyTableNameConvention and HasManyToManyConvention classes to the Conventions folder. Also notice the exclusion of the mapping classes.

Implement a class to derive from the ManyToManyTableNameConvention base class

public class CustomManyToManyTableNameConvention : ManyToManyTableNameConvention
protected override string GetBiDirectionalTableName(IManyToManyCollectionInspector collection,
IManyToManyCollectionInspector otherSide)
return Inflector.Net.Inflector.Pluralize(collection.EntityType.Name) +

protected override string GetUniDirectionalTableName(IManyToManyCollectionInspector collection)
return Inflector.Net.Inflector.Pluralize(collection.EntityType.Name) +

Our custom implementation of the ManyToManyTableNameConvention base class includes overrides for the table names for either Bidirectional or Unidirectional associations. This is a recommended approach from James Gregory to avoid tables being created for either side of the respective associations. 

Implement a class deriving from the IHasManyToManyConvention

    public class HasManyToManyConvention : IHasManyToManyConvention
public void Apply(IManyToManyCollectionInstance instance)

Implement A Class to Override Foreign Key Naming For M:M and M:O Associations

public class CustomForeignKeyConvention : ForeignKeyConvention
protected override string GetKeyName(PropertyInfo property, Type type)
if (property == null)
return type.Name + "ID";
return property.Name + "ID";

In all there is very little code to implement this new convention. The convention is bootstrapped as follows in the AutoPersistenceModelGenerator class. Focus your attention to the GetConventions method.

public AutoPersistenceModel Generate()
var mappings = new AutoPersistenceModel();
return mappings;
private static Action<IConventionFinder> GetConventions()
return c => {


Within my test runs I perform sanity checks on my NHibernate maps and save schema and HBM file changes. Here is an excerpt from the test run showing just the areas of interest from introducing the new convention classes. What becomes apparent and is real cool is that the default for collection associations is a bag. I presume this is so since I am using an IList to manage this collection in either side of the association and FNH uses reflection to auto set the relationship to the .Net equivalent of the bag which is an IList. By default the inverse of the relationship is added without any explicit code and to the correct side of the relationship! This is pretty freaking cool if you ask me. 


default-lazy="true"> …

<bag cascade="save-update" 
column name="RoleID" />
column name="AppUserID" />


<hibernate-mapping xmlns="urn:nhibernate-mapping-2.2" 
default-lazy="true"> …

<bag cascade="save-update" 
column name="AppUserID" />
column name="RoleID" />

Here is the Edit of a recently created user. Note the checked values for both roles that I added this user to.


Remove the user from both roles.


In the final post for this series I will show the front-end changes made to make this all come together.


Wednesday, September 9, 2009

VAN:S#arp Architecture Revisited – Advanced Techniques – November 4, 2009


Billy McCafferty will join us once again on the heels of releasing Service Pack 1 for S#arp Architecture version 1. He will spend some time reviewing feature improvements, changes and add more context to the framework where time was not permitted in the first meeting. If you have any specific questions you would like addressed during the evening please add a question to our group and we will make sure Billy is ready to answer it.


Who is and what makes this Billy McCafferty guy tick? Well he is a long time developer and a hopeless romantic when it comes to writing beautiful software. Billy currently leads a double life between helping to run the world's greatest IT training school at http://www.itsamuraischool.com/ and filling the part-time role of lead developer and architect with Parsons Brinckerhoff. Billy is enjoying getting a bit of his life back after the recent release S#arp Architecture 1.0 and is currently hard at work on the first quarterly release later in September of 2009.

What is VAN?

Virtual ALT.NET (VAN) is the online gathering place of the ALT.NET community. Through conversations, presentations, pair programming and programming dojo’s, we strive to improve, explore, and challenge the way we create software. Using net conferencing technology such as Skype and LiveMeeting, we hold regular meetings, open to anyone, usually taking the form of a presentation or an Open Space Technology-style conversation.

Please see the Calendar to find a VAN group that meets at a time
convenient to you, and feel welcome to join a meeting. Past sessions can be found on the Recording page.

To stay informed about VAN activities, you can subscribe to the Virtual ALT.NET (VAN) Google Group and follow the Virtual ALT.NET blog.

Meeting Details

Times below are Central Daylight Time
Start Time: Wed, November 4, 2009 8:00 PM UTC/GMT -5 hours
End Time: Wed, November 4, 2009 10:00 PM UTC/GMT -5 hours
Attendee URL: Attend the meeting (Live Meeting)

Sunday, August 30, 2009

S#arp Architecture – Part 1: Implementing the M:M mapping override

User Story

Should be able to add a user to more than one role. Should be able to remove the user from specific roles.

Convention over Configuration

This topic is garnering quite a bit of interest of late in the .Net community. James Kovacs recently appeared on an episode of rocks discussing Convention over Configuration. He spent a fair amount of time discussing the move from XML configuration for NHibernate by using the conventions of Fluent NHibernate. By default S#arp Architecture implements the auto-mapping convention. This means that you only need to alter the convention for various edge cases, one of those being M:M associations. Knowing in advance that you will need to perform the override is handy hence the reason for today’s discussion.

Review of the Model

I have the following two entities forming a many-to-many relationship.

Mapping Override using Class Auto Mapping

I created a sub-folder within my data assembly naming it NHibernateMaps and then proceeded to add the following two classes within it. I added using statements for Fluent NHibernate’s AutoMap and AutoMap Alteration namespaces. The key mapping attributes below for both sides of the relationship are the WithParentKeyColumn and WithChildKeyColumn values.

public class AppUserMap : IAutoMappingOverride<AppUser>
public void Override(AutoMap<AppUser> mapping)
mapping.Id(x => x.Id, "AppUserID")

mapping.SetAttribute("lazy", "false");
mapping.Map(x => x.LoginName).WithLengthOf(50);
mapping.Map(x => x.Password).WithLengthOf(255);
mapping.HasManyToMany(x => x.Roles)
public class RoleMap : IAutoMappingOverride<Role>
public void Override(AutoMap<Role> mapping)
mapping.Id(x => x.Id, "RoleID")

mapping.Map(x => x.Name, "RoleName");
mapping.HasManyToMany<AppUser>(x => x.AppUsers)

The choice of using a Bag, List, Set, Map or array structure to handle the transient collection in memory depends on the context of the requirement at hand. In our domain model above we are representing the User and Role associations as an IList structure that is equivalent to NHibernate’s IBag object and therefore we set the mapping to use a Bag. who blogs over at CodeBetter.com discusses this point in the Relationships section in Part 6 of a great series of articles that formed his book called .

In the next article I hope to implement the same override but by using the Conventional approach. Therefore there will not be any need to use any mapping classes. I will wrap up the series by indicating the additional changes needed in the UI and controller layers.

Saturday, August 29, 2009

S#arp Architecture - Architectural review


For the past 6 months or so I have been using and following closely the framework’s evolution from beta to release. During this time I have found the project, community and reference documentation excellent! and the team of contributors have done a professional job assembling this framework into a key source of guidance on how to assemble an enterprise architecture that embraces (from its Project main page on Google code) :

  • Loose coupling leveraging Microsoft’s ASP.Net MVC
  • Persistence ignorance with NHibernate
  • Domain Driven Design
  • Pre-configured infrastructure

S#arp Architecture includes S#arp Scaffolding that greatly speeds up the process of adding CRUD functionality for your entities through the use of T4 templates implemented with the T4 Template toolkit. The “out-of-the box” templates can be extended adding support for using the JavaScript library. It also includes a Visual Studio project template to build out your Solution tree. Billy has a good on the topic of extending the T4 template to embrace the EXT-JS library.He frequently responds to questions that come into the Google group discussion forum linked at the bottom of this post. has done several videos on getting started and extending the architecture. Others of note who contribute with insightful comments and blog entries that have helped my understanding are .

What follows will be a series of blog posts documenting the ability to override the default generated templates and code to produce the desired result of managing the roles for various users accessing an MVC website in the context of varying controller actions.

The posts will be as follows:

Introducing the model and overriding Fluent NHibernate’s auto mapping strategy within S#arp Architecture


Wednesday, July 22, 2009

VAN: Lessons learned building NH Profiler with Ayende Rahien, Christopher Bennage and Rob Eisenberg Oct 21 and Oct 28, 2009


A three way conversation with the main collaborators who created NH Profiler. This tool enables developers to gain a deeper insight into profiling their applications communication from NHibernate(.Net) and Hibernate(Java) through to the database.

Who they are

contributed efforts on the back-end development.

contributed their efforts to the front-end.

Time and location of the meetings

Times below are Central Daylight Time
Start Time: Oct 21 and 28, Each week 8:00 PM UTC/GMT -5 hours
End Time: Oct 21 and 28, Each week 10:00 PM UTC/GMT -5 hours
Attendee URL: Attend the meeting (Live Meeting)

Wednesday, July 15, 2009

VAN: An evening with Scott Bellware discussing the myth of developer productivity August 18, 2009


The myth of developer productivity.

Scott’s Bio

is a software product designer, developer, manager, and agile coach living in Austin, TX. He speaks at software industry conferences and teaches agile development practices and software production methodologies in workshops in the US, Canada, and Europe. He is the founder of the Lean Software Austin and the AgileATX communities of software practitioners. He is the organizer of the upcoming MonoSpace, ALT.NET Open Space, and Continuous Improvement conferences in Austin, and has served as the content chairman for the agile development track at the DevTeach conferences, as well as the chairman of the INETA Speaker Committee. He is the recipient of Microsoft's Most Valuable Professional award.

Meeting Details

Times below are expressed in Central Daylight Time

Start Time: Wed, August 19, 2009 8:00 PM UTC/GMT -5 hours

End Time: Wed, August 19, 2009 10:00 PM UTC/GMT -5 hours

Attendee URL: http://snipr.com/virtualaltnet (Live Meeting)

VAN: An evening of Questions and Sharing of group opinions regarding DDD pragmatic concepts facilitated by David Laribee July 29, 2009


In the spirit of Open Spaces we will be bringing in to facilitate a discussion of opinions on Domain Driven Design.

Who is Dave?

He is a coach for the product development team at VersionOne. He has 12 years experience designing and developing enterprise applications and coaching Agile teams. David has worked on internal IT, product development, consulting, and rapid prototyping teams across a wide variety of industries. David is a frequent speaker at local and national developer events. He was awarded a Microsoft Architecture MVP for 2007 and 2008 and writes about Agile and Lean methods, coaching, and software design on the CodeBetter blog network.

Meeting Details

Times below are Central Daylight Time
Start Time: Wed, July 29, 2009 8:00 PM UTC/GMT -5 hours
End Time: Wed, July 29, 2009 10:00 PM UTC/GMT -5 hours
Attendee URL: (Live Meeting)

del.icio.us Tags: ,,

Technorati Tags: ,,

Friday, May 22, 2009

Enabling Kerberos delegation with Application Service Architectures and SQL Server Analysis Services 2005


This document describes how to setup Kerberos delegation to authenticate an application windows service HTTP requests to SQL Server Analysis Services 2005. Additionally the steps for configuration that will follow become required when the application host machines are separated but exist within the same domain. More information can be obtained on this here.

Active Directory Setup

When making changes in Active Directory there is a requirement for you to have a System Administrator with permissions to invoke any of the changes needed below.

● The server which hosts your application windows services must be set to 'Trust this computer for Delegation (Kerberos
● All your AD user-accounts that will utilize your application windows service should have the setting "Sensitive: not allowed
to be delegated" disabled. This means that all these accounts should be allowed to be delegated.
● Register the Service Principal Names (SPN) described in this document in your Active Directory.

Service Principal Name Registration

If you do not have the Service Principal Name tool or SPN tool is part of the Windows Server 2003 and 2008 Support tools and can be found on your product CD. Alternatively you can download it from here.

Machine hosting your application windows service.

Check the registered SPN’s before you continue with the registration by issuing the following text from the command line:
setspn –l <domainName]\[serverHostName]

This command will list (-l) out the current SPN’s and the only one’s that should show up are as follows:


Protocol registration

setspn –a HTTP/<domainName>\<serverHostName> <serviceAccount>
setspn –a HTTP/<serverHostName>.<domainName>.com <serviceAccount>

Application Service registration

setspn –a <appWindowsServiceName>/<domainName>\<serverHostName> <serviceAccount>
setspn –a <appWindowsServiceName>/<serverHostName>.<domainName>.com <serviceAccount>

Note: The <serviceAccount> token we have used above will be referenced as the <serverHostName> token when the services are running as LocalSystem/NetworkService. Otherwise specify the Domain account that the services are running under.

Machine Hosting SQL Server Analysis Services

Check the registered SPN’s before you continue with the registration by issuing the following text from the command line:
setspn –l <domainName]\[serverHostName]

This command will list (-l) out the registered SPN’s. Typically the one’s that will appear are the same as noted above on your Application windows service machine.

SSAS Service registration

setspn –a MSOLAPSvc.3/<domainName>\<serverHostName> <serviceAccount>
setspn –a MSOLAPSvc.3/<serverHostName>.<domainName>.com <serviceAccount>

The <serviceAccount> token notes above equally apply.

Setup the clients

● The URL used to connect to the Application service web application http://<serverhostname>.<domainname>.com/) should be added to the trusted sites list in Internet Explorer
● To use Windows Integrated Authentication, the option ‘Automatic logon with current username and
password’ must be selected in the Security settings dialog box for the trusted sites zone, (Section:User authentication -Logon)

Sunday, May 3, 2009

Team City Addin for NUnit

Recently at our Virtual Alt.Net group I ran through getting a basic project up and running within Team City 4.0. During the demonstration we were lucky to have a couple of key insiders from Jet Brains join us in the discussion. During the presentation I was painfully hitting the proverbial brick wall of "fail", namely around getting my NUnit test results integrated into Team City's build reports. To save all of you from having to watch the video and to concisely put this rather simple issue to rest quickly, I have concluded at the bottom of this post steps requiring action to have your NUnit tests integrated within Team City. I would like to thank Yegor Yarko and Eugene Petrenko and all participating attendees who offered their assistance in getting this to work!

In order to integrate the NUnit Test runner, Team City requires a bit of additional configuration and the documentation is not as forthcoming as I would like. Hopefully this post will assist others who want to integrate their testing statistics into their build server's dashboard reporting.

Jet Brains indicates that they provide support for NUnit via an addin. The addin provides on-the-fly test coverage reporting integrated within Team City.
The screen to the left appears as an additional chart located within the Settings tab indicating Test Count.

The screen shot below is indicative proof that test reports are not configured correctly. The build itself however is successful. Notice that there are four tabs displayed, there should be five with a Tab named Tests.

When the NUnit test runner integration is correct the screen will appear as follows:

Click on the Tests tab and you can filter the tests by ignored, successful, failed, or all. Additionally you can view your tests by classes, suites, namespaces/packages, or all. This filtering capability is directly above the listing of tests in the screenshot below.

Configuration Steps

It is important to note that with the ensuing steps integration of the NUnit test runner is supported only from versions 2.4X and higher of NUnit.

  • Locate the nunit.console.exe.config file for the version of NUnit that you will be using. In this case I chose to use NUnit 2.4.8. The path to the file is C:\Program Files\NUnit 2.4.8\bin. Add the following XML to the config file save and close the file.

  • I defined a property for locating the NUnit console runner for version 2.4.8. This is optional of course I just find this a cleaner approach as I can reference this location with less noise later in the build script. Additionally its also worth noting that the version for NUnit's console runner does not have to match the version of NUnit referenced within your project's test assembly.

  • Copy the dll and pdb files for the Team City Nunit AddIn to the location defined above for the nunintconsole property and place the files within the newly defined directory of addins that is created within this task. Then the tests are executed by NUnit.

The Team City test runner for NUnit will work only for version's 2.4x and higher for NUnit. If anyone has some alternative configuration that has worked for them I would love to hear them.

Friday, April 10, 2009

Tricks for dealing with Assembly changes in DocProject

I have been working with Steve Bohlen of late in evaluating a couple of API documentation engines that are being considered for inclusion with NHibernate's API.  

One issue that was bothering me was how to best deal with detecting assembly changes in a Project. As many of you know NHibernate has a rather extensive API that undergoes many regular changes and additions. DocProject is one tool that is being evaluated to perform the documentation requirements. During the initial project setup a wizard prompt requests you to select your assembly targets for inclusion. These assemblies then appear as reference dependency inside of your DocProject project. So the question that was bothering me was what happens if you create new assemblies? Having to manually added these new assemblies to DocProject would become a nuisance as it would be easy to forget to do this step. 

Stephen suggested the following which worked quite nice!
Step 1
Remove the reference assemblies from DocProject

After removing the dependencies, you need to specify an external location where they can be located by DocProject.  Below this is done within the DocProject Properties window supply we supply the External sources location for the source of the assemblies and the XML documentation files. 

Step 2
Create a post build event that copies the Assemblies to the projects output directory which in this case we have created a folder output target called Help. This folder content is also automatically cleaned and scrubbed by the projects build script during the compile task. We added the following macro command in the Project Properties | Build Events section defining a post-build event.


Checking the noted changes caused a fail to the build. We need to automate the creation of the Help folder.

Hence I added a make directory task to my compile target in the project's build script.

Now any assemblies newly added within my projects namespace will be automatically detected for documentation. The post-build task can be tweaked to limit the project assemblies you would like to document. 


DocProject fails to build when checked into Subversion

I found this error condition occurring after introducing a documentation project to my code project and checking this into source control. When the build server attempts to build the solution from source control the following error occurs:

The actual error message is:
  • C:\Program Files\Dave  Sexton\DocProject\bin\DaveSexton.DocProject.targets(40, 5): Access to the path 'all-wcprops' is denied. 

  • Its resolved by ignoring the build output folders for DocProject from being included as source control content.
    Namely ignore the following folders:
    buildhelp, Help\Html and Help\Html2

    Now the build is green once again and life is good!

    Over here I found assistance to fix this issue

    Sunday, March 15, 2009

    Solving slow parameterized query plans with SQL Server

    This morning I was listening to Stack Overflow podcast#45 in which Jeff Atwood indicated he had uncovered situations involving poor performing parametrized query's. The solution involves optimizing for 'UNKNOWN' as an optional hint when dealing with parametrized queries.


    @p1=1, @p2=9998,

    Select * from t where col > @p1 or col2 > @p2 order by col1

    option (OPTIMIZE FOR (@p1 UNKNOWN, @p2 UNKNOWN))

    This optional optimization is available for SQL 2008 only. The option forces the query optimizer to look at all available statistical data to come up with a more intelligent deterministic view of what values the local variables used to generate the query plan should equate to rather than using the parameters being passed in by the application. The workaround alternatives do not really offer any good alternatives to solve this issue most notably when dealing with dynamic parameters.


    1. Recompile every time the query is executed using the RECOMPILE hint - This can be very CPU intensive and effectively eliminates the benefits of caching query plans. ex. option(RECOMPILE)

    2. Un-parametrize the query – Not a viable option in most cases due to SQL injection risk.

    3. Hint with specific parameters using the OPTIMIZE FOR hint (However, what value(s) should the app developer use?) This is a great option if the values in the rows are static, that is; not growing in number, etc. – However in my case the rows were not static.

    4. Forcing the use of a specific index

    5. Use a plan guide – Using any of the recommendations above.

    Implications with NHibernate or other Object Relational Mappers's

    I am a user of NHibernate. At present NHibernate does not provide support for the SQL 2008 dialect and recommends using the SQL 2005 dialect configuration option to deal with SQL 2008 data sources. I am wondering if anyone in the NHibernate community has come across this issue with slow parametrized SQL? Is this an issue that an ORM needs to be aware of when supporting a given database dialect? My view is yes. However I am still somewhat of a newb with NHibernate.