Tuesday 13 December 2011

Automatically trim html controls in a asp.net mvc project

I have noticed over the years that users being users sometimes posts forms on a website that has leading or trailing spaces into the input controls. This I find is especially a problem when the user has copied and pasted an email/website address from a internet page or email. On the surface this might be OK but for me especially for email and website addresses when I display this back in a href then sometimes we get:-
- 'mailto:// email@email.com ' or
- href=' http://www.wildesoft.net '

So what I need to remember is to either a) trim before I send to the database or b) trim after I retrieve from the database. I for one prefer option one, better to keep the database correct rather than fixing output in the UI. But is there a way that we can do this for every html control that is posted? With asp.net MVC we can override the default model binder which performs a trim before the value gets to the action method on a controller.

All you need to do is to add the following class to your mvc project:-
public class TrimModelBinder : DefaultModelBinder
{
  protected override void SetProperty(
      ControllerContext controllerContext,
      ModelBindingContext bindingContext,
      PropertyDescriptor propertyDescriptor, object value)
  {
    if (propertyDescriptor.PropertyType == typeof(string))
    {
      var val = (string)value;
      if (!string.IsNullOrEmpty(val))
        val = stringValue.Trim();

      value = val;
    }

    base.SetProperty(controllerContext, bindingContext, 
        propertyDescriptor, value);
  }
}
Then in your application start up method:-
protected void Application_Start() {
  InitContainer();
  ...
  ModelBinders.Binders.DefaultBinder = new TrimModelBinder();
}
Now when a user keys in a leading or trailing space into any input control then the TrimModel binder kicks in and automatically removes it for you.

The only instance where this may be a problem is when a user has a leading or trailing space on their password.

Monday 5 December 2011

Simple.Data and mysql

To connect Simple.Data to a MySQl database is easiest done via Nuget. NuGet is a Visual Studio extension that makes it easy to install and update open source libraries and tools in Visual Studio.

As of 1st December 20111 the code has been updated to work with version Simple.Data 0.12.2.1

To install via Nuget goto your Package Manager Console window and type:-
PM> Install-Package Simple.Data.Mysql

After installation has been completed then you may need to also install the data connector. As of 1st December the latest version of the connector is 6.4.4

PM> Install-Package Mysql.Data

Please note: You may not need to perform this step as the version of you choice will be loaded dynamically at run time. This dynamic loading means that it's enough that the Mysql.Data.dll file is present in the same directory as Simple.Data.Mysql.Mysql40.dll at runtime. You don't have to take a dependency on the connector if you don't want to.

If all has succeeded then your solution will contain a packages.config file.
<?xml version="1.0" encoding="utf-8"?>
<packages>
 <package id="MySql.Data" version="6.4.4" />
 <package id="Simple.Data.Ado" version="0.12.2.1" />
 <package id="Simple.Data.Core" version="0.12.2.1" />
 <package id="Simple.Data.Mysql" version="0.12.2.1" />
</packages>

Now you are ready to start querying your database.

If you need any help then please direct your questions to the user group.

A big thank you to Vidar Sømme and Richard Hopton who has made all this possible.

Wednesday 30 November 2011

Simple.Data and bulk inserts

Simple.Data allows you to pass Lists (or IEnumerables) of dynamically or statically typed objects (ExpandoObjects) to the Insert, Update and UpdateBy methods. This is great if you need to perform any kind of bulk insert/update.
var db = Database.OpenNamedConnection("dbConnection");
var list = new List<User>();
for (var i = 0; i < 10; i++)
  list.Add(new User { 
      Id = i + 10000, Username = "User" + i, 
      Password = "Pwd" + i, 
      DateCreated = DateTime.Now.AddDays(-i), 
      RoleId = (i % 3) 
   }
  );
//All users are inserted into the database 
//  with this single call
db.User.Insert(list);
When bulk inserting into SQL Server Simple.Data makes a call to the ADO.NET DbCommand.Prepare() which actually creates a compiled version of the insert statement on the server itself . Then insert statements are then run one by one which should improve performance. However you may not see and performance gain if you are only bulk inserting 2 or 3 rows as there is a small upfront overhead. This compiled temporary stored procedure will be destroyed when the current connection is closed. The SQL profiler shows us:-
declare @p1 int
set @p1=-1
exec sp_prepexec @p1 output,N'
  @p0 int,
  @p1 varchar(50),
  @p2 varchar(50),
  @p3 datetime,@p4 int',
  N'insert into [dbo].[User] ([Id],[Username],[Password],
    [DateCreated],[RoleId]) values (@p0,@p1,@p2,@p3,@p4)',
    @p0=10000,@p1='User0',@p2='Pwd0',
    @p3='Nov 29 2011  8:59:44:897PM',@p4=0
select @p1
and then sends each insert as:-
exec sp_execute 1,@p0=10001,@p1='User1',@p2='Pwd1',
  @p3='Nov 28 2011  8:59:44:897PM',@p4=1
exec sp_execute 1,@p0=10002,@p1='User2',@p2='Pwd2',
  @p3='Nov 27 2011  8:59:44:897PM',@p4=2
exec sp_execute 1,@p0=10003,@p1='User3',@p2='Pwd3',
  @p3='Nov 26 2011  8:59:44:897PM',@p4=0
Note: In the case of Insert you get back a new list of objects with any database-assigned default values such as identity values or timestamps.

The following code is just a check to see that the inserts worked:-
foreach (var item in db.User.All())
  Console.WriteLine(string.Concat(
    item.Id, " ", 
    item.Username, " ", 
    item.Password, " ", 
    item.DateCreated, " ", 
    item.RoleId)
  );

I believe before version 1 release Mark is going to create an Upsert which will either update or insert your entity based on whether the record exists in the database.

Since writng this post, Mark has also done some macro optimisations using Bulk insert which makes use of SqlBulkCopy. Read this blog post to find out more.

Monday 28 November 2011

Simple.Data Ranges

In this blog I am going to show you how you can use FindAll using ranges that makes use of the BETWEEN operator. The BETWEEN operator (in SQL) is used in a WHERE clause to select a range of data between two values.

Lets look at finding users by Id:-
var db = Database.OpenNamedConnection("dbConnection");
var list = db.User.FindAllById(10002.to(10005));

foreach (var item in list)
  Console.WriteLine(string.Concat(item.Id, " ", item.Username,
    " ", item.Password, " ", item.DateCreated, " ", 
    item.RoleId));
This actually produces the following SQL:-
select
  User.Id,
  User.Username,
  User.Password,
  User.DateCreated,
  User.RoleId
from
  User
WHERE
  User.Id BETWEEN 10002 AND 10005
and returns the following data:-


OK so far so good but what about if we want to get all users that have a RoledId of 1 to 2. This is where the dyanamic features of .net4 and Simple.Data comes alive. All I need to do is change FindAllById to FindAllByRoleId:-
var list = db.User.FindAllByRoleId(1.to(2));

foreach (var item in list)
  Console.WriteLine(string.Concat(item.Id, " ", item.Username,
    " ", item.Password, " ", item.DateCreated, " ", 
    item.RoleId));
The SQL where clause will now be
WHERE User.RoleId BETWEEN 1 AND 2
and returns the following data:-


OK great but what about date ranges?
var list = db.User
  .FindAllByDateCreated("2011-11-20".to("2011-11-22 17:00"));

foreach (var item in list)
  Console.WriteLine(string.Concat(item.Id, " ", item.Username,
    " ", item.Password, " ", item.DateCreated, " ", 
    item.RoleId));
The SQL where clause will now be
WHERE User.DateCreated
  BETWEEN '2011-11-20  00:00:00' 
    AND '2011-11-22  17:00:00'
and returns the following data:-


Great can we also make user of using the BETWEEN operator for strings? you bet:-
var list = db.User.FindAllByUsername("User2".to("User4"));
The SQL where clause will now be
WHERE User.Username BETWEEN 'User2' AND 'User4'

You can also use FindAllBy... range for arrays, this will make use of the IN operator (in SQL) which allows you to specify multiple values in a WHERE clause, more to follow...



Monday 21 November 2011

Simple.Data explicit joins

I am delving a bit more into Simple.Data and starting to look at joins. I am going to begin with a classic one-to-many relationship between the tables cmsmeta and cmspage:-


So how does simple.data handle joins? In this blog I am going to show you explicit joins.

The following code shows how we can project 2 columns from both tables into a DTO using the on(...) form.
var db = Database.OpenConnection(...);

var pages = db.CmsPage
  .Query()
  .Join(db.CmsMeta).On(db.CmsPage.MetaId == db.CmsMeta.Id)
  .Select(
    db.CmsPage.Id, 
    db.CmsPage.PageName, 
    db.CmsMeta.Keywords.As("MetaKeywords"),
    db.CmsMeta.Title.As("MetaTitle")
  ).Cast<PageDto>();

foreach (var page in pages)
  Console.WriteLine(string.Concat(page.Id, " ", page.PageName, ",", page.MetaKeywords, "-", page.MetaTitle));
The DTO looks like this:-
internal class PageDto { {
  public long Id { get; set; }
  public string PageName { get; set; }
  public string MetaTitle { get; set; }
  public string MetaKeywords { get; set; }
}
As you can see we are joining CmsPage with CmsMeta using on(...).

There is also another way to describe an explicit join and that is to use named parameters, that is describe the join as Id: db.CmsPageMetaId, whichever you choose is down to personal taste. For example:-
var pages = db.CmsPage
  .Query()
  .Join(db.CmsMeta, Id: db.CmsPage.MetaId)
  .Select(...)
Both pieces of code produces the following:-
and in case you are wondering the generated SQL looks like this:-
select 
  cmspage.Id,
  cmspage.PageName,
  cmsmeta.Keywords AS MetaKeywords,
  cmsmeta.Title AS MetaTitle
from 
  cmspage
JOIN cmsmeta ON (cmsmeta.Id = cmspage.MetaId)
There is another type of join that is called Natural Joins which requires referential integrity and uses the Foreign Key constraints on the tables. Read more about natural joins. Unless you are using the innoDb engine in MySql then natural joins are not supported.

Sunday 20 November 2011

Simple.Data a projection into a dto

Sorry for the long delay since my last post but my wife gave birth to our son Nico a few weeks ago and to honest this has taken up all my time....

Anyway back to the blog, today I decided to have a quick look into Simple.Data. First I created a new console application and then installed the package via nuget:-
PM> Install-Package Simple.Data.Mysql
If you are using Sql Server then just use:-
PM> Install-Package Simple.Data.SqlServer
The first thing I decided to try was projecting a couple of columns from a single table into a DTO. My schema looks like this (note mysql):-

var db = Database.OpenConnection("server=1.1.1.1;user=usera;database=wildesoft;password=*********;");

IEnumerable<CmsPageDto> pages = db.CmsPage
  .All()
  .Select(db.CmsPage.PageId.As("Id"), db.CmsPage.PageName)
  .Cast<CmsPageDto>();

foreach (var page in pages) {
  Console.WriteLine(page.Id + " " + page.PageName);
}
The dto looks like this:-
public class CmsPageDto {
  public int Id { get; set; }
  public string PageName { get; set; }
}
One thing that I could not fathom out was how to change the column PageId to just Id. After downloading the Simple.Data source code and looked at the tests I worked out that I needed to use the .As("NewColumnName")

and volia...


The generated SQL looks like this:-
select
  cmspage.PageId AS Id,
  cmspage.PageName 
from
  cmspage
A sweet spot is that preserves the case of the table and columns names, very important for MySql server.

Boy is Simple.Data well just simple to use! I am going to use Simple.Data in my next project and hopefully be able to blog about it a bit more..

Thursday 22 September 2011

Mapping by Code - Part 1

Over the next few weeks I am going to put together a sample application showing how you would build a project using Mapping by code. Why do this? Well firstly there are not too many examples on the web yet about NHibernate 3.2 and mapping by code and secondly I am running out of Blog ideas :)

The domain I shall use will based on events, that is events that people pay to book on to. I have vastly simplified the domain so that it looks like this:-


Event

This is where events are held, a typical event would be "Developers Forum" or "How to get ahead in business" etc. The event has a name and a time it starts and finishes.

Event Package

This table holds the different packages that a visitor can book. For example a package could be:-
"Gold" - you get a better seat and a cup of tea
"Silver" - you get a seat but no tea
"Bronze" - you are forced to stand

Event Visitor

As you can see this table holds the people who are attending. A visitor can book more than one ticket so I have decided to use a hierarchical structure, that is the person who books has a ParentId of NULL and the additional people he/she books on will get a Id in ParentId.

Event Payment

This table just keeps track of who has paid. Notice that I have denormalised this table a bit and have created a column called NoTickets sold.

Please note that this domain is for demonstration purposes as I just wanted to show how hierarchical structures can be used in NHibernate.

Next post will be how we create a C# project to map this domain

Thursday 15 September 2011

Queryover, eager loading parent and children with a filter

Sometimes we want to use eager loading as:-

1. The dataset returned is small
2. We don't care about the cartesain product.
3. We want to load the parents
4. We want to load the children
5. We want to filter the results based on a where clause on the children

So how do we go about this, the domain model for this example looks like this:-
public class Movie : Entity
{
  public virtual string Director { get; set; }
  public virtual IList<Actor> ActorList { get; set; }
}

public class Actor : Entity
{
  public virtual string Name { get; set; }
  public virtual string Role { get; set; }
}
When using NHibernate and QueryOver we would specify eager loading by using:-
Actor actor = null;

return Session.QueryOver<Movie>()
  .Fetch(f => f.ActorList).Eager
  .Left.JoinAlias(f => f.ActorList, () => actor)
  .Where(() => actor.Name.Like("m%"))
  .List();
Lets look at the SQL output that gets generated:-

Excellent this is exactly what I require, one sql statement is sent to the database and both parents and children are hydrated at the same time.

However, there is one caveat here, look closely at my c# code and you will see that I am specifying a left join.
Now for me this is where the some confusion crops up. If I take out the .left out of my code so that it is:-
return Session.QueryOver<Movie>()
  .Fetch(f => f.ActorList).Eager
  .JoinAlias(f => f.ActorList, () => actor)
  .Where(() => actor.Name.Like("m%"))
  .List();
and rerun my example the sql that gets generated is:-
As you can see we have resorted back to lazy loading, this is quite strange and I am not 100% sure if this is a bug or the intended behaviour.

So it looks like if you want to take advantage of eager loading and queryover then you always need to specify a .left join in your code!

Monday 12 September 2011

NHibernate QueryOver and NewId or Rand

Returning a random record using NHibernate is quite simple to achieve. If you do not have the whole collection in memory then you ought to consider using NEWID() (for MSSQL) or RAND() (for MySql)

The following MSSQL query will return 5 random Testimonials:-
select TOP 5 * from Testimonials order by newid()
So how would you do this using NHibernate? The following ICriteria example shows how you would register the MySql function RAND(). If you are using MSSQL then you just need to replace the word RAND with NewId.
public class RandomOrder : Order {
  public RandomOrder() : base("", true) { }
  public override SqlString ToSqlString(
      ICriteria criteria, ICriteriaQuery criteriaQuery) {
    return new SqlString("RAND()");
  }
}
The following code based on ICritrtia returns a random list of Testimonials:-
public IList<CmsTestimonial> GetRandomTestimonials(int count) {
  ICriteria criteria = Session
    .CreateCriteria(typeof(CmsTestimonial))
    .AddOrder(new RandomOrder())
    .SetMaxResults(count);
    return criteria.List<CmsTestimonial>();
}
The following code returns a single random Testimonial:-
public CmsTestimonial GetRandomTestimonial() {
  ICriteria criteria = Session
    .CreateCriteria(typeof(CmsTestimonial))
    .AddOrder(new RandomOrder())
    .SetMaxResults(1);
    return criteria.UniqueResult<CmsTestimonial>();
}

Can I do this using QueryOver?

Ok all good so far but what about if we want to use QueryOver rather than ICriteria?

Make sure you keep the RandomOrder class and then create this extension method:-
public static class NHibernateExtensions {
  public static IQueryOver<TRoot, TSubType> 
      OrderByRandom<TRoot, TSubType>(
        this IQueryOver<TRoot, TSubType> query) {
    query.UnderlyingCriteria.AddOrder(new RandomOrder());
    return query;
  }
}
and change your ICriteria query to the following QueryOver query
public IList<CmsTestimonial> GetRandomTestimonials(int count) {
  return Session
    .QueryOver<CmsTestimonial>()
    .OrderByRandom()
    .Take(count)
    .List();
}
NHibernate has so many extensible points and really mostly makes life easy us developers! This is a real sweet spot.

Wednesday 7 September 2011

RequireHttps attribute in MVC3

I came across this gem of an attribute when looking to secure one of my controllers in a asp.net MVC 3 project.
[RequireHttps(Order = 1)]
public class PayController : Controller {
   ...
}
This will automatically redirect and any url:-

from: http://www.mydomain.com/pay/membership

to: https://www.mydomain.com/pay/membership

However this comes with one small glitch, after the user has finished with the pay controller and then navigates to say the home controller then the browser still shows https:// in the address bar. So is there a elegant solution to make all your other controllers http only?

One way is to create a base controller that overrides the OnAuthorization method
public class BaseController : Controller {
  protected override void OnAuthorization(
        AuthorizationContext filterContext) {

    //Only check if we are already on a secure connectuion and 
    // we don't have a [RequireHttpsAttribute] defined
    if (Request.IsSecureConnection) {
      var requireHttps = filterContext.ActionDescriptor
                  .GetCustomAttributes(
                     typeof(RequireHttpsAttribute), false)
                  .Count() >= 1;

      //If we don't need SSL and we are not on a child action
      if (!requireHttps && !filterContext.IsChildAction) {
        var uriBuilder = new UriBuilder(Request.Url) {
                                 Scheme = "http", 
                                 Port = 80
                                };
        filterContext.Result = 
             this.Redirect(uriBuilder.Uri.AbsoluteUri);
      }
    }
    base.OnAuthorization(filterContext);
  }
}
And then make all our non secure controllers inherit from this base controller:
public class CmsHomeController : BaseController {
  ...
}
Lovely!

Tuesday 6 September 2011

Update NHibernate 3.2 and Medium Trust Environments

NOTE: This is an update from a previous post - Nhibernate 3.2 and Medium Trust

Another NOTE: NHibernate 3.3 does not have this problem. See this post for more info.
A few weeks ago I asked the community if somebody could help me figure out how to get NHibernate working in a Medium Trust environment. Well a person by the name of Randy Lee @ranzlee came to the rescue. Basically Randy downloaded the NHibernate source and got to work. He used PEVerify a tool on MSDN that:-
The PEVerify tool helps developers who generate Microsoft intermediate language (MSIL) (such as compiler writers, script engine developers, and so on) to determine whether their MSIL code and associated metadata meet type safety requirements. Some compilers generate verifiably type-safe code only if you avoid using certain language constructs. If, as a developer, you are using such a compiler, you may want to verify that you have not compromised the type safety of your code. In this situation, you can run the PEVerify tool on your files to check the MSIL and metadata.
Thankfully the results of PEVerify showed one error:
[IL]: Error: [c:\Users\Randy\generatedassembly.dll : CatProxy::.ctor][mdToken=0x6000001][offset 0x00000001][found ref ('this' ptr) 'CatProxy'] Call to .ctor only allowed to initialize this pointer from within a .ctor. Try newobj.
Randy figured out that he needed to fixed the proxy factory to emit call the "real" base type constructor, but this would have the side-effect of preventing constructor DI in entities. He then added a new provider and contract (DefaultEntityInjector and IEntityInjector) so that DI is supported out-of-box by providing an implementation. The implementation simply needs to provide the constructor arguments for NH's call to Activator.CreateInstance.

You can download Randy's version of NHibernate.DependencyInjection from Nuget.

Basically all you need to get up and running is to initialise the ByteCode Provider and you are up and running...
protected void Application_Start() {
  Initializer.RegisterBytecodeProvider();
  ...
  Rest of configuration code here
}
Also please note that if you think that NHibernate 3.2 should support Medium Trust environments out of the box then please visit the NHibernate Jira and vote the issue up, the more votes the more chance we have of getting this bug fixed.

Thanks

Monday 5 September 2011

Upcoming NHibernate presentations

Following on from my successful presentations at Birmingham, Manchester and Southampton I can now confirm that I am now presenting in two other regions.

Firstly I will be visiting NxtGenUg at Oxford on September 20th.

Secondly I will be visiting GL.NET at Gloucester on the 5th October.

The presentation covers:-

PART 1: Models and mappings - Shows how to model components, inheritance and collections using hbm XMLmappings

PART 2: Mapping by code - Shows how to model the same model in part 1 using the new sexy mapping by code syntax new to NHibernate 3.2. A simple console application showed how to retrieve some data.

PART 3: Sessions and transactions - Shows how NHibernate can be configured into a MVC 3 application, how a unit of work can implemented and how we can use NHProf.

PART 4: Queries - Shows how we break up a MVC 3 application into layers comprising of a Domain.Model, a service layer, a repository layer and the MVC 3 layer. The application uses Unity to achieve a loosely coupled design. NHProf is used extensively to show how we can optimise the queries. This part also covers eager loading versus lazy loading and how you should be aware of unwanted side effects.

So if anyone is interested in seeing what NHibernate 3.2 is all about including the sexy mapping by code then please book a place, see you there.

Monday 29 August 2011

Projecting a discriminator column into a DTO

To project the discriminator value into a DTO is quite a simple thing to achieve using QueryOver in NHibernate.

I have the following table that I am using the table-per-class-hierarchy strategy. The discriminator column is a string and contains values such as 'Movie', 'Book' etc.

For the following mapping code:-
public class ProductMapping : EntityMapping<Product> {
  public ProductMapping() {
    Property(x => x.Name);
    Property(x => x.Description);
    Discriminator(x => x.Column("ProductType"));
  }
}
Note: I am not exposing the discriminator as a separate property, we really should let NHibernate handle this for us.

The DTO looks like this:-
public class ProductDto {
  public string ProductType { get; set; }
  public string Name { get; set; }
}
Now we need to select build the Query, notice we are using .GetType() to return the underlying discriminator value and project it into a string, for example:-
ProductDto dto = null;

return Session
  .QueryOver<Product>()
  .SelectList(i => i  
    .Select(p => p.Name).WithAlias(() => dto.Name)  
    .Select(p => p.GetType()).WithAlias(() => dto.ProductType)
  )  
  .TransformUsing(Transformers.AliasToBean<ProductDto>())  
  .List<ProductDto>();  
The above code returns:-


Is it possible to do the same using linq? The answer of course is yes, however we need to also add the .GetType().Name which gets the name of the current member:-
return (from product in Session.Query<product>()
  select new ProductDto{  
    Name = product.Name,
    ProductType = product.GetType().Name
}).ToList();  
You should be aware though that the Linq version will return the underlying class name and not the discriminator value. This means that your class name should be the same as your discriminator value. QueryOver however does NOT have the same limitations.

Wednesday 24 August 2011

NHibernate and optimising with lazy='extra'

I came across a stack over flow post today that asked how you could retrieve the count of children from a parent without having to load the entire collection.

The answer is quite simple all you need is to add lazy='extra' on your mappings. Lets put this to the test, for the following domain:-
public class Movie : Entity {
  public virtual string Name { get; set; }
  public virtual string Director { get; set; }
  public virtual IList<Actor> ActorList { get; set; }
}

public class Actor : Entity {
  public virtual string Name { get; set; }
  public virtual string Role { get; set; }
}
The only mapping of importance is the mapping for the movie:-
public class MovieMapping : SubclassMapping<Movie> {
  public MovieMapping() {
    Property(x => x.Name, x => x.NotNullable(true));
    Property(x => x.Director, x => x.NotNullable(true));
    Bag(x => x.ActorList, bag => {
      bag.Key(k => { 
         k.Column(col => col.Name("MovieId"));
         k.NotNullable(true); 
      });
      bag.Cascade(Cascade.All | Cascade.DeleteOrphans);
      bag.BatchSize(10);
    }, action => action.OneToMany());
  }
}
We now have the following code that retrieves a Movie and then the count of all actors:-
var movie = Session.Query<Movie>()
              .Where(w => w.Id == id).SingleOrDefault();
var actorCount = movie.ActorList.Count();
This actually will send two queries to the database:-
-- statement #1
select movie0_.Id          as Id1_,
       movie0_.Name        as Name1_,
       movie0_.Director    as Director1_
from   Movie movie0_
where  movie0_.Id = 'ffffffff-eeee-dddd-cccc-000000000005' /* @p0 */

-- statement #2
SELECT actorlist0_.MovieId as MovieId1_,
       actorlist0_.Id      as Id1_,
       actorlist0_.Id      as Id2_0_,
       actorlist0_.Name    as Name2_0_,
       actorlist0_.Role    as Role2_0_
FROM   ActorRole actorlist0_
WHERE  actorlist0_.MovieId = 'ffffffff-eeee-dddd-cccc-000000000005' /* @p0 */
As you can see this is NOT the intended results we would expect. We have the first SQL query that retrieves a Movie then the second SQL query performs a select * from ActorRole. This means that the second query selects the whole collection and performs the count in memory.

Is there a easy solution that we can use to instruct NHibernate to use a select count(*) from rather than a select * ? If you read the title of this Blog and are still awake, you are probably screaming out lazy='extra'. So how do we use this?

If you are using XML mappings then you would use:-

   ...

If you are using the new mapping by code then you can achieve the same by adding:-
Bag(x => x.ActorList, bag => {
  bag.Key(k => { 
    k.Column(col => col.Name("MovieId"));
    k.NotNullable(true); 
  });
  bag.Cascade(Cascade.All | Cascade.DeleteOrphans);
  bag.BatchSize(10);
  bag.Lazy(CollectionLazy.Extra);
}, action => action.OneToMany());
Now lets look at the SQL that gets sent to the database:-
-- statement #1
select movie0_.Id          as Id1_,
       movie0_.Name        as Name1_,
       movie0_.Director    as Director1_
from   Movie movie0_
where  movie0_.Id = 'ffffffff-eeee-dddd-cccc-000000000005' /* @p0 */

-- statement #2
SELECT count(Id)
FROM   ActorRole
WHERE  MovieId = 'ffffffff-eeee-dddd-cccc-000000000005' /* @p0 */
It should also be noted that it does not matter if you are using HQL, Query, QueryOver or ICriteria to query your data the outcome is exactly the same. Don't you just love it when being able to tweak the mappings so as to optimise your code.

Sunday 21 August 2011

NHibernate 3.2 nxtgenug presentation at Southampton

Just a quick blog post explaining my recent presentation at the Southampton NxtGenUg meeting that took place on Thursday the 18th August.

The presentation was broken up into 4 parts:-

Part 1

Models and mappings - Shows how to model components, inheritance and collections using hbm XMLmappings

Part 2

Mapping by code - Shows how to model the same model in part 1 using the new sexy mapping by code syntax new to NHibernate 3.2. A simple console application showed how to retrieve some data.

Part 3

Sessions and transactions - Shows how NHibernate can be configured into a MVC 3 application, how a unit of work can implemented and how we can use NHProf.

Part 4

Queries - Shows how we break up a MVC 3 application into layers comprising of a Domain.Model, a service layer, a repository layer and the MVC 3 layer. The application uses Unity to achieve a loosely coupled design. NHProf is used extensively to show how we can optimise the queries. This part also covers eager loading versus lazy loading and how you should be aware of unwanted side effects.

Click here to download the complete presentation, the RAR file contains:-
  • DB structure
  • DB data
  • Presentation notes
  • All four VS2010 SLN

UPDATE : Newer demo and slides can be found here.

My next presentation is at Oxford on September 13th.

If you would like me to present NHibernate to your user group in the U.K. (or South Africa anytime in February 2012) then please contact me via twitter @rippo or email me richard AT wildesoft.net.

Wednesday 17 August 2011

Linq into a DTO

The QueryOver API has been in NHibernate since 3.0. It combines the use of extension methods and lambda expressions to provide a statically typesafe wrapper round the ICriteria API. This basically means that we lose all the 'magic strings' that the ICriteria API gave us. Another bonus is that refactoring tools like 'Find All References', and 'Refactor->Rename' work perfectly.

So QueryOver sounds like a dream querying tool for nHibernate. The following code shows how we would project a few columns from a table into a DTO:-
var schoolList = Session.QueryOver<lmschool>()
         .SelectList(i => i
         .Select(p => p.Name).WithAlias(() => dto.Name)
         .Select(p => p.Lat).WithAlias(() => dto.Lat)
         .Select(p => p.Lng).WithAlias(() => dto.Lng)
       )
       .Where(w => w.Lat != null && w.Lng != null)
       .TransformUsing(Transformers.AliasToBean<MarkerDto>())
       .List<MarkerDto>();
However is this as intuitive as you would like? Does it roll out of your head? There is another way to perform the same query and that is to use the built in Linq provider. I shied away from the Linq provider in both version 3.0 and 3.1. However since the release of NH3.2 there has been many improvments, sure there is still a lot you can't do and you get the dreaded Not supported exception but it is coming of age. To rewrite the above QueryOver code with linq we can do this:-
var schoolList = (from school in Session.Query<lmschool>()
        .Where(w => w.Lat != null && w.Lng != null)
      select new LmSchoolMarkerDto {
  		    Name = school.Name,
                    Lat = school.Lat,
                    Lng = school.Lng,
      }).ToList();
Isn't this just a bit easier to understand and to be honest I think this is something that is easily remembered.

In both cases the following SQL is generated:-
select s.name, s.lat, s.lng from school 
where (s.lat is not null) and (s.lng is not null) 
Simple!

Monday 15 August 2011

Getting orders and the count of orderlines using Linq

When using NHibernate we sometimes need to pick the right tool from the toolbox. One example of this is when you have the follow problem.

I need to get a list of orders and the count of order items against each order.

Now me being a bit savvy with QueryOver would sit down and would write a complex piece of code. This code doesn't roll out the top of your head and you need to fit a few pieces together to get it right.

Another option is to use the NHibernate built in Linq provider. In my opinion before NH3.1 the Linq provider just was not quite up to the job and people shied away from it. However the NHibernate team has done a lot of work on the built in Linq provider and it is now maturing into a useful tool.

So back to the solution to our problem, simply first create a DTO:-
public class OrderDto
{
    public long Id { get; set; }
    public DateTime OrderDate { get; set; }
    public int LineCount { get; set; }
}
I would always try and project into a DTO rather than using filters or formula's as I try to keep my mappings as clean as possible. This gives me better flexibility and does not back me into a corner.

The linq query is very easy:-
return Session.Query<Order>()
        .Select(o => new OrderDto
        {
            Id = o.Id,
            OrderDate = o.DateOfOrder,
            LineCount = o.ActorList.Count
        })
    .ToList();
This would produce the following SQL:-
select o.Id, o.OrderDate,
       (select cast(count(*) as INT) from  OrderLines
          where  Order.Id = OrderLines.orderId) as LineCount
from   Order o

I am going to look more closely into the built in NHibernate Linq provider as it can produce some easy and beautiful code.

Thursday 11 August 2011

Queryover into a DTO

Sometimes I get a stuck with some of syntax with QueryOver and how you go about projecting into a DTO. The code below simply goes to the database and pulls back all the schools that have a Lat and a Lng. However I am only interested in three columns, my [school] table contains around 15 columns which also includes a LONG TEXT column. If I simply call select * from school then a lot of redundant data will be sent down the wire that is not actually needed.

So I first would write the following code snippet that pulls back the all the columns from the database. I then iterate through the list and select only a small sub set of the data. e.g:-
var schoolList = Service.Lm.LmSchoolDao.QueryOver()
		.Where(w => w.Lat != null && w.Lng != null)
		.List<LmSchool>();

var xml = new XElement("markers",
  from s in schoolList
    select new XElement("marker",
      new XAttribute("name", s.Name),
      new XAttribute("lat", s.Lat.ToString()),
      new XAttribute("lng", s.Lng.ToString())
));
The above code works great but I get that nagging feeling that I need to optimise this a bit. Therefore I think it will be a good idea to just select a few columns from the table and project the result into a list. Using QueryOver I would write something along the lines of:-
var schoolList = Service.Lm.LmSchoolDao.QueryOver()
        .SelectList(i => i
          .Select(p => p.Name)
          .Select(p => p.Lat)
          .Select(p => p.Lng)
        )
        .Where(w => w.Lat != null && w.Lng != null)
        .TransformUsing(Transformers.AliasToBean<MarkerDto>())
        .List<MarkerDto>();

public class MarkerDto
{
  public string Name { get; set; }
  public decimal? Lat { get; set; }
  public decimal? Lng { get; set; }
}
The above compiles fine, it returns the correct data and columns from the database and I can see that my schoolList has the correct number of rows, so far so good.

However it has one major flaw, each property on the schoolList is NULL. You sit back scratch your head and realise that you have not told NHibernate how to map the returned columns in to the correct properties. The missing piece of the jigsaw is the .WithAlias() QueryOverProjectionBuilder. Armed with this knowledge you tweak your code as follows:-
MarkerDto dto = null;
var schoolList = Service.Lm.LmSchoolDao.QueryOver()
         .SelectList(i => i
         .Select(p => p.Name).WithAlias(() => dto.Name)
         .Select(p => p.Lat).WithAlias(() => dto.Lat)
         .Select(p => p.Lng).WithAlias(() => dto.Lng)
       )
       .Where(w => w.Lat != null && w.Lng != null)
       .TransformUsing(Transformers.AliasToBean<MarkerDto>())
       .List<MarkerDto>();
Sometimes you just got to sit back and think. However I still personally prefer the QueryOver syntax rather than using Linq. In fact I got no idea if the NHibernate Linq provider can do this, maybe this would make a good blog post for another day.

Wednesday 10 August 2011

Gary Short does node.js

A big thank you to Gary Short as he drove a whopping 410 miles from Dundee to Hereford NxtGenUG to talk about node.js. Respect!

The presentation started with Gary taking it easy with us:-

  • He first explained what node.js was and how to install on Ubuntu it via the package manager apt-get 
  • We then went on to see how to run node.js via a putty client and saw a "hello world" example

At this point I could not work out why I would use node.js over using .net.

After pizza (yes we always serve pizza) Gary went on to show us how how node.js can monitor lots of twitter feeds and store the results into CouchDB. From here he then showed us how node.js can send back Json via a HTTP request. A simple webforms (!) project then pulled back this Json and produced charting.

The reason Gary chose Ubuntu was due to the fact that twitter servers are up and down like yoyo's and utilising Ubuntu allowed him to take advantage of Upstart and respawn.

The benefits of using node.js are:-
  1. It can run on a low spec machine as it runs linux (Ubuntu)
  2. It is totally non blocking and does not deadlock
  3. It only uses a small heap allocation as it sleeps until a connection is made
  4. It uses less compute cycles as it employs a callback strategy, that it sleeps until the callback is made
  5. It can handle heavy loads of HTTP traffic
  6. It is therefore ideal for the cloud where you pay for compute cycles

All in all the presentation was a real eye opener and I for one would recommend it to anyone. I think he has a proposed a more advanced session for DDD North

If you would like to see any future sessions at Hereford then please click here.

Monday 8 August 2011

Nhibernate 3.2 and medium trust environments

Note: There is a new blog post based on this one, however you should read this one first to get an idea of the problems.
I am trying out NHibernate in a medium trust environment using the built in DefaultProxy. However I get the following error:-
Operation could destabilize the runtime. 

Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. 

Source File: ProxyFactory.cs Line: 51 
Stack Trace: 
 [VerificationException: Operation could destabilize the runtime.] CmsContentProxy..ctor() +27 
 [TargetInvocationException: Exception has been thrown by the target of an invocation.] 
   System.RuntimeTypeHandle.CreateInstance(RuntimeType type, Boolean publicOnly, Boolean noCheck, Boolean& canBeCached, 
   RuntimeMethodHandleInternal& ctor, Boolean& bNeedSecurityCheck) +0
   System.RuntimeType.CreateInstanceSlow(Boolean publicOnly, Boolean skipCheckThis, Boolean fillCache) +98 
   System.RuntimeType.CreateInstanceDefaultCtor(Boolean publicOnly, Boolean skipVisibilityChecks, Boolean skipCheckThis, Boolean fillCache) +241 
   System.Activator.CreateInstance(Type type, Boolean nonPublic) +69 
   System.Activator.CreateInstance(Type type) +6 
   NHibernate.Proxy.DynamicProxy.ProxyFactory.CreateProxy(Type instanceType, IInterceptor interceptor, Type[] baseInterfaces) in ProxyFactory.cs:51 
   NHibernate.Proxy.DefaultProxyFactory.GetProxy(Object id, ISessionImplementor session) in DefaultProxyFactory.cs:20
Line 51 within the NHibernate sources:-
/Proxy/DyanmicProxy/ProxyFactory.cs line 51 
  object result = Activator.CreateInstance(proxyType); 
Things I have tried:-
1. Castle Proxy - They have created a proxy for 3.1 and not 3.2. I have looked at the NHibernate sources for both 3.1 and 3.2 but the interface to implement has changed.
2. Googled how to work around this problem but to no avail
3. Asked the question on NHusers user group.

I could wait to see if the castle/linfu/spring teams release a proxy that is compatible with version 3.2 however it would be nice to identify the problem within NHibernate itself so as I don't need a dependency on a third party proxy.

If anyone is interested in helping me I have provided a policy.config file, all you need to do is to place it in your web root and place this in your web.config.
<securityPolicy> 
  <trustLevel name="Custom" policyFile="policy.config" /> 
</securityPolicy> 
<trust level="Custom" originUrl="" /> 
I really would love to get this NH3.2 running in Medium trust as I love the new mapping by code syntax, any ideas, pointers, workarounds will be gratefully received.

Whats next:-
I can see that Spring has now updated there framework so as to be compatible with NH3.2 GA, I will try this next...

Wednesday 3 August 2011

Gary Short and node.js

NxtGenUG are proud to announce that Gary Short is visiting Hereford on Monday 8th August 2011, to talk about Node.Js. It seems that Node.js is gaining momentum in the developer community.

Gary's presentation will cover a quick introduction to Node.js before going on to de-construct a network appliance application written in Node.js. By the end of the session you will have a better appreciation for this language which is sure to become more important in the coming years.

According to WIKI:-
Node.js is an event-driven I/O server-side JavaScript (built on V8 JavaScript engine) environment for Unix-like platforms. It is intended for writing scalable network programs such as web servers. It was created by Ryan Dahl in 2009, and its growth is sponsored by Joyent, which employs Dahl. Node.js is similar in purpose to Twisted for Python, Perl Object Environment for Perl, libevent for C and EventMachine for Ruby. Unlike most JavaScript, it is not executed in a web browser, but is instead a form of server-side JavaScript. Node.js includes a REPL environment for interactive testing.

HTTP Server version of hello world in Node.js:
var http = require('http');
 
http.createServer(function (request, response) {
    response.writeHead(200, {'Content-Type': 'text/plain'});
    response.end('Hello World\n');
}).listen(8000);
If you would like to find out more then please register on the NxtGenUG site. The event is FREE for non members, details on how to find us are:-

Shire Hall
Union Street
Hereford
HR1 2HX

Monday, August 08, 2011, 7:00 PM - 9:00 PM

We hope to see you there.

Monday 1 August 2011

Return parents based on a child filter with QueryOver

Sometimes an ORM doesn't quite do what you think it should. One classic example of this when we need to get a parent entity based on a where condition on the child.

Given the following domain:-
public class Movie : Entity {
  public virtual string Name { get; set; }
  public virtual string Description { get; set; }
  public virtual IList<actor> ActorList { get; set; }
}
public class Actor : Entity {
  public virtual string Name { get; set; }
  public virtual string Role { get; set; }
}
We have a Movie (parent) that has a collection of Actor (children). Now comes the challenge we want to get a list of movies where an actors name starts with the letter m, at first glance this is fairly trivial and we would sit down and construct the following query:-
return Session
  .QueryOver<movie>()
  .Inner.JoinQueryOver<actor>(a => a.ActorList)
  .Where(a => a.Name.IsLike("m%"))
  .List();
What do you think will get returned here?

Lets take one step back and look at the sql that gets generated:-
SELECT this_.Id          as Id0_1_,
       this_.Name        as Name0_1_,
       this_.Description as Descript4_0_1_,
       actor1_.Id        as Id2_0_,
       actor1_.Name      as Name2_0_,
       actor1_.Role      as Role2_0_
FROM   Movie this_
       inner join ActorRole actor1_
         on this_.Id = actor1_.MovieId
WHERE  actor1_.Name like 'm%' /* @p0 */
This then returns the following rows:-


As you can see 4 rows are returned but the same movie appears twice as there are two actors that start with the letter M for The Dark Knight. You would probably expect NHibernate to hydrate 3 parent entities but in actual fact it hydrates 4. After a bit of head scratching you realise NHibernate is doing what it should it is taking the 4 rows from the database and hydrating the results. The problem now is that your UI would display 4 results rather than 3.


So how do we instruct NHibernate to filter duplicates from the parent. One very simple approach is to specify a ResultTransformer:-
return Session
  .QueryOver<movie>()
  .Inner.JoinQueryOver<Actor>(a => a.ActorList)
  .Where(a => a.Name.IsLike("m%"))
  .TransformUsing(CriteriaSpecification.DistinctRootEntity)
  .List();
Now there is oneimportant thing to note here. This only performs the filter on the client side. You still get 4 rows returned from the database BUT only distinct parents are hydrated. Depending on a number of factors like you don't require paging and size of the children collections are small this might be suitable.

So is there another solution to this problem? Can we only get three parents returned from NHibernate? Maybe we want to page the results and don't like the idea of transmitting a lot of repeated data down the wire. Well there is a solution we need to introduce a sub query. If we were to sit down and write this in SQL we may come up with something like:-
SELECT this_.Id          as Id0_0_,
       this_.Name        as Name0_0_,
       this_.Description as Descript4_0_0_,
FROM   Movie this_
WHERE  this_.Id in (SELECT distinct this_0_.Id as y0_
                        FROM   Product this_0_
                               inner join ActorRole actor1_
                                 on this_0_.Id = actor1_.MovieId
                        WHERE  actor1_.Name like 'm%' /* @p0 */)
Ok this is not sub optimal SQL but you get the picture, we want to return distinct movies that have a at least one actor beginning with the letter a. The corresponding QueryOver code would look like this:-
var moviesWithActor = QueryOver.Of<Movie>()
  .JoinQueryOver<Actor>(m => m.ActorList)
  .Where(a => a.Name.IsLike("m%"))
  .Select(Projections.Distinct(Projections.Property<Movie>(m => m.Id)));

return Session
  .QueryOver<Movie>()
  .WithSubquery
  .WhereProperty(m => m.Id)
  .In(moviesWithActor)
.List();
This now returns 3 rows down the wire and we require no client side transfomations (filtering):-

Is this the final solution? In reality the optimum SQL might look something like this:-
SELECT distinct
  m.Id,
  m.Name,
  m.Description,
  m.UnitPrice
FROM
  Movie m
inner join 
  ActorRole a on m.Id = a.MovieId
WHERE a.Name like 'm%'
However I will leave this to another day!

Wednesday 27 July 2011

Mixed mappings with HBM and mapping by code

Are you ready to embrace and upgrade to NHibernate 3.2 but are unsure about converting all your project hbm.xml files to the new mapping by code syntax?

It is possible to mix and match; that is use your current projects xml files and start to use the mapping by code. So if you find yourself in this position:-
I am ready to start using NHibernate 3.2 in an old project and utilise all the goodness that the mapping by code syntax gives me, but I am not quite ready to move over all my xml files yet.
I realise that one could argue about testing, that is if all my tests cover mappings then the conversion should not cause friction. However some of us are left with named queries right?

The solution is quite simple, first you need to define your web.config (or external hibernate config file) as:-
<configSections>
  <section name="hibernate-configuration"
   type="NHibernate.Cfg.ConfigurationSectionHandler, NHibernate"     requirePermission="false" />
</configSections>

<hibernate-configuration
   xmlns="urn:nhibernate-configuration-2.2">
  <session-factory>
    <property name="dialect">
      NHibernate.Dialect.MySQL5Dialect
    </property>
    <mapping assembly="Domain.Model" />
  </session-factory>
</hibernate-configuration>
Then in your start up code:-
var mapper = new ModelMapper();
mapper.AddMappings(typeof(CmsMeta).Assembly.GetTypes());
//Notice the .Configure, this is the magic that allows you to
//  use the mixed mappings
var configure = new Configuration().Configure();
configure.DataBaseIntegration(x =>
{
  x.Dialect<MySQL5Dialect>();
  x.ConnectionStringName = "db";
}).CurrentSessionContext<WebSessionContext>();

configure.AddDeserializedMapping(mapping, "Domain");
SessionFactory = configure.BuildSessionFactory();
If you notice we are making a call to .Configure. Because of this call then NHibernate always needs a hibernate-configuration section inside your web.config. We are therefore instructing NHibernate to configure the mappings as usual in web.config and then configure using the new mapping by code syntax (which is based on loquacious).

After you have converted your XML you can remove all references in web.config and change:-
var configure = new Configuration().Configure();
to just:-
var configure = new Configuration();

I personally have used this technique as I still have a few named queries left over in my XML files and have not yet moved them out, therefore this mixed mapping has really helped me out.

You got to love the simplicity of being able to mix old and new, well done to the NHibernate team for taking away some pain points.

Monday 25 July 2011

View the xml generated when mapping by code

When using mapping by code it may not be 100% obvious what XML is being generated for NHibernate.

You have a couple of options, the first option is to tell NHibernate to write all your mappings into the bin folder:-
var mapper = new ModelMapper();
mapper.AddMappings(typeof(CmsMeta).Assembly.GetTypes());

//This will write all the XML into the bin/mappings folder
mapper.CompileMappingForEachExplicitlyAddedEntity().WriteAllXmlMapping();

The second option is to use the extension method .AsString(). However you will be presented with one large XML blob.
var mapper = new ModelMapper();
mapper.AddMappings(typeof(CmsMeta).Assembly.GetTypes());
var mapping = mapper.CompileMappingForAllExplicitlyAddedEntities();
//you could add a breakpoint here! 
var mappingXml = mapping.AsString();
Either way you will be able to quickly and easily view the XML that is being injected into NHibernate. This is very useful for debugging or if you are upgrading all your hbm.xml files into the new mapping by code syntax and wish to compare like for like.

Wednesday 20 July 2011

Mapping conventions using mapping by code

I asked the following question on the NHUsers google group; "I am playing with mapping by code and started wondering about conventions."

According to WIKI:-
Conventions may be formalized in a documented set of rules that an entire team or company follows, or may be as informal as the habitual coding practices of an individual.
All of my mapping files share the same common functionality:-

1. All Bags have bag.Cascade(Cascade.All | Cascade.DeleteOrphans) and bag.BatchSize(10)
2. All table names are lowercase Table("organisation");
3. All bags have the action => action.OneToMany()

Normally in NHibernate 3.2 I would define all my mapping classes using this type of syntax:-
internal class OrganisationMapping : ClassMapping<Organisation> {
  public OrganisationMapping() {
    Id(x => x.Id, map => map.Generator(Generators.GuidComb));
    Property(x => x.Name, x => x.NotNullable(true));
    Table("organisation");

    Bag(x => x.IncomeTypeList, bag => {
      bag.Key(k => k.Column(col => col.Name("OrganisationId")));
      bag.Cascade(Cascade.All | Cascade.DeleteOrphans);
      bag.BatchSize(20);
    }, action => action.OneToMany());
  }
}
The problem I have is that I have 20 or so of these class mappings to do and most of my bags share similar properties. This is what led me to think about conventions and a way to make my code a) compact, b) follow similar standards and c) allow me to map my classes quicker without having to think about what I had set before.

So how do we solve this problem? First I read this blog post from Fabio Maulo then I decided to have a go.

Below is a snippet of my configuration code:-
var mapper = new ModelMapper();
mapper.AddMappings(typeof(Organisation).Assembly.GetTypes());

var configure = new Configuration();
configure.DataBaseIntegration(x => {
  x.Dialect<MySQL5Dialect>();
  x.ConnectionStringName = "db";
}).CurrentSessionContext<WebSessionContext>();

var mapping = mapper.CompileMappingForAllExplicitlyAddedEntities();
configure.AddDeserializedMapping(mapping, "Domain");

SessionFactory = configure.BuildSessionFactory();
The first problem is how do I make all my bags have a batch size of 20 and set Cascade.All | Cascade.DeleteOrphans

This is really easy I just need to set a convention before my bag gets mapped:-
mapper.BeforeMapBag += (mi, t, map) => {
  map.BatchSize(20);
  map.Cascade(Cascade.All | Cascade.DeleteOrphans);
};
Note: this code gets inserted just after var mapper = new ModelMapper();

The second problem is how do I make all my table names lowercase. Again this convention is extremely easy to achieve:-
mapper.BeforeMapClass +=
  (mi, t, map) => map.Table(t.Name.ToLowerInvariant());
mapper.BeforeMapJoinedSubclass += 
  (mi, t, map) => map.Table(t.Name.ToLowerInvariant());
mapper.BeforeMapUnionSubclass += 
  (mi, t, map) => map.Table(t.Name.ToLowerInvariant());
Note: If you are using subclassess in your NHibernate project then you will also need to define the convention here as well.

The third problem is how do I remove the need to have action => action.OneToMany() defined on all my mappings. If you do not specify this action then you mapping files will contain:-
<bag name="IncomeTypeList" cascade="all,delete-orphan" batch-size="20">
  <key column="Id" />
  <element type="Domain.Model.Entities.IncomeType, Domain.Model" />
 </bag>
but what we really want is:-
<bag name="IncomeTypeList" cascade="all,delete-orphan" batch-size="20">
  <key column="Id" />
  <one-to-many class="IncomeType" />
 </bag>
This one is a little trickier to achieve. According to Fabio:-
We can’t define the one-to-many relation using the ModelMapper because the responsibility to define/discover all relation is delegated to the IModelInspector injected to the ModelMapper instance
So lets look at the solution, first we need to define a new ModelInspector that inherits from ExplicitlyDeclaredModel then override the IsOneToMany method:-
public class MyModelInspector: ExplicitlyDeclaredModel
{
    public override bool IsOneToMany(MemberInfo member)
    {
        if(IsBag(member))
            return true;
        return base.IsOneToMany(member);
    }
}
To use our ModelInspector we need to slightly change the configuration code so we pass the ModelInspector into the ModelMapper:-
var modelInspector = new MyModelInspector();
var mapper = new ModelMapper(modelInspector);
and viola all works as expected!

It should also be pointed out that all these conventions can be overridden by explicitly specifying your requirements on the individual classes themselves.

I have to say at first when I looked at this mapping by code syntax it looked scary and complex. However after a couple of hours playing with it and actually coding my application I have found it quite pleasing and fairly simple (with help from Fabio). I am now looking forward to mapping my domains using this syntax rather than the tried and trusted xml that I have used in the past.

If you haven't already looked at the mapping by code (new in NHiberbate 3.2) then I suggest you do as it really is cool and according to Fabio sexy, but that is his personal taste!

Monday 18 July 2011

Lazy loading and the select n+1 problem

With Nhibernate Lazy loading comes switched on by default (in fact it has since version 1.2). However there are some things that you need to understand, lazy loading only works as long as the session is open and one of the side effects is that you may hit the select n+1 problem.

To see this problem in action then let's say you have a collection of Movies and each Movie has a collection of Actors. In other words, Movie to Actor is a one-to-many relationship.

Now, let's say we get the first 5 Movies and then for each iteration we need to show all the actors.

The following SQL would be sent to the server:-
SELECT TOP 5 * FROM Movies;
SELECT * FROM Actor WHERE MovieId = 1;
SELECT * FROM Actor WHERE MovieId = 2;
SELECT * FROM Actor WHERE MovieId = 3;
SELECT * FROM Actor WHERE MovieId = 4;
SELECT * FROM Actor WHERE MovieId = 5;
In other words, you have one select for the Movies, and then N additional selects for actors. As you can see you would be forced to hit the database 6 separate times, this is in essence is the select n+1 problem.

NHibernate overcomes this problem by allowing you to specify the batch-size in your mapping files. To see this in actual we are going to use the following domain:-
public class Movie : Entity
{
  public virtual string Director { get; set; }
  public virtual IList<actor> ActorList { get; set; }
}

public class Actor : Entity
{
  public virtual string Name { get; set; }
  public virtual string Role { get; set; }
}
All you need to do is to place the batch-size attribute to the bag definition.

  <key column="MovieId" not-null="true" />
  <one-to-many class="Actor" />

This now will produce the following SQL that only hits the database twice:-
SELECT TOP 5 * FROM Movies;
SELECT * FROM Actor WHERE MovieId IN (1,2,3,4,5);
If we set a batch-size of 4 we would would hit the database three times:-
SELECT TOP 5 * FROM Movies;
SELECT * FROM Actor WHERE MovieId IN (1,2,3,4);
SELECT * FROM Actor WHERE MovieId=5;
This is a real sweet spot that NHibernate provides for you out the box. I think is a real testament to some of the power NHibernate gives you allowing you to fine tune your code with minimal fuss. Of course for a simple model you could use eager loading but for models with lots of properties (database columns) then this might not be desirable. I for one prefer choice and this reinforces the sweetness that comes with NHibernate.

Entity framework (as of 4.2) cannot solve this problem out of the box and one approach is to use .Include("Actors"), however this eager loads all of your entities and will produce a cartesian product that can send a lot of duplicated data down the wire. I suspect though the Microsoft team will tidy up lazy loading some time in the future.

Friday 15 July 2011

Using eager loading can produce unexpected results

When using NHibernate for the first time you sometimes decide that you want to utilise Eager loading for some of the following valid reasons.
  1. You want to make only one trip to the database
  2. Your result set is only small, contains 10 or so rows and only a small number of columns
Therefore you have decided to embark on the Eager loading journey however one of the side effects is that the generated SQL always returns a cartesian product.

According to WIKI A Cartesian product (or product set) is the direct product of two sets.

The SQL snippet below shows us that duplicate movies will be returned for every actor in the table. This can be quite clearly seen:-
SELECT * FROM
 Product p
left outer join
 ActorRole a on p.Id = a.MovieId

So now we are armed with this knowledge how does NHibernate handle the cartesian product? I have the following model, I have a one-to-many join on actors. The domain model looks like this:-
public class Movie : Entity
{
  public virtual string Director { get; set; }
  public virtual IList<Actor> ActorList { get; set; }
}

public class Actor : Entity
{
  public virtual string Name { get; set; }
  public virtual string Role { get; set; }
}
When using NHibernate and QueryOver we would specify eager loading by using:-
return Session
  .QueryOver<Movie>()
  .OrderBy(w => w.Name).Asc
  .Fetch(f => f.ActorList).Eager
  .List();
However this produces the following result:-
As you can clearly see NHibernate hydrates all movies (root entities) regardless whether they are duplicated or not. So how do we stop this happening? One way is to tell Nhibernate that we are only interested in hydrating distinct movies, e.g.
return Session
 .QueryOver<Movie>()
 .OrderBy(w => w.Name).Asc
 .Fetch(f => f.ActorList).Eager
 .TransformUsing(CriteriaSpecification.DistinctRootEntity)
 .List();
Notice that we are specifying a result transformation which tells NHibernate to only hydrate the root enities once per actor. Now we get the desired result.

It should be noted that the duplicates are removed (well probably not added in the first place) in client code, but you should remember that you are returning duplicate movies from your SQL.

Also it should be noted that this only works for single joins. If you are joining two tables or more then you should read this article by Ayende.

Thursday 14 July 2011

QueryOver with multiple joins into a DTO

Consider the following NHibernate HQL. As you can see it is joining three tables and pulling back three columns from two different tables.

The [Organisation] table joins to the [OrganisationType] which in turn joins to the [IncomeType] table.
hql = @"
 select 
  t.Name, ot.LocalRespend, ot.LocalRespendOutArea
 from
  organisation o 
 inner join
  organisationincometype ot on ot.OrganisationId = o.Id
 inner join
  incometype t on ot.IncomeTypeId = t.Id
 where 
  o.Id = :orgId and t.Id = :incomeTypeId;";
How would you do the following using the QueryOver syntax?

Simple when you know how:-
var oit = null;
var  it = null;

return Session.QueryOver<organisation>()
   .Where(w => w.Id == orgId)
  .JoinAlias(r => r.OrganisationIncomeTypeList, () => oit)
  .JoinAlias(r => oit.IncomeTypeList, () => it)
   .Where(() => it.Id == incomeTypeId)
  .Select(
   Projections.Property(() => it.Name),
   Projections.Property(() => oit.LocalRespend),
   Projections.Property(() => oit.LocalRespendOutArea)
  )
  .TransformUsing(Transformers.AliasToBean<organisationincometypedto>())
  .SingleOrDefault<organisationincometypedto>();
The secret is that we are using JoinAlias into temporary local variables and then projecting these aliases into our DTO. The beauty of using the QueryOver syntax is that we are losing all the magic strings that we would get with HQL or Criteria.

Friday 4 March 2011

Find out when your SQL Server databases were last backed up

Would you like to know when any of your SQL backups were last backed up? Or did you want to verify that a back up actually ran? Why now you can!
select
 max(isnull(datediff(dd,b.backup_start_date,getdate()),0))  as 'Days since backup',
 b.backup_size,
 d.name as database_name
from  
 master..sysdatabases d with (nolock)
left join 
 msdb..backupset b with (nolock)
  on d.name = b.database_name 
   and b.backup_start_date = (select max(backup_start_date) 
    from 
     msdb..backupset b2 
    where
     b.database_name = b2.database_name
     and b2.type = 'D')
where 
 d.name != 'tempdb'
group by
 d.name, b.type, b.backup_size
This produces the following:-

Thursday 3 March 2011

Find out how big each table is for a SQLdatabase

Ever wanted to know in one swoop how big your tables are for a databse in SQL Server?

Create Table #temp (
  table_name sysname ,
  row_count int,
  reserved_size varchar(50),
  data_size varchar(50),
  index_size varchar(50),
  unused_size varchar(50)
)

SET NOCOUNT ON
insert #temp exec sp_msforeachtable 'sp_spaceused ''?'''

select
  a.table_name,
  a.row_count,
  count(*) as col_count,
  a.data_size
from
  #temp a
inner join information_schema.columns b
  on a.table_name = b.table_name
group by
  a.table_name, a.row_count, a.data_size
Order by 
  CAST(Replace(a.data_size, ' KB', '') as integer) desc

drop table #temp
This produces the following:-

Wednesday 2 March 2011

Find out how big your databases are in SQL Server

Get a list of all database sizes in KB for the whole SQL server.
Create Table #temp (
    table_name varchar(200) ,
    database_size int,
    remarks varchar(50)
  )

  SET NOCOUNT ON
  insert #temp exec sp_databases

  select 
    table_name, 
    cast(database_size / 1024 as nvarchar)+ ' KB' 
  from 
    #temp
  order by 
    database_size desc

  drop table #temp
An example of what gets shown can bee seen below:-

Saturday 1 January 2011

SQL Nested Sets


  1. Create a database called TreeTest and open QA and execute Setup SQL
  2. Now you are ready to go. Simply select the test you with to run, then
    copy the code below, paste it into Query Analyser and hit F5!
The tree is structured in this way


Schema
  • TEST 1 - _NestedSet_ConvertToAdjacency
  • TEST 2 - _NestedSet_DeleteAndClose
  • TEST 3 - _NestedSet_DeleteNodeAndLeafs
  • TEST 4 - _NestedSet_GetAll
  • TEST 5 - _NestedSet_GetDepthFullTree
  • TEST 6 - _NestedSet_GetDepthSubTree
  • TEST 7 - _NestedSet_GetLeafs
  • TEST 8 - _NestedSet_GetNextLevel
  • TEST 9 - _NestedSet_GetSinglePath
  • TEST 10 - _NestedSet_GetTree
  • TEST 11 - _NestedSet_GetTree
  • TEST 12 - _NestedSet_InsertOnSameLevel
  • TEST 13 - _NestedSet_Move
  • TEST 14 - _NestedSet_MoveUpDown

Top

Test 1 - _NestedSet_ConvertToAdjacency

Converts a nested set to a Adjacency model

exec __Reset
exec _NestedSet_ConvertToAdjacency

Test1

Top
Test 2 - _NestedSet_DeleteAndClose

Deletes a node then moves any sub nodes up to close the gap

exec __Reset
exec _NestedSet_TestLayout
exec _NestedSet_DeleteAndClose 7
exec _NestedSet_TestLayout

Test3

Top
Test 3 - _NestedSet_DeleteNodeAndLeafs

Deletes a node and all sub nodes then closes any gaps

exec __Reset
exec _NestedSet_TestLayout
exec _NestedSet_DeleteNodeAndLeafs 2
exec _NestedSet_TestLayout

Test3

Top
Test 4 - _NestedSet_GetAll

Simply gets all the nodes

exec __Reset
exec _NestedSet_GetAll

Test4

Top
Test 5 - _NestedSet_GetDepthFullTree

Simply gets all the nodes with depths

exec __Reset
exec _NestedSet_GetDepthFullTree

Test5

Top
Test 6 - _NestedSet_GetDepthSubTree

Gets the depths of the selected node and sub nodes

exec __Reset
exec _NestedSet_TestLayout
exec _NestedSet_GetDepthSubTree 2

Test6

Top
Test 7 - _NestedSet_GetLeafs

Gets all the leafs - that is nodes without sub nodes

exec __Reset
exec _NestedSet_GetLeafs

Test7

Top
Test 8 - _NestedSet_GetNextLevel

Gets all nodes that are directly below the selected node

exec __Reset
exec _NestedSet_TestLayout
exec _NestedSet_GetNextLevel 1

Test8

Top
Test 9 - _NestedSet_GetSinglePath

Gets a full path for a single node

exec __Reset
exec _NestedSet_TestLayout
exec _NestedSet_GetSinglePath 8

Test9

Top
Test 10 - _NestedSet_GetTree

Gets nodes including and below the selected node

exec __Reset
exec _NestedSet_TestLayout
exec _NestedSet_GetTree 2

Test10

Top
Test 11 - _NestedSet_InsertDirectlyBelow

Inserts a node directly below the selected node

exec __Reset
exec _NestedSet_TestLayout
exec _NestedSet_InsertDirectlyBelow 1,'new name'
exec _NestedSet_TestLayout

Test11

Top
Test 12 - _NestedSet_InsertOnSameLevel

Inserts a node on the same level (at the bottom) as the selected node

exec __Reset
exec _NestedSet_TestLayout
exec _NestedSet_InsertOnSameLevel 1,'new name'
exec _NestedSet_TestLayout

Test12

Top
Test 13 - _NestedSet_Move

Moves a node to below another node

exec __Reset
exec _NestedSet_TestLayout
exec _NestedSet_Move 6,3
exec _NestedSet_TestLayout

Test13

Top
Test 14 - _NestedSet_MoveUpDown

Moves a node on the same level either up or down (1 for up, 0 for down) to another node

exec __Reset
exec _NestedSet_TestLayout
exec _NestedSet_MoveUpDown 2,0
exec _NestedSet_TestLayout

Test14

References
  • Mike Hillyer Managing Hierarchical Data in MySQL (I have used his sample data)
  • Joe Celko's Trees and Hierarchies in SQL for Smarties - 2004

  • Alejandro Izaguirre Google groups

I would also like to thank:-

  • Ryan O'Neill
  • Nick Crowther