Linq to SQL Stored procedure vs Functions

When it comes to Linq to SQL, there is a very interesting difference between using a stored procedure or a table function (or user defined functions).  In this post, I want to discuss the similarities and differences between the two and which one should be used in a given scenario.

Stored Procedures

Stored Procedures can be dragged and dropped on the right hand side of the designer.  This will then add the stored procedure as a method on your Data Context. If the stored procedure accepts parameters, then these will be translated into parameters on the method.

image

The method code for CustOrderHis stored procedure would be:

[Function(Name=”dbo.CustOrderHist”)]
public ISingleResult CustOrderHist([Parameter(Name=”CustomerID”, DbType=”NChar(5)”)] string customerID)
{
    IExecuteResult result = this.ExecuteMethodCall(this, ((MethodInfo)(MethodInfo.GetCurrentMethod())), customerID);
    return ((ISingleResult)(result.ReturnValue));
}

You could then call it in your code:

DataClasses1DataContext dc = new DataClasses1DataContext();
var sproc = dc.CustOrderHist(“ALFKI”);

If we look at the code produced, the first important part is the FunctionAttribute which defines which stored procedure will be called in the database. The next important part is the GetCurrentMethod which allows Linq to access the method to execute. According to the documentation, the method “Returns a MethodBase object representing the currently executing method”. This is then passed into the ExecuteMethodCall which will result in the query being executed against the database.

Once the result has been returned, it is converted into a ISingleResult, which implements IEnumerable, and returned to the calling client. As ISingleResult is a generic class, an object with the same name as the sproc is created with the same properties as returned by the stored procedure.

However, if you know your stored procedure returns a object already in your system, for example GetCustomer might return a row from your Customer table, then you can use that object as the generic type – ISingleResult. This saves having unnecessary objects in your system.

When using stored procedures they can only return items of type ISingleResult or IMultipleResult (discussed later). This causes a number of problems when using the created objects.  Firstly, in your client code you will have to manually convert the result to something like ToList() if you want to bind the objects to a data source. If you convert the objects to a list then they will not be change-tracked. Finally, you cannot benefit from deferred loading when using stored procedures as the call to the database is executed when the method is called.

If your stored procedure returns two record sets, for example something like below

SELECT * FROM Customers

SELECT * FROM Orders

Then there will be a problem with a method returning ISingleResult. This is where IMultipleResult comes in handy. However just dragging the stored procedure like above will not generate the correct code. Instead, you will have to add your own partial method for the code.

Given this code, we will need to convert it so it can work correctly.

[Function(Name=”dbo.ReturnCustomerAndOrder”)]
public ISingleResult ReturnCustomerAndOrder()
{
    IExecuteResult result = this.ExecuteMethodCall(this, ((MethodInfo)(MethodInfo.GetCurrentMethod())));
    return ((ISingleResult)(result.ReturnValue));
}

The code for it to correctly access two or more different tables would look something like this:

        [Function(Name=”dbo.ReturnCustomerAndOrder”)]
        [ResultType(typeof(Customer))]
        [ResultType(typeof(Order))]
        public IMultipleResults ReturnCustomerAndOrder()
        {
            IExecuteResult result = this.ExecuteMethodCall(this, ((MethodInfo)(MethodInfo.GetCurrentMethod())));
            return ((IMultipleResults)(result.ReturnValue));
        }

NOTE: Important to include this in a partial class otherwise the designer will override your changes.

To access one of the set of results, you need to explicitly request it using the GetResult<> method.

DataClasses1DataContext dc = new DataClasses1DataContext();
IMultipleResults sproc = dc.ReturnCustomerAndOrder();
List orders = sproc.GetResult().ToList();

foreach (Order o in orders)
{
    Console.WriteLine(o.OrderID);
}

This allows you to deal with both situations for stored procedures. This is also what you will need to do if your sproc returns a single object but of different types.

User Defined Functions

Like stored procedures, user defined functions can be added as a method on your data context.  User defined functions are similar to stored procedures, however you have the ability to defined what is being returned from the function, this allows Linq to correctly understand the sql and produce correct code to match.

The function below returned as table, as such it is called a table valued function.

CREATE FUNCTION [dbo].[AllOrdersInWeek52Function]()
    RETURNS TABLE
AS
RETURN (SELECT * FROM Orders
        WHERE DATEPART(ww, OrderDate) = 52)

When this code is added to the data context, the following code is produced.

[Function(Name=”dbo.AllOrdersInWeek52Function”, IsComposable=true)]
public IQueryable AllOrdersInWeek52Function()
{
    return this.CreateMethodCallQuery(this, ((MethodInfo)(MethodInfo.GetCurrentMethod())));
}

As you can see, this returns an IQueryable object, unlike the sproc which returned ISingleResult.  This means that you can take full advantage of Linq when querying against this object in the database.

Other advantages of functions is that they can be executed against the database, inline with an existing linq to sql query.  This means that we can do more advanced queries solely in the database without having to bring everything in memory to perform the same task. For example, if we can the function below which takes a date and returns an int representing the week of the year.

CREATE FUNCTION WeekOfYear(@Date DateTime)
RETURNS Int
AS
BEGIN
RETURN (CAST(DATEPART(ww, @Date) AS INT))
END

Then we can use that within our linq query.

DataClasses1DataContext dc = new DataClasses1DataContext();
var q = from o in dc.Orders
        where dc.WeekOfYear(o.OrderDate) == 23
        select o;

foreach (var item in q)
{
    Console.WriteLine(item.OrderID);
}

Now, instead of WeekOfYear being performed in memory over the entire table as would be required without the function then its included within the generatored SQL.

SELECT [t0].[OrderID], [t0].[CustomerID], [t0].[EmployeeID], [t0].[OrderDate], [t0].[RequiredDate], [t0].[ShippedDate], [t0].[ShipVia], [t0].[Freight], [t0].[ShipName], [t0].[ShipAddress], [t0].[ShipCity], [t0].[ShipRegion], [t0].[ShipPostalCode], [t0].[ShipCountry]
FROM [dbo].[Orders] AS [t0]
WHERE [dbo].[WeekOfYear]([t0].[OrderDate]) = @p0

I’ve highlighted the important section, as you can see this will result in a much better execution.

Summary

In summary, user defined functions provide you with more flexibility and allow you to gain some great advantages over the queries being executed. Very powerful and something to keep in mind.  Stored procedures have some limitations, but can solve certain scenarios.

Technorati Tags: ,

DATEPART(WEEK,…) With Linq to SQL

An interesting question was raised on the Linq forums about how to query a date column based on the which week number the date is in.

If we was to write this in SQL, we would use the DATEPART keyword within our WHERE clause. In Northwind, if we wanted to return all the orders which occurred in week 52 we would run the following query.

SELECT * FROM Orders
WHERE DATEPART(ww, OrderDate) = 52

However, in .Net there isn’t a simply way to access the Week via DateTime. This makes it difficult for Linq to translate into SQL.

One approach you could take is the easy route and do the query in the database.

CREATE PROCEDURE AllOrdersInWeek52
AS
SELECT * FROM Orders
WHERE DATEPART(ww, OrderDate) = 52

With that, we can included the procedure in our diagram and query it using Linq to SQL. The code below calls the stored procedure and outputs all the dates which match.

public void LinqSproc()
{
    Console.WriteLine(“LinqSproc”);
    Console.WriteLine(“—————————————————————“);
    DataClasses1DataContext dc = new DataClasses1DataContext();

    var query = dc.AllOrdersInWeek52();

    foreach (var item in query)
    {
        Console.WriteLine(item.OrderDate);
    }
}

The results returned:

ListAllOrdersWithDatesInWeek52
—————————————-
23/12/1996 00:00:00
23/12/1996 00:00:00
24/12/1996 00:00:00
25/12/1996 00:00:00
25/12/1996 00:00:00
26/12/1996 00:00:00
27/12/1996 00:00:00
27/12/1996 00:00:00
22/12/1997 00:00:00
22/12/1997 00:00:00
22/12/1997 00:00:00
23/12/1997 00:00:00
23/12/1997 00:00:00
24/12/1997 00:00:00
24/12/1997 00:00:00
24/12/1997 00:00:00
25/12/1997 00:00:00
25/12/1997 00:00:00
26/12/1997 00:00:00
26/12/1997 00:00:00
26/12/1997 00:00:00

Another approach would be to use a Function.  This gives us more control and flexibility within our database.

CREATE FUNCTION dbo.AllOrdersInWeek52Function()
    RETURNS TABLE
AS
RETURN (SELECT * FROM Orders
        WHERE DATEPART(ww, OrderDate) = 52)

That is all very easy because we are just using SQL and calling into our database, if we wanted both could accept the week number as a parameter.  If we wanted to do this solely using .Net we would have to use a custom method to return the week number for the date and then query based on that. The problem with using a custom method is that the query must be performed in memory, as such we need to use the .AsEnumerable().

public void ListAllOrdersWithDatesInWeek52()
{
    Console.WriteLine(“ListAllOrdersWithDatesInWeek52”);
    Console.WriteLine(“—————————————————————“);
    DataClasses1DataContext dc = new DataClasses1DataContext();

    var query = from o in dc.Orders.AsEnumerable()
                where WeekOf(o.OrderDate) == 52
                select o;

    foreach (var item in query)
    {
        Console.WriteLine(item.OrderDate);
    }
}

private int WeekOf(DateTime? nullable)
{
    if (nullable.HasValue)
    {
        GregorianCalendar gCalendar = new GregorianCalendar();
        int WeekNumber = gCalendar.GetWeekOfYear(nullable.Value, CalendarWeekRule.FirstFourDayWeek, DayOfWeek.Monday);
        return WeekNumber;
    }
    else
        return 0;
}

As this is occurring for every row in the database, performance will take a hit.  The best solution, is to use a scalar function within the database.

CREATE FUNCTION WeekOfYear(@Date DateTime)
RETURNS Int
AS
BEGIN
RETURN (CAST(DATEPART(ww, @Date) AS INT))
END

Which we can use within our query.

DataClasses1DataContext dc = new DataClasses1DataContext();
var q = from o in dc.Orders
        where dc.WeekOfYear(o.OrderDate) == 23
        select o;

foreach (var item in q)
{
    Console.WriteLine(item.OrderID);
}

This will execute the function inside of the database, resulting in much better performance. The sql produced would look like this:

SELECT [t0].[OrderID], [t0].[CustomerID], [t0].[EmployeeID], [t0].[OrderDate], [t0].[RequiredDate], [t0].[ShippedDate], [t0].[ShipVia], [t0].[Freight], [t0].[ShipName], [t0].[ShipAddress], [t0].[ShipCity], [t0].[ShipRegion], [t0].[ShipPostalCode], [t0].[ShipCountry]
FROM [dbo].[Orders] AS [t0]
WHERE [dbo].[WeekOfYear]([t0].[OrderDate]) = @p0

I’ve highlighted the important part.  By using a function, the result is a nice clean quick execution of the query.

Original Question: http://forums.microsoft.com/MSDN/ShowForum.aspx?ForumID=123&SiteID=1

Technorati Tags: ,

Custom Insert logic with Linq to SQL

On the forums, it was asked how to have your own custom logic for the insert process, for example using a stored procedure to insert the object.

One approach is to redefine the insert behaviour within the designer inside Visual Studio 2008.  For these examples, I’m going to be the Category table in Northwind.

Firstly, create your stored procedure which accepts all the properties on the object as arguments.  The stored procedure needs to have a special OUTPUT parameter.   This output parameter should return the ID of the row being inserted so Linq can re-populate the ID for the object and track any future changes.

CREATE PROCEDURE [dbo].[InsertCategory]
@categoryID INT OUTPUT,
@categoryName nvarchar(15),
@description ntext,
@picture image
AS

INSERT INTO [Northwind].[dbo].[Categories]
           ([CategoryName]
           ,[Description]
           ,[Picture])
     VALUES
           (@categoryName
           ,@description
           ,@picture)

SET @CategoryID = CAST (SCOPE_IDENTITY() as Int)

Information on why to use Scope Identity vs Identity can be found here.

Then, drag it onto the designer so it is a method on the data context.

image

Then, select the category table on your designer surface (add it if required). In the properties window, you can set the action to be taken for Delete, Insert and Update.  Select insert and click the button on the property.

image

The Configure Behaviour form will then be displayed. Here, you can select to Customize the action, and then select the stored procedure to use on the data context.  If should then auto configure the parameters for the sproc, but you manually do this stage if required.

image

At this point, whenever you insert a Category and call SubmitChanges, the stored procedure will be used instead of Linq’s own code. You can do configure update and delete in a similar fashion. However, with update sprocs you can access in the original data values as well as the updated data.

The other approach is to use the partial methods feature of C# 3.0.  In a partial class for your DataContext, you add the partial method for InsertCategory which calls your stored procedure on the data context.

partial void InsertCategory(Category instance)
{
    System.Nullable nullableID = instance.CategoryID;
    this.InsertCategory(ref nullableID, instance.CategoryName, instance.Description, instance.Picture);
    instance.CategoryID = nullableID.GetValueOrDefault();
}

Another approach is just to have ADO.net code but I think the above method is neater.

partial void InsertCategory(Category instance)
{
    SqlConnection sqlConn = new SqlConnection(global::LinqPOC.Properties.Settings.Default.NorthwindConnectionString);
    SqlCommand sqlCmd = new SqlCommand(“InsertCategory”, sqlConn);
    //Setup Parameters for sproc
    try
    {
        sqlConn.Open();
        int insertedID = (int)sqlCmd.ExecuteScalar();
        instance.CategoryID = insertedID;
    }
    finally
    {
        sqlConn.Close();
    }
}

Which approach should you use? Well, under the covers they are very similar. Instead of having a partial method within the designer, as below.

partial void InsertCategory(Category instance);

There is a private InsertCategory implementation. This takes the category object being inserted, calls the stored procedure as a method on the datacontext and then repopulates the CategoryID

private void InsertCategory(Category obj)
{
    System.Nullable p1 = obj.CategoryID;
    this.InsertCategory(ref p1, obj.CategoryName, obj.Description, obj.Picture);
    obj.CategoryID = p1.GetValueOrDefault();
}

As such, it just depends on if you want to implement the method yourself or have the designer do it for you.

Forum post: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2592415&SiteID=1

Technorati Tags: ,

Linq Quickly: Book Review

Linq Quickly

Recently I was sent a copy of a new book entitled Linq Quickly.  The book is available from Packt Publishing wrote by N. Shateesh Kumar.  This book is for a .Net developer with all the code samples in C#.  The book covers the main aspects for the Linq framework including:

  • LINQ to Objects
  • LINQ to XML
  • LINQ to SQL
  • LINQ to DataSets
  • LINQ to XSD

    The book itself is 225 pages and provides a good introduction into the Linq framework.  The book itself starts by introducing the architecture of Linq and some of the language features of C# 3.0.  It then moves on to discuss Linq to Objects providing the reader with a foundation of linq.  It then touches on Linq to XML which was interesting and provided a good foundation and introduction. There is then a large section of the book on Linq to SQL, which I would expect as its a major part of the framework and the book discusses the concepts with code examples.  Finally, it talks about DataSets and XSD.

    The last chapter is more of a reference guide to the different extension methods provided by the framework which is a good summary of everything and an example of how to implement and use the method.

    My main problem with the book is that there are a number of typos and small things which should easily have been picked up by the reviewer/author.  For example it refers to Linq as being part of the .Net framework 3.0.  As everyone knows, it was part of .Net 3.5.  It does let the book down.  The book is also based on pre-rtm syntax, but that isn’t really a problem as there wasn’t any major changes.

    In summary, the book provides a good introduction/quickstart to the Linq stack and if you want a gentle introduction to the different sections and how they fit together then its great. If you want something more in-depth then definitely look at Linq in Action (sadly, I only have that in ebook format) but this book does aim at a different reader. I’m sure I will refer back to it to lookup some syntax.

    Linq in Action Review: Ben Hall’s Blog- LINQ In Action Book – Early Review

    Website and purchase: http://www.packtpub.com/linq-quick-starter/book

    Technorati Tags: ,
  • Linq to SQL Beta 2 to RTM Diagram Error

    After installing RTM on my laptop, I all of my Beta 2 Linq to SQL diagrams refused to open and save the error.

    “Cannot load ‘path’: Error encountered, check Error List window for details”.

    The error list had the following error

    “Error    2    There is no Unicode byte order mark. Cannot switch to Unicode.        1    1    TestingLinqToSQL” 

    In the properties for the dialog, it was already set as Unicode. To fix this, in the xml header for the diagram, in Beta 2 it was ““.  RTM requires the encoding to be utf-8 ().  Set the xml declaration to this, save and reopen and your diagram should be working again.

    Technorati Tags:

    How To Unit Test – Linq to SQL and Mocking

    All problems in computer science can be solved by another level of indirection Butler Lampson (Or Abstraction as some people refer to it as)

    Over the last few posts, I have been covering how to unit test various topics, in this post I will cover Linq to SQL. I’ve been thinking about this for a while, but never had chance to post about it. Various people have spoken about this before, but I thought I would give my view.  I’m going to discuss how to unit test systems involving Linq to SQL, how to mock Linq to SQL, how this will come into play with the new ASP.net MVC Framework and how TypeMock might have the answer.

    Scenario

    For this post, I decided to implement a small part of a shopping cart.  You can return a list of categories, a list of products for each category and the product itself.

    Unit Testing Linq to SQL

    In my previous post, I discussed how to unit test Linq to XML and mentioned how it really isn’t that different to any other system.  Linq to SQL is similar in the fact that you can easily ignore the underneath implementation (Linq) when unit testing, however I’m going to focus more on how to decide the system and how it all should hang together.

    Like in any Linq to SQL implementation you will need a data context.  While you could unit test this implementation, I always prefer to test it as part of a higher layer as its generated code to give the tests more context and more meaningful for the system. The most important part of the system will be how we actually access the DataContext, retrieve the data and make sure that it can be effectively unit tested.

    In my system I will have my HowToUnitTestDataContext generated by the Linq to SQL Designer.  I will then have a ProductController which will talk to my DataContext and return the generated object. Unlike with SubSonic, we can return the designer generated objects as they are just POCO (Plain Old C# Objects).  I will then have a set of unit tests targeting the ProductController.  We are not doing any mocking at this point and all of our unit tests will hit the database.

    First test, create a data context and ensure it has a connection string.  This ensures everything is setup correctly.

    [Test]
    public void CreateDataContext_ConnectionString_ReturnsDataContextObject()
    {
        HowToUnitTestDataContext db = ProductController.CreateDataContext();
        Assert.IsNotNull(db);
        Assert.IsNotNull(db.Connection);
    }

    All the method does is initialise a data context and return it to the caller. The next test requests a list of all the categories from the database.  However, because we haven’t populated anything yet it should just return an empty list.

    [Test]
    public void GetCategories_NoContent_ReturnEmptyList()
    {
        List categories = ProductController.GetCategories();
        Assert.IsNotNull(categories);
        Assert.AreEqual(0, categories.Count);
    }

    We now have a base structure in place and can start filling in the details. The following test first inserts a category into the database (code is in the solution which you can download at the bottom of the post).  It then requests all the categories from the database and ensures that what we returned was correct. We then use the MbUnit RollBack feature to ensure the database is left in the same state as it was before the test. The rollback works perfectly with Linq to SQL!

    [Test]
    [RollBack]
    public void GetCategories_SimpleContent_ReturnPopulatedList()
    {
        InsertCategory();

        List categories = ProductController.GetCategories();
        Assert.IsNotNull(categories);
        Assert.AreEqual(1, categories.Count);
        Assert.AreEqual(“Microsoft Software”, categories[0].Title);
    }

    The code for GetCategories() is a simply Linq to SQL statement which returns a generic list.

    public static List GetCategories()
    {
        HowToUnitTestDataContext db = CreateDataContext();
        var query = from c in db.Categories
                           select c;
        return query.ToList();
    }

    The next important test is the one which returns product information.  Here, we use a tested GetCategory method to return a particular category. We then insert a temp product into the database for that category, meaning that we now have a known database state to work with. The test then simply verifies that when given a category we can return all the products in the database for it.

    [Test]
    [RollBack]
    public void GetProductsForCateogry_ValidCategoryWithProduct_PopulatedList()
    {
        InsertCategory();

        Category c = ProductController.GetCategory(“Microsoft Software”);
        InsertProduct(c);

        List products = ProductController.GetProductsForCategory(c);
        Assert.AreEqual(1, products.Count);
        Assert.AreEqual(“Visual Studio 2008”, products[0].Title);
    }

    The implementation of this method is a little bit more complex as it joins a ProductCategories table to return the products within the particular category.

    public static List GetProductsForCategory(Category c)
    {
        HowToUnitTestDataContext db = CreateDataContext();
        var query = from p in db.Products
                           join pc in db.ProductCategories on p.id equals pc.ProductID
                           where pc.CategoryID == c.id
                           select p;

        return query.ToList();
    }

    The final method is to return a particular product based on its ID.  It works in a similar fashion to the previous methods.

    [Test]
    [RollBack]
    public void GetProductByID_ValidProductID_ReturnProductID()
    {
        InsertCategory();

        Category c = ProductController.GetCategory(“Microsoft Software”);
        InsertProduct(c);

        List products = ProductController.GetProductsForCategory(c);
        Assert.AreEqual(1, products.Count);

        Product p = ProductController.GetProduct(products[0].id);
        Assert.AreEqual(p.Title, products[0].Title);
    }

    In the implementation we then just return a single product using a lambda expression.

    public static Product GetProduct(int productID)
    {
        HowToUnitTestDataContext db = CreateDataContext();

        Product product = db.Products.Single(p => p.id == productID);
        return product;
    }

    That pretty much covers unit testing basic Linq to SQL. Other parts of the system, such as the business layer or UI layer, can then talk directly to the ProductController to return all the information. However, this doesn’t offer anything if you want to mock out Linq to SQL.

    Mocking Linq to SQL

    Unit testing Linq to SQL isn’t very difficult, however mocking Linq to SQL is a different beast. As with SubSonic, the best approach to take is to abstract away from your database. In this case, I am going to add an additional layer in between my ProductController and my DataContext called LinqProductRepository which can then be mocked. 

    My first set of tests are focused on testing the LinqProductRepository which talks to my DataContext and as such my database. The tests are very similar to the above tests for ProductController. I always test this against the database to ensure that it will work effectively when its in production/live, with mock objects you can never have the same level of confidence.

    LinqProductRepository m_ProductRepository = new LinqProductRepository();
    [Test]
    [RollBack]
    public void GetCategoryByName_NameOfValidCategory_ReturnCategoryObject()
    {
        InsertCategory();

        Category c = m_ProductRepository.GetCategory(“Microsoft Software”);

        Assert.AreEqual(“Microsoft Software”, c.Title);
        Assert.AreEqual(“All the latest Microsoft releases.”, c.Description);
    }

    To give you an idea of the implementation, the GetCategory method looks like this:

    public Category GetCategory(string categoryTitle)
    {
        using (HowToUnitTestDataContext db = CreateDataContext())
        {
            Category category = db.Categories.Single(c => c.Title == categoryTitle);
            return category;
        }
    }

    In order to make the ProductRepository mockable it is required to implement an interface.  The interface is very simply, as shown:

    public interface IProductRepository
    {
        List GetCategories();
        Category GetCategory(string categoryTitle);
        List GetProductsForCategory(Category c);
        Product GetProduct(int productID);
    }

    We now have a fully implemented and tested ProductRepository so we can create the ProductController. To start with, in my ProductControllerTests I setup the variables and the [Setup] method for each test.  This ensures that we have our MockRepository (via RhinoMocks) to hand, a copy of our mocked IProductRepository together with our stub category and product.  These two objects are simple well known objects (for the system) which we will return from our mocked methods.  I’m using parameter injection to set the mocked repository on the ProductController which will be used during the tests. By using a parameter we can have a default implementation for our production code but a way for our test code to injection the mock object.

    MockRepository m_Mocks;
    IProductRepository m_ProductRepos;
    Category m_MockedCategory;
    Product m_MockedProduct;

    [SetUp]
    public void Setup()
    {
        m_Mocks = new MockRepository();
        m_ProductRepos = m_Mocks.CreateMock();
        ProductController.ProductRepository = m_ProductRepos;

        m_MockedCategory = MockCategory();
        m_MockedProduct = MockProduct();
    }

    We can then write our unit tests based on this information which will be similar to our previous units tests as they are implementing the same requirements. Within this test, we setup our expected return for the method GetCategories on our ProductRepository, this simply uses C# 3.0 Collection Initialises to create a list with one item, the stub category. We can then execute our test/asserts against the ProductController to view it is all linked correctly and working as expected.

    [Test]
    public void GetCategories_SimpleContent_ReturnPopulatedList()
    {
        using (m_Mocks.Record())
        {
            Expect.Call(m_ProductRepos.GetCategories()).Return(new List { m_MockedCategory });
        }

        using (m_Mocks.Playback())
        {
            List categories = ProductController.GetCategories();
            Assert.IsNotNull(categories);
            Assert.AreEqual(1, categories.Count);
            Assert.AreEqual(“Microsoft Software”, categories[0].Title);
        }
    }

    The ProductController simply passes the call onto our ProductRepository, which is set using a parameter in our test Setup.

    private static IProductRepository m_ProductRep = new LinqProductRepository();

    public static IProductRepository ProductRepository
    {
        get { return m_ProductRep; }
        set { m_ProductRep = value; }
    }

    public static List GetCategories()
    {
        return ProductRepository.GetCategories();
    }

    This allows us to use the ProductController with a mock object. ProductController could be/do anything and by going via the LinqProductRepository we can use mock objects to save us accessing the database.  In our real system, we would use the real tested LinqProductRepository object.

    ASP.net MVC and Linq to SQL

    Recently there has been a lot of buzz around the new ASP.net MVC framework the ASP.net team are releasing (CTP soon).  If you haven’t read about this, Scott Guthrie has done a great post on the subject.

    However, Scott Hanselman did a demo of the MVC framework at DevConnections and has posted the source code online here – DevConnections And PNPSummit MVC Demos Source Code. Demo 3 and 4 is about TDD and uses mock objects and Linq to SQL.

    The layout is as follows:

    ASP.net MVC

    The BlogController talks to a IPostRepository object which is either LinqPostRepository or TestPostRepository. They use a Stub object instead of a mock object, but the architecture is the same as my system. I would just like to say I didn’t just copy them 🙂

    From this demo code, it looks like the way to mock your data access layer will be to use this approach.

    Mocking the DataContext

    The thing which first struck me about this was why couldn’t we just mock the DataContext itself instead of messing around with Repositories and Controllers. The first answer was that it didn’t implement an interface, but that’s a simple fix.  The second problem is that the data context returns everything as System.Data.Linq.Table<>, and this cannot be mocked. It does implement an ITable interface, however you cannot cast another object implementing ITable to Table.

    Ayende, creator of Rhino Mocks, wrote a post called Awkward Testability for Linq for SQL which covers why it cannot be mocked.  Its a shame the Linq team didn’t think about TDD and Mocking the repository as it would have made a big difference to the system design. Maybe this is a lesson for anyone creating an API at the moment – think about testability! I think the ASP.net team have realised this.

    Mocking with TypeMock

    That all said, it looks like TypeMock might be able to provide an answer.  TypeMock is a mocking framework, however is extremely powerful as it can mock any object within your system, it doesn’t care about the implementation or if there is an interface available. It can simply mock anything.  I will be looking at TypeMock more over the next few weeks, but before then visit their site. It’s not free (30 day trail available), but if you need to do this type of mocking then it really is your only solution and it can greatly increase your test coverage and your test suite as a whole.

    You can read the initial ideas over on their blog – Eli Lopian’s Blog (TypeMock) » Blog Archive » Mocking Linq – Preview.  Going in depth on TypeMock and Linq deserves its own post, however I couldn’t resist posting some code now.

    Below, we have a test which mocks out querying a simple list.  Here, we have a list of customers and a dummydata collection. Our actual Linq query is against the m_CustomerList, however TypeMock does some ‘magic’ under the covers and tells the CLR that the query should just return the dummyData. As such, instead of just returning the one customer record, we are returning the two dummy data records. How cool!!!

    public class Customer
    {
        public int Id { get; set; }
        public string Name { get; set; }
        public string City { get; set; }
    }

    [Test]
    [VerifyMocks]
    public void MockList()
    {
        List m_CustomerList = new List {
            new Customer{ Id = 1, Name=”Dave”, City=”Sarasota” },
            new Customer{ Id = 2, Name=”John”, City=”Tampa” },
            new Customer{ Id = 3, Name=”Abe”, City=”Miami” }
        };

        var dummyData = new[] {new {Name=”fake”,City=”typemock”},
                   new {Name=”another”,City=”typemock”}};
        using (RecordExpectations recorder = RecorderManager.StartRecording())
        {
            // Mock Linq
            var queryResult = from c in m_CustomerList
                              where c.City == “Sarasota”
                              select new { c.Name, c.City };
            // return fake results
            recorder.Return(dummyData);
        }

        var actual =  from c in m_CustomerList
                      where c.City == “Sarasota”
                      select new { c.Name, c.City };
        Assert.AreEqual(2, actual.Count());
    }

    But it gets better!  Given our original ProductController (no repositories, mocks, fakes, stubs) we can tell TypeMock that for this statement, always return the dummyData. As such, the ProductController never hits the database.

    [Test]
    [VerifyMocks]
    public void MockDataContext()
    {
        Category category = new Category();
        category.id = 1;
        category.Title = “Test”;
        category.Description = “Testing”;

        List dummyData = new List { category };
        using (RecordExpectations r = RecorderManager.StartRecording())
        {
            // Mock Linq
            List mocked = ProductController.GetCategories();
            // return fake results
            r.Return(dummyData);
        }

        List categories = ProductController.GetCategories();
        Assert.AreEqual(1, categories.Count);
        Assert.AreEqual(“Test”, categories[0].Title);
        Assert.AreEqual(“Testing”, categories[0].Description);
    }

    Feel free to download the solution and debug the methods to see it for yourself, I think its cool.

    Summary

    In summary, I hope you have found this post useful and interesting. I’ve covered how to unit test your Linq to SQL code and how you go about mocking Linq to SQL. I then finished by giving you a quick look at what TypeMock can offer. It is possible to mock Linq to SQL, however it’s not as simple as it could be. If you have any questions, please feel free to contact me.

    Download Solutions – TestingLinqToSQL.zip | TestingLinqToSQLMock.zip | TestingLinqToSQLTypeMock

    How To Unit Test – Linq To XML

    Now that I’m back from TechEd, it’s time to continue my series of “How To Unit Test” posts.  In this post, I’ve decided to have a quick look into Linq to XML and how that could be unit tested.  I’ve decided to pick the nice standard sample for Linq to XML – A Rss Reader :).  This also follows on nicely from my previous post on How To Unit Test Using SubSonic and Rhino Mocks

    Unit testing Linq to Xml is no great big secret. The fact that you are performing Linq queries under the covers doesn’t really affect your unit tests, but it’s still quite interesting.

    To start with, I’m going to be using the BBC News feed as my sample xml feed to query. My first test is to return an RssChannel object which had a url and title.  Note, I’m not implementing all the properties as that would get very boring very quickly.

    [Test]
    public void GetRssChannel_StandardRss_ReturnDataRssChannel()
    {
        RssProcessor rss = new RssProcessor();
        RssChannel chan = rss.GetRssChannel(TestHelper.url);
        Assert.AreEqual(TestHelper.url, chan.Url);
        Assert.Contains(chan.Title, “BBC”);
    }

    The implement on the method then simple queries a XDocument, creates my RssChannel object and returns it to the caller.

    public RssChannel GetRssChannel(string url)
    {
        XDocument x_Feed = XDocument.Load(url);
        var feeds = from f in m_Feed.Elements(“rss”).Elements(“channel”).Take(1)
                           select new RssChannel
                           {
                               Title = f.Element(“title”).Value,
                               Url = m_Url
                           };

        return feeds.First();
    }

    The next method does a similar function but returns a RSSItem for the position in the document (Linq to XML is great for doing queries like this).

    [Test]
    public void GetRssItem_RssFeed_ReturnsASingleRSSItem()
    {
        RssProcessor rss = new RssProcessor();
        RssItem item = rss.GetRssItem(TestHelper.url, 1);
        Assert.IsNotNull(item);
        Assert.IsNotNull(item.Title);
    }

    The query would be:

    var feeds = from f in m_Feed.Elements(“rss”).Elements(“channel”).Elements(“item”).Skip(itemPosition – 1).Take(1)
                select new RssItem
                {
                    Title = f.Element(“title”).Value
                };

    Instantly we can see from the tests that we can do a bit of refactoring.  Firstly, we can move the creation of the RssProcessor into a [Setup] method.  We can also give the Url for the feed as a parameter in the constructor.  This makes our test code cleaner as there is less duplication, but also more readable as we are focusing more on the intent of the tests.

    The complete solution can be downloaded at the end of the post, however there is one more test which I want to refer to.

    The following returns an RssFeed object which has a reference to the RssChannel and a List.

    [Test]
    public void GetRssFeed_EntireFeed_ReturnsChannelAndItems()
    {
        RssFeed feed = m_Rss.GetFeed();
        Assert.IsNotNull(feed.Channel);
        Assert.Contains(feed.Channel.Title, “BBC”);
        Assert.GreaterThan(feed.Items.Count,0);
    }

    The test works fine, however based on the implementation on the existing tests, each method loads their own Rss feed/XDocument. This is great for isolation, but doesn’t work well when you put it all together as loading a single feed will cause it to actually be loaded three times – that simply doesn’t scale.

    The solution was to have a LoadFeed method on the RssProcessor which would be called before GetFeed().  This could also allow for a single object to be used for returning multiple feeds.  Note for future: My first test should really have been for this method to make sure I can actually receive an object to work with before querying it.

    m_Rss.LoadFeed(TestHelper.url);

    Having this single method as the way to load and return the Xml makes mocking very easy. Following the approaches in previous posts, I created myself an IFeedGetter object, with a property to injection a mock if required on my RssProcessor.  The load method then simply passes it’s call onto the IFeedGetter object.

    public XDocument LoadFeed(string url)
    {
        m_Url = url;
        m_Feed = m_FeedGetter.GetXDocument(url);
        return m_Feed;
    }

    I can then create a stub feed getter to be used during unit testing.

    public class StubFeedGetter : IFeedGetter
    {
        public System.Xml.Linq.XDocument GetXDocument(string url)
        {
            return XDocument.Load(System.Xml.XmlReader.Create(new System.IO.StringReader(GetFakeXml())));
        }
    }

    So that’s it,  we have now unit tested our complete Rss implementation using mocks and Linq to XML!  If you need support in getting this to work, please give me a shout.

    Download Solution: HowToUnitTest_MockLinq-ToXml.zip

    Technorati Tags: , , ,

    What are the files the Linq to SQL designer creates?

    When I spoke about SQLMetal the files it generates are pretty straight forward.  You ask for code, it generates code.  You ask for the mapping file, it generates you the xml mapping file, and asking for the dbml generates you a very similar xml file.  So what files does the Linq to SQL designer built into Visual Studio generate?

    The main file which you interact with is the dbml file.  When you open this file, it launches into the designer. However, the file actually holds xml.  If you right click on the file and choose open with, you can view the file as XML.  You will then see that the file actually containing information about the entries included in the datacontext.  This file contains all the meta data for the tables, such as column type, name and other information.

    One of the other files is the .dbml.layout file which contains XML.  This file simply tells the designer how to layout the entries on the surface.  Not really any need to edit this by hand.

    Finally, the most important file is .designer.cs.  This contains all of the C# implementation for the DataContext and is generated based on the information in the designer/dbml file.  As the built in designer only creates attribute based files, all of the database information is also in this time. 

    Finally, when the project is compiled, the only file taken into consideration and included within the assembly is the .designer.cs which contains the DataContext code.

    Technorati Tags: ,

    LINQ In Action Book – Early Review

    linqinaction While Linq isn’t officially due out until Feburary, there are already a number of books available.  The book I have been reading is the Linq In Action book which I have had access to via the Manning Early Access Program as a PDF ebook.  This is a good way to access the book as you can download the nearly finished chapters at certain points while the book is being completed which allows you to read it in blocks and gain a good idea of the technology.  When the book is finally finished, you will be sent either a printed copy or the final ebook version.

    On Monday, all of the chapters where made available to download via the program.  I still have to read the final few chapters, however I just wanted to comment on how good I have found this book.   The book doesn’t just focus on Linq to SQL, but instead covers all aspects of the framework so you will have a great understanding.

    The book starts with an introduction of what Linq is and builds this more and more throughout the book.  It provides a very good high level overview of all the sections, however goes deep into the inner works of Linq to give you an in-depth understanding when and where it is required.  It also covers the advance parts of Linq, such as extending Linq and Linq in the different layers of the application so you will be really ready to use the technology on your first project.

    No matter what level of understanding you have of linq, be it your trying it for the first time or already have a good understand this is a really good book to read and you will gain a lot from reading it.  Once I get the final version of the book, I will give it another read. 

    Table of Contents

    Part I – Getting started
     1. Introducing LINQ
     2. C# and VB.NET language enhancements 
     3. LINQ building blocks
    Part II – Querying objects in memory
     4. Getting familiar with LINQ to Objects 
     5. Working with LINQ and DataSets 
     6. Beyond basic in-memory queries
    Part III – Mapping objects to relational databases
     7. Getting started with LINQ to SQL
     8. Peeking under the covers
     9. Advanced LINQ to SQL features
    Part IV – Manipulating XML
     10. Introducing LINQ to XML 
     11. Querying and transforming XML 
     12. Common LINQ to XML scenarios
    Part V – LINQing it all together
     13. Extending LINQ
     14. LINQ in every layer

    The book should be in stores in January 2008, but you can purchase and download the ebook today. You can download the first chapter for free.

    http://www.manning.com/marguerie/

    The book website is http://linqinaction.net/

    Technorati Tags: ,

    LINQ to SQL Beta2 to RTM Key Changes

    Dinesh just posted a list on the forum of the key changes which will occur between Beta 2 and RTM for Linq to SQL.

    Read them here:

    http://forums.microsoft.com/MSDN/showpost.aspx?postid=2061468&siteid=1&&notification_id=27123913&message_id=27123913

    Main changes are renaming of the Add/Remove methods to make them more descriptive.  They will now be called InsertOnSubmit() and DeleteOnSubmit(). 

    The OnValidate() method will now have a ChangeAction value so you know if your validating a delete, insert or update.  This will be very useful.

    One thing mentioned which was interesting is that the next release will be RTM and they are skipping RC.  Tiscali (ADSL provider at the moment) will be happy…

    Technorati Tags: ,