remote desktop using both monitors without additional software

Liam Westley had a very interesting post today where he mentioned using the /span property on mstsc (Remote Desktop client) to enable dual screen remote desktop.  One of the things I hate about remote desktop when I connect to my work machine is that I can only use one monitor, this solves the problem.

To activate this, you need to run mstsc from the command problem:

C:WindowsSystem32mstsc.exe /span

More information can be found on technet where it says the switch “Matches the Remote Desktop width and height with the local virtual desktop, spanning across multiple monitors if necessary”

One annoyance, any window with the position set to centre screen will end up in the split between my two monitors.

Hope you find it useful…

Technorati Tags: ,

Gift from Microsoft – Visual Studio 2008 Thank you

Last night when I got back home from a NxtGenUG session I had a gift from Microsoft for helping with the Visual Studio 2008 Beta (I think it was due to submitting a few bugs). The gift was a glass cube with an engraving.  Looks cool.

The Box

IMAGE_005 

The message

IMAGE_006

The Cube

IMAGE_008

The engraving doesn’t really come out but its got Visual Studio 2008 and .Net framework.  It’s now happily sitting on my desk.  Thanks team!

Technorati Tags:

Upgrading a project from Visual Studio 2005 to Visual Studio 2008

While Visual Studio 2008 supports multi-targeting, sadly you are still required to convert the solution into the 2008 version.  I will briefly discuss how to convert a existing project from 2005 to 2008.

Visual Studio 2005 ASP.net 2.0 Website

To start with, I will discuss how to convert an ASP.net website. I’ve got an existing C# website stored on my file system which just containing a single Default.aspx page saying “This is a Visual Studio 2005 website.” (I didn’t get very far with it).

If we load Visual Studio 2008 (Beta 2) and select File > Open > Website and select the 2005 Project.  When we click open a dialog will be displayed

.NET Framework 2.0 Web Site Found

Clicking yes will load the website and add System.Core and System.Xml.Linq (not System.Data.Linq, you will need to add this manually yourself) and targeted for the .Net Framework 3.5.

Visual Studio 2005 Solution

For all other solutions, including ASP.net 2.0 solutions, when you load the project solution (File > Open > Solution) you will be represented with the conversion wizard, the same one as used when converting 2003 to 2005.

Visual Studio Conversion Wizard 

Visual Studio Conversion Wizard (2)

Visual Studio Conversion Wizard (3)

Visual Studio Conversion Wizard (4)

 Upon clicking close, your project will be loaded. However, it will still act and behave as a 2.0 application and still compile under the 2.0 framework.

2005WinFormApplication - Microsoft Visual Studio

Inside the actual solution file, the version text changes from

Microsoft Visual Studio Solution File, Format Version 9.00
# Visual Studio 2005

to

Microsoft Visual Studio Solution File, Format Version 10.00
# Visual Studio 2008

Because of this, when you try and load the project again in Visual Studio 2005 you will be given the following error message.

Microsoft Visual Studio

In order to add Linq functionality to the application you need to convert it to a .Net 3.5 application.  First, you need to change the Target Framework to .Net Framework 3.5 and then add references to the System.core.dll assembly.  In order to use Linq to SQL you will need to add System.Data.Linq.dll and for Linq to XML you will need to use System.Xml.Linq.dll.

Technorati Tags:

What are the files the Linq to SQL designer creates?

When I spoke about SQLMetal the files it generates are pretty straight forward.  You ask for code, it generates code.  You ask for the mapping file, it generates you the xml mapping file, and asking for the dbml generates you a very similar xml file.  So what files does the Linq to SQL designer built into Visual Studio generate?

The main file which you interact with is the dbml file.  When you open this file, it launches into the designer. However, the file actually holds xml.  If you right click on the file and choose open with, you can view the file as XML.  You will then see that the file actually containing information about the entries included in the datacontext.  This file contains all the meta data for the tables, such as column type, name and other information.

One of the other files is the .dbml.layout file which contains XML.  This file simply tells the designer how to layout the entries on the surface.  Not really any need to edit this by hand.

Finally, the most important file is .designer.cs.  This contains all of the C# implementation for the DataContext and is generated based on the information in the designer/dbml file.  As the built in designer only creates attribute based files, all of the database information is also in this time. 

Finally, when the project is compiled, the only file taken into consideration and included within the assembly is the .designer.cs which contains the DataContext code.

Technorati Tags: ,

Remove Recent Projects from Visual Studio 2008

Ever wanted to remove an item from the recent projects menu on the start page of Visual Studio?

The list is stored in the registry under:

HKEY_CURRENT_USERSoftwareMicrosoftVisualStudio9.0ProjectMRUList

Here, you will find a list like this:

File1 Reg_Expand_Sx Path

File2 Reg_Expand_Sx Path

File3 Reg_Expand_Sx Path

File4 Reg_Expand_Sx Path

You just need to delete the items you don’t want.  Note: If you delete item 2 you will need to rename item 3 and 4 so there are no gaps in the naming. (3 becomes 2, 4 becomes 3).

Another way is just to wait until you have opened more projects.  Also, if you delete the project solution and try and open it, Visual Studio will display a dialog asking if you want to remove it from the list.

Technorati Tags:

Linq to SQL – Mapping Tables to Objects

In this post I am going to cover how Linq to Sql deals with the mappings between the objects in your assembly and the database.

There are two ways in which Linq to SQL handles the mapping, either AttributeMappingSource or XmlMappingSource.  By default, Linq uses AttributeMappingSource where all the information which links tables and columns to classes and properties are stored as attributes within the DataContext.

For example, this code links the Categories object to dbo.Categories table.

[Table(Name=”dbo.Categories”)]
public partial class Categories : INotifyPropertyChanging, INotifyPropertyChanged

While this code links the CategoryID property to the column, also called CategoryID.

[Column(Storage=”_CategoryID”, AutoSync=AutoSync.OnInsert, DbType=”Int NOT NULL IDENTITY”, IsPrimaryKey=true, IsDbGenerated=true)]
public int CategoryID

There are a few advantages to having attribute saved within the code. Reading the code is easier as everything is in a single place and the compiler can verify that everything is correct.

The other choice is to use an external file which contains all the database information as XML.  This is then combined with a DataContext (which doesn’t have attributes) in order to be able to query the database.

The external mapping file can be generated by SQLMetal as I discussed in my previous post.  The following command would generate the file.

SqlMetal /server:. /database:northwind /map:”%~dp0Northwind.map”

The contents of the map file would look something like this.


 
   
   
   
   
   
 

We can then generate the DataContext code and tell it to use the external file by having both map and code as options, like below.

SqlMetal /server:. /database:northwind /map:”%~dp0Northwind.map” /code:”%~dp0Northwind.cs”

The datacontext would then just be the code.

public partial class Categories : INotifyPropertyChanging, INotifyPropertyChanged

and

public int CategoryID

Like with attributes, there are also advantages to be had when using an external file.  For one, you can make schema changes without having to recompile the application – but you could only make minor changes without breaking the relationship.  Another advantage is that the same mapping file could be used by multiple DataContext objects.   One possible disadvantage could be the two becoming out of sync, which would result in runtime errors.

However, to use this in our application we need to specify that we want to use the external mapping source, we could just modify the DataContext constructors to do this for us.

Northwind db = new Northwind(LinqConsole.Properties.Settings.Default.NorthwindConnectionString, XmlMappingSource.FromXml(“Northwind.map”));

So which one to use?  Well it really depends on your requirements, for the most part using either one will be fine.

Finally, the instead of SQLMetal connecting to the database directly to generate the mapping and datacontext, it can gain all the information from the dbml file.  The dbml file contains all the database metadata and everything SQLMetal requires to generate the code and mappings.

To generate the dbml you use the command.

SqlMetal /server:. /Database:Northwind /dbml:Northwind.dbml

Which creates an XML file containing data like this.  It looks similar to the mapping file, however is slightly different.


 
   
   
   
   
   
 

Then to use the dbml to generate the mapping file and code instead of a database connection you would use this command.

SqlMetal /map:”%~dp0Northwind.map” /code:”%~dp0Northwind.cs” Northwind.dbml

Technorati Tags: ,

The power of SQLMetal

During my time with Linq, I have created all of my DataContext’s using the designer in Visual Studio 2008.  I’ve been aware of SQLMetal but always thought it was just a command line tool which didn’t offer much – after looking at it today, I am wrong.  SqlMetal is very powerful and cool!

To start with, SqlMetal can generate a DataContext for your entire database with a single command.  This is very useful if you have a large number of tables in your system, as dragging and dropping them onto the designer would have got boring very quickly. By entering this command we can have our Northwind datacontext created for us and saved to the file NorthwindDataContext.cs.

SqlMetal /server:. /database:Northwind /code:NorthwindDataContext.cs

We can then include the class within our project and use it as if the designer had created it.  However, it would have been nice if it accepted an array of tables to exclude, or only include, during the creation process.  Also it also doesn’t create an overload for the constructor so it uses the App.config connection string like the designer does.

We can also get SqlMetal to include all of our views, functions and sprocs

SqlMetal /server:. /database:Northwind /code:NorthwindDataContext.cs /views /functions /sprocs

If it cannot extract an item from the database into code, then it will continue with the process and report the error at the end.  In my case I had this single error:

warning SQM1014: Unable to extract stored procedure ‘dbo.sp_upgraddiagrams’ from SqlServer. Invalid object name ‘dbo.dtproperties’.

I think that’s a really powerful feature and makes life a lot simpler than manually creating everything.

In the above example, we are asking SQLMetal to generate our DataContext as code. However, SQLMetal can also output the dbml (/dbml:file.dbml) or mapping (/map:file.xml) file for the database.

By default, the Linq to SQL designer will pluralise all of the table names (Orders => Order).  SQLMetal doesn’t do this by default, however by adding /pluralize you can force the changes.  I’ve spoken about this before.

You can define the language which the code should be generated using the /language: option and you can set the namespace for which the DataContext should be part of by using the /namespace: property.

If you wanted a different name for your datacontext class then you could use the /context: option, by default it uses the same name as the database.

Onto my favourite two options. The entitybase option (/entitybase:) allows you to specify a class or interface which all the entities in the datacontext must inherit from.

SqlMetal /server:. /database:northwind /code:”%~dp0Northwind.cs” /entitybase:MyNamespace.ITable

The code generated would then look like this:

[Table(Name=”dbo.Categories”)]
public partial class Categories : MyNamespace.ITable, INotifyPropertyChanging, INotifyPropertyChanged

This is a very useful feature, as discussed in the previous post we should place all custom code in a partial class. I don’t think interfaces are best used in this situation as you would need to implement it on all the entities, however it could be useful for base classes.

Finally, /serialization:Unidirectional option adds a [DataContract()] attribute to the classes and [DataMember(Order=#)] to the properties.  These two attributes allow for objects to be serialised which means they can be used with WCF.

Ben’s top tip

Use a classic batch file (.bat) to store your SQLMetal command. Then when you need to regenerate the DataContext you will not end up using different settings by mistake.  This batch file can then be included within your source control for the rest of the team to use.  Use the “%~dp0″ variable to ensure it runs from its current location and not the default command line location, generally it should be your project folder.

Given this command in a batch file, it will convert the %~dp0 to the path where the path file is located.

SqlMetal /server:. /database:northwind /code:”%~dp0Northwind.cs”

Becomes

SqlMetal /server:. /database:northwind /code:”E:PATHTOMYPROJECTSNorthwind.cs”

Technorati Tags: , ,

Effects database changes have on Linq to SQL

In this post I’m going to discuss the effects of changing the underlying database table has on Linq to SQL and to make people aware of the potential issues as quite a few people seem to be asking about this.  Changes to the underlying database structure is a problem with any application and data access, be it ADO.net and Datasets, SubSonic, Linq or any other approach.  If the data structure is changed without your application being updated or knowing about the change then you will run into problems.

In this application, I will be working with the following entity.

Table

Writing queries against this isn’t a problem.   For example, if we wanted to write everything out to the console we could do this:

var query = from p in db.Persons
                   select p;

foreach (var person in query)
{

    Console.WriteLine(“{0} | {1} | {2} | {3}”,
                                 person.id, person.Name, person.PostCode, person.Phone);
}

This could result in:

1 | Bob | AAA | 123
2 | Jack | BBB | 0
3 | Gill | MB | 12346

Adding additional columns

If we added an additional column, say email,  onto our table then the above could would still work fine however we would never be able to access that additional column until we regenerated our DataContext (see below).

If we inserted a new person then this also wouldn’t stop our application from working, unless the new column was set to be NOT NULL, however we wouldn’t be able to fill in that information.

Person newP = new Person();
newP.Name = “Test”;
newP.PostCode = “TTTT”;
newP.Phone = new Random().Next();
db.Persons.Add(newP);
db.SubmitChanges();

Changing column data types

Changing the types in our database could cause our application to stop functioning. If we change the int to a bigint in the table design then our query would not longer run because Linq couldn’t convert a long to an int.  If we did something like change a nvarchar(50) to a nvarchar(MAX) then this wouldn’t cause a problem as Linq treats them both as a string.

Removing columns

Removing columns is when most problems will occur. If I removed the phone column from the table and execute the query then I would receive a SqlException – Invalid column name ‘Phone’.  If our query was like below, then it wouldn’t cause an exception as Phone would never be called.

var query = from p in db.Persons
                        select new { p.id, p.Name, p.PostCode };

However, when inserting data we will always get a SqlException even if the column was not populated as Linq will try and insert null into the column.

Regenerating our DataContext

So, if you are changing the underlying table, updating the DataContext in your system is also required.  In Visual Studio 2008 there are only two ways to update a DataContext.  Either, regenerate the entire datacontext using SQLMetal, or remove the table from the datacontext using the designer and then insert it again.  Shame there isn’t a button called refresh on the designer.

Thankfully, all the objects are partial, so you can store all your custom logic in a separate class which will not be affected by this.  However, if you go against this and write your logic in the actual DataContext.designer.cs class then when you recreate it, you will lose these changes.

Technorati Tags: ,

Converting IEnumerable to IEnumerable

This morning on the forum there was a question regarding nullable types (ie decimal?) and how to convert an IEnumerable collection containing nullable types to one containing the equivalent non-nullable type (decimal).  This raised an interesting question, how do you convert a collection containing one type of object, into a collection containing another type of object, be it nullable>nonnullable or otherwise.

To start with, I looked at the extension methods of IEnumerable, and in particularly Cast<>.  However, if the value is null then we are going to have to handle it differently which cast doesn’t have the ability to do.  So no luck there and there didn’t seem any other methods.

I then looked at the methods which List<> contains, this implements IEnumerable<> so its part of the same type.  List<> does have a method which allows for this requirement, called ConvertAll() which takes a Converter object, which in turn takes a method name which is called for each object in the list to handle the conversion.  This then allowed me to write this code:

List nonNull;
List nullable = new List();

nullable.Add(null);
nullable.Add(1);

nonNull = nullable.ConvertAll(new Converter(ConvertToNonNull));

foreach (var i in nonNull)
{
   Console.WriteLine(i);   
}

public static decimal ConvertToNonNull(decimal? nullValue)
{
    return nullValue.GetValueOrDefault(0);
}

ConvertToNonNull is called for every item in the list, taking advantage of generics to keep everything strongly typed.  The GetValueOrDefault() is available to use on all nullable types, you simple give it the value as a parameter you want it to return if the value is null.

The problem is that you have to be working in terms of List<> and not IEnumerable<>.  The solution, create your own extension method for IEnumerable<>.

public static class IEnumerableExtension
{
     public static IEnumerable ConvertAll(this IEnumerable collection, Converter converter)
     {
         if (converter == null)
             throw new ArgumentNullException(“converter”);

         List list = new List();

         foreach (T i in collection)
         {
             list.Add(converter(i));
         }

         return list;
     }
}

The extension method above attaches itself to IEnumerable<> and takes a Converter as a parameter – just the same as List does.  I then check to make sure its not null,  create a new internal list of the outputting type (need a way to hold the converted items internally).  Then for each item in the source collection, I call the converter method.  Finally I return the list as a return type of IEnumerable<>.  Here’s the calling code.

IEnumerable notNull;
IEnumerable INull;

INull = nullable; //Original list.

notNull = INull.ConvertAll(new Converter(ConvertToNonNull)); //My new extension method

foreach (decimal i in notNull)
{
    Console.WriteLine(i);
}

One of the nice things about extension methods is that if a method, with the same name and parameter list, is defined locally within the class then it overrides the extension implementation.  So, the List<>.ConvertAll() functionality is not affected.

Hope you find a use for this.

Technorati Tags: ,

Linq to SQL – DataLoadOptions – Does it improve performance?

Following on from my previous post on DataLoadOptions, I decided to do a quick test to see if it does actually improve performance, even with its limitations.

If we take this query, with DataLoadOptions only a single query is executed, while with it not set a query is executed per order.

DataLoadOptions options = new DataLoadOptions();
options.LoadWith(Customer => Customer.Orders);
db.LoadOptions = options;

var query = from c in db.Customers
                        select c;

            foreach (var c in query)
            {
                Console.WriteLine(“{0}, Phone: {1} with {2} orders”, c.ContactName, c.Phone, c.Orders.Count);
            }

Without using DataLoadOptions, it took 1,739,904 Ticks to execute, while with using them it took 1,487,722 Ticks. An improvement, but nothing ultra special.

If we consider this query, which if you have read the previous, causes a query to be executed for each order even with DataLoadOptions on:

DataLoadOptions options = new DataLoadOptions();
options.LoadWith(Customer => Customer.Orders); options.LoadWith(Order => Order.Order_Details);
db.LoadOptions = options;

var query = from c in db.Customers
                     select c;

         foreach (var c in query)
         {
             Console.WriteLine(“{0}, Phone: {1}”, c.ContactName, c.Phone);
             foreach (var o in c.Orders)
             {
                 Console.WriteLine(“Order Details for OrderID: {0} with {1} lines”, o.OrderID, o.Order_Details.Count);

                 foreach (var od in o.Order_Details)
                 {
                     Console.WriteLine(“Product ID: {0}”, od.ProductID);
                 }
             }
         }

Without using DataLoadOptions, it took 10,003,725 Ticks to execute.  While using DataLoadOptions, taking into account the time to setup the options, it took 6,536,320 Ticks.  Much clearer performance improvements when a larger number of queries are being executed.

There is definitely performance improvements to be had even with it not being able to load all the data in at the same time and so should be used when executing these kinds of queries.

However, what happens if we are not using the Order/Order Details information and simply just using Customer data?  Executing our first query again but without c.Orders.Count, with options set it takes 1,609,867 Ticks, without it takes 1,174,217 Ticks.  So, if your not using them, then there is additional overhead to be aware of.

Technorati Tags: , ,