Nov 062013
 

Introduction

Hi all.  I’ve been working with the old ASP.NET Dynamic Data website templates, as I wanted a quick and easy to use web UI for managing the data at http://rs-photography.org.

As the site is (obviously) image-centric, I wanted to make some modifications to the out-of-the-box Dynamic Data project to include native image rendering/loading and updating.

I thought this was going to be relatively straightforward, being that there is a NuGet packaging encouragingly called “Dynamic Data Database Embedded Image Field Template” as well as a package called “Microsoft.Web.DynamicData.Handlers” which promises “Asp.Net image handler for rendering images from database”.

image

image

image

The good news is that this is a relatively straightforward implementation, but sadly lacking in clear instructions once the packages are successfully added.  This post will clarify exactly what you must do to get images displaying in your Dynamic Data site.

Note: the only references I could find online were quite old (i.e. 2007-2008), and pre-dated the NuGet packages, hence I felt the need to write this article.

The Scenario – NuGet Packages installed

imageI’m going to assume that if you are reading this article, you’re familiar with Dynamic Data websites.  If you aren’t, I’d suggest you get up to speed by reading a couple of tutorials (e.g. this one) – it’s a synch to get a site built, takes only a couple of minutes.

Once you have your site loading correctly, you’ll need to ensure you have a column in a table which is of sql type “image”.  The Entity Framework models this (correctly) as a property of type byte array (byte[]).  By default, Dynamic Data templates don’t have a field template for the byte array type, so the column is ignored when scaffolding occurs.

Assuming you have the NuGet packages installed, you’ll automatically have two new field templates added to the DynamicData folder, like so (left).

Your web.config file will also be updated to include a http handler for image requests, like so:

<system.webServer>
    <handlers>
      <add name="ImageHandler" path="ImageHandler.ashx" verb="*" 
           type="Microsoft.Web.DynamicData.Handlers.ImageHandler"/> </handlers> </system.webServer>

The handler is implemented in a binary which is included in the project – Microsoft.Web.DynamicData.Handlers.

image

The Data Schema

Now, if you run up the site and drill into the table you’ve defined your image field in, you’ll probably notice the image column is missing.  As stated above, because of the data type, by default there’s no handler.  This is where you need to do some work to help the scaffolding and field templates map properly.

Here’s my table schema and entity definitions so you can see exactly what I’ve done:

The “Files” Table:

CREATE TABLE [dbo].[Files] (
    [FileId]       INT            IDENTITY (1, 1) NOT NULL,
    [Filename]     NVARCHAR (100) NULL,
    [CatalogId]    INT            NULL,
    [SizeId]       INT            NULL,
    [IsDefault]    BIT            DEFAULT ((0)) NOT NULL,
    [Height]       INT            NULL,
    [Width]        INT            NULL,
    [ImageData]    IMAGE          NULL,
    [HasImageData] BIT            CONSTRAINT [DF_Files_HasImageData] DEFAULT ((0)) NOT NULL,
    PRIMARY KEY CLUSTERED ([FileId] ASC),
    CONSTRAINT [FK_Files_Catalog] FOREIGN KEY ([CatalogId]) REFERENCES [dbo].[Catalog] ([CatalogId]),
    CONSTRAINT [FK_Files_Sizes] FOREIGN KEY ([SizeId]) REFERENCES [dbo].[Sizes] ([SizeId])
);

The File class (entity):

public partial class File
{
    public int FileId { get; set; }
    public string Filename { get; set; }
    public Nullable<int> CatalogId { get; set; }
    public Nullable<int> SizeId { get; set; }
    public bool IsDefault { get; set; }
    public Nullable<int> Height { get; set; }
    public Nullable<int> Width { get; set; }
    public byte[] ImageData { get; set; }
    public bool HasImageData { get; set; }
    
    public virtual Catalog Catalog { get; set; }
    public virtual Size Size { get; set; }
}

Adding Attributes

You’ll need to extend on the partial class(es) in your Entity Framework model.  You need to keep the column definitions the same, so you need to just extend the class for metadata purposes (i.e. attributes).  If you add the attributes directly to the generated classes, they’ll be overwritten when you refresh the data model.

To add the appropriate attributes, I did the following:

using System.ComponentModel.DataAnnotations;

 

[MetadataType(typeof(File_MD))] public partial class File { } public partial class File_MD { [ScaffoldColumn(true)] [UIHint("Image")] [ImageFormat(100, 100)] public byte[] ImageData { get; set; } }

Examining the Attributes – the important one is [UIHint] as this provides the scaffolding the name of the type template to use (suffix/prefixes are used automatically), ergo by specifying “Image” we are implying the use of Image.ascx or Image_Edit.ascx.  To learn more about how scaffolding works, check out the following MSDN article.

Note that the extended partial class must be in the same namespace as the data model.

Once we have these attributes in place, and reload the site – given a row that has data in the image column, we should see something more favourable:

image

Presto!  Nothing more to do.  We can also now use a file picker to upload an image for new and existing records:

image

..and here it is in the SQL table:

image

If you want to make your column naming a bit more pleasing, add the following attribute to the column definition (in your metadata extension):

[Display(Name="Image")] 

image

Summary

That’s it.  It comes down to getting the entity property attributes set properly, and the rest is easy as.  It’s essentially adding image handling support out of the box for Dynamic Data sites, the last part is something a NuGet package can’t do – which is to interpret your data model.

I’ve tested this with IIS Express, Visual Studio 2013 and SQL Server 2012 and Entity Framework v5 (the last version compatible with Dynamic Data sites).  Note that the Entity Framework v6 will not work with Dynamic Data sites due to the namespace refactoring.

Enjoy.

Useful Links

http://www.olegsych.com/2010/09/understanding-aspnet-dynamic-data-entity-templates/

Jul 252012
 

Hi All.  This is a quick post to introduce you to SQL Server Data Tools – support and tools for database developers.

image

Introduction

Recently, I started a new solution in Visual Studio 2010.  There is a need to build and maintain a database schema (for SQL Server 2008 R2), so I decided to add what was once formerly known as “DataDude” – the Database Project for Visual Studio.

This was in a copy of Visual Studio Professional 2010 with Service Pack 1 – and the out-of-the-box solution only supported SQL Server 2008 and prior.  A bit surprised, I did some digging.  You’ll recall there was a ‘Database Edition GDR’ which came out a few years back..  well there’s now an even better flavour of support.

Introducing SQL Server Data Tools

It’s called ‘Microsoft SQL Server Data Tools’ and you can get a copy from the following link on MSDN.  There are quite a number of new bits and pieces included, and it works with both Visual Studio 2010 (alert: apply Service Pack 1 beforehand) or Visual Studio 2012 (RC – although with some known issues if upgrading from, the beta).

Although the installation takes a little while (depending on your connection speed), the wait is worthwhile.

image

I’ll borrow some text from the MSDN site in order to explain the purpose of SSDT:

Who is SSDT for, and what does it provide them?

SSDT is for SQL Server database developers, who often develop database schemas, views, stored procedures, and other database objects while developing their application logic.

  • Tooling for both SQL Server and SQL Azure Development: SSDT offers new capabilities in a single cohesive environment to compile, refactor, and deploy databases to specific editions of SQL Server and SQL Azure. The toolset makes it easy, for example, to migrate on-premise SQL Server schemas to the cloud on SQL Azure, and develop and maintain databases across both on premise and cloud deployments. SSDT can target SQL Server 2005, SQL Server 2008, SQL Server 2008 R2, SQL Server 2012, and SQL Azure databases, including all editions of these database servers.
  • For SQL Server DBAs: SSDT provides a central and unified toolset targeted to the specific needs of DBAs to develop and maintain databases, with visual tools for developing tables, schema compare, and rich T-SQL support for refactoring databases, building views, stored procedures, functions and triggers. The toolset provides both a live development mode, and an offline project mode that tracks and manages all artifacts associated with a database. This mode optionally fully integrates with Visual Studio 2010 for team development, source control and change tracking. All change operations are automatically transformed into optimized T-SQL alter scripts, and can optionally be applied immediately to the online database or saved for later execution.

[http://msdn.microsoft.com/en-us/data/hh322942]

It’s actually very easy to use. 

Real World Applications

For example, if you open a solution containing legacy database projects the tools will automatically prompt you as to whether you wish to upgrade your existing database projects to the newer edition.

The basic benefit is targeting SQL Server 2008 (and R2) as well as SQL Server 2012 and SQL Azure.  That last part might get your attention! That’s right – SQL Azure.  We’ll be checking this out soon and reporting back in a bit more detail. 

How Do I get SSDT?

You can download the “pre-installer” here.

For more information, check out the ‘Getting Started with SQL Data Tools’ located here.. or stay tuned at this location for more!

But wait.. there’s a bug?

Today I got into full swing with the SSDT and a real world project.  During my work I came across a fairly horrible bug which has been documented on the MSDN Forums.

Basically, if you initialize (i.e. use) any of the SSDT tools you can’t use the Entity Framework tools and vice-versa.  The interim workaround is to load two instances of Visual Studio (perhaps even the same solution?) and use the SSDT tools in the first instance, and the Entity Framework tools in the other.

A symptom which might lead you to this article?  When you have the SQL Server Data Tools installed, and try to update or create an Entity Framework data model – you may receive errors such as ‘Object not set’ etc.  In some cases it might crash the VS IDE.  In my experience, I received the ‘object not set’ errors, and the Model Explorer was greyed/did not render.  I was also not able to refresh my EDMX properly.

Whilst being a right royal pain in the butt, believe it or not this approach does work – even if it does make for a very disjointed development experience.  According to the thread, a fix is in the works – but no word on when it will be released.

Aug 232011
 

Continuing on from Part 1, our objective in this article is to create a new Visual Studio solution, and then populate it with some existing projects.  To facilitate this, it would be beneficial if you had a few projects already created.  For this example, I have defined the following structure:

C:\UnitTesting – Root Folder

C:\UnitTesting\TestSolution – An existing folder which contains subdirectories which hold project files.

The sample structure is as follows:

image

So basically, there are three projects and two reference each other.  Our goal is to create a new solution file in the root directory which is then populated with references to the existing project files.

Please bear in mind that this is all demo code, and lacks the normal hardening and checks which production code should have.  Please use wisely, as there are no refunds.. ha!

I’ve reworked the previous example to be a little more robust, as follows:

  1. Encapsulated the functionality to create a new blank solution file (a method called CreateSolution())
  2. Second, I’ve added a new function called AddProjects which will take a path, plus details about the target solution file.

I’ve moved the DTE2 object to be a static object, and refactored CreateSolution thus:

private static DTE2 _dte2 = (DTE2)Microsoft.VisualBasic.Interaction.CreateObject("VisualStudio.DTE.10.0", "");

public static void CreateSolution(string path, string solutionName)
{
    Solution4 solutionObject = (Solution4)_dte2.Solution;
    solutionObject.Create(path, solutionName);
    solutionObject.Close(true);
}

Unit testing for this was easy enough:

[TestMethod]
public void CreateEmptySolution()
{
    SolutionHelper.CreateSolution(@"C:\UnitTesting\", "UnitTestSolution");
}

[TestMethod]
public void AddProjects()
{
    File.Delete(@"C:\UnitTesting\UnitTestSolution.sln");
    SolutionHelper.AddProjects(@"C:\UnitTesting\UnitTestSolution.sln", @"C:\UnitTesting\TestSolution", "*.*proj");
}

As you can see, all pretty straightforward.  So reviewing our CreateSolution function, it now just takes a path and solution name (omit the “.sln” extension) and creates the empty solution.

The next step is to iterate and find projects, and finally – add them to a solution file.  I’ve added some logic which creates the target Solution File if it does not exist.

The AddProjects function is pretty easy to understand:

public static void AddProjects(string solutionFile, string projectPath, string projectWildCardMatch)
{
    if (!File.Exists(solutionFile))
    {
        CreateSolution(Path.GetDirectoryName(solutionFile), Path.GetFileName(solutionFile));
    }

    Solution4 solutionObject = (Solution4)_dte2.Solution;
    solutionObject.Open(solutionFile);

    var fileNames = Directory.GetFiles(projectPath, projectWildCardMatch, SearchOption.AllDirectories);

    foreach (string projectFile in fileNames)
    { 
        solutionObject.AddFromFile(projectFile, false);
    }

    solutionObject.Close(true);
}

Hopefully you can see how easy it is to add projects to the solution.  The boolean value in the call Solution.AddFromFile is important, as it elects to use the currently loaded solution.  If omitted, it is either attached to the project’s original solution (in my experience).  Make sure you save the solution when closing it.

If we open the generated solution file in Visual Studio 2010, you’ll see that not only are the projects added, but they also show the references:

image

Here is a complete copy of the source file:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using EnvDTE80;
using EnvDTE100;
using System.IO;
using EnvDTE;
using System.Diagnostics;

namespace VSAutomation.Toolkit
{
    public static class SolutionHelper
    {
        private static DTE2 _dte2 = (DTE2)Microsoft.VisualBasic.Interaction.CreateObject("VisualStudio.DTE.10.0", "");

        public static void CreateSolution(string path, string solutionName)
        {
            Solution4 solutionObject = (Solution4)_dte2.Solution;
            solutionObject.Create(path, solutionName);
            solutionObject.Close(true);
        }

        public static void AddProjects(string solutionFile, string projectPath)
        {
            if (!File.Exists(solutionFile))
            {
                CreateSolution(Path.GetDirectoryName(solutionFile), Path.GetFileName(solutionFile));
            }

            Solution4 solutionObject = (Solution4)_dte2.Solution;
            solutionObject.Open(solutionFile);
            
            var fileNames = Directory.GetFiles(projectPath, "*.*proj", SearchOption.AllDirectories);

            foreach (string projectFile in fileNames)
            { 
                solutionObject.AddFromFile(projectFile, false);
            }

            solutionObject.Close(true);
        }
    }
}
Aug 082011
 

Now for something completely different.  Recently I had the desire to build a Visual Studio 2010 solution file for a number of project files under a directory.  There were many projects, and each one linked to a solution file – but I wanted a “master” solution file.. one to rule them all(?)

Generally, I hate manual repetitious tasks, and laboriously adding all the projects to a  new, blank, solution file seemed tedious and not fun one bit.  Then I decided that I’d write a program which would iterate all the projects and create a solution file for me.  Sounds simple enough, right?

Well, it took me on a journey into Visual Studio automation.  There are many facets of working with VS, as I found out, and backwards compatibility reigns supreme.  What I’m going to introduce you to is pretty much just for Visual Studio 2010, but in the right hands a design could come which supports versions dating back to Visual Studio 2005.

The Prerequisites

Optional Extras

Once you’ve got your system all nicely configured, we’re ready to begin.

The Beginning

To begin with, I created a Class Library.  My intention was to create a large amount of nice, reusable automation for integration into Team Build, utilities and so forth.  You never know when this is going to be very, very handy.

To start our journey, I decided that my requirement was going to be very simple (for Part 1): create a blank solution file using Visual Studio 2010.  Easy enough, right?  Well, as you’d expect, the devil is in the details.  This took me quite a bit of time to get right, but in the end the outcome was most pleasing.

So open up Visual Studio, create a project type you’d prefer and then ready the project, as so:

  1. Add a reference
  2. Pick the COM tab
  3. Scroll and select “Microsoft Development Environment 10.0”
  4. This will add several COM interop assemblies to your project (and also include the previous versions)
  5. Click on the .Net tab and select “Microsoft.VisualBasic” (even if you aren’t coding in Visual Basic)

image image

Now it is time to go code us up a simple solution.  I’m going to be working on a far more robust library, but in the meantime this will introduce you to the extremely powerful DTE and DTE2 objects.

We have to (sadly) instantiate a COM object wrapper, which will actually load devenv in a non-interactive mode, from which we can then cherry pick functionality we’d like to use.  For this demo, we’re just going to create a Visual Studio 2010 compatible solution file.

Consider the following very simple class:

namespace VSAutomation.Toolkit
{
    public static class SolutionHelper
    {
        public static void CreateSolution(string path, string solutionName)
        {
            DTE2 dte2 = (DTE2)Microsoft.VisualBasic.Interaction.CreateObject("VisualStudio.DTE.10.0", "");
            Solution4 solutionObject = (Solution4)dte2.Solution;
            solutionObject.Create(path, solutionName);
            solutionObject.Close(true);
        }
    }
}

You will notice that we create a COM object instance using the VisualBasic namespace.  When the code actually creates the object, under the hood, the machine will see an instance of Visual Studio created (but no GUI):

image

Once it is loaded, we can get access to the VS Solution functionality.  From here, it is a simple matter of supplying the path and solution name (omit the .sln extension) and, assuming all goes to plan, you will have the following output:

image

Obviously this is just the tip of the iceberg. 

We have so much powerful functionality to exploit, so we won’t stop at this.  Since we’ve achieved our objective for Part 1, we’ll wrap up here, but check back for Part 2 where we’ll populate the solution file with existing (or new) projects!

In the next part, we’ll also discuss ways of supporting the older versions (Visual Studio 2008 and 2005) as well as providing a more verbose sample.


Further Reading

How to: Get References to the DTE and DTE2 Objects

Solution4.Create Method (String, String)

May 102011
 

If you are migrating projects from previous versions of Visual Studio, you might intend to upgrade the project files to be compatible with Visual Studio 2010, but you may not want to necessarily upgrade to the latest version of the .Net Framework (v4.0) until you have properly tested compatibility.

Fair enough, but unless you have installed Visual Studio 2010 Service Pack 1, you might have some trouble with Unit Tests.  Until Service Pack 1 was released, upgrading would automatically retarget Unit Test projects to the .Net Framework v4.0 which could be a problem, especially if you have linked dependencies between projects.  To maintain compiling Unit Tests against the .Net Framework 3.5, you need to do the following:

The following MSDN article located here: http://msdn.microsoft.com/library/gg601487.aspx outlines the steps required after you have installed Visual Studio 2010 Service Pack 1 (which is a prerequisite), to allow Visual Studio Unit Test projects to target the previous version of the .Net Framework.

If you are in my position, where I’ve installed the TFS Power Tools, the devenv.exe.config file has been modified (prior to the installation of Service Pack 1) which means that you have to manually make some changes.  I highly recommend you back up the config file before making any manual changes.

There is a slight typo in the MSDN article though, at step #6, there should be a space between the “add” in the <appSettings>, i.e.

<appSettings>
     <add key="TestProjectRetargetTo35Allowed" value="true" />  
</appSettings>

..and this entry should be added after the <configSections> block (if you have one).

Then, you may add the following after the requisite <assemblyBinding> section element:

<dependentAssembly>
        <assemblyIdentity name="Microsoft.VisualStudio.QualityTools.UnitTestFramework" publicKeyToken="b03f5f7f11d50a3a" culture="neutral"/>
        <bindingRedirect oldVersion="10.1.0.0" newVersion="10.0.0.0"/>
      </dependentAssembly>
      <dependentAssembly>
        <assemblyIdentity name="Microsoft.VisualStudio.QualityTools.Tips.UnitTest.Adapter" publicKeyToken="b03f5f7f11d50a3a" culture="neutral"/>
        <bindingRedirect oldVersion="10.1.0.0" newVersion="10.0.0.0"/>
      </dependentAssembly>
      <dependentAssembly>
        <assemblyIdentity name="Microsoft.VisualStudio.QualityTools.Tips.UnitTest.ObjectModel" publicKeyToken="b03f5f7f11d50a3a" culture="neutral"/>
        <bindingRedirect oldVersion="10.1.0.0" newVersion="10.0.0.0"/>
      </dependentAssembly>
      <dependentAssembly>
        <assemblyIdentity name="Microsoft.VisualStudio.QualityTools.Tips.UnitTest.Tip" publicKeyToken="b03f5f7f11d50a3a" culture="neutral"/>
        <bindingRedirect oldVersion="10.1.0.0" newVersion="10.0.0.0"/>
      </dependentAssembly>

If you are unsure about whether your configuration is valid or not, if you are unable to connect to a TFS server, you know that you have borked your devenv.exe.config file!

Note:  I had trouble with this when I initially changed the Target Version.  When I reverted to .Net Framework v3.5, Visual Studio 2010 would force me through the Project Migration Wizard, which changed the Target Version back to .Net Framework 4.0!

Around we went in circles, until I stumbled across a Microsoft Connect entry which listed the following workaround (which I’ve tested successfully):

1. Unload the test project
2. Edit the xxxx.csproj
3. Remove {3AC096D0-A1C2-E12C-1390-A8335801FDAB}; from <ProjectTypeGuids>
4. Change the TargetFrameworkVersion to v3.5 (Save)
5. Reload the project (migration wizard still is prompted)
6. Change the reference to Microsoft.VisualStudio.QualityTools.UnitTestFramework.dll as specified in the other workaround.

Note: you can try to re-add the ProjectTypeGuid back, but VS 2010 will convert the project to .Net 4 again when you reload the project in step 4.