Nov 062013
 

Introduction

Hi all.  I’ve been working with the old ASP.NET Dynamic Data website templates, as I wanted a quick and easy to use web UI for managing the data at http://rs-photography.org.

As the site is (obviously) image-centric, I wanted to make some modifications to the out-of-the-box Dynamic Data project to include native image rendering/loading and updating.

I thought this was going to be relatively straightforward, being that there is a NuGet packaging encouragingly called “Dynamic Data Database Embedded Image Field Template” as well as a package called “Microsoft.Web.DynamicData.Handlers” which promises “Asp.Net image handler for rendering images from database”.

image

image

image

The good news is that this is a relatively straightforward implementation, but sadly lacking in clear instructions once the packages are successfully added.  This post will clarify exactly what you must do to get images displaying in your Dynamic Data site.

Note: the only references I could find online were quite old (i.e. 2007-2008), and pre-dated the NuGet packages, hence I felt the need to write this article.

The Scenario – NuGet Packages installed

imageI’m going to assume that if you are reading this article, you’re familiar with Dynamic Data websites.  If you aren’t, I’d suggest you get up to speed by reading a couple of tutorials (e.g. this one) – it’s a synch to get a site built, takes only a couple of minutes.

Once you have your site loading correctly, you’ll need to ensure you have a column in a table which is of sql type “image”.  The Entity Framework models this (correctly) as a property of type byte array (byte[]).  By default, Dynamic Data templates don’t have a field template for the byte array type, so the column is ignored when scaffolding occurs.

Assuming you have the NuGet packages installed, you’ll automatically have two new field templates added to the DynamicData folder, like so (left).

Your web.config file will also be updated to include a http handler for image requests, like so:

<system.webServer>
    <handlers>
      <add name="ImageHandler" path="ImageHandler.ashx" verb="*" 
           type="Microsoft.Web.DynamicData.Handlers.ImageHandler"/> </handlers> </system.webServer>

The handler is implemented in a binary which is included in the project – Microsoft.Web.DynamicData.Handlers.

image

The Data Schema

Now, if you run up the site and drill into the table you’ve defined your image field in, you’ll probably notice the image column is missing.  As stated above, because of the data type, by default there’s no handler.  This is where you need to do some work to help the scaffolding and field templates map properly.

Here’s my table schema and entity definitions so you can see exactly what I’ve done:

The “Files” Table:

CREATE TABLE [dbo].[Files] (
    [FileId]       INT            IDENTITY (1, 1) NOT NULL,
    [Filename]     NVARCHAR (100) NULL,
    [CatalogId]    INT            NULL,
    [SizeId]       INT            NULL,
    [IsDefault]    BIT            DEFAULT ((0)) NOT NULL,
    [Height]       INT            NULL,
    [Width]        INT            NULL,
    [ImageData]    IMAGE          NULL,
    [HasImageData] BIT            CONSTRAINT [DF_Files_HasImageData] DEFAULT ((0)) NOT NULL,
    PRIMARY KEY CLUSTERED ([FileId] ASC),
    CONSTRAINT [FK_Files_Catalog] FOREIGN KEY ([CatalogId]) REFERENCES [dbo].[Catalog] ([CatalogId]),
    CONSTRAINT [FK_Files_Sizes] FOREIGN KEY ([SizeId]) REFERENCES [dbo].[Sizes] ([SizeId])
);

The File class (entity):

public partial class File
{
    public int FileId { get; set; }
    public string Filename { get; set; }
    public Nullable<int> CatalogId { get; set; }
    public Nullable<int> SizeId { get; set; }
    public bool IsDefault { get; set; }
    public Nullable<int> Height { get; set; }
    public Nullable<int> Width { get; set; }
    public byte[] ImageData { get; set; }
    public bool HasImageData { get; set; }
    
    public virtual Catalog Catalog { get; set; }
    public virtual Size Size { get; set; }
}

Adding Attributes

You’ll need to extend on the partial class(es) in your Entity Framework model.  You need to keep the column definitions the same, so you need to just extend the class for metadata purposes (i.e. attributes).  If you add the attributes directly to the generated classes, they’ll be overwritten when you refresh the data model.

To add the appropriate attributes, I did the following:

using System.ComponentModel.DataAnnotations;

 

[MetadataType(typeof(File_MD))] public partial class File { } public partial class File_MD { [ScaffoldColumn(true)] [UIHint("Image")] [ImageFormat(100, 100)] public byte[] ImageData { get; set; } }

Examining the Attributes – the important one is [UIHint] as this provides the scaffolding the name of the type template to use (suffix/prefixes are used automatically), ergo by specifying “Image” we are implying the use of Image.ascx or Image_Edit.ascx.  To learn more about how scaffolding works, check out the following MSDN article.

Note that the extended partial class must be in the same namespace as the data model.

Once we have these attributes in place, and reload the site – given a row that has data in the image column, we should see something more favourable:

image

Presto!  Nothing more to do.  We can also now use a file picker to upload an image for new and existing records:

image

..and here it is in the SQL table:

image

If you want to make your column naming a bit more pleasing, add the following attribute to the column definition (in your metadata extension):

[Display(Name="Image")] 

image

Summary

That’s it.  It comes down to getting the entity property attributes set properly, and the rest is easy as.  It’s essentially adding image handling support out of the box for Dynamic Data sites, the last part is something a NuGet package can’t do – which is to interpret your data model.

I’ve tested this with IIS Express, Visual Studio 2013 and SQL Server 2012 and Entity Framework v5 (the last version compatible with Dynamic Data sites).  Note that the Entity Framework v6 will not work with Dynamic Data sites due to the namespace refactoring.

Enjoy.

Useful Links

http://www.olegsych.com/2010/09/understanding-aspnet-dynamic-data-entity-templates/

Jul 252012
 

Hi All.  This is a quick post to introduce you to SQL Server Data Tools – support and tools for database developers.

image

Introduction

Recently, I started a new solution in Visual Studio 2010.  There is a need to build and maintain a database schema (for SQL Server 2008 R2), so I decided to add what was once formerly known as “DataDude” – the Database Project for Visual Studio.

This was in a copy of Visual Studio Professional 2010 with Service Pack 1 – and the out-of-the-box solution only supported SQL Server 2008 and prior.  A bit surprised, I did some digging.  You’ll recall there was a ‘Database Edition GDR’ which came out a few years back..  well there’s now an even better flavour of support.

Introducing SQL Server Data Tools

It’s called ‘Microsoft SQL Server Data Tools’ and you can get a copy from the following link on MSDN.  There are quite a number of new bits and pieces included, and it works with both Visual Studio 2010 (alert: apply Service Pack 1 beforehand) or Visual Studio 2012 (RC – although with some known issues if upgrading from, the beta).

Although the installation takes a little while (depending on your connection speed), the wait is worthwhile.

image

I’ll borrow some text from the MSDN site in order to explain the purpose of SSDT:

Who is SSDT for, and what does it provide them?

SSDT is for SQL Server database developers, who often develop database schemas, views, stored procedures, and other database objects while developing their application logic.

  • Tooling for both SQL Server and SQL Azure Development: SSDT offers new capabilities in a single cohesive environment to compile, refactor, and deploy databases to specific editions of SQL Server and SQL Azure. The toolset makes it easy, for example, to migrate on-premise SQL Server schemas to the cloud on SQL Azure, and develop and maintain databases across both on premise and cloud deployments. SSDT can target SQL Server 2005, SQL Server 2008, SQL Server 2008 R2, SQL Server 2012, and SQL Azure databases, including all editions of these database servers.
  • For SQL Server DBAs: SSDT provides a central and unified toolset targeted to the specific needs of DBAs to develop and maintain databases, with visual tools for developing tables, schema compare, and rich T-SQL support for refactoring databases, building views, stored procedures, functions and triggers. The toolset provides both a live development mode, and an offline project mode that tracks and manages all artifacts associated with a database. This mode optionally fully integrates with Visual Studio 2010 for team development, source control and change tracking. All change operations are automatically transformed into optimized T-SQL alter scripts, and can optionally be applied immediately to the online database or saved for later execution.

[http://msdn.microsoft.com/en-us/data/hh322942]

It’s actually very easy to use. 

Real World Applications

For example, if you open a solution containing legacy database projects the tools will automatically prompt you as to whether you wish to upgrade your existing database projects to the newer edition.

The basic benefit is targeting SQL Server 2008 (and R2) as well as SQL Server 2012 and SQL Azure.  That last part might get your attention! That’s right – SQL Azure.  We’ll be checking this out soon and reporting back in a bit more detail. 

How Do I get SSDT?

You can download the “pre-installer” here.

For more information, check out the ‘Getting Started with SQL Data Tools’ located here.. or stay tuned at this location for more!

But wait.. there’s a bug?

Today I got into full swing with the SSDT and a real world project.  During my work I came across a fairly horrible bug which has been documented on the MSDN Forums.

Basically, if you initialize (i.e. use) any of the SSDT tools you can’t use the Entity Framework tools and vice-versa.  The interim workaround is to load two instances of Visual Studio (perhaps even the same solution?) and use the SSDT tools in the first instance, and the Entity Framework tools in the other.

A symptom which might lead you to this article?  When you have the SQL Server Data Tools installed, and try to update or create an Entity Framework data model – you may receive errors such as ‘Object not set’ etc.  In some cases it might crash the VS IDE.  In my experience, I received the ‘object not set’ errors, and the Model Explorer was greyed/did not render.  I was also not able to refresh my EDMX properly.

Whilst being a right royal pain in the butt, believe it or not this approach does work – even if it does make for a very disjointed development experience.  According to the thread, a fix is in the works – but no word on when it will be released.

Aug 232011
 

Continuing on from Part 1, our objective in this article is to create a new Visual Studio solution, and then populate it with some existing projects.  To facilitate this, it would be beneficial if you had a few projects already created.  For this example, I have defined the following structure:

C:\UnitTesting – Root Folder

C:\UnitTesting\TestSolution – An existing folder which contains subdirectories which hold project files.

The sample structure is as follows:

image

So basically, there are three projects and two reference each other.  Our goal is to create a new solution file in the root directory which is then populated with references to the existing project files.

Please bear in mind that this is all demo code, and lacks the normal hardening and checks which production code should have.  Please use wisely, as there are no refunds.. ha!

I’ve reworked the previous example to be a little more robust, as follows:

  1. Encapsulated the functionality to create a new blank solution file (a method called CreateSolution())
  2. Second, I’ve added a new function called AddProjects which will take a path, plus details about the target solution file.

I’ve moved the DTE2 object to be a static object, and refactored CreateSolution thus:

private static DTE2 _dte2 = (DTE2)Microsoft.VisualBasic.Interaction.CreateObject("VisualStudio.DTE.10.0", "");

public static void CreateSolution(string path, string solutionName)
{
    Solution4 solutionObject = (Solution4)_dte2.Solution;
    solutionObject.Create(path, solutionName);
    solutionObject.Close(true);
}

Unit testing for this was easy enough:

[TestMethod]
public void CreateEmptySolution()
{
    SolutionHelper.CreateSolution(@"C:\UnitTesting\", "UnitTestSolution");
}

[TestMethod]
public void AddProjects()
{
    File.Delete(@"C:\UnitTesting\UnitTestSolution.sln");
    SolutionHelper.AddProjects(@"C:\UnitTesting\UnitTestSolution.sln", @"C:\UnitTesting\TestSolution", "*.*proj");
}

As you can see, all pretty straightforward.  So reviewing our CreateSolution function, it now just takes a path and solution name (omit the “.sln” extension) and creates the empty solution.

The next step is to iterate and find projects, and finally – add them to a solution file.  I’ve added some logic which creates the target Solution File if it does not exist.

The AddProjects function is pretty easy to understand:

public static void AddProjects(string solutionFile, string projectPath, string projectWildCardMatch)
{
    if (!File.Exists(solutionFile))
    {
        CreateSolution(Path.GetDirectoryName(solutionFile), Path.GetFileName(solutionFile));
    }

    Solution4 solutionObject = (Solution4)_dte2.Solution;
    solutionObject.Open(solutionFile);

    var fileNames = Directory.GetFiles(projectPath, projectWildCardMatch, SearchOption.AllDirectories);

    foreach (string projectFile in fileNames)
    { 
        solutionObject.AddFromFile(projectFile, false);
    }

    solutionObject.Close(true);
}

Hopefully you can see how easy it is to add projects to the solution.  The boolean value in the call Solution.AddFromFile is important, as it elects to use the currently loaded solution.  If omitted, it is either attached to the project’s original solution (in my experience).  Make sure you save the solution when closing it.

If we open the generated solution file in Visual Studio 2010, you’ll see that not only are the projects added, but they also show the references:

image

Here is a complete copy of the source file:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using EnvDTE80;
using EnvDTE100;
using System.IO;
using EnvDTE;
using System.Diagnostics;

namespace VSAutomation.Toolkit
{
    public static class SolutionHelper
    {
        private static DTE2 _dte2 = (DTE2)Microsoft.VisualBasic.Interaction.CreateObject("VisualStudio.DTE.10.0", "");

        public static void CreateSolution(string path, string solutionName)
        {
            Solution4 solutionObject = (Solution4)_dte2.Solution;
            solutionObject.Create(path, solutionName);
            solutionObject.Close(true);
        }

        public static void AddProjects(string solutionFile, string projectPath)
        {
            if (!File.Exists(solutionFile))
            {
                CreateSolution(Path.GetDirectoryName(solutionFile), Path.GetFileName(solutionFile));
            }

            Solution4 solutionObject = (Solution4)_dte2.Solution;
            solutionObject.Open(solutionFile);
            
            var fileNames = Directory.GetFiles(projectPath, "*.*proj", SearchOption.AllDirectories);

            foreach (string projectFile in fileNames)
            { 
                solutionObject.AddFromFile(projectFile, false);
            }

            solutionObject.Close(true);
        }
    }
}