Tag Archives : visual studio

Preparing for ASP.NET vNext and Visual Studio 2015

Happy Thanksgiving to folks in the USA.

I’ve finally taken the plunge and decided to get stuck into the recently released Release Candidate (RC) of ASP.NET 5.  Prior to today, I’d stuck with the RTM version of Visual Studio 2015 which insulated me from some of the changes which are on the horizon.

A few months ago, I’d managed to put together a working (live) solution using VS 2015 and the new Web Projects, and you can see it here at http://windowsbinary.com

Whilst this was handy experience, it barely prepared me for the massive changes to the development environment which ASP.NET 5 RC requires.  This article contains my experiences in getting a Web API project compiled and run when consuming ASP.NET 5 RC packages.

Git Support

Whether you use Github, Team Foundation Server Source Control or no source control, you’ll want Git support in your dev environment anyway.  A lot of PowerShell scripts and commands pull and clone from Git repositories, and command line integration, IMHO is essential.  If you haven’t installed Git support with Visual Studio 2015, now’s the time to do so.

Install Git/GitHub support when you install Visual Studio 2015 (or modify your install)



Also you can download Git tools for Windows from http://git-scm.com/download/win and support for Git in PowerShell here: https://github.com/dahlbyk/posh-git

Speaking of PowerShell….

Preparing PowerShell

Enable PowerShell script execution.  You’ll probably be working with PowerShell more than you have in the past, even if you aren’t writing the script.  You’ll certainly be using PowerShell commands, at a minimum inside the Package Manager Console inside VS 2015.

Open a PowerShell console as Administrator, then: Set-ExecutionPolicy Unrestricted


If you get the following error when loading the Package Manager Console inside Visual Studio 2015:

“”Windows PowerShell updated your execution policy successfully, but the setting is overridden by a policy defined at a more specific scope. Due to the override, your shell will retain its current effective execution policy of Unrestricted. Type “Get-ExecutionPolicy -List” to view your execution policy settings. For more information please see “Get-Help Set-ExecutionPolicy”.”””

Here’s my PowerShell Execution Policy on a Workgroup-based computer:


..and on a Domain-joined machine with a Group Policy Object applied:


It’s likely caused by a Group Policy Object (GPO) which is setting a domain-policy on PowerShell restrictions.  Even if you modify and update group policy, this error condition may persist.  Based on an article here: https://powershellpanda.wordpress.com/2013/12/01/override-gpo-for-powershell-execution-policy/

A registry hack will get you past this annoying issue:

Windows Registry Editor Version 5.00



Working with DNX, DNU and DNVM

To manage different versions of the .NET Runtime environments, you’ll need to get familiar with dnx (Microsoft .NET Execution environment), dnu (.NET Development Utility) and dnvm (.NET Version Manager).  Screenshots below.  You should be able to execute them from the Visual Studio 2015 Command Line Tool:








If you get the message:

“’dnx’ is not recognized as an internal or external command, operable program or batch file.” http://stackoverflow.com/questions/30440974/dnx-command-not-found-in-developer-command-prompt-for-vs2015

You can fix this issue by running the following command:dnvm use default –p

which will persist the changes to the environment variable for the current user.


On another machine, I was warned about a deprecated environment variable:


Which might beg the question….

What are KRE, KVM, KPM?


In short, KRE/KVM and KPM are management bits for ASP.NET 5.  K-bits were named to DNX/DNVM.  I’m Including this info in case it leads you to this article.

From the link above:

K has three components:

  1. KRE – K Runtime Environment is the code required to bootstrap and run an ASP.NET vNext application. This includes things like the compilation system, SDK tools, and the native CLR hosts.
  2. KVM – K Version Manager is for updating and installing different versions of KRE. KVM is also used to set default KRE version.
  3. KPM – K Package Manager manages packages needed by applications to run. Packages in this context are NuGet packages.

Microsoft ASP.NET and Web Tools 2015 (RC) – Visual Studio 2015

Lastly, before you get too excited, there’s a couple of hundred megabytes of updates you’ll need to the supporting tooling for the RC (RTM differs too much, some important things were renamed since then).

The latest version, naturally, requires updated tooling.  If you only have Visual Studio 2015 RTM, then prepare for some fun.  You can download the RC bits here: https://www.microsoft.com/en-us/download/details.aspx?id=49959

Which leads me to installing all of the following on my Development machine:



The net result is that when I now open Visual Studio 2015, and I create a new project – I select .NET Framework 4.6 and when I create a new ASP.NET Web project, the options include:




Here’s some infuriating error messages you might stumble across in trying to compile a simple Web API…..

“DNX 4.5.1 error NU1002: The dependency <Assembly> in project <Project> does not support framework DNX,Version=v4.5.1.” e.g.

“DNX 4.5.1 error NU1002: The dependency System.Runtime 4.0.0 in project Asp5Api does not support framework DNX,Version=v4.5.1.”


System.IO.FileNotFoundException: Could not load file or assembly ‘Microsoft.DNX.PackageManager’ or one of its dependencies. The system cannot find the file specified.

Means you probably haven’t installed the latest Web Tools.  The PackageManager assembly apparently has been renamed, and is reflected in the later (post-RTM) versions.

Using the Dynamic Data Database Embedded Image Field Template


Hi all.  I’ve been working with the old ASP.NET Dynamic Data website templates, as I wanted a quick and easy to use web UI for managing the data at http://rs-photography.org.

As the site is (obviously) image-centric, I wanted to make some modifications to the out-of-the-box Dynamic Data project to include native image rendering/loading and updating.

I thought this was going to be relatively straightforward, being that there is a NuGet packaging encouragingly called “Dynamic Data Database Embedded Image Field Template” as well as a package called “Microsoft.Web.DynamicData.Handlers” which promises “Asp.Net image handler for rendering images from database”.




The good news is that this is a relatively straightforward implementation, but sadly lacking in clear instructions once the packages are successfully added.  This post will clarify exactly what you must do to get images displaying in your Dynamic Data site.

Note: the only references I could find online were quite old (i.e. 2007-2008), and pre-dated the NuGet packages, hence I felt the need to write this article.

The Scenario – NuGet Packages installed

imageI’m going to assume that if you are reading this article, you’re familiar with Dynamic Data websites.  If you aren’t, I’d suggest you get up to speed by reading a couple of tutorials (e.g. this one) – it’s a synch to get a site built, takes only a couple of minutes.

Once you have your site loading correctly, you’ll need to ensure you have a column in a table which is of sql type “image”.  The Entity Framework models this (correctly) as a property of type byte array (byte[]).  By default, Dynamic Data templates don’t have a field template for the byte array type, so the column is ignored when scaffolding occurs.

Assuming you have the NuGet packages installed, you’ll automatically have two new field templates added to the DynamicData folder, like so (left).

Your web.config file will also be updated to include a http handler for image requests, like so:

      <add name="ImageHandler" path="ImageHandler.ashx" verb="*" 
           type="Microsoft.Web.DynamicData.Handlers.ImageHandler"/> </handlers> </system.webServer>

The handler is implemented in a binary which is included in the project – Microsoft.Web.DynamicData.Handlers.


The Data Schema

Now, if you run up the site and drill into the table you’ve defined your image field in, you’ll probably notice the image column is missing.  As stated above, because of the data type, by default there’s no handler.  This is where you need to do some work to help the scaffolding and field templates map properly.

Here’s my table schema and entity definitions so you can see exactly what I’ve done:

The “Files” Table:

CREATE TABLE [dbo].[Files] (
    [FileId]       INT            IDENTITY (1, 1) NOT NULL,
    [Filename]     NVARCHAR (100) NULL,
    [CatalogId]    INT            NULL,
    [SizeId]       INT            NULL,
    [IsDefault]    BIT            DEFAULT ((0)) NOT NULL,
    [Height]       INT            NULL,
    [Width]        INT            NULL,
    [ImageData]    IMAGE          NULL,
    [HasImageData] BIT            CONSTRAINT [DF_Files_HasImageData] DEFAULT ((0)) NOT NULL,
    CONSTRAINT [FK_Files_Catalog] FOREIGN KEY ([CatalogId]) REFERENCES [dbo].[Catalog] ([CatalogId]),
    CONSTRAINT [FK_Files_Sizes] FOREIGN KEY ([SizeId]) REFERENCES [dbo].[Sizes] ([SizeId])

The File class (entity):

public partial class File
    public int FileId { get; set; }
    public string Filename { get; set; }
    public Nullable<int> CatalogId { get; set; }
    public Nullable<int> SizeId { get; set; }
    public bool IsDefault { get; set; }
    public Nullable<int> Height { get; set; }
    public Nullable<int> Width { get; set; }
    public byte[] ImageData { get; set; }
    public bool HasImageData { get; set; }
    public virtual Catalog Catalog { get; set; }
    public virtual Size Size { get; set; }

Adding Attributes

You’ll need to extend on the partial class(es) in your Entity Framework model.  You need to keep the column definitions the same, so you need to just extend the class for metadata purposes (i.e. attributes).  If you add the attributes directly to the generated classes, they’ll be overwritten when you refresh the data model.

To add the appropriate attributes, I did the following:

using System.ComponentModel.DataAnnotations;


[MetadataType(typeof(File_MD))] public partial class File { } public partial class File_MD { [ScaffoldColumn(true)] [UIHint("Image")] [ImageFormat(100, 100)] public byte[] ImageData { get; set; } }

Examining the Attributes – the important one is [UIHint] as this provides the scaffolding the name of the type template to use (suffix/prefixes are used automatically), ergo by specifying “Image” we are implying the use of Image.ascx or Image_Edit.ascx.  To learn more about how scaffolding works, check out the following MSDN article.

Note that the extended partial class must be in the same namespace as the data model.

Once we have these attributes in place, and reload the site – given a row that has data in the image column, we should see something more favourable:


Presto!  Nothing more to do.  We can also now use a file picker to upload an image for new and existing records:


..and here it is in the SQL table:


If you want to make your column naming a bit more pleasing, add the following attribute to the column definition (in your metadata extension):




That’s it.  It comes down to getting the entity property attributes set properly, and the rest is easy as.  It’s essentially adding image handling support out of the box for Dynamic Data sites, the last part is something a NuGet package can’t do – which is to interpret your data model.

I’ve tested this with IIS Express, Visual Studio 2013 and SQL Server 2012 and Entity Framework v5 (the last version compatible with Dynamic Data sites).  Note that the Entity Framework v6 will not work with Dynamic Data sites due to the namespace refactoring.


Useful Links


SQL Server Data Tools for Visual Studio

Hi All.  This is a quick post to introduce you to SQL Server Data Tools – support and tools for database developers.



Recently, I started a new solution in Visual Studio 2010.  There is a need to build and maintain a database schema (for SQL Server 2008 R2), so I decided to add what was once formerly known as “DataDude” – the Database Project for Visual Studio.

This was in a copy of Visual Studio Professional 2010 with Service Pack 1 – and the out-of-the-box solution only supported SQL Server 2008 and prior.  A bit surprised, I did some digging.  You’ll recall there was a ‘Database Edition GDR’ which came out a few years back..  well there’s now an even better flavour of support.

Introducing SQL Server Data Tools

It’s called ‘Microsoft SQL Server Data Tools’ and you can get a copy from the following link on MSDN.  There are quite a number of new bits and pieces included, and it works with both Visual Studio 2010 (alert: apply Service Pack 1 beforehand) or Visual Studio 2012 (RC – although with some known issues if upgrading from, the beta).

Although the installation takes a little while (depending on your connection speed), the wait is worthwhile.


I’ll borrow some text from the MSDN site in order to explain the purpose of SSDT:

Who is SSDT for, and what does it provide them?

SSDT is for SQL Server database developers, who often develop database schemas, views, stored procedures, and other database objects while developing their application logic.

  • Tooling for both SQL Server and SQL Azure Development: SSDT offers new capabilities in a single cohesive environment to compile, refactor, and deploy databases to specific editions of SQL Server and SQL Azure. The toolset makes it easy, for example, to migrate on-premise SQL Server schemas to the cloud on SQL Azure, and develop and maintain databases across both on premise and cloud deployments. SSDT can target SQL Server 2005, SQL Server 2008, SQL Server 2008 R2, SQL Server 2012, and SQL Azure databases, including all editions of these database servers.
  • For SQL Server DBAs: SSDT provides a central and unified toolset targeted to the specific needs of DBAs to develop and maintain databases, with visual tools for developing tables, schema compare, and rich T-SQL support for refactoring databases, building views, stored procedures, functions and triggers. The toolset provides both a live development mode, and an offline project mode that tracks and manages all artifacts associated with a database. This mode optionally fully integrates with Visual Studio 2010 for team development, source control and change tracking. All change operations are automatically transformed into optimized T-SQL alter scripts, and can optionally be applied immediately to the online database or saved for later execution.


It’s actually very easy to use. 

Real World Applications

For example, if you open a solution containing legacy database projects the tools will automatically prompt you as to whether you wish to upgrade your existing database projects to the newer edition.

The basic benefit is targeting SQL Server 2008 (and R2) as well as SQL Server 2012 and SQL Azure.  That last part might get your attention! That’s right – SQL Azure.  We’ll be checking this out soon and reporting back in a bit more detail. 

How Do I get SSDT?

You can download the “pre-installer” here.

For more information, check out the ‘Getting Started with SQL Data Tools’ located here.. or stay tuned at this location for more!

But wait.. there’s a bug?

Today I got into full swing with the SSDT and a real world project.  During my work I came across a fairly horrible bug which has been documented on the MSDN Forums.

Basically, if you initialize (i.e. use) any of the SSDT tools you can’t use the Entity Framework tools and vice-versa.  The interim workaround is to load two instances of Visual Studio (perhaps even the same solution?) and use the SSDT tools in the first instance, and the Entity Framework tools in the other.

A symptom which might lead you to this article?  When you have the SQL Server Data Tools installed, and try to update or create an Entity Framework data model – you may receive errors such as ‘Object not set’ etc.  In some cases it might crash the VS IDE.  In my experience, I received the ‘object not set’ errors, and the Model Explorer was greyed/did not render.  I was also not able to refresh my EDMX properly.

Whilst being a right royal pain in the butt, believe it or not this approach does work – even if it does make for a very disjointed development experience.  According to the thread, a fix is in the works – but no word on when it will be released.