Jan 132011
 

Hi there and happy new year.  2011 promises to be quite an interesting year, and I hope that I can continue to contribute here at Sanders Technology.  To kick off the new year, I decided to revisit the Windows Workflow article I started late last year.

A little while ago I wrote a post at entitled “A quick and dirty Rules Engine using Windows Workflow (Part 1)” which has evidently been fairly popular.  Unfortunately, it seems that I forgot to follow it up with a part 2!  Now, welcoming in the new year, I’m putting together the second part.

Honestly though, folks, this could easily be a multi part mini project, because the uses of this Windows Workflow Foundation (WF) rules engine are immense!

I’ve managed to extend the scope of the code displayed in part 1 to include some dummy data items and I’ve crafted some more reusable and general purpose code (for example purposes), but you really ought to be able to see for yourselves how powerful and multi-purpose this really is.

I was going to write a quick and dirty WinForms UI, but I ended up ditching it in favour of a bunch of unit tests instead.  You really should be able to see the potential here, I don’t want to spoil the magic by adding an inept user interface.

Let’s take a look at the sample solution.  I’ve added some terribly (and perhaps insultingly) simple “objects” which, of course, you would substitute for your own DTOs/Entities/BusinessObjects.  It’s a basic class with some public properties, nothing terribly complex (it’s a demo after all).  You can see it uses an Enum just for fun on one of the properties.  I’ve also included a screenshot of the Solution structure – nothing too scary here.

imageimage

The Class View for the “BusinessObjects” | The Solution Structure

Basically the entire solution consists of two class libraries and a Unit Test project.  I’m trying to keep this very simple.  You could plug a WinForms UI or a website or a WCF Web Service Application underneath this very easily!

The main fun is in the “RuleManager” class, which is basically just a wrapper for the main WF workflow engine parts.  I’ve put in an extremely vanilla implementation which allows a few interesting parts of functionality.  I think if you use your imagination, you’ll be able to come up with some much more interesting ways to play with the options.

So why don’t we have a look at the RuleManager class?  It is defined to take a generic type, so you can work upon different source object types.

For the purpose of this post, we have just the one main object defined “Employee”.  It also doesn’t so anything to ensure the rules loaded are explicit for the data type – I’ll do an expanded implementation later to show how we can account for this.

My sincere apologies for the crappiness of the format of the posted code!  I’m having a bit of a fight with my copy of Live Writer and the plugins for inserting code snippets are not working very well with the site layout theme.  I’ll try and get it looking right.. soon.

#region Using Directives
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Workflow.Activities.Rules.Design;
using System.Workflow.Activities.Rules;
using System.Windows.Forms;
using System.Workflow.ComponentModel.Serialization;
using System.Xml;
using System.IO;
using BusinessObjects;
using System.Collections.ObjectModel;
#endregion

namespace WorkFlowProvider
{
    /// Implements a wrapper around the Windows Workflow Foundation Rules Engine
    /// A Data Object type to process
    public static class RulesManager<T> where T : new()
    {
        #region Rules Editor Support

        /// Launch the Rules Form to create a new rule
        public static RuleSet LaunchNewRulesDialog(string ruleName, string outputPath)
        {
            return LaunchRulesDialog(null, ruleName, outputPath);
        }

        /// Launch the Rules Editor with an existing rule (for editing),        
        /// or to create a new rule (pass NULL to create a new rule)
        /// The rule name (for the file name)
        /// The path to save rules to
        /// A rule (if one is saved/edited)
        public static RuleSet LaunchRulesDialog(RuleSet ruleSet, string ruleName, string outputPath)
        {
            // You could pass in an existing ruleset object for editing if you 

// wanted to, we're creating a new rule, so it's set to null RuleSetDialog ruleSetDialog = new RuleSetDialog(typeof(T), null, ruleSet); if (ruleSetDialog.ShowDialog() == DialogResult.OK) { // grab the ruleset ruleSet = ruleSetDialog.RuleSet; // We're going to serialize it to disk so it can be reloaded

WorkflowMarkupSerializer serializer = new WorkflowMarkupSerializer(); string fileName = String.Format("{0}.rules", ruleName); string fullName = Path.Combine(outputPath, fileName); if (File.Exists(fullName)) { File.Delete(fullName); //delete existing rule } using (XmlWriter rulesWriter = XmlWriter.Create(fullName)) { serializer.Serialize(rulesWriter, ruleSet); rulesWriter.Close(); } } return ruleSet; } #endregion #region Rule Processing /// Applies a set of rules to a specified data object public static T ProcessRules(T objectToProcess, ReadOnlyCollection rules) { RuleValidation validation = new RuleValidation(typeof(T), null); RuleExecution execution = new RuleExecution(validation, objectToProcess); foreach (RuleSet rule in rules) { rule.Execute(execution); } return objectToProcess; } /// Execute a single rule on a single data object public static T ProcessRule(T objectToProcess, RuleSet rule) { RuleValidation validation = new RuleValidation(typeof(T), null); RuleExecution execution = new RuleExecution(validation, objectToProcess); rule.Execute(execution); return objectToProcess; } #endregion #region Rules Management /// Loads a single rule given a path and file name public static RuleSet LoadRule(string rulesLocation, string fileName) { RuleSet ruleSet = null; // Deserialize from a .rules file. using (XmlTextReader rulesReader = new XmlTextReader(Path.Combine(rulesLocation, fileName))) { WorkflowMarkupSerializer serializer = new WorkflowMarkupSerializer(); ruleSet = (RuleSet)serializer.Deserialize(rulesReader); } return ruleSet; } /// Loads a set of rules from disk public static ReadOnlyCollection LoadRules(string rulesLocation) { RuleSet ruleSet = null; List rules = new List(); foreach (string fileName in Directory.GetFiles(rulesLocation, "*.rules")) { // Deserialize from a .rules file. using (XmlTextReader rulesReader = new XmlTextReader(fileName)) { WorkflowMarkupSerializer serializer = new WorkflowMarkupSerializer(); ruleSet = (RuleSet)serializer.Deserialize(rulesReader); rules.Add(ruleSet); rulesReader.Close(); } } return rules.AsReadOnly(); } #endregion } }

This one class pretty much gives you all you need to create, load and save rules.  It’s a bit basic at this point in time, I will try to create a more robust and tolerant class in subsequent posts on this topic.  For now though, I think it adequately demonstrates the sort of functionality which can be gleaned from the Rules Engine.

You can create or edit a rule by using the LaunchNewRulesDialog or LaunchRulesDialog methods with minimal user input.  I’ve written a very basic Unit Test which proves how efficient this can be, but I’m sure you’ll be able to have some fun with it.

Next up, there are some functions to load existing rule files from disk, the aptly named LoadRule and LoadRules methods.  They are pretty self explanatory, I don’t think we need to go into too much detail about the loading of rules files.

Finally, there are some functions which can be called to execute rules against data objects.  At this stage I’m supporting the execution of a single rule against a single data object, or a collection of rules against a single data object.  Obviously you could easily expand upon this.  You may wish to consider a multi-threaded approach, I may be persuaded to implement a more robust solution which allows for concurrent multiple item/multiple rule processing if you leave a comment for me.

Finally, here’s the Unit Test which allows you to create a new rule and apply it to the test data defined in the test:

[TestMethod]
public void CreateNewRule()
{
    Employee testEmployee = new Employee();    
    testEmployee.FirstName = "Joe";    
    testEmployee.Surname = "Smith";
    testEmployee.Location = StateEnum.ACT;
    testEmployee.Manager = null;
    testEmployee.DateHired = DateTime.Now.AddYears(-1);
    testEmployee.EmployeeNumber = 99;

    string ruleName = String.Format("{0}UnitTestRule", DateTime.Now.Millisecond);
    string path = Assembly.GetExecutingAssembly().Location.Replace(Assembly.GetExecutingAssembly().ManifestModule.Name, String.Empty);

    RuleSet newRule = RulesManager.LaunchNewRulesDialog(ruleName, path);
    testEmployee = RulesManager.ProcessRule(testEmployee, newRule);

    Trace.WriteLine(testEmployee.FirstName);
    Trace.WriteLine(testEmployee.Surname);
    Trace.WriteLine(testEmployee.DateHired);
}

So, in this post we’ve had a look at a very basic solution structure which demonstrates a reusable rules design.  At the moment it is as close to useless as a demo usually starts off looking like.  I’m only getting started, once you are familiar with the ‘RulesManager’ wrapper concept, we’ll be ready to expand upon it significantly.

This is part 2 of a multi-part series.  I’ll be expanding upon the concepts shown here in subsequent posts.

Check back soon!

Solution Files


Sep 072010
 

Recently I put my mind towards developing a fairly light rules based utility for applying various logical patterns on top of some linear business data.  Honestly, my initial thoughts were “oh no, not another boring rules based implementation” because we (ought to) know how boring that can be – and convoluted – but this time I decided to try something a little different.

Through no fault of my own, I’ve actually had very little to do with Windows Workflow Foundation – WF, not WWF which stands for World Wildlife Foundation, and not to be confused with the World Wrestling Federation, just to avoid confusion!

Most of the project I’ve worked on previously have used some other workflow product, usually something like K2 blackpearl or a custom implementation, for example.  In any case, I’ve always been curious – from what I’d heard it kicks Sharepoint’s “workflow” (if you could call it that) in the butt.  I’m all for that!

Now, from my admittedly sparse knowledge of WF, I knew there was some sort of Rules Engine sitting somewhere near some sort of Workflow for Business Analyst, and it reminded me of a demo I’d seen a little while back involving a graphical interface to design business rules.  Don’t fear, this is way cooler than that!

Anyhow, to cut a long and boring intro short (too late?) I got stuck into designing a quick and dirty rules engine for working with Data Transfer Objects or POCO entities.  Actually, you can pretty much use whatever object you like, it’s very unassuming :)

To start out, you need to add a reference to the following .Net assemblies, available in the .Net Framework v3.0 and onwards:

System.Workflow.Activities
System.Workflow.ComponentModel
system.Workflow.Runtime

So just add them to the project you are working with, and you’re half way there.  Next, lets start off by actually creating some rules, or at least a mechanism to do so.  This is almost too easy for words, so let us take a look at some code instead:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Workflow.Activities.Rules.Design;
using System.Workflow.Activities.Rules;
using System.Windows.Forms;
using System.Workflow.ComponentModel.Serialization;
using System.Xml;
using System.IO;

public static class CreateRulesHelper
{
    public static void LaunchRulesDialog(string outputPath)
    {
        RuleSet ruleSet = null;

        // You could pass in an existing ruleset object for editing if you wanted to, we're creating a new rule, so it's set to null
        RuleSetDialog ruleSetDialog = new RuleSetDialog(typeof(BasicTerm), null, ruleSet);
            
        if (ruleSetDialog.ShowDialog() == DialogResult.OK)
        {
            // grab the ruleset
            ruleSet = ruleSetDialog.RuleSet;
            // We're going to serialize it to disk so it can be reloaded later
            WorkflowMarkupSerializer serializer = new WorkflowMarkupSerializer();

            string fileName = String.Format("BusinessRule{0}.rules", DateTime.Now.Millisecond);
            string fullName = Path.Combine(outputPath, fileName);

            if (File.Exists(fullName))
            {
                File.Delete(fullName); //if it bleeds, we can kill it
            }

            using (XmlWriter rulesWriter = XmlWriter.Create(fullName))
            {
                serializer.Serialize(rulesWriter, ruleSet);
                rulesWriter.Close();
            }
        }
    }
}

See? This is a basic implementation inside a static class and essentially launches the WF rules engine editor.  I’m not doing anything fancy here, just taking whatever the user does and serializing it to disk (to be loaded later).  Some key things to point out here:

  1. I’m always creating a new ruleset – you could rewire this to allow editing of existing rules
  2. You may notice I’m using a random file name.  A more detailed implementation could take some input to form the rule name
  3. I’m using a specific object type (“BasicTerm”) – check back for Part 2 when I’ll rework this code so you can use it with different object defintions
  4. This is just a quick mock up!  Sample purposes only :)

Now assuming you had an object called “BasicTerm” and you compiled and ran this code, you’d get prompted with a window not unlike this one:

image image

The best part?  It has IntelliSense too!  You can create some very interesting IF..THEN..ELSE runs here, and you can create multiple rules per rule set.  The neat part is that you can modify the object directly, although it is somewhat limiting, it could be a very handy asset for something like message based routing rules (like in BizTalk).

I won’t go into too much detail at this juncture, other than encouraging you to have a play with this relatively simple interface.  Once you’ve set some rules (or even if you set none) – provided you click OK, you’ll get a .rules file written out to disk.  Check back for Part 2 where I’ll demonstrate how we can use these rules in a quick and dirty workflow.

Continued in Part 2

Further Reading

Introdution to the Windows Workflow Foundation Rules Engine

How to use Windows Workflow Rules

Aug 202010
 

Introduction

Now if you are like me, you’ve probably had some interest in POCO (plain old CLR objects) objects for at least some time.  They are an invaluable tool in the distributed systems and service oriented architecture areas, but up until now they’ve been inaccessible for those designs.

In a nutshell, both LINQ to SQL and Entity Framework (v1) class entities did not support serialization for the purpose of stateless transport(such as web service communication).  This stems from the embedded context tracking attributes, and the design which stipulates a fairly poor experience for those daring enough to detach entities and “pass them around”.

Enter the ADO.net Entity Framework v2.. ahem, version 4 which shipped in the early part of this year.  Whilst the EFv4 doesn’t support POCO objects out of the box (you have to use an online template), it’s easy enough to accomplish with minimal effort.  Plus, they can used (almost) as seamlessly as non-POCO objects.

Before we get into the nitty gritty of this particularly long post, I will direct your attention to the following MSDN article which covers most of the steps for harmonious life with POCO objects and WCF services.  What the article does not cover is handling somewhat more complex object graphs.  In other words, the MSDN scenario is fine with fairly basic (and bland) objects, but it’s pretty nasty when you have objects containing, well, joins (collections, relationships, yada yada).

Now what follows, is based on a number of other articles floating around the Internet.  I’m not trying to take any credit for (the majority), I’m just collating the information into one handy to reach place.  I’m also going to supply sample code in case you have any trouble getting it all configured.  The parts which are my implementation alone, I’ll highlight.

The Data Model

First, let us take a quick look at the sample data model.  Nothing fancy, I’ll admit, but enough for our purposes:

DB-Schema

Which we will use with a WCF Service or two.  You can use the attached T-SQL script to create and populate a SQL Server database (and later generate your EDMX model from that schema).  Next, create a solution containing WCF services, and add a ADO.net Entity Framework (v4) model.  You can see from my sample below, the model is admittedly not very complex.  Notice the “self join” on the Category table.  This is not an uncommon scenario in designing parent/child relationships at the DB level.  It also has the (awesome) advantage of generating Parent/Child navigation properties (you may need to do some renaming if you generated the model from my sample schema).

The Object Model

image

Solutions and Settings

Once you have generated the model, right click anywhere on the blank model surface and select “Add Code Generation Item”.  This prompts you with a bulky dialog window – select “Online Templates” from the left hand side tree view.

Poco-1 Poco-2

Select ADO.NET C# POCO Entity Generator and click OK a few times as needed.  The template builds up the POCO entities and removes the EDMX/Designer based implementation which the EF designer would have originally generated.  This leaves you with a number of new files in your solution, which should look a lot like the following:

image

Web Services

Now that I’ve got your attention, lets have a think about how we’re going to expose these via WCF.  I’ve created two WCF Services, SystemLogService.svc and ProductService.svc. 
The interface definition of each is per below:

image image

Don’t worry about those attributes just yet!  I’ll explain a little about why they are necessary shortly.  If you have reviewed the original MSDN article you’ll recall:

“The POCO proxy type cannot be directly serialized or deserialized by the Windows Communication Foundation (WCF), because the DataContractSerializer serialization engine can only serialize and deserialize known types. The proxy type is not a known type. For more information, see the Serializing POCO Proxies section in the Working with POCO Entities topic. To serialize POCO proxies as POCO entities, use the ProxyDataContractResolver class to map proxy types to POCO types during serialization.”

Which means that the default (runtime) classes generated by LINQ/EF are incompatible with WCF because WCF requires classes defined at compile time. 

The Solution

As such, you need to both disable the use of Proxies, and also label your web service methods with the [ApplyDataContractResolver] attribute as seen above.  You can obtain the details about this attribute from the MSDN article or from my sample solution.  You only need to use it on the service side.  This is as simple as creating a new class and pasting the implementation from either source.  Then add the attribute to decorate your web service definition (on the interfaces).

image

Now, for the part not previously covered – we generally encounter a problem with passing entities which are a little more complicated than the example POCO objects encountered in the MSDN article.  Take our sample application.  The System Log entities define a basic relationship, and the products include a (fairly standard) self join, allowing product categories to have a hierarchy.

If we then create a standard console application, and add a web service reference, we can observe the class definition from the generated WSDL (below). 

image

If you’re unsure about how to view the WSDL code within Visual Studio, simply follow these steps:

  1. Right Click on the Service Reference
  2. Select “View in Object Browser”
  3. From here, expand the namespace of the reference, then right click on one of the interfaces
  4. Select “Go To Definition”

image image image

Now assuming you have done everything correctly, you should be able to consume the web services and the POCO objects in your console application:

image

 

Execution

If we execute the code, the first web service call returns fine, with no errors.  The second call however, which returns a collection is not as fortunate.
When we step over the following line of code, we receive an exception with the following message:

SystemLog[] logs = logClient.GetLogEntryByCategoryId(1);

“The underlying connection was closed: The connection was closed unexpectedly.”

 

image 

Looking deeper into the service side of affairs (debugging), we may discover that the exception being thrown is, in fact, the following:

There was an error while trying to serialize parameter http://tempuri.org/:GetLogEntryByCategoryIdResult. The InnerException message was ‘Object graph for type ‘Products.WcfServices.SystemLogCategory’ contains cycles and cannot be serialized if reference tracking is disabled.’.  Please see InnerException for more details.

After a fair amount of searching, I found a way to work around this little problem.  Implementing the suggested attribute [CyclicReferencesAware(true)] to methods involving collections appears to fix the problem.  After applying the attribute and updating the service reference (just to be sure!)  you will find the call succeeds, as per below:

image

 

But Wait.. There’s More..

Just when you thought it was safe to go back into the ocean..  What happens when we want to send things the other direction

Let’s look ahead to a web service method which takes one of our POCO objects, and tries to apply an update.
The logic I’ve used here detects a new entity, and also when an existing entity can not be located in the data store.

image

So nothing terribly complicated, correct?  If we implement something on the client side – something very simple, like the following:

image

When we try to execute this rather simple update scenario, we get the same kind of exception we’ve seen before:

image

 

I love it when a plan comes together..

So what is the solution?  Well, rather simple, if somewhat complex in the implementation. 
The outcome I found which works quite well is to emit the same attribute into the generated WSDL on the client side, when the reference is created.
This turned out to be a pretty straightforward idea, but a terribly intriguing problem to try to solve.

Without delving too much into details (please download and examine the sample solution) the basic premise was two fold:

  1. Define the required files in a common or shared assembly that both the service and the client project can consume.
  2. Build a class which implements several WSDL extensions: IWsdlImportExtension, IServiceContractGenerationExtension,IOperationContractGenerationExtension and IOperationBehavior

Basically, the class is triggered when the WSDL is being imported, and it adds the appropriate [CyclicReferencesAware(true)] attribute above the appropriate methods. 
To do this, you must modify the client’s App.Config to include the following configuration:

image

When the WSDL import is called, the referenced extension finds operations decorated with the CyclicReferencesAware attribute (the export decorates them with a documentation text).
When an operation decorated with the attribute is found, the importer adds (writes) a reference to itself to the operations’s behavours collection. 
As the WSDL is being generated, it’s a relatively easy step to output the required attribute.

Now, when if you update the service reference the appropriate attribute is applied to the generated WSDL code, as you can see from the screenshot below:

image

Side Notes

The only thing I didn’t figure out was how to add the required using directive to the generated code, however it is very easy to add the reference yourself – just compile the client project and you’ll get the appropriate errors. 

Double click on one, right click on the reference and you can easily add it to the code.  I realise it’s a bad practice to modify generated code, but I ran out of patience and figured this wasn’t a terrible oversight.  If you find a nice way to fix this, please get in touch.

Running the solution after updating the configuration (and referencing the shared assembly) and now the previous code runs just fine.  You can check the database to ensure the update occurred.

image

 

Summary and Disclaimer

Thus far, I haven’t had much time to test this any further.  I’ve implemented it on a number of web service clients without any problems. 
I’ve not tried any further complicated scenarios, but I’d really appreciate any feedback if people find further problems.

To wrap up, I’ve included the sample project and T-SQL to create a database.  This is not production code, so please use it as a demo. 
There’s no encryption, compression or other types of scenarios we might encounter in a complete system. 
It is supplied “As-IS” and no warranty is implied :)

As always, if you have any feedback please leave it here or get in touch.

Seriously though, I sincerely hope this might help out some folks who are as intrigued and equally baffled with WCF and the Entity Framework.

Bon Appétit.  /R

[ Download Sample Project and Schema ]

Additional Reading

http://blogs.msdn.com/adonet/archive/2009/12/22/poco-proxies-part-1.aspx
http://blogs.msdn.com/adonet/archive/2010/01/05/poco-proxies-part-2-serializing-poco-proxies.aspx

MSDN Walkthrough on POCO Entities

http://msdn.microsoft.com/en-us/library/ee705457.aspx

The source for the cyclic check is courtesy of:

http://chabster.blogspot.com/2008/02/wcf-cyclic-references-support.html