28 May, 2011

.NET: How to externalize your configuration

At times you run into a situation where an “externalization” of your configuration files would be nice.

Not always is your host your “own” proces, but your code is being executed by some existing process (e.g. a service), and you do not want to “polute” this system’s native configuration with settings only you’re are requiring. Put in other words – settings that applies only to your own code and not to the hosting service.

How to do this?
In the below illustration, it is desired to provide connectionstrings from the outside as your own code (extension1.dll) apparently needs it. In addition to connectionstrings, it is also desired to supply appSettings from the outside.

external configuration

This is quite easily achieved as .NET’s configuration subsystem has had support for this since .NET 2.0.

To your hosting process, you need to tell the configuration subsystem that it should go looking for settings in another file. This is achieved by an attribute called configSource in the app.config file.

app.config

settings.config

 

connections.config

Do pay attention to the fact that these external files only contains xml-fragments. This is important, as the configuration system will fail to initialize otherwise.

Why is this so? The reason shall be found in the way the .NET configuration system merges these 3 files together in memory. It is analogous to the later transformation-configurations that came out in Visual Studio.

Anyway – to make use of this, you go the traditional way (this is just a sample to test out the configuration!). The beauty is that the hosting process is completely unaware of this “externalization”.

program code

Result:

Resulting process

Note1: This only works if your configuration files are located in the same folder (or subfolder) as the parent process.

Note2: The (native) configuration section you wish to provide from the outside, needs to have an attribute called ‘configSource’.

That’s all there is to it…

StackOverflow Tags: ,

25 May, 2011

WP7/7.1: Where is INotifiyPropertyChanging defined?

Given the new ability (in 7.1 SDK) to create a local database in a WP7 application, I today set out to create such for proper relational storage.

According to samples and documentation you should support INotifyPropertyChanged as well as INotifyPropertyChanging. The first is easy to find, but the latter is not found where it should be according to documentation. According to documentation, the INofityPropertyChanging interface is found in System.dll, but alas that is not the case!

image

You will need to find (and thus reference) this in a new assembly: mscorlib.Extensions.

image

Who should have known… ;-)

Update (30.05.2011): The mscorlib.Extensions.dll is found here:
’C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\Silverlight\v4.0\Profile\WindowsPhone71\mscorlib.Extensions.dll’

StackOverflow Tags:

24 May, 2011

FIM 2010: How to run a “Hello world” - workflow

Given the very poor mass of examples found for FIM 2010, I’ll be aiding this problem for people new to FIM 2010. Manipulation of data in FIM is done in a number of different ways. One developer centric way is via execution of a custom workflow (.NET 3.5), which will be the method I’ll be exploring in this sample.

For an overall introduction to FIM 2010 – see this post:
http://blog.clauskonrad.net/2011/05/fim-2010-developers-quick-introduction.html

The process goes through these steps:

1) Create the custom workflow in Visual Studio (2008/2010)

2) Install it into FIM 2010

3) Create a condition for invoking the workflow (via an MPR)

4) Invoke the workflow

architecture

The above figure (coming from Microsoft's website) presents the extension points in FIM 2010. As seen the FIM Service exposes 3 different kinds of workflows for you to extend:

6) Authentication (who is calling)

7) Authorization (is the individual allowed access/execution)

8) Action (basically every other manipulation)


The nature of these workflow types is kind of self-explanatory and the “Hello World” sample will be an Action Workflow. But, before engaging in code, let’s determine what actually happens when triggering an event in the FIM Service.

Every incoming action in a FIM Service is presented to the FIM Service via it’s web-service API as a Request. The Request contains info on What Operation (Create,Update,Read,Delete) is requested, Who (Actor) is calling and for What Resource (e.g. PersonObject) is the request intended? The below illustration (coming from Microsoft) shows the request processing of an incoming Request.

FIM service request pipeline

Note: The webservice (as far as I know) in itself does not perform any action apart from presenting the Request to the FIM Service for evaluation.

As seen in the figure – the last step is calling of an ‘Action’ workflow if such exists. This is where we will be coming in. Before invoking the Action Workflow lets see how the workflows are actually hosted in FIM?

 

Workflows in FIM 2010

Coming from a WF 3.5 background, you should think that it is straight forward? Well – not entirely :-). The workflows need to obey to certain rules and to be embedded as an activity in a special parent sequential workflow class (Microsoft.ResourceManagement.Workflow.Activities.SequentialWorkflow) from FIM’s universe. This FIM parent workflow actually itself inherits from a System.Workflow.Activities.SequentialWorkflowActivity-class from the native WF 3.5 namespace.

FIM version of SequentialWorkflow

Why a need for this special FIM workflow?
The FIM version of the SequentialWorkflow holds a number of FIM related properties that allows you to gain access to among other things, the incoming Request object as well as the Resource you wish to manipulate (e.g. PersonObject). If you inherited directly from the std. WF version, you would not be able to gain access to these in a FIM context essential properties! If you open the FIM version (the blue element above) in e.g. Reflector or Visual Studio; you will see properties of type ResourceID, Resource, Request etc.

Okay – so far so good; now we know how it is hosted. So how do you create a HelloWorld workflow?

Creating your HelloWorld workflow

1) Open Visual Studio and select Target = 3.5

2) Select ProjectType = ‘Workflow Activity Library’

Visual Studio 2010

3) By default – this projecttype will present you with an Activity that inherits from System.Workflow.Activities.SequenceActivity, which is exactly what you need to create a sequence of actions.

Default designer: SequenceActivity

4) To read a request – drag a ReadResourceActivity into the design surface

ReadResourceActivity

5) Bind Actor, Resource and ResourceID of this activity to 3 new fields

new fields...

6) Override the Execute method' and retrieve the Parent workflow (the FIM version)

protected override ActivityExecutionStatus Execute(ActivityExecutionContext executionContext)
{
SequentialWorkflow parentWF = null;
if (!SequentialWorkflow.TryGetContainingWorkflow(this, out parentWF))
throw new InvalidOperationException("Logging activity can not run outside the context of the FIM SequentialWorkflow.");

readRequest_ResourceId = parentWF.RequestId;
readRequest_ActorId = parentWF.ActorId;

m_logData.Add("Got called by FIM...");
m_logData.Add("\tIncoming requestId: {0}", readRequest_ResourceId);
m_logData.Add("\tIncoming actor: {0}", readRequest_ActorId);

base.OnSequenceComplete(executionContext);
return base.Execute(executionContext);
}
The parent workflow is the special FIM-version, remember? This guy carries the incoming Request, Actor and Resource to be manipulated. Set the newly created fields ‘ResourceID’ and ‘Actor’ to point to these properties of the ParentWF (i.e. let the parentWF populate the fields of your custom activity). Note that if we fail to retrieve the parentworkflow, we will throw an exception. It makes little sense to run without it!
7) Override the OnSequenceComplete method to report on progress
protected override void OnSequenceComplete(ActivityExecutionContext executionContext)
{
var reqType = readRequest_Resource as RequestType;
if (reqType == null)
throw new InvalidOperationException("Cannot locate current request...");

m_logData.Add("Request (type=[{0}]) has been processed with the following parameters:", reqType);

var reqParameters = reqType.ParseParameters<CreateRequestParameter>();
foreach (var item in reqParameters)
{
m_logData.Add("\trequestParameter.PropertyName: [{0}] / requestParameter.Value: [{1}]", item.PropertyName, item.Value);
}

base.OnSequenceComplete(executionContext);
}
Note: Due to some reason unknown to me at this time, the ‘Resource’ itself is only available at the time the OnSequenceComplete method is being called by the Parent Workflow. Due to this reason, it is extracted in the OnSequenceComplete method and NOT the Execute method (where Resource = null).
That was a long way to go, but we are not done yet.
 

Installing the workflow in FIM 2010

To install the workflow (the parent FIM version) including your own SequenceActivity, you need to write some XOML to import into FIM. The XOML is generated quite easily in a console app with these lines of code:
static void Main(string[] args)
{
var logAct = new FIMLoggingActivity();

var seqWorkflow = new SequentialWorkflow(); //parent = special FIM SeqWF!
seqWorkflow.Activities.Add(logAct); //add std. WF acticity to parent

var settings = new XmlWriterSettings();
settings.OmitXmlDeclaration = true;

using (var xmlWriter = XmlWriter.Create("CustomWF.xoml", settings))
{
var ser = new WorkflowMarkupSerializer();
ser.Serialize(xmlWriter, seqWorkflow);
xmlWriter.Flush();
}

}

As seen, these few lines of code will create a FIM SequentialWorkflow as parent and add your custom activity (‘logAct’ in the above) as the single activity. Next, it will simply serialize the objectgraph to disk using the standard XmlWriter (system.xml). This resulting file (“customWF.xoml) will be imported into FIM when creating the workflow.


1) Copy the assembly from your development machine –> FIM Server at (C:\Program Files\Microsoft Forefront Identity Manager\2010\Service\*.*)


2) Open the FIM Portal and hit ‘administration’ – ‘workflows’


3) Press ‘new’ and import the xoml-file


WF creation in FIM


4) You are done importing the WF into FIM


 


Invoking the workflow in FIM 2010


Finally – you need some means of invoking the new workflow.


1) Open ‘Management Policy Rules’


2) Add a new policy


MPR


3) Note that is will run when a single property is edited on a Person object (All People)


4) Select Target Resources (before/after)


image 


5) Select your ‘Action’ workflow (in the below ‘CK_MobileWF’)


image


6) You are done!


 


Now – open a person object and change a property. This will trigger the MPR which will trigger your workflow (technically the parentWF which will call your activity).


Change a property -> invoke WF


That’s how ‘easy’ it is (in the simple version!)


 


StackOverflow Tags: ,

23 May, 2011

FIM 2010: A developer´s quick introduction

Wow - if everything was as poorly documented as FIM 2010 (Forefront Identity Manager 2010), the World would surely look a lot different! Given the very poor assistance available when it comes to programmatic interaction with FIM, I’ll be doing a short intro to aid people that are in the same boat as I was when starting playing around with this “beast”.

Overall Design
architecture

Concepts

First off – let’s establish some terminology. The following 3 entities are important to know:

FIM Service Windows Service + Web Service (hosted by the Windows Service)
FIM Sync Windows Service
FIM Portal SharePoint (WSS 3.0) based website.

FIM Service
This guy is the more recent entry point to FIM. It is technically a Windows Service (‘Forefront Identity Manager Service’) which runs under a specific account (‘FIMServiceAccount’). In the FIM World – this service account is “God” and can in FIM do “anything”. This FIM Service holds the most extension points to FIM (5,6,7,8). The Windows Service hosts and presents a WCF-service to the outside Clients, which is the entry point (5) of all incoming Requests to FIM. In the above illustration – the FIM Service is seen as ‘FIM Web Service’. This component has it’s own database (name = ‘FIMService’).

FIM Sync
This is also a Windows Service (‘Forefront Identity Manager Synchronization Service’) and is the ‘traditional’ functionality of FIM. This service is responsible for calling the various Management Agents in FIM when a ‘Run Profile’ is executed. Management Agents are seen as ‘Adaptors’ (10) in the above illustration. This component has it’s own database (name = ‘FIMSyncronizationService’).

Services

FIM Portal
The FIM Portal is a WSS 3.0 website that allows the user to indirectly manipulate the objects in the metaverse. Indirectly means that manipulation is made into the Portals database and subsequently synchronized to the metaverse (see below). This portal has the traditional WSS databases (see documentation on WSS at Microsoft’s website).

FIM Portal

 

Sync Service (further)

The FIM Sync service is functionally an ‘engine’ that manipulates a number of sub-elements that is also important to know about. These elements are seen below:

Metaverse A ‘space’ containing combined identity information for all entities
Management Agent An adapter that connects the Connector Space objects with the outside system (e.g. HR-system, AD or other system)
Connector Space A temporary storage area for entities

Metaverse
This is technically a set of SQL-server tables (now we are comfortable again!) holding all attributes about entities modeled in FIM. These ‘entities’ are typically Persons but can in theory be anything you wish to model in FIM. Entities in the Metaverse are manipulated by Management Agents. Management Agents update and modify the metaverse from multiple connected data sources via their representations found in the Connector Space. As an example, an HR-system does not hold all attributes about an employee, but only a subset. The entire list of attributes relevant to an employee is a combination of different sources (HR, telephone system etc). that in unison form the "’truth” about an employee. The metaverse object (Person) holds the combined ‘truth’ about this employee – combining the various Connector Space objects into one.

Management Agent
A Management Agent (MA) is basically an adapter that connects the Connector Space object with a specific source/target system. An example could be a MA that connects a HR-system holding data about employees with the employee representation in the Connector Space.

Connector Space
The connector space is a storage area where object additions, deletions, and modifications are written before they are synchronized with the metaverse or the connected data source. Again this is a set of SQL-server tables. It is important to recognize that a single Connector Space representation of e.g. a Person holds only a subset of the full metaverse objects attributes (as per the metaverse description).

The 3 entities are seen in combination below. As previously mentioned, the Metaverse representation is the full ‘truth’ about an entity and is the combined attribute collection from the in the above 3 connector space objects.

Connector Space, MA and Metaverse

How are the ‘FIM Service’ and the ‘FIM Sync’ Service related- you might ask? And with good rights. It is not evident from the above where the connection exists (which it does!). The FIM Service and it’s data is synchronized with the metaverse via a Management Agent as any other external system. The FIM Sync Service and it’s metaverse therefore sees the portal data a just another external system.

I sincerely hope this gave an initial overview of FIM 2010’s components.

 

StackOverflow Tags:

19 May, 2011

Book Review: “.NET Windows Development; Everyday Tips, Tricks & Optimization”

Review
I've been given the opportunity to review the Krasis Press Book (".NET Windows Development; Everyday Tips, Tricks & Optimization") by Alberto PoblaciĆ³n (C# MVP). It is a book easily read with approx. 250 pages and written by an author with good command of the English language, hence it is easy approachable.

Audience to this book
In the foreword, the author states the audience of the book is Line-Of-Business (LOB) developers. This is not a very precise definition as this category spans a great variety of people and can include both junior developers with little experience under the belt, as well as experienced senior developers with years of experience in .NET development. The proper audience to this book would fall into the first category (junior developers) which will benefit most from reading this book. There is not much to gain for the senior developer.

Content
The book seems a listing of the everyday obstacles the author has faced over his long career as consultant in the field. In that respect it does cover some by now “ancient” development techniques that is not considered best practice anymore.

As an example ADO.NET is covered with problems/solutions that indeed was an issue if you go back 4-5 years; whereas these problems no longer exists for the everyday developer as free and built-in ORM tools like Linq2SQL, Entity Framework and NHibernate etc. have replaced the need for direct manipulation of data. You can of course think of a situation where direct manipulation is due, but such case is rather rare in my view. In that light some of the chapters unfortunately seem an outdated waste to people with senior experience.

Another example that I’m curious about is the fact that the author choses to focus on an again ancient UI-technology like WinForms? WinForms is a 25 years old technology that, granted is in use a lot of places today, but it is definitely not considered the obvious choice for new applications. Here Silverlight and WPF have taken over the throne as best practice and the first choice for UI-development; this is not touched at all.

Not all is outdated though; a few chapters touching on new technology like Windows 7 specific API’s are coved which comes as a breath of fresh air in the book. That’s a good thing.

Overall evaluation
In my view – this is a book definitely intended for the junior developer. As the book is a listing of problems the author has seen in the field it is not very focused but covers a great array of areas in .NET development in general. A rather large portion of the book addresses known and basic technology that one should already know if you are a .NET developer, hence this seems a waste to me. To be able to gain a lot from this book, you need to be facing a specific problem that matches one in the book. In that case, you can adopt a list of solutions to such problem, otherwise is it just a listing of problems with solutions.

I would have liked to have seen topics covered like WCF, WF, WPF and Design Patterns if the intention was to present a comprehensive list of problems and matching solutions. None of these are touched on in the book unfortunately. I’m wondering if people presented with a problem would not go hunting the Internet for a solution instead of reading a book that possibly may presents a solution to a problem you are facing.

15 May, 2011

WP7: How to unit-test your ViewModels when using MVVM

Update:
For this to work, you need to install the NUnit Silverlight Framework.You can download the .vsi-file from here: http://www.testdriven.net/downloads/SilverlightNUnitProject.zip

This will give you this project template:

image

 

It has always been a source of great frustration for me with Silverlight, WP7 and unit testing in combination. It has remained an unsolved mystery to me why this is not possible with the native UnitTesting framework from Microsoft to author and run unit tests against your viewmodels? Come on, they are just plain assemblies (dll).

I know a specific Silverlight Unittesting framework has emerged, but I want to use the code-friendly unit testing experience that I’m used to inside Visual Studio. I don’t want to be unittesting in the phone emulator!

Anyway – forced to start using an old-time friend (NUnit) in my current work situation has actually solved the problem it seems. NUnit does not care what assemblies it is to load; it just loads them and applies the unit-test you specify! Therefore – you can use NUnit to perform the VM-unittesting you wish.

But, this requires an external GUI for running Unit tests, you might state? That is not entirely correct; an external GUI testrunner does exist, but you can also download Visual NUnit (see screenshot) from the Online Gallery inside Visual Studio. This runner can be nicely placed in your righthand side toolbox and will allow you to run unittests as easy as any other (Microsoft) framework you might think of.

Visual NUnit

The quality of these unit-tests is not dependent on what framework you are using, but on your own code.

So this is actually the way to do unit-testing of the ViewModel in a MVVM-application for the WP7 phone…

Happy coding.

14 May, 2011

Same codebase –> multiple .NET frameworks ?

Building a logger what I want to be available to both .NET 3.5 and .NET 4.0 clients, I do not want to have two copies of the same projects (one for 3.5 and one for 4.0). So what to do? How can you setup Visual Studio to build and emit code for respectively 3.5 and 4.0 all from the same code base?

The short answer is: You can not! You have to have a physical copy of the project for each version, unless you hand-fiddle with the project (.csproj) files ;-) In that case, Yes – we can (to quote some politician that I hold in very positive esteem).

Build for 2 frameworks...

How to do?

1) In Visual Studio – you set 2 new Configurations (‘3.5’ and ‘4.0’)

Settings dialog

2) Close the project

3) Open the project file (.csproj) in an external editor (e.g. Notepad++)

4) Locate the ‘<TargetFrameworkVersion>’ tag in the file in the first PropertyGroup!

csproj-file

5) Take out this element/tag and insert it into the two new PropertyGroups just created in step 1

csproj-file

6) Replace the text with respectively (one for each property group)
<TargetFrameworkVersion>v3.5</TargetFrameworkVersion> <TargetFrameworkVersion>v4.0</TargetFrameworkVersion>

7) Close the file (.csproj) and re-open it in Visual Studio 2010

8) Now you can select the target configuration (3.5 or 4.0) and you will see an assembly built for the two different frameworks respectively

bin-folder

9) You are done!

 

Verification:
To verify that this actually works, open the files (dll) in Reflector and see the internal references

.NET 3.5 version     .NET 4.0 version

How clever is that?

Can’t copy a file: The file is in use by another process…?

Often you run into the file copy issue (“the file is in use by another process”). Sometimes it is evident who is the active process using your file, other times it can be a little complicated to determine who is indeed holding on to your file you are trying to copy from or to.

Unable to copy

Well – to determine this in a quite robust way; you have the luxury of using ‘ProcesExplorer’ by SysInternals. Follow this recipe to see how to determine the exact process (PID) holding on to your assembly.

Example:
I
want to copy an assembly (FIMLoggingActivityLibrary.dll) like this:

Development machine –> Test machine

But, when I attempt this, I’m presented with the below dialogs:

image

File Copy failure

I know the name of the assembly I want to replace (FIMLoggingActivityLibrary.dll), so armed with this knowledge, I can use ProcessExplorer to find the hosting process:

Press ‘Find’ – ‘Find Handle or Dll…’

Process Explorer

next – enter the known part of the file name you want to search for:

Process identified

Here you have the name and ProcessID (PID) of the process holding on to your file. Now, either shut the process down in a nice fashion (in the above case using Control Panel – Services) or in a rude manner using Process Kill from Task Manager. This really depends on the situation what you do here.

Service Manager

When this process is stopped or restarted, you are able to copy the file initially requested.

Technorati Tags: ,

04 May, 2011

Hyper-V: How to enable remote access to guest environment

Setting up a multi machine development environment on a Windows 2008 server; I’ve run into some challenges surrounding networking and access to the Guest environment from the Host (my own machine).

Now – I have found a workable solution that allows for the following which pretty much covers my needs at this point in time:
1) Access from Host –> Guest
2) Internet access from Guest
3) Drive sharing: Host drives are visible to Guest
4) Remote Desktop (RDP) access: Host –> Guest

Overall View

How to do this?
To my surprise, Hyper-V works very differently when it comes to networking compared to the (other) Virtual PC product from Microsoft that I’m used to. Having read a great deal on blogs and the internet today has educated my in this direction. Anyway – here is how.

Required software (host):
1) Windows Server 2008 (x64 in my case)
2) Terminals (terminals.codeplex.com) (native RDP client will do as well)

Recipe:
1) Enable Virtual Networking on Hyper-V (host). Make sure this is Internal Networking
2) Call the Virtual Network something meaningful (e.g. Virtual Internal Network)

Hyper-V Manager    Virtual Network

Note: When this is done, you will see an additional network adapter on your Host machine (which explains the meaningful naming ;-))

image

 

3) Associate the Virtual Network just created on the Guest machine configuration in Hyper-V

 image   image

How we are done with Hyper-V for now. Next is the configuration of the physical network cards on the Host machine.

4) Share your active physical networking card (NIC) on your Host using ICS (Internet Connection Sharing)

image

5) Start the Hyper-V machine (Guest)
6) Once it is up and running – rightclick and ‘Connect’ from within the Hyper-V Manager
7) Login to the Guest
8) Open a Cmd-prompt and observe the allocated IP-address of the Guest

image

Now – armed with the IP-address of the Guest machine, you can RDP into the machine from the Host.

9) Open Terminals and enter the IP-address
image

10) To present the Host-drives to the Guest, enable this in Terminals

image

11) Connect to the Guest using Terminals

image

12) Observe the Host-drives are visible from the Guest

image

 

The Guest-machine will now present itself as any other networked machine visible from your Host-machine. You are by this maneuver able to transfer files from Host –> Guest as well as using the Guest a webserver/test-machine from the Host.

 

Technorati Tags: ,

InRiver: Not loading your extensions?

(You really need to in the loop to appreciate the issue this post addresses). Man, I've been fighting this problem for hours before I ...