Thursday, October 30, 2008

VS2010 and Architecture

I had a little time left and started playing around with Visual Studio 2010 and I noticed some cool features. There's a new project type called the Modeling Project. I was a bit sceptical about this, because the modeling in VS2005 and VS2008 was not the experience I expected...

In VS2010 it is now possible to create Use Case diagram, Activity diagram, Component diagram, Sequence diagram and Layer diagram. Even versioning is possible with your favorite source control product. (TFS). Now I must have been asleep for a while or busy doing other things, but Microsoft has joined OMG, so the new UML diagrams are al UML 2.4! Now this I hadn't expected.

It's also possible to create Sequence diagrams from existing code, which is a big help if you're a new architect on a existing project. Talking about acrhitectional discovery VS2010 adds a new Architecture Explorer which creates Sequence diagrams, project dependency graphs and all sorts of handy diagrams, which can help you understanding the system.

From my point of view the Architecture Explorer is the best! It's a cool new tool in VS2010, which helps me get a better understanding of the system I'm working with. I can't wait to use it in real projects.

.NET Services and The Service Bus

The Service Bus is a new messaging infrastructure part of the new Windows Azure Services. It addresses a few Connectivity challenges we all are aware of:

  • IPv4 Address Shortage
    • Dynamic IP address allocation
    • Network address translation
  • Firewalls layered over firewalls over firewalls

So how does the service bus address these problems? There are a couple of components the service bus consist of:

- Naming: Because of the practical constraints of DNS (High update propagation latency, DNS names hosts and not instances etc.) the service bus naming system works in a different way i'm not going to dive into it now, but a few features are: R/W access with Access Control through the Service Bus Registry, Updates are reflected instantaneously and it names endpoints, not machines.

- Service Registry: The service registry is registry for service endpoints, not a general purpose directory. The registry is layered over the naming system and the services should publish themselves in the registry. The registry also provides the family of bindings for the service bus. Right now the service bus can be used with the following bindings:

  • BasicHttpBinding
  • WebHttp
  • WSHttp
  • WS2007Http
  • WSHttpContext
  • WS2007Httpfederationg
  • NetTcp
  • NetTcpContextBinding

So how does the service bus works with these bindings? If you use NetOnewayRelaybinding then the Receiver creates an outbound connect bidirectional socket and register in the cloud (Registry of the Service Bus) (TCP/SSL 828 outbound open required) The sender outbound connect one-way net.tcp (tcp/ssl 808/828) to the Service Bus. The sender sends a message the Service Bus creates a route to the receiver.

The NetEventRelayBinding allows muliple receivers, so you can send a message and both/more receivers will receive the message.

NetTcpRelayBinding / Relayed is the one you should almost always use, because it got the highest throughput. A connection starts just like NetOneWay, but instead of connecting to the Service Bus we connect to the load-balanced front-end nodes. These nodes are some sort of Socket-Socket Forwarders the nodes sends a control message to the cloud and the cloud sends a control message to the receiver. The receiver then connects to the socket-socket forwarder and then the receiver and sender are connected through the socket-socket-forwarder.

The last part I haven't discussed is security, this is where the Relay Access Control Model comes in.

- Relay Access Control Model:

RECEIVER MODEL: The receiver goes to the acces control service (azure) and acquire access token and asked for a claim for listen permissions on the service bus relay. Then this access token with subscription is passed to the service bus relay. The service bus relay will evaluate that token.

SENDER MODEL: The same way: get a token send it to the relay, relay evaluate, and now the relay will pass the normal message to the receiver.

There's even a possibility for you if you don't want to use the Access Control Service just add the following line in your BindingConfig <security relayClientAuthenticationType="None"/>. I don't think it's a good idea, but you can.

Wednesday, October 29, 2008

"Huron" Sync enabled cloud data hub

To my surprise the Microsoft Sync Framework team has teamed up with the SQL Services Lab to produce "Huron".

"Huron" is a combination of SQL Data Services and Sync Framework. Which actually means that an arbritary database can be published to the SQL Data Services (which is a part of the Windows Azure Services) and after that Sync Framework kicks in.

For example: You have some sort of SQL Database which contains information that you want to share with your colleagues (on mobile devices for example). It's fairly easy to publish this database to the "cloud" (SQL Data Services). The Data Service will store your database as a blob in the cloud. After you've uploaded your database, sync framework does a first sync of the data in the database. Another user can subscribe to your database, this means that a template of the database is downloaded to their local drive and sync framework does a first sync of al the data. This user also get a range of primairy keys he can use for adding rows. Which means that a user can add a row and the primairy key of that row will not be the id of the last row + 1, but will be the first number in the primairy key range. This little trick is done to reduce the sync logic. (Two users adding a row which gets the same id, sync framework can't determine which row is should be persisted in the real database).

Everyone who have been working with Sync Framework won't see anything new here accept the fact that the database is now out in the "cloud" and not in the company. So actually nothing new here, just a hype to sell Live Services. I understand that for smaller companies which don't have the money to provide the cost of administration, deployment and availability this could be a good solution. For bigger companies who already have the network and the maintenance departments (and the budget) for supporting these kinds of technologies, this can be an option if you want your data up in the "cloud", but I think there are better options.

First contact with Oslo

The first presentation about the language Oslo was by Don Box and David Langworty. It was really good to see that I don't have to know T-SQL anymore!

Why "M"

  • We want creating and interacting with oslo content to be simple and natural
    • Having a box-and-line design experience is an important enabler
    • Having a complementary textual experience is equally important
  • M is how we achieve the latter

What is "M"

  • "M" is a language for defining domain models and textual domain-specific languages (DSLs)
  • M domain models define schema and query over structured data
    • Values, constraints, and views
    • Natural projection to SQL
  • M DSLS define projections from Unicode text to structured data
    • Rule-based transformation
    • Grammar driven text editor integration

What "M" is not

  • An object-oriented language
    • No polymorphism, virtual dispatch
    • "ls-a" determined based on structural subtyping, not stipulation
  • A data access technology
    • M domain models compile down to T-SQL
    • Tool chain supports course-grained loading/unloading of schemas and values - not an OLTP solution
  • A replacement for T-SQL
    • Far less expansive feature set
    • Tool chain supports linking/invoking T-SQ

So Oslo is an abstraction on T-SQL and although it is not finished yet, it is already fun to play with. A little example:

First to start with a normal table:

Module Contacts

{

People

{

{ Name="persona", Age=29 },

{ Name="personb", Age=28 }

}

}

This actually means that we defined a schema and filled the table with two rows.

Next up constraints:

Module Contacts

{

-- Constraints:

People :

{

Name : Text where value.Count <= 30;

Age : Integer32;

}*;

People

{

{ Name="persona", Age=29 },

{ Name="personb", Age=28 }

}

}

What you see here is a constraint on the People table, which in T-SQL would look like this:

Create table [Contacts].[People]

{

[Age] int not null,

[Name] nvarchar(28) not null

}

It's still another language to learn, but I like this more then T-SQL.

Tuesday, October 28, 2008

WPF on Windows 7

Scrott Guthrie just presented some new features in WPF on Windows 7. They added some more interesting controls like a jumplist. A jumplist add some contextual support for your app. So when you right-click on your app in the taskbar, it shows the jumplist with it's available actions. They also improved the theme support and Deep Zoom will be in the WPF 4.0!

They enabled multitouch, which you can use to make Surface-like applications (if you got the hardware!)

Just to prove they believe in WPF (How could they not) they even built Visual Studio 2010 on WPF! This means that it's going to be really easy to extend Visual Studio specially on the visualisation part.

Today there will be a new WPF toolkit out, with a new datagrid (finally), datepicker, calender, ribbon and a preview of the Visual State Manger available in WPF in .NET 4.0. The Visual State Manager introduces two concepts: State transitions and visual states. A control, such as a datepicker, has multiple visual states that define how the datepicker should look like when the mouse is hovering over it, or when it is pressed etc. Transitions are used to define how visuals dynamically change as the control moves from one state into another.

The new WPF Toolkit will be available here.

WF 4.0 vs WF 3.0

A fast blog for now: the best improvements for WF 4.0!

Activity:

  • Authoring is simpler and takes much less code
  • Fully declarative workflows and activities
  • Alignment across Expressions, Rules, and Activities
  • Seamless Composition Across Flow Styles

Runtime

  • 10-100x Performance Improvements
  • Full control over persistence
  • Flow-in Transactions
  • Partial Trust Support
  • Integrates with WCF, WPF, ASP.NET

Tools

  • Designer Performance and usablility
  • Rehosting Improvements
  • Unified Debugging Experience

I've seen the new Designer and it rocks! It looks better and even has a new zoomfunction which works much better then the old! But they added more they added external Arguments you can assign which goes in and out. And my personal favourite it even shows which variables are in scope! And it's possible to use expressions on those variables.

WF 4.0 is a part of "Dublin" for a complete overview on "Dublin" check it out here!

Visual Studio 2010 CTP

Visual Studio 2010 and .NET Framework 4.0 CTP is out!

Available Windows Azure Services

Today there was another session on Windows Azure. It was sort of a walkthrough through all the available services on the azure platform, in my last post I gave a general idea on the available services (SharePoint, CRM, Live, .NET and SQL services). Now I can give a little bit more info on the Azure Services.
There are six available service on the Azure Platform
  • Access Control Service (identity management)
  • Mesh Service
  • Service Bus (application integration, with messaging events)
  • Workflow Service
  • LiveID Service
  • SQL Data Service

I found the Acces Control Service, Mesh Service and the Workflow Service the most interesting.

The Access Control Service can be used to delegate your identity management to the cloud. What you can do is to allow users to be identified by their LiveID or by using your own Active Directory, which can save you a lot of work.

The Mesh Service can be used to get more information about the user who has been identified. For example: If a user wants to register for an event, he has to fill in where he lives, his date of birth and all sorts of private info. The mesh service allows you to request this data so a user doesn't have to fill it in. (Of course you should have permission of the user to do this).

The last one, the Workflow service is my personal favourite, you can create a workflow as you usually would do and the publish that workflow to the workflow service! After you published it, it is running in the cloud and can by example process the registration.

Monday, October 27, 2008

Windows Azure is coming!

Today we got a little surprise @PDC2008: Windows Azure was announced!
I immediately changed sessions so I could attend to Manuvir Das' presentation: "A lap around windows Azure" for more information check out: Windows Azure.

So what is Windows Azure?
Windows Azure is an internet-scale cloud services platform hosted in Microsoft data centers, which provides an operating system and a set of developer services that can be used individually or together. Azure’s flexible and interoperable platform can be used to build new applications to run from the cloud or enhance existing applications with cloud-based capabilities. According to Microsoft.

I will be attending more sessions about Windows Azure, so more technical information will follow. For what I've seen today it's going to be big, we've seen some demo's about live services of Sharepoint, CRM and SQL all available for us to develop on.

Live Services is a set of building blocks within the Azure Services Platform for handling user data and application resources. Live Services provides developers with an easy way to build applications, across a range of digital devices that can connect with them.

The Azure platform consists of five major services: Live Services, .NET Services, SQL Services, SharePoint Services and Dynamic CRM Services.

We got access to the preview of Windows Azure, but the commercial release will be in 2009.
A couple of notes about the business model of Windows Azure:
  • Consumption based billing
  • Strict SLAs with financial guarantees
  • Global reach
  • Presence in multiple datacenters
  • Geo-distribution
  • Service hosting options
Bit of a side note: I attended this session with my colleague Wouter Crooy. Who immediately started blogging about it, and showed me he was first with his blog. Gratz to you :-)

Monday, October 13, 2008

PDC 2008 Achievements

In two weeks I will be attending the PDC 2008 in LA. This will be my first time, so I have great expectations and can't wait to go.

Today I read that they will be handing out different badges, "each attendee will start out with the official PDC 2008 badge and will be given new badges as a result of an accomplishment or in some cases being at the right place at the right time", according to Adam Kinney. As a true Xbox 360 fan, I love the achievements and these badges are just the same only in real life!

So if you see me running around, raving like a madman, there's nothing wrong with me (Well nothing more then usual), I'm just collecting my achievements :-)

Monday, October 6, 2008

Pageflow available on CodePlex

After a three week vacation in France and Italy, I'm catching up on the latest facts and the first good news I read was that my colleague Wouter Crooy finally managed to get Pageflow on CodePlex!

We've been working with Pageflow for over a year now and he's been trying to get it on CodePlex for about half a year, so congratulations!

Thursday, September 11, 2008

ResourceLocator for Silverlight

Last year I've done a large project with WPF and so I couldn't wait to start with Silverlight. I know there's a lot of stuff "missing" so once in a while I'll be posting some stuff that I know could come in handy.

First thing I started with is a decent Resource Locator. Usually you would create your animations in the App.xaml and reference them in your UserControl. There are no triggers in Silverlight, in WPF triggers are able to connect to the target object for an animation. In Silverlight you should use the events to do this, they won't do this automatically so you have to do this yourself in code-behind.

So I've created a helper class that can find Resources for you (like in WPF):

public static class ResourceLocator
{
///
/// Helper method for finding resources located in app.xaml
///

public static object FindResource(string name)
{
if (App.Current.Resources.Contains(name))
{
return App.Current.Resources[name];
}
else
{
FrameworkElement root = App.Current.RootVisual as FrameworkElement;
return root.FindName(name);
}
}
}

Wednesday, September 10, 2008

Presentation CodeCamp 2008

Together with my colleague: Wouter Crooy, we gave a session about Using Pageflow / Workflow Foundation for navigation in applications.
(September 6th 2008, @ Microsoft innovation centre, Barneveld.)

For people who are interested in the demo bits and presentation click here.

More info about CodeCamp.

[AOP] Aspect-Oriented Programming part 1

I've always been interested in AOP (Aspect Oriented Programming). I've spent a lot of time trying to figure out Policy Injection Application Block in Enterprise Library 3 and Spring Framework. Both frameworks made AOP a lot more accessible than it was before. Offcourse PIAB has its downsize, not being compatible with ObjectBuilder so AOP and DI with entlib is a no-go.

The only thing the .NET framework missed was a compile time weaver. PIAB and Spring are runtime weavers, which basically means your code knows it's using AOP. For example a logging advice:

Basic .NET:
public class MyClass
{
public void DoSomething()
{
MessageLogging.Write(“Starting method DoSomething”);
// Run code

MessageLogging.Write(“End method DoSomething”);
}
}


class Program
{
static void Main(string[] args)
{
MyClass mine = new MyClass();
mine.DoSomething();
}
}

When you're using Spring it should look something like this:


public class MyClass
{
public void DoSomething()
{
// run code
}
}


// spring advice:
public class LoggingAdvice : IMethodInterceptor
{
public object invoke(IMethodInvocation invocation)
{
MessageLogging.Write(“Starting method DoSomething”);
Object returnvalue = invocation.Proceed();
MessageLogging.Write(“End method DoSomething”);
return returnValue;
}
}

class Program
{
static void Main(string[] args)
{
ProxyFactory factory = new ProxyFactory(new MyClass());
Factory.AddAdvice(new LoggingAdvice());
((MyClass)factory.GetProxy()).DoSomething();
}
}


The crosscutting concern (logging) is seperated from the method. The method DoSomehting isn't aware of the logging code. But the code using the method is well aware of the use of AOP (creating a proxyfactory). One advantage is that as a developer you will see the use of AOP. (But the code isn't getting any better, in my opinion).

PostSharp is a compile time weaver, which means that postsharp will rewrite the MSIL, PostSharp is a post-compiler. PostSharp will inject itself in the build process and transforms or analyzes the program after its compiled. PostSharp is integrated in the MSBuild Proces and already has some plugins available:

- PostSharp Laos (for easily writing aspects)
- Software Transactional Memory (for using In-memory datastructures)
- Entity Framework Bindings
- PostSharp4Entlib (extend PIAB with PostSharp)
- PostSharp4Spring (integrate PostSharp with the Spring Framework)
- Log4PostSharp (Custom attribute for Log4Net)


Because PostSharp is a compile time weaver, I don't have to use a proxyfactory. I just have to create an aspect and put it above my method:

public class MyClass
{
[LoggingAspect ]
public void DoSomething()
{
// run code

}
}

class Program
{
static void Main(string[] args)
{
MyClass mine = new MyClass();
mine.DoSomething();
}
}

[Serializable]
public class LoggingAspect : OnMethodInvocationAspect
{
public override void OnInvocation(MethodInvocationEventArgs eventArgs)
{
MessageLogging.Write("Method: " + eventArgs.Delegate.Method + " is called");
eventArgs.Delegate.DynamicInvoke(eventArgs.GetArguments());
MessageLogging.Write("Method call ended");
}
}


My calling code and method looks clean and mean, just the way I like it :).