Setup and Deploy a DTC App in Windows Azure

Intro

As part of my TechEd 2012 talk I went through the process of configuring and deploying an application that requires the Microsoft Distributed Transaction Coordinator (aka DTC).  This piece of Windows Server infrastructure is more commonly found in legacy applications but still comes up from time to time in new solutions whenever they require multiple resource managers.  When it turns out that multiple resources (ex. 2 SQL Databases) need to have an ATOMIC transaction they will rely on DTC to make sure it is done correctly.

In Windows Azure this has been an unsupported dependency since the platform was released.  If developers wanted to get their applications running that used DTC they would have to rewrite their code to handle transaction commits and rollbacks without it.  This blocks a lot of scenarios that work really well in the cloud and Windows Azure. Fortunately, with the release of Infrastructure as a Service (IaaS) on Windows Azure we can very quickly configure the environment necessary to support DTC and through my TechEd 2012 demo I did just that.  What follows is a list of steps to configure and deploy the sample application I created for TechEd to showcase this new architecture option in Windows Azure.

Step 1 – Create an Active Directory Controller in IaaS and note the assigned IP by running inetpub from a cmd window after you create the new VM. This IP will now be static so we can use it from the other server instances to allow domain joining and DNS resolution.

This will require you to configure a new VM using the Win Server image and go through the typical DC promo process you would have always used to create a new domain controller.  For more details on creating a DC read more here: http://technet.microsoft.com/en-us/library/cc755103(v=WS.10).aspx

Step 2 – Provision two SQL Server 2012 databases and an IIS server from the images provided (see screenshot above) and ensure the 1433 port is opened on the windows firewall for the 2 SQL server boxes. These are initially very straight forward steps and just require you to setup the VMs in the same virtual network.  To do this, all you need to ensure is that the first VM is a stand alone and the following servers are connected to the initial server.

Step 3 – Once that is done ensure you enable the network access for the DTC process from the Control Panel > System & Security > Windows Firewall > Allow a Program through Windows Firewall. You should see the screenshot to the right which will need the DTC checkboxes enabled.

Step 4 – Now we need to conigure the DTC settings on all three instances.  To do this you’ll have to open the MMC and add the component servers snapin.  Once that is open you’ll want to get to the local DTC on all three instances and ensure you have network access to all three isntances.

Note that these settings are not creating any generic admin user and are relying on the calling process to provide credentials and do proper authentication.  We are also not enabling any of the client admin access. The purpose of this configuraiton on all three servers is to enable the transaction flow only.

 Step 5 – Now that we have DTC all enabled and ready to go we need to domain join the machines and ensure that they are able to do the proper security we just setup for DTC. Now is the time to use that IP we noted back when we initially created the AD server.

The next thing we’ll need to do is use that address and manually set the default DNS server address to that IP and do a standard domain join which will require you to reboot all of the instances.

Step 6 – Create a new account in Active Directory that you will use as a least privileged account from IIS and into the DTC processes on the 2 SQL databases. Remember this account because we will use it in a future step to configure IIS to run as that account. As an example my domain that I created was developertofu.com and my account was dtcdemoaccount.

Step 7 – The next thing to do is setup the web server we created to support .NET and IIS and .NET 4.

To do this you’ll have to add the application server and web server role and install .NET 4 ( http://go.microsoft.com/fwlink/?linkid=186916 ).

Step 8 – We will also need to add another element to the Windows Azure IaaS portal so the load balancer knows to point incoming traffic to our IIS server.  This is simply a port 80 endpoint configured and setup on the IIS VM (see screenshot to the right).

Step 9 – Now we need to get the databases setup in SQL Server so we can get the proper security configured.  To do this we’ll need some SQL scripts which we can get by first runnign the code locally.  You can download the code from here: http://bit.ly/L3OgYq and in that code you will see a web.config that points to two different SQL isntances.

You can use SQL Express to create these isntances and then run the code to get entity framework to generate the DB which you can then export the SQL using either EF sql migrations of just simple SQL server management studio script generation.

Note: The code you are downloading is made up of 2 projects.  One is a data creation utility that you can run locally to get some test data setup and the other is the main MVC 4 WebAPI sample that uses a simple HTML/JQuery front end. The key logic for DTC is in the controllers and wraps calls to two EF contexts using System.Transactions (see below).

Step 10 – We are getting close now,  the SQL databases need to be setup to allow the DTCDemoAccount we created in step 6 to access the database.  This is simple SQL security.  First add the login to the master security logins and then add the user mapping to the databases in SQL that you created after first getting it setup and running locally and then exporting the script.

Step 11 – Now its basically time to just upload the code and configure the app pool on our IIS server to run as the DTC Demo Account.  To do this we’ll first move the InventoryServer directory onto the C:\Intepub\wwroot folder and then using IIS we can right click and “Create Application”.  This will give us a application type that can use .NET 4.  Make sure that the default app pool is configured to use .NET 4 (right click on the app pool > basic settings).  Also change the identity of the default app pool to use our DTC demo account so it will be the account used to call the DTC process.

 Step 12 – We are finally ready to test.  You should be able to go to your site from any web browser based on the DNS address of the IIS server (found in the portal) + /InventoryService and add new products, edit products, or delete products.  If the DTC configuration you setup is correct you will see a transaction count increasing in any of the servers involved in the DTC transaction (see screenshot).

Summary

While this may look like quite a few steps please take comfort in the fact that all this was done in the cloud with full RDP access and it was really straight forward. In only a few hours this 4 node configuration with a complex DTC dependency was setup and running using a modern WebAPI sample.  I hope this high level set of steps helps you see how to take advantage of this first ever capabiltiy in Windows Azure and start moving DTC apps to the cloud right now!

Posted in IaaS, Windows Azure | Leave a comment

What my wife thinks of ‘Cloud Computing’

First let me just say that my wife is somewhat atypical when it comes to her level of knowledge of software and corporate IT. The poor thing has had me working from home for the past 5 years and is effectively my sounding board for anything and everything that bugs me or pops into my head. She shows patience and attentiveness as she listens to me ramble on about things that must be so boring to her. As I made the decision to launch this new blog it came to me that she would be the best inspiration for a blog about cloud computing. Why? Because she has probably logged at least a hundred hours in the past 2 years hearing me spew out vision and long term cost benefits for companies moving to the cloud. So … how much of it did she absorb? I put it to the test.

Understanding the ‘Cloud’

It is truly ironic that we have this term ‘Cloud’ being tossed around and almost anyone you ask to explain it will give you 10 different definitions as to what it is. Ironic because we effectively have a ‘Cloudy’ picture of what the ‘Cloud’ actually is (har har har). I have heard at least a dozen times in the last two years how the cloud is effectively ‘Just the internet’. Well that’s interesting because that means Al Gore invented the cloud too … damn he’s smart! Actually, this is the first thing I list because it’s one of the biggest areas of confusion and I think I can guess why. How long have we been using the little cloud icon to represent something that was hosted in or being called from the WWW. I would say that’s been a common design technique for as long as the Internet has been around. So I get it but let’s set a baseline here. I want to cover ‘Cloud Computing’ and not necessarily the underlying ‘Cloud’ as it is traditionally described. In other words, i’m not saying folks are wrong when they just call the ‘Cloud’ the ‘Internet’ … they’ve just unfortunately misunderstood the question.

Here’s another one and I just heard it a week ago when doing some Windows Azure talks. I have a company that hosts all my web sites and I just can push changes to them via FTP. In my mind, that is what ‘Cloud Computing’ is. Interesting, but from my point of view this is a infinitesimal slice of what ‘Cloud Computing’ is and can be for an organization. Again, not a wrong definition, just not quite yet grasping the full conceptual value of ‘Cloud Computing’.

Now what about those that think ‘Cloud Computing’ is simply about virtual machines? Well this is another very confusing element indeed because many of the folks that define it this way are in corporate IT and probably even managing or consuming virtual infrastructures. From their point of view, they’ve been doing ‘Cloud Computing’ since they put in their VM infrastructure and started to spin things up on the fly to suit the demand of their internal corporate consumers. It’s a fair statement but again one that overly constrains the value of ‘Cloud Computing’ all up. At this point we’d need to start talking about Private vs. Public cloud computing but I think I’ll save that for another post. In the end, we’re getting closer if the conversation shifts to VMs but that’s just too in the weeds if we’re going to continue to forward the ‘Cloud Computing’ revolution.

Ok, back to my wife

Now when I decided to circle back to my wife on the topic I wanted to make sure she was ok with how I would frame all of this and I just this morning took another poll. This is after spending the last few weeks talking about things like iCloud and how did that relate? See that’s what is astonishing about this in my opinion, the whole concepts of clouds and cloud computing are being put in front of consumers as well. So I took a step back and asked her what were the three things she thought of when she heard ‘Cloud Computing’?

1. Expandable

This one is probably unique to my wife because she gets a lot of background information on corporate IT cost structures and how I’ve positioned the value of ‘Cloud Computing’. This is a sign that my wife actually listens to me more than I would have thought considering the lack of applicability to her daily life … I know … what a lucky guy I am to have someone so interested in something so incredibly boring 🙂

So the key thing that always perks up peoples ears and really seems to stick is when you describe those scenarios that require an extreme deviation from the normal volume or usage trend. Easy examples in our everyday world exist like Tax Time or Day after Thanksgiving Retail spikes. I have always used these types of metaphors when explaning it and it really works. Wouldn’t it be great if you could buy computer(s) for just what you need for as long as you need them and then give them back and not deal with their loss of value? I really wonder how much personal computing power we’ll need 10 years from now if a majority of it can be centralized using ‘Cloud Computing’.

2. Cost Effective

The notion of cost is absolutely something my wife gets because we can sit down and look at how much owning a house has gained us personal economic benefit in the last 10 years. Obviously that’s a sarcastic remark related to the depreciating value of real estate assets in the US. It does however become a powerful metaphor for ‘Cloud Computing’. The rent vs. own story is huge and it takes you not only into talking about ‘Cloud Computing’ but leveraging someone like Microsoft or Amazon or Cloud Vendor X to absorb the costs of purchasing, managing, and maintaining your rented compute power. These big massive software giants all want to be your new landlord and you have to decide which one of them you trust to show up at 3 am when your washing machine is spilling out all over the floor!

That’s not the whole story though, the idea of being cost effective spills over into all the decisions you make when moving to cloud computing and as a developer it becomes something you allow to enter your frontal lobes of consciousness. For a very long time developers (me included) have taken for granted that hardware and infrastructure was an infinite supply and by comparison to my time very cheap. The cloud computing universe is bringing folks together that normally had totally orthogonal concerns when producing and managing a software asset. Now when folks look to start building something they have to understand the impact of their designs and the longer term cost impact.

Now this is somewhat controversial because there are those that believe cost is not something to get too concerned about in these granular terms. I think I fall somewhere in the middle on this argument. I do not think developers should be doing herculean things to save $.25 a month on cloud costs but I do think having it in their consciousness leads to more efficient and smart designs that are highly optimized. This is a topic near and dear to my heart on the Windows Azure platform and my next blog post will dive deep into the things that are extremely important when considering the cost of your solution in the cloud.

3. It’s Everywhere

This actually came out of our most recent conversations about the cloud and how my wife wanted to understand if she should pay $24.95 for a subscription to iCloud (so much for keeping this free as Steve intended) with her iPod Shuffle. She said to me, “I don’t get why I would need this for just my PC and my iPod?” This made me happy in two ways. First, my wife was instinctively avoiding giving more money to Apple 🙂 Second, she totally got the fact that the cloud was there to help her access her content anywhere but if she only had one device that needed it then the cloud meant nothing to her.

I suspect this is why folks always turn cloud computing into a conversation about WWW. It is a fundamental underpinning and should never be forgotten. In fact, the beauty of leveraging others assets in ‘The Cloud’ means you have incredible reach. Looking at geographic distribution of your solutions was never easier. I can see a time when mom and pop companies will be able to expand into regions almost seamlessly because of the cloud. Getting software and content as close to your consumers (internal and external to your company) is what the next generation will demand. No one will understand why the application they use in Atlanta runs 5 times slower when they’re in London. That’s simply not going to be acceptable in the future. How in the world would any small company be able to do this without the public cloud?

Wrap it up

It’s been a really long time since I sat down and just wrote up some of my thoughts and this was a fun way to get back into it. My wife and I spend so much time talking about things like this (when we’re not talking about family stuff of course). I was so impressed by her when she hit these three key points on the head. If you want to explain Cloud Computing to your signficant other then look at these three things and how much they could change everyone’s perspective on the value of cloud computing. Sure, we’re not quite there yet when it comes to getting everyone into the public cloud but the model itself can not be ignored. We must get the rest of these walls that create friction for getting to the cloud torn down so everyone can benefit from this. Less cost equals more opportunity which equals better options for developers and ultimately consumers.

Posted in Cloud Computing | 1 Comment