Recent Updates Page 2 Toggle Comment Threads | Keyboard Shortcuts

  • Richard Prodger 5:00 pm on March 2, 2011 Permalink | Reply

    Windows Azure and the Global Alerting Platform 

    Just spent the last couple of weeks at the Microsoft Technology Centre in the UK helping out with a proof of concept for a Microsoft ISV. Whilst there, David Gristwood (aka Steven Spielberg) and I took a few minutes out of the day to capture some info on our recently launched Global Alerting Platform (GAP). GAP is a global hub for satellite messaging devices providing location tracking, emergency alerting, message routing and storage.

  • Richard Prodger 1:38 pm on March 2, 2011 Permalink | Reply

    Am I running in the Azure Devfabric? 

    There are several ways to check if you’re running in the DevFabric. Here is my way.

    if (cloudStorageAccount.TableEndpoint.OriginalString.StartsWith("") && IsRunningInDevFabric())
                   // Yes I'm running in DevFabric
            private static bool IsRunningInDevFabric()
                // Try to translate deployment ID into guid
                Guid guidId;
                if (GuidTryParse(RoleEnvironment.DeploymentId, out guidId))
                    return false;   // Valid guid? We're in Azure Fabric
                return true;        // Can't parse into guid? We're in Dev Fabric
            // This is needed as Guid.TryParse is only available in .NET4
            private static bool GuidTryParse(string s, out Guid result)
                if (s == null)
                    throw new ArgumentNullException("s");
                Regex format = new Regex(
                    "^[A-Fa-f0-9]{32}$|" +
                    "^({|\\()?[A-Fa-f0-9]{8}-([A-Fa-f0-9]{4}-){3}[A-Fa-f0-9]{12}(}|\\))?$|" +
                    "^({)?[0xA-Fa-f0-9]{3,10}(, {0,1}[0xA-Fa-f0-9]{3,6}){2}, {0,1}({)([0xA-Fa-f0-9]{3,4}, {0,1}){7}[0xA-Fa-f0-9]{3,4}(}})$");
                Match match = format.Match(s);
                if (match.Success)
                    result = new Guid(s);
                    return true;
                    result = Guid.Empty;
                    return false;
  • Richard Prodger 1:39 pm on February 17, 2011 Permalink | Reply

    Getting WCF services in the DevFabric to work 

    I must have encountered these problems a dozen times or more over recent months when working with customers, so I thought I ought to write it down. When porting a service to run in Azure, you typically start by targetting the DevFabric which essentially simulates your service behind a load balancer. However, there are some challenges in getting this to work. Here are the 3 key steps you need to take.

    1. Install the HOTFIX for WCF services to work behind a load balancer. NB: There are 2 versions depending on your OS.

    The hotfix causes WCF to generate the correct URI by using the “Host” HTTP header of the incoming metadata request. In this case, the “Host” header contains the load balancer address instead of the internal node address.

    2. Add the following behaviour to your service model

       <behavior name="<name>">
              <add scheme="http" port="81" />
              <add scheme="https" port="444" />

    If a URI inside the WSDL document has a different scheme than the scheme of the “Host” header URI, for example, if a request for metadata comes over HTTPS but the metadata contains HTTP URIs, the hotfix will need the port number for that different scheme. The port number can be specified per scheme in the <defaultPorts> section.

    3. Decorate your service class with the following attribute:


    By default, WCF ensures that the To of each Message matches the intended address. This essentially turns off address filtering, so that the service doesn’t care anymore.

    I’m sure there are other tweeks you’ll need to make, but these fixes are the most common ones I’ve come across.

  • Richard Prodger 6:39 pm on January 17, 2011 Permalink | Reply

    GAP launches at Iridium Partner Conference 

    Global Alerting Platform Logo

    We are proud to announce, that the Global Alerting Platform (GAP) has been officially launched at the 2011 Iridium Partners Conference this week in New Orleans.  

    The team were invited to attend the conference as an Iridium Value Added Developer (VAD) partner, this gave us the perfect opportunity to launch the new global hub for satellite messaging devices, which has received lots of interest.  The conference, which was only open to registered Iridium partners, has taken place at the Intercontinental Hotel. 

    From Tuesday to Thursday, the conference has allowed partners to network and connect, offering a full agenda, which included interesting updates, presentations and round table discussions, as well as an impressive exhibition hall, where GAP was displayed.  

    The 2011 Partner Conference has given the attendees an opportunity to join with other industry experts, interact with the Iridium executive team, receive product and service updates, and delivered new opportunities to share stories.

  • Richard Prodger 1:41 pm on January 1, 2011 Permalink | Reply

    CommunicationObjectFaultedException with Azure SDK 1.3 Full IIS 

    This is a particularly bizarre bug in the SDK. If you are using the Full IIS features with multiple sites defined in your ServiceDefinition.csdef, then you will get this exception thrown when you start debugging if any of the web.config files are read-only. This really caught me out for ages. All was working well until I checked-in my files and then the deplyoment started throwing this error. I’m using TFS which sets files to read-only on check-in. I don’t understand why this occurs as the config files are not being modified. So now I have a post build script that searches for all web.config files and sets them to be writable. Please sort this out Microsoft.

  • Richard Prodger 1:42 pm on September 27, 2010 Permalink | Reply

    AzureRunMe Demo, Telnet and Java on Windows Azure 

    Richard Prodger talks to Rob Blackwell about running Java and other copy-deployable software, unmodified on Windows Azure.

    Rob demonstrates running a telnet session to interact with a Windows Azure compute instance (More info and how AzureRunMe ( makes it easy to run Java software including the Apache Tomcat Web Server.

    They discuss the upcoming Windows Azure Launch Pad work and how this will enable Independent Software Vendors (ISVs) to get up and running quickly in the cloud.

    • Maria Hayes 4:43 pm on March 9, 2011 Permalink | Reply

      From that Java code, you can do whatever you want, but typically you’ll want to poll a Windows Azure queue and maybe write results back to storage. Because Windows Azure storage is based on HTTP and XML, it’s not too hard to write Java code to interact with storage.

  • Richard Prodger 1:43 pm on August 23, 2010 Permalink | Reply

    Running Hosted Web Core in Windows Azure problems 

    Steve Marx’s article on how to run IIS Hostable Web Core is Azure is a great starting point for simple hosting in an worker role. However, I struggled to get this working locally and kept getting ‘module not found errors’. A quick look in the event log and it turns out that HWC couldn’t load the URLRewrite module (%windir%\system32\inetsrv\rewrite.dll). It turns out that this module is not installed on 32bit Windows7 by default. You have two options, either you can comment it our of the applicationHost.config file or better still, install it using the Web Platfrom Installer. It is installed in the Azure VM by default.


    • writing a dissertation 9:30 am on March 8, 2011 Permalink | Reply

      The good thing about your information is that it is explicit enough for students to grasp. Thanks for your efforts in spreading academic knowledge.

  • Richard Prodger 1:53 pm on July 2, 2010 Permalink | Reply

    StreamInsight & Silverlight 

    In August 2008 I started to work on a project that used Silverlight client to plot charts based on the real time data. On server the data were imported from third party application into an SQL database and using SOAP services with periodical polling we loaded selected data into the Silverlight client.

    In February 2010 I got the opportunity to have a look at our architecture again and solve the same task using the new Microsoft StreamInsight platform. To compare the new approach with the old one I decided to create a simple test application.

    Diagram of the test application

    Current architecture

    If you have a look at the blue part of the diagram above you can see the architecture of our current application.

    The third party application is replaced with DataGenerator. That’s a console application that generates about five thousand random valued records every second. Each record consists of a text symbol and a numeric value.

    Then there is the block named SqlImporter. It’s another console application that periodically queries the DataGenerator for new data and inserts them into SQL database.

    The last piece of functionality executed on the server is the SqlService. This is a standard WCF service that uses a request/response approach. Through this service Silverlight clients periodically requests data for specified symbol, returned data contain time and numeric value.

    New architecture

    The new solution is featured in the red part of the diagram. It is based on Microsoft StreamInsight, a new platform for complex event processing applications that can work with large volumes of data with very little latency. I used the latest currently available version, November 2009 CTP.

    Another important difference is the use of duplex WCF service that can push the data from server to the Silverlight client. The client subscribes for a symbol only once and whenever new item with this symbol is processed by StreamInsight server it is automatically send to the Silverlight client application.

    Silverlight client

    The client application uses both services to get the data and render them into two charts with some extra tracing information, like average delay and number of service calls. Once the user picks a symbol from a drop down list both charts start to render the values. The blue one plots data loaded through SQL server and the red one values that were processed using StreamInsight.

    Screenshot of the client application

    StreamInsight application

    Let’s return back to the moment when user picks the symbol and see what is happening. Silverlight client application uses duplex WCF service to subscribe for all records with this symbol. This starts a new StreamInsight query that processes all generated data, filters appropriate records and returns them to the WCF service that can push them to the client.

    I used explicit server development model with server running as a separate process. Input adapter loads data from DataGenerator as XML using its API, parses them into individual records and inserts them into queue.

    The actual query is very simple, it only filters records with specified symbol and passes them to the output adapter.

    Output adapter is responsible for sending the values to the client using the duplex service. As the StreamInsight server runs in a different process I used .NET Remoting with named pipes to pass the records to the WCF service.

    Server screenshot with DataGenerator, SqlImporter and StreamInsight server console

    Running the test application

    I uploaded my application to the company server in the UK and tested it from my laptop while working from our Prague office. With both old and new technologies I was able to get the data into my Silverlight client within less than one second after it was generated. From my test I can’t tell that any of the approaches were significantly better.

    The StreamInsight server usually needed more time to initialize the query and therefore when I picked a symbol from the drop down list I was able to see the first records earlier using the SQL chart. But after few seconds the StreamInsight chart started to work at full speed and new data appeared on the red chart slightly faster than on the blue one.

    It’s important to state that unlike this experiment our real application plots charts not only from real time but also historical data. And for this part there is nothing that StreamInsight could help us with. It can give you immediate insight into stream of real time data, but it’s not storing this data once they are processed. So even if we used StreamInsight, we would have to do the SQL part for historical data anyway.

    And also I have to mention that StreamInsight supports many complex processing algorithms, but for our case I needed only very simple filtering. Using such a potent technology for a simple task like this seems almost inappropriate.

  • Richard Prodger 1:49 pm on June 24, 2010 Permalink | Reply

    Active Web Solutions in top 3 Windows Azure Partners worldwide 

    Microsoft today announced the winners and finalists of its 2010 Microsoft Partner of the Year Awards. I am delighted to say that Active Web Solutions was announced as one of the 2 finalists in the Windows Azure Partner of the Year category. The annual awards honor Microsoft Registered, Certified and Gold Certified partners for delivering exemplary solutions for their customers during the past year. Award winners and finalists were chosen from nominations from around the world. Winners and finalists will be recognized at the Microsoft Worldwide Partner Conference 2010, the company’s premier annual event for industry partners, July 12–15 in Washington, D.C.

    We are thrilled to have done so well. To be in the top 3 companies in the world in this exciting and ground breaking area is quite an achievement by the team. We are all passionate about technology and aligning our efforts with Microsoft has helped us to remain at the forefront of new technologies. Being recognised by Microsoft for this award is an accolade in itself.

    Nearly 3,000 entries were submitted by partners from more than 110 countries; the award finalists and winners were selected from a group of nominations based on their dedicated use of Microsoft technologies to provide solutions for their customers’ needs.

    “Congratulations to the 2010 Partner Award finalists and winners for delivering such creative and superior Microsoft solutions and services,” said Allison Watson, corporate vice president, Worldwide Partner Group, Microsoft. “It’s incredible to see the level of expertise our partners continue to exhibit as they create and deliver innovative solutions and services to grow their businesses, meet customer needs and drive down costs.”

  • Richard Prodger 1:50 pm on April 14, 2010 Permalink | Reply

    Real World Windows Azure: Interview with Richard Prodger, Technical Director at Active Web Solutions 

    This article is a re-posting from

    As part of the Real World Windows Azure series, MSDN talked to Richard Prodger, Technical Director at Active Web Solutions (AWS), about using the Windows Azure platform to deliver the company’s search-and-rescue application and the benefits that Windows Azure provides. Here’s what he had to say:

    MSDN: Tell us about Active Web Solutions and the services you offer.

    Prodger: AWS specializes in Web application and custom software development. In 2006, the Royal National Lifeboat Institution contracted AWS to build an automated alerting system for fishing vessels in the United Kingdom. We developed a location-based service infrastructure, code-named GeoPoint, that transmits position data to a centralized tracking and alerting system. We then used GeoPoint to build MOB Guardian, a search-and-rescue application for fishing vessels (MOB stands for “man overboard”).

    MSDN: What was the biggest challenge Active Web Solutions faced with GeoPoint and MOB Guardian before migrating to the Windows Azure platform?

    Prodger: Our original infrastructure could handle approximately 10,000 boats, but we wanted to offer MOB Guardian to the 500,000 leisure craft in the U.K. and the millions of marine users worldwide. However, as a small company, we would find it hard to accommodate the massive infrastructure that would be required to offer MOB Guardian more broadly.

    MSDN: Can you describe the solution you built with Windows Azure and Windows Azure platform AppFabric to help address your need for a cost-effective, scalable solution?

    Prodger: We migrated our existing application to Windows Azure very quickly. Now, instead of passing emergency messages by satellite to physical servers, messages are transferred by satellite using the Simple Mail Transfer Protocol (SMTP) and delivered to a number of message queues. Multiple service instances read from the queues, process the messages, and store the data using Table storage in Windows Azure. Emergency alarms are then relayed through the AppFabric Service Bus to the end-user monitoring application in the search-and-rescue operations center. We also used the AppFabric Service Bus to connect cloud-based GeoPoint to on-premises databases without exposing the data to the public Internet.

    Figure 1. Search-and-rescue teams use a graphical user interface created by AWS to locate troubled boats.

    MSDN: What makes your solution unique?

    Prodger: In the waters surrounding the United Kingdom, an estimated 350 fishing crew lost their lives at sea between 1996 and 2007. However, in our first 18 months of operation, MOB Guardian helped to save nine lives. Our original architecture supported 10,000 vessels, but with Windows Azure and AppFabric, we can support hundreds of thousands or even millions of vessels, without any capital expenses, and help save more lives.

    MSDN: Are you offering MOB Guardian to any new customer segments or niche markets? 

    Prodger: By using the Windows Azure platform, we’ve been able to transform the original fishing vessel-focused MOB Guardian application into a broader geolocation services platform that has more extensive capabilities and can be marketed to many more customers. There are also new market opportunities for GeoPoint where pleasure sailors might access GeoPoint from Facebook, for example, to see a map of where they’ve been or to log their trip once they get home. With AppFabric Access Control, we wouldn’t have to force users to create another set of authentication credentials; they could use their Facebook credentials as their GeoPoint ID.

    MSDN: What are some of the key benefits Active Web Solutions has seen since migrating to the Windows Azure platform?

    Prodger: The ability to scale up without capital expenditures is key. To scale, we simply provision more computing capacity and add more message queues from the Windows Azure platform Web portal. With AppFabric Service Bus, we can manage requests and message volumes globally with massive scale. We’re also able to avoid additional IT staffing costs because Microsoft handles all operating system updates and upgrades, which occur without downtime. In addition, we slashed development time and costs-Windows Azure saved us three to six months of development effort.

    Read the full story at:

    Read more Windows Azure customer success stories, visit:

    • aslan residence 2:15 pm on March 10, 2011 Permalink | Reply

      Great post, this is always a subject that requires more attention to understand, and sometimes we stay all
      confused by all this complicated thing, but you´ve this a little easy to understand, thanks

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc