Monday 31 October 2016

EventHub - No connection could be made because the target machine actively refused it 127.0.0.1:10000

I recently trying out a simple POC that listens to messages from Azure EventHub. I referred to the online documentation and everything was very simple to set up.

I set up separate policies for Listen and Send and used those to set up simple code that sends and listens to a message. The code was fairly simple but it failed :)

        static string eventHubName = "eventhub1";
        static string listeneventHubConnectionString = "";
        static string sendeventHubConnectionString = "";
        static string storageConnectionString = "UseDevelopmentStorage=true";

        static void Main(string[] args)
        {

            Console.WriteLine("Press Ctrl-C to stop the sender process");
            Console.WriteLine("Press Enter to start now");
            Console.ReadLine();
            SendingRandomMessages();

            string eventProcessorHostName = Guid.NewGuid().ToString();
            EventProcessorHost eventProcessorHost = new EventProcessorHost(eventProcessorHostName, eventHubName, EventHubConsumerGroup.DefaultGroupName, listeneventHubConnectionString, storageConnectionString);
            Console.WriteLine("Registering EventProcessor...");
            var options = new EventProcessorOptions();
            options.ExceptionReceived += (sender, e) => { Console.WriteLine(e.Exception); };
            eventProcessorHost.RegisterEventProcessorAsync(options).Wait();

            Console.WriteLine("Receiving. Press enter key to stop worker.");
            Console.ReadLine();
            eventProcessorHost.UnregisterEventProcessorAsync().Wait();

       }

Here is the exception:

{"Unable to connect to the remote server"}



At first the exception seemed a little misguiding - you get an impression that eventhub is not reachable or you have typed the connectionstring incorrectly. However, if you look at the error stack details, it turns out that Azure Storage Emulator is not started on the machine :). You get more hints when you start digging into the exception details.


{"No connection could be made because the target machine actively refused it 127.0.0.1:10000"}  
at System.Net.HttpWebRequest.EndGetResponse(IAsyncResult asyncResult)
   at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.EndGetResponse[T](IAsyncResult getResponseResult)


Saturday 15 October 2016

Unit testing Azure Table Storage

This one is a little less explored as far as Internet Search goes. How do you mock a Azure Table Storage operations?

I recently ran into this problem and thought of documenting the steps required to easily Shim the Azure Table Storage. I used MSTest and Microsoft Fakes for creating Shim.


Once Fake is added, Add Shims for Azure Table Storage related classes. Few parts are highlighted in yellow as those parts let you control what data is queried upon. Once you have bound the observable source to ShimTableQuery, even the dynamic filters applied in your ATS queries get applied on the mock data.

[TestInitialize]
public void Initialize()
{
                ShimsContext shimsContext = ShimsContext.Create();

                ShimCloudStorageAccount.ParseString = s => new ShimCloudStorageAccount();

                ShimCloudStorageAccount.AllInstances.CreateCloudTableClient = account => new ShimCloudTableClient();

                ShimCloudTableClient.AllInstances.GetTableReferenceString = (client, s) => new ShimCloudTable();

                ShimTableOperation.InsertOrMergeITableEntity = entity => new ShimTableOperation(); // Insert,Update Shim

                ShimTableResult.Constructor = result => new ShimTableResult();

                ShimTableResult.AllInstances.ResultGet = result => new Entity(); // The entity mapped to the storage.

                ShimCloudTable.AllInstances.CreateIfNotExistsAsync = table => Task.FromResult(true);

                ShimCloudTable.AllInstances.ExecuteAsyncTableOperation =
                    (table, operation) => Task.FromResult(new ShimTableResult().Instance);
               
                var q = new ShimTableQuery();
                q.Bind(
entities); // observable collection of entities that you want to reflect in shim table storage.

                ShimCloudTable.AllInstances.CreateQueryOf1<Entity>((table) => q);
}

[TestCleanup]
public void Cleanup()
{
            shimsContext?.Dispose();
}

Now - good thing is that there is nothing to be changed in the code. Your code should work against both actual ATS and Shim ATS.

Wednesday 12 October 2016

oData like filtering and paging in ASP.NET Web API

oData is quite nice. It lets you present a query and filter friendly REST API endpoint by providing you a standard you can stick to. What's more - there is a great set of tooling available in you want to bind your datasource wrappers/providers like EntityFramework or otherwise if you are building your WebAPIs using ASP.NET MVC. It is specifically helpful if you want to add "free" filtering and paging abilities in your WebAPI. The Queryable data source and oData model binder takes care of it for you.

However, when you want to build real world application, often you do not want to expose your data source directly or sometimes it is not practical at all. In those scenarios if you want to support oData like filtering and paging capabilities in your WebAPI action, then it is a little more involved.

There are some options available.

1. Build a IQueryable data source and expose that. It is not practical in most of the cases though.
2. Build a custom model binder that can pick the paging and filtering expressions from QueryString and bind it to a custom model parameter before executing the WebAPI action.

Help links: Link # 1, Link # 2

Here is a simple run down.

Create a simple FilterModel class (you can build similar paging model too).

public class FilterModel
{
        public Dictionary FilterCriteria { get; set; }
}

Create a simple FilterModelBinder class that implements IModelBinder interface.

public class FilterModelBinder : IModelBinder
    {
        public bool BindModel(HttpActionContext actionContext, ModelBindingContext bindingContext)
        {
            if (bindingContext.ModelType != typeof(FilterModel))
            {
                return false;
            }
            FilterModel model = new FilterModel();
            model.FilterCriteria = new Dictionary();
            var qs = actionContext.Request.GetQueryNameValuePairs();
            foreach (var kv in qs)
            {
                model.FilterCriteria.Add(kv.Key, kv.Value);
            }
            bindingContext.Model = model;
            return true;
}

Bind the specific WebAPI action's parameter with custom modelbinder.

public string Get(int id, [ModelBinder(typeof(FilterModelBinder))] FilterModel model)
        {
            return "value";
        }

Now, you can try a url like: http://localhost:3539/api/values/5?firstname eq 'David'&pageNumber=10

You can see the filtering and paging criteria gets populated and passed on to the WebAPI action method for your custom implementation to take over.

Wednesday 7 September 2016

Static Map APIs (snapshots of maps with drawing on it)

Have you ever run into a requirement where you need to share a static image of a map with things (e.g. Pins, Polylines, Areas etc.) drawn on it? My guess is that this is a regular requirement for social integrations/sharing use cases. Nowadays, almost every new product lets you share statistics (could be either related to user's activities on a new mobile app or just sharing a map where a couple of folks were just drawing lines etc.) on Facebook or Twitter or (is there a third :O?).

As it turns out, most of the big Map API providers i.e. Google and Bing, support what they call a static map API which essentially returns what you need. They return an image in response to a web call with stuff drawn on it as indicated in the HTTP request.


Below is a simple guide to get it done with Google APIs.

1. Get a key from Google Maps API portal.


2. You are set :). Just try out the sample URLs mentioned on the developer portal from your browser. Browser should display the picture.






3. Give it a try from Fiddler.




You are set to share map snapshots with others.

Only thing to keep in mind is the pricing model of different providers. For most general usage scenarios, this API is almost free.




Tuesday 6 September 2016

Azure Data Lake vs Azure Blob Storage

References:
https://azure.microsoft.com/en-us/pricing/details/data-lake-store/
https://azure.microsoft.com/en-us/pricing/details/storage/
https://azure.microsoft.com/en-gb/documentation/articles/data-lake-store-comparison-with-blob-storage/

I was going through the costing structure of Azure Data Lake and was wondering how its pricing stacks up against the pricing of Azure Blob Storage.

As it turns out there is a good difference and for very good reasons.

First of all, its storage cost is a little more expensive than Azure Blob Storage. But that is expected as Azure Data Lake is designed for storing massive amount of unstructured and semi-structured data and has no practical limit on the size of the data that needs to be stored.

Azure Data Lake:

Azure Blob Storage:


So the storage cost is about 10 times per GB for ZRS and comparable for other expensive options like GRS and RA-GRS. If you are keeping data in a single zone and you don't plan to store the whole internet (what?), Azure Blob Storage is definitely cheaper.

Now, let us look at transaction costs:

Azure Data Lake:

Point to note: 1 transaction means reading/writing a 128 KB chunk.

Azure Blob Storage:
        $0.0036 per 100,000 transactions for Block Blobs. Transactions include both read and write operations to storage.

So how do you choose:

1. If you are just piling up unstructured data with the requirement of frequent and fast retrieval, go for Azure Blob Storage.
2. If you want to run analytics (ADAL jobs) on stored data, go for Azure Data Lake.
3. If you want to do both frequent and fast data retrieval and perform analytics, duplicate the data in both stores. There is no either/or scenario here. With current state of PaaS services, it seems to be the only way right now.

Monday 22 August 2016

Receiving expiration messages for commands in IoT Hub

When you are dealing with devices in an IoT scenario, you want to ensure that you get a feedback to the commands you issued. Generally, the nature of the feedback itself depends on the type of IoT scenario you are trying to build. For example - if the device is tied to a building which sends out temperature readings of the rooms, it may not be necessary that the device receive a "ChangeTemperature" command immediately. It could very well be acceptable if it take up the command after a minute. However, imagine a scenario where you are trying to remotely control a moving object e.g. drone, toy car, ship, cattle, you may want to ensure that commands are received by the remote object quickly or you get to re-issue another command after the previous command has expired.

IoT Hub's service SDK provides a very useful API to receive feedback to every command that you issued.

1. Enable acknowledgement of the command when you issue it. If you are only looking for negative acknowledgements (expiration, rejection etc.) then set the acknowledgement enum to "NegativeOnly".


2. Start a feedback receiver and verify the status of the commands.



Now, when message expires before it is received by the device, you get a feedback with status "Expired" and you can re-issue the command if you like :).



Thursday 11 August 2016

Creating GIF images programatically

Do you have a requirement to create GIF files dynamically? I had a similar requirement where I needed to create GIF files which will show a set of data point over a fixed time duration.

It turns out that the StackOverflow and GitHub are a great place :). There are plenty of libraries available that could help me achieve this. In the end, I picked BumpKit. I mixed its capabilities with a wonderful extension class mentioned in a StackOverflow post.

Code:

static void Main(string[] args)
        {
            string content = @"{
""items"":
[
    {
        ""Value"": ""120""
    },
    {
        ""Value"": ""125""
    },
    {
        ""Value"": ""130""
    },
    {
        ""Value"": ""135""
    },
    {
        ""
Value"": ""140""
    }
]
            }";
            JObject obj = JObject.Parse(content);
            Console.WriteLine(obj);
            var img = Image.FromFile(@"Untitled.png");
            int delay = 0;
            using (GifWriter writer = new GifWriter("output.gif", 1000, -1))
            {
                foreach (var jItem in obj["items"])
                {
                    delay += 1000;
                    string imagePath = null;
                    var backgroundImage = img.ScaleToFit(new Size(100, 100), false, ScalingMode.Overflow);
                    using (var gfx = Graphics.FromImage(backgroundImage))
                    {
                        gfx.DrawString(jItem["
Value"].ToString(),
                            new Font(FontFamily.Families.First(f => f.Name.Contains("Times")), 15),
                            Brushes.White, 15, 25, 10,
                            new[] { Color.Black},
                            new[] { (float) 1 });
                        gfx.DrawImage(backgroundImage, 0, 0);
                    }
                    writer.WriteFrame(backgroundImage, delay);
                }
            }
            Console.ReadLine();
        }
    }

Untitled.png is a blank image with white background that I created using MSPaint.

Output:


You can let your imagination decide how you want to use this to create interesting gifs programmatically.

Tuesday 9 August 2016

Simple OTP verification - Azure Multi Factor Authentication Provider

Have you ever run into a situation where you need to apply an additional layer of security on certain areas of your application (web and mobile both)? If you look at most of the applications that are available today, they tend to use "Multi Factor Authentication" for similar objective, but during authentication process. One way of implementing MFA is to SMS a one time use password/phrase to user's registered mobile phone number and have user enter that value. Quite familiar, isn't it?

Imagine if you had a similar requirement where you have already authenticated the user but you want the user to go through another verification process before letting them perform some action - this is really not a user authentication process but additional security for whatever he/she is going to do next (e.g. Change Password?, Remove Account? etc.). One Time Password is common utility for such scenario as well.

Let us have a look at Multi Factor Authentication Provider Service available in Microsoft Azure and how to use that to achieve the specific scenario.

You can create a stand alone "MFA Provider" in Azure.


You can choose to associate the MFA provider to an Azure Active Directory if you like. In my scenario, I did not associate it with an AAD. Please do note that this setting can not be modified later, so do give this a thought.



You are now all set to integrate the OTP feature in your application.

Click the "Manage" link.


It opens up another portal where you can configure multiple properties of the MFA server (including the audio message that user would hear when they get a phone call if we use phone call verification). Amount of modification to the out of the box functionality is totally up to you to decide. I would not delve into that in this post.


Click on SDK link. It would open up a page which gives away the standard SDKs that you can use for interacting with the MFA provider you created. Each download has a specific certificate associated with it - therefore if you want to reuse it for other MFA providers, ensure that you have the right certificate.


I downloaded the ASP.NET 2.0 C# version. It is a website. You can open it in visual studio.

The version I downloaded gave me compilation errors as it was not able to identify the highlighted file part of solution.




There can be n reasons of why it was not working - I just took the short route of copying the file content into "example.aspx.cs" file and excluding the highlighted file. Since the source code uses a hard coded path "C:\\cert_key.p12" for locating the certificate file, I (being lazy) copied the file to C:\ and ran the application. You can change the path as per your liking.

Once you run the solution, it opens a test page that lets you test out different capabilities of MFA provider. In this case I am interested in verifying the OTP functionality.


If you are not in USA, it would not work out of the box as country code "1" is hard coded in the SDK code :). You can modify the country code in the class "PfAuthParams".


When testing the app, enter your phone number without country code. If you want to debug, put a break point at "pf_authenticate_internal" function.


"get_response_status" function returns the value of OTP sent to the end user. Your application can store that and compare that against the user entered value for OTP. Easy, right!!


Of course, there are better ways to do this. This is just one simple way to achieve this. Good thing is that you do not have procure a MFA server and you pay for the number of authentication (or number of users) your application goes through.

Wednesday 3 August 2016

protobuf vs json

It can be a demanding ask for you to decide between two great options. Case in point - ProtoBuf or JSON.

From my experience, the decision boils down to following:

1. Is the messaging contract between two parties strict?

winner: protobuf.

protobuf expects folks to work with an Interface Definition Language (IDL) and produce a .proto file. Once the file is ready, both parties need to generate their platform specific proxies from the .proto file and generated code should not be modified.

json is a little liberal format and does not force a schema by itself. There are tools available to force a json schema but json serialization process does not force schema validation itself. It is always an added step and is performed by referencing another 3rd party (and may be not free) library.

2. What is the expectations around speed of serialization/deserialization?

winner: protobuf

3. What is the expectation around size of serialized data?

winner: protobuf

4. What is the expectation around readability of serialized data?

winner: json

5. What is the application?

winner: json (web applications, web apis, dynamic data type requirements.)
winner: protobuf (low latency scenarios, high throughput etc.)


reference: http://maxondev.com/serialization-performance-comparison-c-net-formats-frameworks-xmldatacontractserializer-xmlserializer-binaryformatter-json-newtonsoft-servicestack-text/

http://ganges.usc.edu/pgroupW/images/a/a9/Serializarion_Framework.pdf

File upload from device in IoT Hub ("Message":"ErrorCode:InvalidStorageEndpointProperty;BadRequest")

I was recently trying out the newly introduced feature of "File Upload" in IoT Hub and ran into an interesting issue. I followed every step mentioned in the Azure documentation for the code creation and configured the file upload on IoT Hub using the prescribed link

The code kept failing with following error:

Exception: Exception caught: 'System.ArgumentException' in mscorlib.dll ("{"Message":"ErrorCode:InvalidStorageEndpointProperty;BadRequest",
 "ExceptionMessage":"Tracking ID:98cf5df7f44c4d0f9a463b1cec691e14-G:4-TimeStamp:08/03/2016 05:59:50"}").
 Exception caught: 'System.ArgumentException' in mscorlib.dll
 ("{"Message":"ErrorCode:InvalidStorageEndpointProperty;BadRequest",
 "ExceptionMessage":"Tracking ID:98cf5df7f44c4d0f9a463b1cec691e14-G:4-TimeStamp:08/03/2016 05:59:50"}")

As it turns out there is a bug in the current Azure Portal UI that does not set the storage account details and container name in the "File Upload" options. It shows the UI as following:


You can verify that by getting the details of the resource through PowerShell.

PS C:\windows\system32> Login-AzureRmAccount
PS C:\windows\system32> $res = Get-AzureRmResource -ResourceId /subscriptions/XYZ/resou
rceGroups/StreamAnalytics-Default-East-US/providers/Microsoft.Devices/IotHubs/123iothub
PS C:\windows\system32> $res

Name              : 123iothub
ResourceId        : /subscriptions/XYZ/resourceGroups/StreamAnalytics-Default-East-US/
                    providers/Microsoft.Devices/IotHubs/123iothub
ResourceName      : 123iothub
ResourceType      : Microsoft.Devices/IotHubs
ResourceGroupName : StreamAnalytics-Default-East-US
Location          : eastus
SubscriptionId    :
XYZ
Tags              : {}
Properties        : @{state=Active; provisioningState=Succeeded; hostName=123iothub.azure-devices.net;
                    eventHubEndpoints=; storageEndpoints=; messagingEndpoints=; enableFileUploadNotifications=True;
                    cloudToDevice=; operationsMonitoringProperties=; features=DeviceManagement; generationNumber=0}
ETag              : AAAAAABLpj8=
Sku               : @{name=F1; tier=Free; capacity=1}

PS C:\windows\system32> $res.Properties

state                          : Active
provisioningState              : Succeeded
hostName                       : 123iothub.azure-devices.net
eventHubEndpoints              : @{events=; operationsMonitoringEvents=}
storageEndpoints               : @{$default=}
messagingEndpoints             : @{fileNotifications=}
enableFileUploadNotifications  : True
cloudToDevice                  : @{maxDeliveryCount=10; defaultTtlAsIso8601=PT1H; feedback=}
operationsMonitoringProperties : @{events=}
features                       : DeviceManagement
generationNumber               : 0

PS C:\windows\system32> $res.Properties.storageEndpoints
$default
--------
@{sasTtlAsIso8601=PT2H; connectionString=; containerName=}

So, till the UI gets fixed, the work around is to either to update through PowerShell or use a wonderful site called https://resources.azure.com



Once you have updated the details, the UI starts to show the assigned storage account as well. Of course, the file upload functionality starts to work as well.




Friday 29 July 2016

Writing Roslyn based Code Analyzer and Fixer - 2

Previous post talked about my code analyzer that identifies Code Analysis suppressions in source files and provides fix for the same.

So how does it work.

Reporting diagnostics:


It is as simple as looking at all "attribute lists" and checking if any of the attributes is "SuppressMessage" in a file that is not called "GlobalSuppressions.cs". I know it is not the most optimal thing but I guess it works for me for now.

Roslyn's new version provides a new and elegant way of dealing with code through IAction and Syntax Generators. Here I used "RegisterSyntaxNodeAction" method.

Fixing the warning:


Register your code fix.


Modify the solution. Basically do 3 things:
   a. Add GlobalSuppressions file if it is not present.
   b. Modify the current source file.
   c. Modify the GlobalSuppressions file.

You would notice that I have added some hacks like appending "assembly:" text or changing "SuppressMessage" to "System.Diagnostics.CodeAnalysis.SuppressMessage" to ensure that there is no dependency over using blocks. I am sure you can improve over this.


Happy code analysis!!

Tuesday 26 July 2016

Visual Studio Online Service Hooks

Visual Studio Online provides a nifty feature called "Service Hooks". It is a great way to integrate your Visual Studio Online set up with external services like Slack, Bamboo, custom Web Hooks etc.

You can set it up by clicking the settings icon on the .visualstudio.com. It redirects to the administration portal where you see the tab "Service Hooks".



You would notice a plethora of options to integrate with. Some are well known third party services, some are azure's queue offerings (Azure Storage Queue, Azure Service Bus: Notification Hub, Topic, Queue), and the likes.

You can pick the events that needs to be integrated with the external end point e.g. Build Completed, Check-in Completed etc.



On the next screen, you see options specific to the external system selected on first screen. For example - if you pick Azure Service Bus, you get to provide following details:


And you are set.


I set up the service hook against a check-in event. As soon as a check-in completes, we notice a message in the azure service bus topic.



And if you are wondering what happens when you delete the topic or provide wrong azure service bus details, it is not that drastic. As you would expect, it fails in a safe manner and you can see a detailed log of the failure so that you can correct it.






Thursday 21 July 2016

Python # 1

It is not a easy task to learn language. It gets slightly easier if it is a programming language though. I thought of picking up a bit of Python just for fun. Since I have  a little background in programming using a modern language (C#), it seemed more or less similar with few quirks in terms of syntax of Python.

Here is how easy it is to have a Python program running on your machine:
  1. Go to Python.org site.
  2. Install python - either version 2.x or 3.x or both.
  3. Copy Fibonacci program from home page :)
  4. Run "Python 'python file path'"
Step 4, reminded me of the new DotNet Core way of running programs (dotnet "DLL path"). Maybe they picked it up from older languages to ensure that guys who have been working on Node/Python etc. do not feel strange when running a DotNetCore application.

Python's runtime is interpreted instead of static compilation but it does have many advanced features that you would find in modern day languages - function pointers, lambda, exception handling, garbage collection, class etc. As I was exploring the different aspects/features of the language, I discovered tons of recipes and modules available on the Internet to help out with learning.

Plus you can run it anywhere you like - Windows or Mac or Linux. Linux OS (in my case Ubuntu 16.04 LTS) comes with Python installed - so it saves you the effort.

It does give you few surprises though. I borrowed snippets and wrote the below program (from Python.org and PythonCentral):

import timeit
import time

def fib(n):
    a, b = 0, 1
    while a < n:
        try:
            a, b = b, a + b
        except:
            print(Exception)

def wrapper(func, *args):
    def wrapped():
        return func(*args)
    return wrapped


wrapped = wrapper(fib, 10000000000000000)
response = timeit.timeit(wrapped, number=100)
print(response)


This gives different results on different OS (Windows and Ubuntu) as expected. This particular program seems to run faster on a virtualized Linux OS with 4 GB RAM as against a non-virtualized Windows 10 OS of 32 GB RAM :)


Windows OS: 0.0007623 seconds
Ubuntu OS: 0.00033211 seconds

Strange!! I am sure that is not a conclusive thing.

Wednesday 13 July 2016

Linux - File name and Directory name are case sensitive

Note to self : Not every operating system is as forgiving as Windows.

So I have started to use the Linux (Ubuntu 16.04) operating system as side activity for last couple weeks, primarily because Microsoft has allowed me to be less dependent on OS when I build applications (read .NET core). While using the OS, I learned that it is unforgiving when it comes to names of files and directories. They are case sensitive. See examples below:




Another encounter with the same problem:

I installed "Docker for windows" and "Docker tools for Visual Studio". After adding "docker support" in the ASP.NET Core application, "publish" operation kept failing with following error:


But the file was present :)


And then it struck me again. Script was instructing the docker daemon to look for a file with extension name ".Debug" but the file present in package had extension name as ".debug". Its a wonderful world :).

PS: Experience with Ubuntu has been quite positive for me. I can run the VM with just 4 GB RAM and it does not have any responsiveness issues. Plus I can install VS Code and code in .NET.

Azure Service Fabric : Cloud deployment

Azure Service Fabric is Microsoft's entry into the world of "MicroServices". While the most popular non-Microsoft way of implementing a MicroService is to utilize "containers" (e.g. Docker), Microsoft seems to have chosen the option of "Virtual Machine Scale Set" to implement the first version.

Since the concepts of MicroServices are not something one can cover in single post and there is so much awesome content available on the Internet, I will try to cover the experience with Azure Service Fabric.

In order to run a Azure Service Fabric application, you need to install the Azure Service Fabric SDK on your machine. Once it is set up, you should start to see the relevant options in "New Project" dialog box.


In Service Fabric world, a microservice can be of one of the following categories:


Azure Service Fabric (ASF) comes with a broad implementation of reliable data structures (reliable queues, reliable dictionaries) which can be utilized when building a Stateful Service. Given that ASF cluster indeed launches and manages the application executable inside its infrastructure, it is entirely possible to host an executable (called "guest executable") in ASF environment.

The fastest way to get ramped up on the different type of ASF applications is to go through the samples provided here.

I downloaded one of the web applications (Link) as I was interested in finding out the infrastructure that would be set up for web applications once we deploy the ASF application to Azure. The solution's help page is quite informative and as expected, it runs fantastically on local machine.

Now, let us try to deploy it on Azure. Deploying an ASF application is quite simple - Just select the ASF application in Visual Studio and say "Publish". A prerequisite to the deployment activity is availability of a Service Fabric Cluster in your Azure Subscription. 



While configuring the ASF cluster, you specify the different parameters (including ports that you would be using for different purposes like managing ASF cluster or accessing an HTTP end point that is mapped to the web application hosted in ASF cluster). Once you have deployed the ASF application to the cluster, the resource group looks like following:


Following components are deployed:
  1. Virtual Machine Scale Set: For managing the virtual machines for resiliency, reliability and scale purpose. Default setting is to create a set of 5 virtual machines.
  2. Load Balancer: To balance the load of inbound traffic on specific ports.
  3. Public IP Address: To create a global address for ASF app.
  4. Virtual Network: a vNet so that each machine in VM scale set has unique internal IP.
  5. Service Fabric Cluster
  6. Storage Accounts: To store to logs and VHD image of the VM scale set machines.

You might notice that your web application is not accessible at port 19000 as configured on the ASF cluster. This is because the Web Application's manifest causes the web server to run on port 8505 instead of 19000 (which is configured through ASF cluster configuration).




The fix: correct the LB rule.


Once it is done, the inbound traffic on port 19000 gets rerouted to internal port 8505 on the VM Scale Set machines and application works like charm.