Friday 29 July 2016

Writing Roslyn based Code Analyzer and Fixer - 2

Previous post talked about my code analyzer that identifies Code Analysis suppressions in source files and provides fix for the same.

So how does it work.

Reporting diagnostics:


It is as simple as looking at all "attribute lists" and checking if any of the attributes is "SuppressMessage" in a file that is not called "GlobalSuppressions.cs". I know it is not the most optimal thing but I guess it works for me for now.

Roslyn's new version provides a new and elegant way of dealing with code through IAction and Syntax Generators. Here I used "RegisterSyntaxNodeAction" method.

Fixing the warning:


Register your code fix.


Modify the solution. Basically do 3 things:
   a. Add GlobalSuppressions file if it is not present.
   b. Modify the current source file.
   c. Modify the GlobalSuppressions file.

You would notice that I have added some hacks like appending "assembly:" text or changing "SuppressMessage" to "System.Diagnostics.CodeAnalysis.SuppressMessage" to ensure that there is no dependency over using blocks. I am sure you can improve over this.


Happy code analysis!!

Tuesday 26 July 2016

Visual Studio Online Service Hooks

Visual Studio Online provides a nifty feature called "Service Hooks". It is a great way to integrate your Visual Studio Online set up with external services like Slack, Bamboo, custom Web Hooks etc.

You can set it up by clicking the settings icon on the .visualstudio.com. It redirects to the administration portal where you see the tab "Service Hooks".



You would notice a plethora of options to integrate with. Some are well known third party services, some are azure's queue offerings (Azure Storage Queue, Azure Service Bus: Notification Hub, Topic, Queue), and the likes.

You can pick the events that needs to be integrated with the external end point e.g. Build Completed, Check-in Completed etc.



On the next screen, you see options specific to the external system selected on first screen. For example - if you pick Azure Service Bus, you get to provide following details:


And you are set.


I set up the service hook against a check-in event. As soon as a check-in completes, we notice a message in the azure service bus topic.



And if you are wondering what happens when you delete the topic or provide wrong azure service bus details, it is not that drastic. As you would expect, it fails in a safe manner and you can see a detailed log of the failure so that you can correct it.






Thursday 21 July 2016

Python # 1

It is not a easy task to learn language. It gets slightly easier if it is a programming language though. I thought of picking up a bit of Python just for fun. Since I have  a little background in programming using a modern language (C#), it seemed more or less similar with few quirks in terms of syntax of Python.

Here is how easy it is to have a Python program running on your machine:
  1. Go to Python.org site.
  2. Install python - either version 2.x or 3.x or both.
  3. Copy Fibonacci program from home page :)
  4. Run "Python 'python file path'"
Step 4, reminded me of the new DotNet Core way of running programs (dotnet "DLL path"). Maybe they picked it up from older languages to ensure that guys who have been working on Node/Python etc. do not feel strange when running a DotNetCore application.

Python's runtime is interpreted instead of static compilation but it does have many advanced features that you would find in modern day languages - function pointers, lambda, exception handling, garbage collection, class etc. As I was exploring the different aspects/features of the language, I discovered tons of recipes and modules available on the Internet to help out with learning.

Plus you can run it anywhere you like - Windows or Mac or Linux. Linux OS (in my case Ubuntu 16.04 LTS) comes with Python installed - so it saves you the effort.

It does give you few surprises though. I borrowed snippets and wrote the below program (from Python.org and PythonCentral):

import timeit
import time

def fib(n):
    a, b = 0, 1
    while a < n:
        try:
            a, b = b, a + b
        except:
            print(Exception)

def wrapper(func, *args):
    def wrapped():
        return func(*args)
    return wrapped


wrapped = wrapper(fib, 10000000000000000)
response = timeit.timeit(wrapped, number=100)
print(response)


This gives different results on different OS (Windows and Ubuntu) as expected. This particular program seems to run faster on a virtualized Linux OS with 4 GB RAM as against a non-virtualized Windows 10 OS of 32 GB RAM :)


Windows OS: 0.0007623 seconds
Ubuntu OS: 0.00033211 seconds

Strange!! I am sure that is not a conclusive thing.

Wednesday 13 July 2016

Linux - File name and Directory name are case sensitive

Note to self : Not every operating system is as forgiving as Windows.

So I have started to use the Linux (Ubuntu 16.04) operating system as side activity for last couple weeks, primarily because Microsoft has allowed me to be less dependent on OS when I build applications (read .NET core). While using the OS, I learned that it is unforgiving when it comes to names of files and directories. They are case sensitive. See examples below:




Another encounter with the same problem:

I installed "Docker for windows" and "Docker tools for Visual Studio". After adding "docker support" in the ASP.NET Core application, "publish" operation kept failing with following error:


But the file was present :)


And then it struck me again. Script was instructing the docker daemon to look for a file with extension name ".Debug" but the file present in package had extension name as ".debug". Its a wonderful world :).

PS: Experience with Ubuntu has been quite positive for me. I can run the VM with just 4 GB RAM and it does not have any responsiveness issues. Plus I can install VS Code and code in .NET.

Azure Service Fabric : Cloud deployment

Azure Service Fabric is Microsoft's entry into the world of "MicroServices". While the most popular non-Microsoft way of implementing a MicroService is to utilize "containers" (e.g. Docker), Microsoft seems to have chosen the option of "Virtual Machine Scale Set" to implement the first version.

Since the concepts of MicroServices are not something one can cover in single post and there is so much awesome content available on the Internet, I will try to cover the experience with Azure Service Fabric.

In order to run a Azure Service Fabric application, you need to install the Azure Service Fabric SDK on your machine. Once it is set up, you should start to see the relevant options in "New Project" dialog box.


In Service Fabric world, a microservice can be of one of the following categories:


Azure Service Fabric (ASF) comes with a broad implementation of reliable data structures (reliable queues, reliable dictionaries) which can be utilized when building a Stateful Service. Given that ASF cluster indeed launches and manages the application executable inside its infrastructure, it is entirely possible to host an executable (called "guest executable") in ASF environment.

The fastest way to get ramped up on the different type of ASF applications is to go through the samples provided here.

I downloaded one of the web applications (Link) as I was interested in finding out the infrastructure that would be set up for web applications once we deploy the ASF application to Azure. The solution's help page is quite informative and as expected, it runs fantastically on local machine.

Now, let us try to deploy it on Azure. Deploying an ASF application is quite simple - Just select the ASF application in Visual Studio and say "Publish". A prerequisite to the deployment activity is availability of a Service Fabric Cluster in your Azure Subscription. 



While configuring the ASF cluster, you specify the different parameters (including ports that you would be using for different purposes like managing ASF cluster or accessing an HTTP end point that is mapped to the web application hosted in ASF cluster). Once you have deployed the ASF application to the cluster, the resource group looks like following:


Following components are deployed:
  1. Virtual Machine Scale Set: For managing the virtual machines for resiliency, reliability and scale purpose. Default setting is to create a set of 5 virtual machines.
  2. Load Balancer: To balance the load of inbound traffic on specific ports.
  3. Public IP Address: To create a global address for ASF app.
  4. Virtual Network: a vNet so that each machine in VM scale set has unique internal IP.
  5. Service Fabric Cluster
  6. Storage Accounts: To store to logs and VHD image of the VM scale set machines.

You might notice that your web application is not accessible at port 19000 as configured on the ASF cluster. This is because the Web Application's manifest causes the web server to run on port 8505 instead of 19000 (which is configured through ASF cluster configuration).




The fix: correct the LB rule.


Once it is done, the inbound traffic on port 19000 gets rerouted to internal port 8505 on the VM Scale Set machines and application works like charm.



Friday 8 July 2016

Audit Logs on Microsoft Azure Platform

Auditing is something that everyone wants. Auditing is also something that everyone wants for Free :). If you account for number of hours you had to spend to implement auditing functionality in an application, it would be quite high and that would mean that the Auditing "feature" can not be delivered to a customer for free. Your mind is caught in two conflicting houghts.

When you are using Microsoft's Azure platform, there is plenty of logging features available which you can utilize to build a basic infrastructure "Auditing" for the set up without much manual effort. Of course, there is some cost but that is running cost for storing data and not related to effort related to implement it. So how do you get it done?

Search "AuditLogs" on the Azure portal :)


You will be able to see all the activities done on different resources by different folks. Now you can tell who deleted what :)


What is better is that you can configure export of data. The destination can be either an Azure Storage Account (good for batch processing) or an Event Hub (for real time processing).



You can view the logged audit log data through any Azure Storage data viewer tool of your choice. Table name is "insights-operational-logs". Logged data contains lot of useful information.






Once you have either of the two exports set up, you can achieve a lot of monitoring on your azure subscription. Imagine streaming this data through a Stream Analytics Job :).

Application Insights Trends In Visual Studio Update 3

Visual Studio Update 3 brings out plethora of new and useful features. One of the features added to this update is named "Application Insights Trends". I have been a long time user of "Application Insights" service and have found it very useful. Integrating the "Application Insights" into any .NET application, whether it is a web application, a web job or an on premise application, is very easy and the tooling support is pretty good as well. If there ever was an issue with AI, it was that whenever I wanted to look at the collected data or query it for different metric/factors, I was forced to leave Visual Studio and login into browser. With "Application Insights Trends" this problem is resolved.


If, like me, you are also not able to locate the "Application Insights Trends" after installing Update 3, use the quick search feature of Visual Studio - I think this feature needs to be popularized more.



Once you have opened the window, you can either choose to use predefined searches/analysis or create a custom search. If you want to switch Azure accounts, use the "Settings" button.



You can apply the filters and search for specific metric as well.


Or view predefined reports.


and drill down :)