Monday 31 August 2015

JsonRequestBehavior in ASP.NET MVC

ASP.NET MVC supports multiple type of Result responses to an Action. It can be a View, a File, a Content, a JavaScript, a Json object or even a custom result. For most of the cases where you want o support AJAX calls, you may choose to use JsonResult. While creating it is quite straightforward, one of the common mistakes that I found was around the usage of JsonRequestBehavior value. If you use the default implementation of Json method present in Controller class then you way notice few overloads of the method.
 
 
Few of the overloads accept JsonRequestBehavior as a parameter. JsonRequestBehavior is an enum that supports two values - AllowGet, DenyGet

 
AllowGet and DenyGet values specify if the JsonResult response should be allowed or denied in an Action that is exposed over HTTP GET method. If you do not specify the value i.e. use an overload of Json method that does not take this parameter, the internal implementation uses DenyGet which means the Action method will return Http Status 500 in case it is exposed over HttpGet. This is, in my opinion, a good thing because you need to be cautious about what you expose as a GET operation.
 
You would surmised by now that these kind of restrictions do not allow on HTTP POST operations :).
 

Saturday 22 August 2015

Stories of 2 IActionFilter interfaces

If your web application has both classic MVC controllers and web API controllers and you want to apply one Action Filter to both, then you need to be careful about the two types of IActionFilter interfaces that need to be used. At first, I thought writing standard MVC action filter should work but then learned otherwise through bitter experience of debugging :).
System.Web.Mvc.IActionFilter -[ System.Web.Mvc assembly]




System.Web.Http.Filters.IActionFilter -[ System.Web.Http assembly]




Both interfaces are named IActionFilter but have different definitions and reside in different assemblies. This is so because even the base classes for classic MVC controller and ApiController reside in different assemblies too.

Learning: If your web application has both ApiController and Controller then you will need to have two different implementation (at least two different placeholder classes) if you want to apply common Action Filter logic.

Saturday 15 August 2015

Javascript serializer, performance, Kendo Grid

We are using Kendo controls (mainly Kendo Grid for web pages) in one of our web applications. Kendo's controls are very good in terms of simplicity that they bring to build standard (but slightly complex) web application features. However, sometimes you run into some strange issues.

One of our screens used client side paging feature of Kendo Grid and one day, out of nowhere, we started to notice exceptions like below:

Exception Type: System.InvalidOperationException  Exception: System.InvalidOperationException: Error during serialization or deserialization using the JSON JavaScriptSerializer. The length of the string exceeds the value set on the maxJsonLength property.
   at System.Web.Script.Serialization.JavaScriptSerializer.Serialize(Object obj, SerializationFormat serializationFormat)
   at Kendo.Mvc.Infrastructure.JavaScriptInitializer.Serialize(IDictionary`2 object)
   at Kendo.Mvc.Infrastructure.JavaScriptInitializer.Serialize(IDictionary`2 object)
   at Kendo.Mvc.Infrastructure.JavaScriptInitializer.Serialize(IDictionary`2 object)
   at Kendo.Mvc.Infrastructure.JavaScriptInitializer.InitializeFor(String selector, String name, IDictionary`2 options)
   at Kendo.Mvc.UI.Grid`1.WriteInitializationScript(TextWriter writer)
   at Kendo.Mvc.UI.WidgetBase.WriteHtml(HtmlTextWriter writer)
   at Kendo.Mvc.UI.Grid`1.WriteHtml(HtmlTextWriter writer)
   at Kendo.Mvc.UI.WidgetBase.ToHtmlString()
   at Kendo.Mvc.UI.Fluent.WidgetBuilderBase`2.ToHtmlString()
   at System.Web.WebPages.WebPageExecutingBase.WriteTo(TextWriter writer, Object content)

It was quite strange given that we did not have more than 5000 records and i hoped kendo's grid would be able to handle it. As it turned out, there were two reasons:

1. As a default, Kendo's controls use .NET's in-built JavaScriptSerializer class which has its own limitations. Good news is that Kendo controls give you a way to switch to JavaScript Serializer of your choice (JSON.Net in my case). This improved the # of records that could be handled by the grid. In fact, it simply doubled the # of records that could be handled through client side paging/sorting.

2. General recommendation from kendo (and it applies everywhere irrespective of if you are using a set of professional controls or building your own), is that a webpage should carry data that is absolutely required. In fact, every view in the page should have its own ViewModel class which is simple in structure.

Sunday 9 August 2015

Tuple in ASP.NET MVC

So Tuple is a good thing. It is very useful in scenarios where you need to return multiple values from a method but do not intend to create a dedicated DTO for that sole purpose. It is not of structure type either which means it is passed by reference. Additionally, it is a read-only placeholder as all the values that you want to return must be set in the constructor. The properties Item1, Item2... are read-only properties. That indicates that the purpose of the Tuple class is to serve purely as a read-only DTO.

This brings us to another point. Given that it is quite useful, you may want to use Tuple for binding to ASP.NET MVC views. However, given its read-only design, it will be problematic during non-HttpGet operations. There is a way though. Custom ModelBinder.

I created 2 methods in my controller for testing Tuple - one for Get and other for Post.

        [HttpGet]
        public ActionResult TestTuple()
        {
            Tuple t = new Tuple("test", 123);
            return View(t);
        }

        [HttpPost]
        public ActionResult TestTuple(Tuple tuple)
        {
            return new EmptyResult();
        }

The view part is straightforward:

@model Tuple
@{
    ViewBag.Title = "TestTuple";
}

@Html.BeginForm("TestTuple", "Home", FormMethod.Post){   

    @Html.EditorFor(m => m);  

    input type="submit" value="Test Tuple"/
}

If you post the form, you get following exception:

No parameterless constructor defined for this object. Object type 'System.Tuple`2[[System.String, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089],[System.Int32, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]]'.

Quite logical. Lets write a custom model binder and try to create Tuple from posted values.

public class CustomModelBinder : System.Web.Mvc.DefaultModelBinder
    {
        protected override object CreateModel(ControllerContext controllerContext, System.Web.Mvc.ModelBindingContext bindingContext, Type modelType)
        {
            if (modelType == typeof(Tuple))
            {
                Tuple t = new Tuple(bindingContext.ValueProvider.GetValue("Item1").AttemptedValue,
                    int.Parse(bindingContext.ValueProvider.GetValue("Item2").AttemptedValue));
                return t;
            }

            return base.CreateModel(controllerContext, bindingContext, modelType);
        }
    }

And assign the same in App_Start method of Global.asax.

ModelBinders.Binders.DefaultBinder = new CustomModelBinder();

Everything works now :). Similar custom model binding steps can be applied wherever standard binding does not work. 

Saturday 1 August 2015

Semantic Logging Application Block - Issues with Sql databaes provider customization

Enterprise Library's Semantic Logging Application Block (SLAB) is the right way to capture structured information (trace, audit and error). Enterprise Library 6.0 comes with an out of the box windows service which lets you capture these events in an out-of-process manner. This capability helps in two ways:

1. SLAB uses high performance ETW (event tracing for windows) infrastructure to publish the events. Multiple sinks can be configured to subscribe to the desired events and publish to desired destinations e.g. Flat File, SQL Server Database, SQL Azure Database, Email etc. In fact, the out of the box service comes prepackaged with listeners for file and SQL database (on-premise and Azure). This, also means, your application's core implementation is not blocked to complete the publishing process and therefore it can perform better.

2. When you use Windows Service based out-of-process listener host, you can actually cater to multiple applications which is an awesome capability. If you have a good telemetry design and structure in place, it would be really easy for you to have all of your enterprise's applications dump information to single destination and enable you to analyze the captured data for better decisions.

Now, let us switch to issues - Most of us would eventually decide to dump the traces, error, audit into single database (because we are too lazy and it is super easy to write a T-SQL query). However, there are couple of gotchas:

1. Out of the box database listener/sink gives us the capability to rename the destination tables and have things work in seamless fashion. You can specify the table name in the configuration file. That being said, if you change the schema (say, add new columns and rename some columns e.g. rename ProviderID to MyProviderId), you will have to change the dbo.WriteTraces SP to adhere to the new schema. That works (mostly). You would start to notice the intermittent occurrence of the following warning in event viewer :

EventId : 102, Level : Warning, Message : A database sink discarded XXX events due to failures while attempting to publish a batch., Payload : [numberOfEntries : XXX] , EventName : DatabaseSinkPublishEventsFailedAndDiscardsEntriesInfo, Timestamp : 
EventId : 103, Level : Warning, Message : A database sink discarded an event with index 0 due to a failure while attempting to publish a batch. Message: The given ColumnMapping does not match up with any column in the source or destination., Payload : [message : The given ColumnMapping does not match up with any column in the source or destination.] [entryOrder : 0] , EventName : DatabaseSinkPublishEventsFailedAndDiscardSingleEntryInfo, Timestamp : 

If you run SQL profiler to figure out the root cause of "ColumnMapping" mismatch then you would notice the intermittent occurrence of following command:

select @@trancount; SET FMTONLY ON select * from Trace SET FMTONLY OFF exec ..sp_tablecollations_100 N'.[Trace]'

And you wonder why? You modified the SP to take care of mismatches and things should work. Well, look into the implementation of database listenter/sink and you would know the reason.

The out of the box implementation has following function:

private async Task PublishEventsAsync(IList collection)
        {
            int publishedEvents = collection.Count;

            try
            {
                if (collection.Count > 128)
                {
                    await this.UseSqlBulkCopy(collection).ConfigureAwait(false);
                }
                else
                {
                    await this.UseStoredProcedure(collection).ConfigureAwait(false);
                }

                return publishedEvents;
            }
            catch (OperationCanceledException)
            {
                return 0;
            }
            catch (Exception ex)
            {
                if (this.cancellationTokenSource.IsCancellationRequested)
                {
                    return 0;
                }

                SemanticLoggingEventSource.Log.DatabaseSinkPublishEventsFailed(ex.ToString());
                throw;
            }

        }

This means if the application is publishing large number of events then the default behavior of database sink is to use SqlBulkCopy to copy data to destination table. That would mean that Bulk Copy can actually fail if your new table schema is not compatible with the assumed structure present in code. In fact, in my case, the bulk copy operation failed due to case mismatch in column name e.g. EventId name did not work, EventID worked :). Probably, that is the way bulk copy works. There is another option by which you can force the service to always use the new SP that you wrote. You can change the maximum buffer batch size in configuration file to 128 and it will ensure that Bulk Copy operation is never used. But think about impact of this change before doing it. Bulk Copy operations are pretty efficient for transferring large data :).

One good news is that SLAB lets you specify batch size for each type of event and therefore you can selectively decide your future.

2. If one of the event in the batch carries a payload which has more than 4000 characters, whole batch of the events will get discarded for Sql provider due to schema mismatch. That happens because the dbo.WriteTraces SP uses a table type parameter which limits the payload size. So be careful!!

Happy logging.