Monday 26 January 2015

Inline JavaScript blocks and Content Security Policy (CSP)

Recently I was working with a team that is writing a large Web application using ASP.NET MVC framework. They had written a lot of inline JavaScript like following:


While this does not look all that awful, this kind of code has couple of issues:

Maintainability and Re-usability:


If you have ever worked in a large team you would be able to connect with the feeling you gets when you see a razor view full of multiple script blocks which perform different types of functions. You start to wonder about the re-usability and maintainability of such code blocks - how many bugs could have been prevented and how many development hours could have been saved if folks just reused the code. Lets accept it - having code into common files makes things simpler.

Performance:


Web Servers and CDNs can easily cache the static files. If the JavaScript code is moved to separate JS files then they can be cached and help improve the performance of the web page.

Security:

Content Security Policy is something you should want to apply on your web pages to prevent any Cross Site Scripting and other similar other attacks. Essentially, if you enable CSP in your web application, then you should be able to specify origin of web page content (that includes JS, CSS, fonts etc.) to the browser that it should trust to load and execute. Though in its current state, it does not force any security measures against browser extensions and add-ons, it is still more secure than to write inline script and fret about consequences if someone found a way to inject JavaScript code in your web pages.

So that is that. What happens when you think of implementing this in your standard ASP.NET MVC application? Potential challenges:

1. A major disadvantage is that you can not use server side objects e.g. Messages declared in your resource files, Constants declared in C# classes etc. in the external JavaScript file. You can essentially find a smart way to do that like using Razor template engine outside view to generate the Script file content dynamically but that could end being a little unmanageable in terms of caching and clean up of dynamically generated content.
2. Another disadvantage would be the in scenarios when you need to use dynamic views in incremental fashion. You may have to switch to JsonResult instead of ViewResult to better manage it through external Javascript file.

Tools:
In order to overcome those challenges, you should move to client side MVVM programming model. There are plenty of tools available now that make such programming model easy e.g. AngularJS, KnockoutJS. Both of these fit well inside ASP.NET MVC application. In order to ease the pain of handling unstructured contracts and data types, you could also leverage a wonderful tool released by Microsoft called TypeScript. Integration between TypeScript and popular client side MVVM programming tools is already made easy by the enthusiastic community - Example # 1, Example # 2

Peace!!

Sunday 18 January 2015

Strange SGEN issue

Mixed mode assembly is built against version 'v2.0.50727' of the runtime and cannot be loaded in the 4.0 runtime without additional configuration information. Debugger\SGEN

I recently faced this strange compilation issue. It was strange because the compilation failed in release mode only. Things compiled and ran fine in debug mode. After reading through the internet about the issue, i figured that compilation process was trying to run "SGEN.exe" in order to create an assembly that contains XML serializers for the types contained in the targeted project. It turned out that the issue was with one of the dependency assemblies referred in the project as it was built against .NET framework version 2.0. Given that my implementation was not targeting any serialization requirements, i simply turned off the option in project properties and things started to work just fine.


Sunday 11 January 2015

SQL Server - data consistency is important

Sometimes it becomes difficult to detect the reasons why SQL Server is complaining about failure of some statement if you are not paying close attention. Here is one of the examples:

Create a table "dbo.Records" and insert some records in it.

Create Table dbo.[Records]
(
Id int not null primary key identity (1,1),
Name nvarchar(100) not null
)
GO

Insert Into dbo.Records
(Name)
SELECT 'Record1'

Create a table dbo.MasterRecords and insert some records in it.

Create Table dbo.[MasterRecords]
(
Id int not null primary key,
Name nvarchar(50) not null
)
GO

Insert Into dbo.MasterRecords
(Id, name)
SELECT 1, 'TestMaster1'

Now if you realize that you should have a reference relationship between the two tables, you may choose to run Entity Framework Migrations. If you let it do it by itself then it will generate the script like following:

ALTER TABLE dbo.[Records]
ADD MasterRecordId Int NOT NULL Default (0)
GO

ALTER TABLE dbo.[Records]
Add constraint FK_RecordId_Master_MasterId Foreign Key (MasterRecordId) References MasterRecords(Id)
GO

You will notice that the last statement of creating foreign key fails. It fails for a very valid reason though. MasterRecords table does not have a record for Id "0" and hence the foreign key can not be created because data present in the two tables is not consistent with whatever is demanded by the sql statement.

Solution? Either make existing records consistent i.e. update the records in Records table to match with existing values in MasterRecords table or Add a record in MasterRecords that has Id value "0".

Insert Into dbo.MasterRecords
(Id, name)
SELECT 0, 'None'

Once you do that everything works fine.