Category Archives: Uncategorized

Validation in MVC and its different way of implementation

We can implement validation in two ways
1) Client side validation
2) Server side validation

In Server side by default we make use of existing validation through DataAnnotations
Declare default validation attribute like “Required”, “RegularExpression” , “Range” on top of required class properties to enforce certain rules.

Other way is by implementing IvalidatableObject

1) This interface contain Validate(), its return type is Ienumerable
2) By implementing IvalidatableObject interface in a class and implement our own custom rules and condition inside the Validate()
3) Since it could be implement our own custom rules and condition it is easy for cross validation property
But the same has limitation in reusability because sometime it become tightly coupled.

Model binding in controller
> By default model validation handled during model binding process, when data requested and start binding with class property, it apply assigned attribute validation attribute classes to check data is correct.
> For validation implemented using IvalidatableObject interface will happen next to prebuilt validation, this is because to check each property has loaded with correct and required data, before implemeting cross validation property.
> After this two steps completed the results of validation stored in “ModelState” property of Controller. So even when we implement our custom validation using IvalidatableObject interface, those validation result will also automatically updated in “ModelState.IsValid” property

The other way is by using ValidationAttribute class as below

Code-based Migration in EF code first approch

In Migration we have two type of migration

  1. Automated Migration
  2. Code-based Migration

Below information’s are for Code-based Migration

Enable-Migrations: Enables the migration in your project by creating a Configuration class.
PM> Enable-Migrations

Add-Migration: Creates a new migration class as per specified name with the Up() and Down() methods.
PM> Add-Migration "MigrationName"
The above command will create a “timestamp_MigrationName”.cs file with the Up() and Down() methods

Update-Database: Executes the last migration file created by the Add-Migration command and applies changes to the database schema.
PM> Update-Database

Rollback Migration to previous one
PM> update-database -TargetMigration:"timestamp_MigrationName

ASP.NET Core AddSingleton vs AddScoped vs AddTransient

In .net core while register services with dependency injection container in “ConfigureServices(IServiceCollection services)” in Startup.cs, we can define the lifetime of the instance across the application using any one of the three methods.


  • Only one instance will create and that instance will be available through out the application.
  • So that when ever we need the instance that created during application start we could able to access it at any point in the application.
  • At any number of http request we could able to access the same instance


  • For each http request new instance will gets created.
  • The instance gets created using AddScoped, will be alive only for that particular http request


  • Irrespective of http request, every time when we look for an instance, it will create an instance every time
  • Example, in an single http request if we are trying to access Student repository instance more than a time, each time it will create new instance separately

Steps to Install OpenSSL in Ubuntu

Update the Ubuntu repository and install package dependencies
sudo apt update
sudo apt install build-essential checkinstall zlib1g-dev -y

Download OpenSSL
cd /usr/local/src/

Now extract the openssl.tar.gz file, and go to the ‘openssl’ directory.
tar -xf openssl-1.0.2o.tar.gz
cd openssl-1.0.2o

Install OpenSSL
cd /usr/local/src/openssl-1.0.2o

Configure and compile OpenSSL
./config --prefix=/usr/local/ssl --openssldir=/usr/local/ssl shared zlib

make test
make install

Configure Link Libraries

cd /etc/
vim openssl-1.0.2o.conf

Paste the openssl library path directory.

Save and exit.

Reload the dynamic link
sudo ldconfig -v

Configure OpenSSL Binary
mv /usr/bin/c_rehash /usr/bin/c_rehash.BEKUP
mv /usr/bin/openssl /usr/bin/openssl.BEKUP

Edit the ‘/etc/environment’ file using vim.
vim /etc/environment

Now add the new OpenSSL binary directory as below

Save and exit.

Reload the environment file and test the new updated binary PATH.
source /etc/environment
echo $PATH
which openssl

openssl version -a

Steps to create Certificate using OpenSSL

Step 1: Create Private Key

genrsaout C:\cert\private.key 1024


Step 2: Create intermediate self-signed root CA certificate

reqnew -x509 –days 1826 –key C:\cert\private.key –out C:\cert\rootCert.crt


Step 3: Create intermediate private key

genrsaout C:\cert\intermediatekey.key 1024


Step 4: Create intermediate request file

reqnewkey C:\cert\intermediatekey.key –out C:\cert\intermediate.csr


Step 5: Create intermediate certificate

x509 -req –days 1826 -in C:\cert\intermediate.csr –CA C:\cert\DSS\rootCert.crt -CAkey C:\cert\private.key –set_serial 01 –out C:\cert\intermediateCert.crt


Step 6: Package all the above file into .pfx file


pkcs12exportout C:\cert\DSS2.pfx –inkey C:\cert\DSS\intermediatekey.key –in C:\cert\intermediateCert.crt –chainCAfile C:\cert\rootCert.crt

How to sign and verify data using digital certificate

While communicate with other there would be a sensitive data which need to reach the other end in secure manner.

Digitally signed data comes in to picture which provide us a secure and confidential way to communicate.

Below are the code to digitally sign the plain text and send it over other end, there signed data will be verified using the public key.

To sign and verify the data we need Private and Public key. I am going to use Private key to sign the text and using public key I am going to validate the signed text.

Certificate Encrypt and Decrypt using SHA1 algorithm



Signing text using Private Key


Verify signed text in other end using public key



How can you get the best out of Sitecore?

Get the website requirment, analyze the site’s requirements, breakdown into smaller entities, using entities build normalized sitecore data template, then work further understand the content structure.

To build good template, should aware of below checklist

– Based on the site’s requirment think about entities and fields.
– Group all releted fields under logical section.
– Identify all common fields and create a base template.
– Field name used should be easily understand by business users.
– “Display Name” property to provide friendly name to an item.
– Always set default values in standard values.
– Apply Icon to templates, so that it is easy to recognize.
– Presentation should be configured in standard values.
– Utilize branch templates for predefined structure creation.

Content Structure:
Once the templates have been created, you need to begin work on creating Content Item in content tree.

– As a standard, try not to have more than 100 items under particular node. If expecting more than 100, then consider using buckets or create folders in a way that it doesn’t exceed more then 100 child per folder/item.
– Try to have only page items created under home page.
– Make sure we have security added for items based on user access roles.
– For faster access index should be configured.
– Also maintain minimal version for each item.


Now the content structure is ready, let’s get started on creating the presentation

– Make sure all the LAYOUTS are created under ‘/Sitecore/Layout/Layouts’, can also create sub folders if needed with access limitation.
– For each device layout should not have more than 3 layouts structure.
– Layout details should be assigned in standard values and not in Template/Item level.
– Use Field Renderer object to render the fields on presentation.
– Caching options should be configured whenever the controls are used, based on the control definitions.



ASP.NET 5 Preview runtime

ASP.NET 5 includes the following features:

1) New flexible and cross-platform runtime
2) New modular HTTP request pipeline
3) Cloud-ready environment configuration
4) Unified programming model that combines MVC, Web API, and Web Pages
5) Ability to see changes without re-building the project
6) Side-by-side versioning of the .NET Framework
7) Ability to self-host or host on IIS
8) New tools in Visual Studio 2015
9) Open source in GitHub


Here are some helpful links to get you started with ASP.NET 5 Preview:


For more information ASP.NET 5 Overview

Getting Started – Git

ADO.NET best practices – Reading data from data reader

Nice extract that i found, as a developer who wanted to implement some standards in there code  need to read, if you are expert in it just recap it. 🙂

{love to code?}

I have seen many people using DataReader incorrectly. In this post, I will try to explain some good practices that can be followed when reading from a data reader. Consider the following problematic code,

How many problems can you figure out from the above code? There are many problems with this code,

View original post 500 more words

%d bloggers like this: