Memory dumps of a developer

Articles and tutorials on .NET Core, ASP.NET MVC, Kendo UI, Windows 10, Windows Mobile, Orchard

  • Breaking Changes in coming your way for ASP.NET Core 3.0

    Microsoft is in the process of releasing a new version for their .NET Core framework and there are some significant changes coming your way in that release. The most important ones are

    1. Removal of some sub-components
    2. Removal of the PackageReference to Microsoft.AspNetCore.App
    3. Reducing duplication between NuGet packages and shared frameworks

    In v3.0, the ASP.NET Core framework will contain only those assemblies which are fully developed, supported and serviceable by Microsoft. They are doing this to reap all the benefits provided by the .NET Core shared frameworks like smaller deployment size, faster bootup time, centralized patching etc

    Removal of some sub-components

    In this version, they are removing some sub-components from the ASP.NET Core shared framework and most notable among them are the following

    Json.NET 

    JSON format has become so popular these days and has become the primary method for transferring data in all modern applications. But .NET doesn't have a built-in library to deal with JSON and relied on third-party libraries like JSON.NET for some time now. In ASP.NET Core, it has a tight integration with Json.NET which restricted the users to chose another library or a different version of Json.NET itself. 

    So with version 3.0 they have decoupled Json.NET from the ASP.NET Core shared framework and is planning to replace it with high-performance JSON APIs. That means you will now need to add Json.NET as a separate package in you ASP.NET Core 3.0 project

    and then update your ConfigureServices method to include a call to AddNewtonsoftJson() as shown below

    public void ConfigureServices(IServiceCollection services)
    {
               services.AddMvc()
                   .AddNewtonsoftJson();
    }
    


  • Resilient Connections in Entity Framework Core

    When you work with databases in your application, you may face connection issues from time to time which is beyond our control. When this happens normally the application will raise a connection timeout or server not available exception. In Entity Framework core you can overcome this kind of scenario by setting up resilient connections with exponential retries. 

    The code snippet given below will retry to connect up to 10 times in case of a failure with a delay of 30 seconds in-between each try.

    services.AddDbContext(o =>
    {
        o.UseSqlServer(connectionString,
            sqlServerOptionsAction: options =>
            {
                options.EnableRetryOnFailure(maxRetryCount: 10,
                    maxRetryDelay: TimeSpan.FromSeconds(30),
                    errorNumbersToAdd: null);
            });
    });	
    

    Also, when you enable retries in EF Core connections, each operation you perform will become its own retriable operation. So that means whenever we perform a query or a call to the SaveChanges method, it will be retried as a unit during a transient failure scenario. But when you initiate a transaction block in your code using BeginTransaction, then you are defining your own group that needs to treated as a single unit.

    So, in this case, you will need to manually invoke an execution strategy with a delegate method that contains all that need to be executed as a block. So, when a transient failure occurs, the execution strategy will invoke the delegate again as part of the retry operation.

    var strategy = blogContext.Database.CreateExecutionStrategy();
    await strategy.ExecuteAsync(async () =>
    {
    
        using (var transaction = blogContext.Database.BeginTransaction())
        {
            blogContext.PostItems.Update(postItem);
            await blogContext.SaveChangesAsync();
    
    
            if (raisePostChangedEvent)
            await eventLogService.SaveEventAsync(postChangedEvent);
            transaction.Commit();
        }
    

    Reference : https://docs.microsoft.com/en-us/dotnet/standard/modern-web-apps-azure-architecture/work-with-data-in-asp-net-core-apps


  • Deploying Resources to Azure using Azure Resource Manager Templates - Part #3

    In the previous post, I had already explained the steps that are needed for deployment in Azure using an empty template. Let's explore further and see how we can deploy a storage account in Azure using ARM Templates.

    Step 1: Create the template file

    Open any text editor and create a template like the one given below. Save it as a JSON file with name StorageTemplate.json
    What we are doing with this template is that we have defined two parameters named storageName and storageLocation for accepting the name of the resource as well as the location where it needs to be provisioned

    And, under the resource section, we will use these parameters to set the name and the location properties for the storage account. Also, we will set the values for the resource type, kind and SKU

    {
        "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
        "contentVersion": "1.0.0.1",
        "parameters": {
            "storageName": {
                "type": "string"
            },
            "storageLocation": {
                "type": "string"
            }
        },
        "variables": {},
        "resources": [
            {
              "apiVersion": "2016-01-01",
              "type": "Microsoft.Storage/storageAccounts",
              "name":  "[parameters('storageName')]",
              "location": "[parameters('storageLocation')]",  
              "sku": {
                "name": "Standard_LRS"
              },
              "kind": "Storage",
              "properties": {
              }
            }
       
        ],
        "outputs": {}
    }
    
    
    


  • Deploy to Azure using an Empty Azure Resource Manager Templates - Part #2

    In the earlier post, I went through the basic concepts and terminologies for deploying resources using the Azure Resource Manager(ARM) templates. Please refer it using this link for quick reference. In this post, I will show you how to perform a deployment using an empty ARM template.

    Step 1: Create an empty template

    Create an empty template like the one given below using any text editor. Save as a JSON file with any name you want. In my case, I named it as EmptyTemplate.json

    {
      "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
      "contentVersion": "1.0.0.0",
      "parameters": {
      },
      "variables": {
      },
      "resources": [
      ],
      "outputs": {
      }
    }
    

    Step 2: Configure Azure CLI

    I am going to use Azure CLI for doing the deployment. Before you start deploying, make sure that your local machine has got Azure CLI installed and configured correctly. Azure CLI is a cross-platform tool is available for download from here, which helps you to connect to your Azure subscription and execute various commands to manage and monitor it.

    The best way to verify it's installed or not by executing the below command.

    az --version


  • Deploying Resources to Azure using Azure Resource Manager Templates - Part #1

    A component provisioned in Azure can contain a set of resources, say for example a Virtual Machine in Azure can have components such as Storage Accounts, Virtual Networks, IP address etc. And most of the times you may want to manage, deploy and delete these interdependent resources as a single entity. Azure Resource Manager(ARM) will help you to work with these resources in a single, coordinated operation.

    ARM supports various tools for interacting with its management layer, the most used ones include Azure CLI, Azure Powershell, REST APIs, and Azure Cloud Shell. The portal gets the newly released functionalities with 180 days of the initial release.

    The tools interact with the Azure Resource Manager API, which then passes it to the Resource Manager Service to perform the authentication and authorization of the request. Once this is completed, the Resource Manager then routes the request to the appropriate service for performing the requested operation.

    Source: docs.microsoft.com


  • Creating NuGet Package using .NET Core CLI

    NuGet is a great tool in managing your third-party dependencies as well as in distributing your own libraries. The dotnet pack command available in .NET Core CLI toolset will help you to build the project and creates a NuGet package. The output of this command will be a .nupkg file which can be used to push to a public registry like nuget.org or to any other private registries.

    Let's start with a .NET Core class library project and see how we can pack that using the CLI toolchain

    dotnet new class --name SampleLib

    It will create a new project with one file in C# inside it. Let's create a package using the below command


  • Spin up Docker Containers in a Kubernetes Cluster hosted in Azure Container Service

    In one of the earlier posts, I have explained in details about the steps that need to be performed for running Docker containers in a Kubernetes cluster hosted in Azure. In that example, I used the default IIS image from Docker Hub for spinning up a new container in the cluster. In this post, I will show you how to containerize an ASP.NET Core MVC application using a private Docker registry and spin-off containers in a cluster hosted in Azure using Azure Container Service

    Pre-Requisites

    1. Azure Subscription
    2. Azure CLI
    3. kubectl 

    You need to install both the CLI tools for Azure and Kubernetes in your local machine for these commands to work and needs an Azure subscription for deploying the cluster in Azure Container Service.

    Step 1: Create a Kubernetes Cluster using Azure Container Service

    The first step is to create the create the cluster in Azure, for that we will use the az acs create command available in Azure CLI. You need to provide a resource group and a name for the cluster. A resource group in Azure is like a virtual container that holds a collection of assets for easy monitoring, access control etc. The --generate-ssh-keys parameter will tell the command to create the public and private key files which can be used for connecting to the cluster.

    az acs create --orchestrator-type kubernetes --resource-group TrainingInstanceRG1 --name TrainingCluster1 --generate-ssh-keys

    Step 2: Get the credentials for the Kubernetes Cluster

    Now we need to download the credentials to our local machine for accessing the cluster. 

    az acs kubernetes get-credentials --name TrainingCluster1 --resource-group TrainingInstanceRG1

    When the command is executed it will download the key files to your local machine and by default, it will reside in a folder under user folder. 


  • Running Sql Server Linux on a Docker Container

    One of the new features added to the SQL Server 2016 is the ability to run it on Linux. Along with that, they brought in support for Docker and released an official image which is available in Docker Hub. It has already got over 5 million + pulls from the repository and is gaining momentum day by day. Let's see the various steps for setting up a container based on this image.

    Step 1 

    Search for the image in the Docker Hub and pull the official image from the Docker hub

    docker search microsoft

    So the image we are looking for is  mssql-server-linux

    docker pull microsoft/mssql-server-linux


  • Implementing Functional Testing in MVC Application using ASP.NET Core 2.1.0-preview1

    Functional Testing plays an important role in delivering software products with great quality and reliability. Even though the ability to write in-memory functional tests in ASP. NET Core MVC application is present in ASP.NET Core 2.0, it had some pitfalls

    1. Manual copying of the .deps files from application project into the bin folder of the test project
    2. Needed to manually set the content root of the application project so that static files and views can be found
    3. Bootstrapping  the app on the Test Server.

    In ASP.NET Core 2.1, Microsoft has released a new package Microsoft.AspNetCore.Mvc.Testing which solves all the above mentioned problems and helps you to write and execute in-memory functional tests more efficiently. To check how it can be done, let's create two projects - one for the web application and another for our test project.

    Step 1

    Create an ASP.NET Core MVC project without any authentication. The following command will create one in the folder specified by the -o switch

    dotnet new mvc -au none -o SampleWeb/src/WebApp

    Step 2

    Let's add a test project based on Xunit test framework using the below command. As in Step 1, the project will be created in the specified folder

    dotnet new xunit -o SampleWeb/test/WebApp.Tests

    Step 3

    Now we will create a solution file and add these two projects into it. 

    cd SampleWeb

    dotnet new sln

    dotnet sln add src/WebApp/WebApp.csproj

    dotnet sln add .\test\WebApp.Tests\WebApp.Tests.csproj


  • Make Your HTML Pages Dynamic Using Handlebars Templating Engine

    Technologies and methods for designing and developing web pages have come a long way and with a plethora of tools available at one's disposal, getting a minimal website is not a rocket science anymore. As web developers, we often face a dilemma to go for a static site using plain HTML or to go for a dynamic one with some server-side programming stack. Both of the approaches has got advantages as well as disadvantages. 

     If we go for a static one, the advantages are  

    1. Time needed for development is less.
    2. Easy to create and less expertise is needed 
    3. Site will be super fast 

     Disadvantages,  

    1. Hard to maintain 
    2. Lot of repetitive work 
    3. Hard to scale 

     And for the dynamic approach, advantages are 

    1. Easy to maintain 
    2. Repetitive work can be avoided 
    3. Scaling is easy and more features can be easily integrated 

     Disadvantages 

    1. Dedicated personal  is needed 
    2. Cost factor increases 

    We can overcome some of the cons mentioned above by applying JavaScript Templating. JavaScript Templates helps to segregate HTML code and content which it is rendering in the browser. This separation of concerns helps to build a codebase which is easy to maintain in the future, modifications can be easily done with minimal disruption to the existing codebase. 

    Some of the most popular JavaScript templating engines are Mustache, Underscore, EJS, and Handlebars and in this post, I am going in detail to show how we can make of Handlebars to generate HTML content from the template.