ASP.NET Core, C#, Programming

Tips on using Autofac in .NET Core 3.x

.NET Core supports DI (dependency injection) design pattern which is technique for achieving Inversion of Control (IoC) between classes and their dependencies. The native, minimalistic implementation is known as conforming container and it is anti-pattern. You can read more about issues related with it here. There is a promise from Microsoft to make better integration points with 3rd party DI vendors in a new .NET Core releases but it is good enough for most of the pet projects and small production projects. Being a user of DI for quite a long time I used to have more broad support from dependency injection frameworks. For the sake of keeping this article short and focused I would skip the definite list of functionality I miss in native DI implementation for .NET Core. I will mention only few: extended lifetime scopes support, automatic assembly scanning for implementations, aggregate services and multi-tenant support. There plenty of DI frameworks on the market. Back in 2008/2009 when I switched to .NET one of my favorite DI frameworks was StructureMap. Having rich functionality it was one of the standard choice for my projects. Another popular framework was Castle Windsor. For some time I was also a user of a Ninject DI which I found very easy to use.

However, StructureMap was deprecated for some time already, Ninject is still good, but I was looking for some different DI to try with one of my new .NET Core projects. Autofac caught my attention immediately. It is on the market since 2007 and it gets only better with 3400+ stars and 700+ forks on the GitHub. It has exhaustive documentation and feature list. On the moment of writing this post the latest version of Autofac is 6 and the way how you bootstrap it in .NET Core 3.x and 5 changed compared to 5.x branch.

So, enough talks: talk is cheap, show me some code…

Tip 1

Setting-up

public class Program
{
  public static void Main(string[] args)
  {
    // ASP.NET Core 3.0+:
    // The UseServiceProviderFactory call attaches the
    // Autofac provider to the generic hosting mechanism.
    var host = Host.CreateDefaultBuilder(args)
        .UseServiceProviderFactory(new AutofacServiceProviderFactory())
        .ConfigureWebHostDefaults(webHostBuilder => {
          webHostBuilder
            .UseContentRoot(Directory.GetCurrentDirectory())
            .UseIISIntegration()
            .UseStartup<Startup>();
        })
        .Build();
    host.Run();
  }
}

Startup Class

public class Startup
{
  public Startup(IHostingEnvironment env)
  {
    // In ASP.NET Core 3.0 `env` will be an IWebHostEnvironment, not IHostingEnvironment.
    this.Configuration = new ConfigurationBuilder().Build();
  }
  public IConfigurationRoot Configuration { get; private set; }
  public ILifetimeScope AutofacContainer { get; private set; }
  public void ConfigureServices(IServiceCollection services)
  {
    services.AddOptions();
  }

  public void ConfigureContainer(ContainerBuilder builder)
  {
    // Register your own things directly with Autofac here. Don't
    // call builder.Populate(), that happens in AutofacServiceProviderFactory
    // for you.
    builder.RegisterModule(new MyApplicationModule());
  }

  public void Configure(
    IApplicationBuilder app,
    ILoggerFactory loggerFactory)
  {
    // If, for some reason, you need a reference to the built container, you
    // can use the convenience extension method GetAutofacRoot.
    this.AutofacContainer = app.ApplicationServices.GetAutofacRoot();
    loggerFactory.AddConsole(this.Configuration.GetSection("Logging"));
    loggerFactory.AddDebug();
    app.UseMvc();
  }
}

Tip 2

Scanning assemblies

Autofac can use conventions to find and register components in assemblies.

public void ConfigureContainer(ContainerBuilder builder)
{
    builder.RegisterAssemblyTypes(typeof(Startup).Assembly)
        .AsClosedTypesOf(typeof(IConfigureOptions<>));
 }

This will register types that are assignable to closed implementations of the open generic type. In that case it will register all implementations of IConfigureOptions<>. See options pattern for more information on how to configure configuration settings with dependency injection.

Tip 3

Use Mvc/Api controllers instantiation with Autofac

Controllers aren’t resolved from the container; just controller constructor parameters. That means controller lifecycles, property injection, and other things aren’t managed by Autofac – they’re managed by ASP.NET Core. You can change that using AddControllersAsServices().

  public void ConfigureServices(IServiceCollection services)
  {
    services.AddControllersAsServices();
  }
public void ConfigureContainer(ContainerBuilder builder) {
	
	var controllersTypesInAssembly = typeof(Startup).Assembly.GetExportedTypes().Where(type => typeof(ControllerBase).IsAssignableFrom(type)).ToArray();
	builder.RegisterTypes(controllersTypesInAssembly).PropertiesAutowired();
}

Here we register all types that are descendants of ControllerBase type. We also enable property injection capability (line 5). This is useful when you want to have some property in the base controller implementation which could be re-used (e.g. IMediator).

Tip 4

Register EF Core DbContext with Autofac

If you use Entity Framework Core you want your DbContext to be managed by DI container. One important notice is that DbContext should behave as a unit of work and be scoped to request lifetime. In native DI it registered as a scoped service which in Autofac equal to InstancePerLifetimeScope.

public static void AddCustomDbContext(this ContainerBuilder builder, IConfiguration configuration) {
	builder.Register(c => {
		var options = new DbContextOptionsBuilder<ApplicationContext>();
		options.UseLoggerFactory(c.Resolve<ILoggerFactory>()).EnableSensitiveDataLogging();
		options.UseSqlServer(configuration["ConnectionStrings:ApplicationDb"], sqlOptions => { sqlOptions.MigrationsAssembly(typeof(Startup).GetTypeInfo().Assembly.GetName().Name);
			sqlOptions.EnableRetryOnFailure(maxRetryCount: 15, maxRetryDelay: TimeSpan.FromSeconds(30), errorNumbersToAdd: null);
		});
		return options.Options;
	}).InstancePerLifetimeScope();
	builder.RegisterType<ApplicationContext>()
          .AsSelf()
          .InstancePerLifetimeScope();
}
public void ConfigureContainer(ContainerBuilder builder) {
	
	builder.AddCustomDbContext(this.Configuration);
}

Tip 5

Use modules for your registrations

public void ConfigureContainer(ContainerBuilder builder) {
	
	builder.RegisterModule(new MediatorModule());
	builder.RegisterModule(new ApplicationModule());
}
public class ApplicationModule: Autofac.Module {
	public ApplicationModule() {}
	protected override void Load(ContainerBuilder builder) {
		builder.RegisterType<NinjaService>().As<INinjaService>().SingleInstance();
		builder.RegisterType<KatanaService>().As<IKatanaService>().InstancePerLifetimeScope();
	
		builder.RegisterAssemblyTypes(typeof(Startup).GetTypeInfo().Assembly).AsClosedTypesOf(typeof(INinjaRepository<>)).InstancePerLifetimeScope();
	}
}

Keeping registrations in modules makes your wire-up code structured and allow deployment-time settings to be injected.

Tip 6

Follow best practices and recommendations

.NET, ASP.NET Core, Programming

From Zero to Hero: Build ASP.NET Core 3.1 production-ready solution from the ground up (Part 1)

How often do you start a new project with latest and greatest version of .NET Core and C# to try some new fancy language features or perhaps creating a new solution for implementing your ideas? It happens to me a lot. I find myself creating a pet projects over and over again. Sometimes, project growth and get more contributors . People working from different places having different IDEs and operational systems. Solution should work the same way on each workstation on each OS. Also it is important to have code style conventions and scripts for building and running solution. I would like to share with you my experience on structuring .NET solution, containerizing it with Docker, adding HTTPS support for development and many more nice bonuses like adding code analyzers, following conventions and code formatting. As an example we will create ASP.NET Core 3.1 simple API.

From this post you will learn:

  • How to properly structure you solution
  • How to add Git and other configuration files
  • How to create ASP.NET Core API application
  • How to containerize ASP.NET Core application
  • How to add support for HTTPS development certificate
  • How to add styling and code conventions with analyzers
  • How to make it work cross-platform in different editors and OSes (Visual Studio, Visual Code, CLI)

Structure solution and add Git with configuration files

Okay. Lets start from the beginning. I assume you have Git installed:

mkdir ninja-core
cd ninja-core
git init

I do suggest to structure your solution in the following way:

  • /
    • src
      • project-1
      • project-2
    • docs
    • tests
    • build
    • deploy

src – solution source files which includes all projects sources

docs – documentation on your solution. This could be any diagrams which contains sequence or activity flows or just a simple use cases

tests – all kind of tests for your solution including unit tests, integration tests, acceptance tests, etc.

build – could be any scripts for building your solution

deploy – scripts related to deploying your solution to different environments or localhost

Suggested solution structure of our deadly Ninja .NET Core app could look like this for now:

In the root folder we will have following files:

Let’s add following files in the root folder of our project:

.gitattributes – Defines Git behavior on certain attributes-aware operations like line endings, merge settings for different file types and much more.

.gitignore – Defines patterns for files which should be ignored by Git (like binaries, tooling output, etc). This one adapted for Visual Studio/Code and .NET projects

.gitlab-ci.yml – Configuration file for GitLab pipeline (will be covered in Part 2). We would like to be sure that our code continuously integrated and delivered.

README.md – Every well-made project should contain readme file with instructions on how to build and run your solution, optionally, with team members and responsible persons.

You can use files as is or adapt it for your’s project needs. After you created folder structure and added all needed files with configuration you need to push it to your repository (I assume you’ve created one). Typically it looks something like:

git remote add origin git@gitlab.com:username/yrepo_name.git
git add .
git commit -m "Initial commit"
git push -u origin master

Create ASP.NET Core web application

Creating ASP.NET Core web app is really simple. Just run following commands in CLI:

#inside of ninja-core/src folder
mkdir Iga.Ninja.Api
cd Iga.Ninja.Api
dotnet new webapi

By default, it generates WeatherForecastController.cs file in Controllers folder. Because we’re building deadly ninja API, we want to delete this file and instead add simple NinjaController.cs with following content:

using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Logging;

namespace Iga.Ninja.Controllers
{
    [ApiController]
    [Route("[controller]")]
    public class NinjaController : ControllerBase
    {
        private readonly ILogger<NinjaController> _logger;

        public NinjaController(ILogger<NinjaController> logger)
        {
            _logger = logger;
        }

        [HttpGet]
        public string Get() => "Go Ninjas!!!";
    }
}

Cool. Now we should be able to build and run it:

dotnet run

Go in your browser and see it working: http://localhost:5000/ninja.

Containerize ASP.NET Core application

Since introducing back in 2013 Docker changed the way how modern software development looks today, especially in micro-service oriented architecture. You want your application to work exactly the same on local machine, on test and on production with all package dependencies required for app to run. This also helps a lot in end-to-end testing when your application has dependency on external services and you would like to test whole flow.

First, we need to create an image in the root of Iga.Ninja.Api folder. Here the Dockerfile I use:

# Stage 1 - Build SDK image
FROM mcr.microsoft.com/dotnet/core/sdk:3.1 AS build
WORKDIR /build

# Copy csproj and restore as distinct layers
COPY *.csproj ./
RUN dotnet restore

# Copy everything else and build
COPY . ./
RUN dotnet build -c Release -o ./app

# Stage 2 - Publish
FROM build AS publish
RUN dotnet publish -c Release -o ./app

# Stage 3 - Build runtime image
FROM mcr.microsoft.com/dotnet/core/aspnet:3.1
WORKDIR /app
COPY --from=publish /build/app .
ENTRYPOINT ["dotnet", "Iga.Ninja.Api.dll"]

Here we use what is known as multi-stage builds which is available in Docker starting from version 17.05. So don’t forget to check that you are up-to-date. There two images: one with .NET Core SDK which contains all required tools for building .NET Core application and .NET Core runtime which is needed to run application. We use image with SDK in a first stage as a base image to restore packages and build application. You can notice that we have dotnet restore and dotnet build as a two separate commands in Dockerfile instead of one. That is small trick to make creation of the image a bit faster.

Each command that is found in a Dockerfile creates a new layer. Each layers contains the filesystem changes of the image between the state before the execution of the command and the state after the execution of the command.

Docker uses a layer cache to optimize the process of building Docker images and make it faster.

Docker Layer Caching mainly works on RUNCOPY and ADD commands

So if csproj file hasn’t changed since last state, cached layer will be used. In Stage 2 we just publish binaries built by Stage 1 and dotnet build. Stage 3 will use ASP.NET Core runtime image and artifacts from Stage 2 with published binaries. That will be our final image. With the last line we instruct Docker what command to execute when new container from that image will be instantiated. By the way ASP.NET Core application is just console app which runs with built-in and lightweight Kestrel web server. But preferred option if you run on Windows is to use In-Process hosting model with IIS HTTP Server (IISHttpServer) instead of Kestrel which gives performance advantages.

That’s it. You can build an image and run it:

docker build -t ninja-api .
docker run --rm -d -p 8000:80 --name deadly-ninja ninja-api

Now you should be able to see a deadly ninja in action by visiting http://localhost:8000/ninja in your browser.

Congratulations! You’ve just containerized our web api.

Add HTTPS development certificate (with support in Docker)

So far so good. Now we would like to enforce HTTPS in out API project for development and make it work also when running in Docker container. In order to achieve that we need to do the following steps:

  • Trust ASP.NET Core HTTPS development certificate.

When you install .NET Core SDK it installs development certificate to the local user certificate store. But it is not trusted, so run this command to fix that:

dotnet dev-certs https --trust

That’s already enough if we going to run our API locally. However if we would like to add this support in Docker we need to do additional steps:

  • Export the HTTPS certificate into a PFX file using the dev-certs global tool to %USERPROFILE%/.aspnet/https/<>.pfx using a password of your choice

PFX filename should correspond to your application name:

# Inside Iga.Ninja.Api folder
dotnet dev-certs https -ep %USERPROFILE%\.aspnet\https\Iga.Ninja.Api.pfx -p shinobi
  • Add the password to the user secrets in your project:
dotnet user-secrets init -p Iga.Ninja.Api.csproj
dotnet user-secrets -p Iga.Ninja.Api.csproj set "Kestrel:Certificates:Development:Password" "shinobi"

Now we would be able to run our container with ASP.NET Core HTTPS development support in container with following command:

docker run --rm -it -p 8000:80 -p 8001:443 -e ASPNETCORE_URLS="https://+;http://+" -e ASPNETCORE_HTTPS_PORT=8001 -e ASPNETCORE_ENVIRONMENT=Development -v %APPDATA%\microsoft\UserSecrets\:/root/.microsoft/usersecrets -v %USERPROFILE%\.aspnet\https:/root/.aspnet/https/ --name deadly-ninja-secure ninja-api

Navigate to https://localhost:8001/ninja. Now, our deadly ninja even more secure and trusted than ever.

P.S. Because docker mounts user secrets as a volume, it is very important to check that docker has access rights to required folders, so please check your docker resources settings

Add styling and code conventions with analyzers

When you work on a project with more than one developer you want to have common conventions and agreement on how to style and format your code. It is time to add that. First, I would like to suggest to create a solution file for our project. Although not necessary it is very handy to have it, especially if you work outside of IDE. It will serve as a project container and you can issue dotnet build in /src root, so that your solution file will be used for build process. Let’s add solution file and our API project:

cd ./src
dotnet new sln --name Iga.Ninja
dotnet sln add Iga.Ninja.Api/Iga.Ninja.Api.csproj

Okay. Let’s move on. There a lot of source code analyzer packages you can find. For our example we will use SecurityCodeScan, SonarAnalyzer.CSharp and StyleCop.Analyzers. You can add it by running following commands in Iga.Ninja.Api folder:

dotnet add package SonarAnalyzer.CSharp
dotnet add package SecurityCodeScan
dotnet add package StyleCop.Analyzers

But I will suggest a different approach here. Instead of adding these packages manually to the specific project, it would be nice to have a way to automatically add it to any project we add in our solution. This is because we want to have code analyzers in each of our projects and enforce code validation on solution build. And there is a way to do it. We need to add Directory.Build.Props file in the root of our /src folder.

Directory.Build.props is a user-defined file that provides customizations to projects under a directory.

When MSBuild runs, Microsoft.Common.props searches your directory structure for the Directory.Build.props file (and Microsoft.Common.targets looks for Directory.Build.targets). If it finds one, it imports the property.

Let’s add Directory.Build.props file. The content of my file:

<Project>
  <PropertyGroup>
    <Company>NinjaCorp</Company>
    <ProductName>MessageLog</ProductName>
  </PropertyGroup>
  <!-- StyleCop Analyzers configuration -->
  <PropertyGroup>
    <SolutionDir Condition="'$(SolutionDir)'==''">$(MSBuildThisFileDirectory)</SolutionDir>
    <CodeAnalysisRuleSet>$(SolutionDir)ca.ruleset</CodeAnalysisRuleSet>
  </PropertyGroup>
  <PropertyGroup>
    <TreatWarningsAsErrors>false</TreatWarningsAsErrors>
  </PropertyGroup>
  <ItemGroup>
    <AdditionalFiles Include="$(SolutionDir)stylecop.json" Link="stylecop.json" />
    <PackageReference Include="Microsoft.CodeAnalysis.FxCopAnalyzers" Version="3.0.0">
      <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
      <PrivateAssets>all</PrivateAssets>
    </PackageReference>
    <PackageReference Include="SecurityCodeScan" Version="3.5.3.0">
      <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
      <PrivateAssets>all</PrivateAssets>
    </PackageReference>
    <PackageReference Include="SonarAnalyzer.CSharp" Version="8.10.0.19839">
      <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
      <PrivateAssets>all</PrivateAssets>
    </PackageReference>
    <PackageReference Include="StyleCop.Analyzers" Version="1.1.118">
      <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
      <PrivateAssets>all</PrivateAssets>
    </PackageReference>
  </ItemGroup>
</Project>

An attentive reader noticed next line in the file:

<CodeAnalysisRuleSet>$(SolutionDir)ca.ruleset</CodeAnalysisRuleSet>

This file is code analysis rule set reference file which describes configuration for different rules for StyleCop. You should not necessarily 100% agree with these rules, so you can configure it. As a base I use Roslyn Analyzer rule set with a bit of tweaks. You can find this rule set for our ninja core project here. And again, you should customize it for your organization needs. So this file will be picked up each time you issue dotnet build command on your solution and will validate your binaries against this rule set. You will see warnings in output of your build which you can resolve later:

Next line which you perhaps noticed is

<AdditionalFiles Include="$(SolutionDir)stylecop.json" Link="stylecop.json" />

This file used for fine-tune the behavior of certain Stylecop rules and to specify project-specific text. You can find full reference here. In our project stylecop.json looks like this:

{
  "$schema": "https://raw.githubusercontent.com/DotNetAnalyzers/StyleCopAnalyzers/master/StyleCop.Analyzers/StyleCop.Analyzers/Settings/stylecop.schema.json",
  "settings": {
    "documentationRules": {
      "companyName": "Ninja Coreporation",
      "copyrightText": "Copyright (c) {companyName}. All Rights Reserved.\r\n See LICENSE in the project root for license information.",
      "xmlHeader": false,
      "fileNamingConvention": "stylecop"
    },
    "layoutRules": {
      "newlineAtEndOfFile": "allow"
    }
  }
}

By the way, all package references and additional files described in Directory.Build.Props file will be automatically added to all projects on dotnet build/publish without need to add packages to each project manually.

Last steps

Okay. Now we have pretty decent solution which runs locally, in docker with HTTPS support, with code analyzers in place. You can build and run it from CLI on Windows and Linux. You should be able to run it in VS Code or in Visual Studio 2019. Before committing changes to Git what I like to do is to format code according to conventions in our .editroconfig file. And there is very nice tool for that – dotnet-format. You can install it globally:

dotnet tool install -g dotnet-format

Then all you need is to go in your project/solution folder an issue following command:

dotnet format

This will ensure you files now formatted according to your conventions, so when you commit to the Git you are good.

In next part we will look how to setup CI/CD pipeline for our ninja-core web api project with an example of GitLab infrastructure.

You can find sample for this article on my GitLab: https://gitlab.com/dnovhorodov/ninjacore

Have a nice coding and stay tuned.