main

Agile software developmentApplication architectureArchitecture, PatternsBlogLanguage featuresPatterns, Software ArchWeb API

Microservices architecture benefits and business value

April 17, 2017 — by Samuele Resca0

matt_w_moore_canal_saint_martin_paris_1.jpg?fit=768%2C702&ssl=1

Microservices are small, autonomous services that work together.
—Sam Newman, Thoughtworks

So, what is a service?
A service is a software that…

  • is responsabile for holding, processing, and distributing a particular kind of information within the scope of the system;
  • deploy and runs are indipendent;
  • communicate with consumers and other services, presenting informations using conventions and contracts;
  • handles failure conditions;

A system is a federation of services, aiming to provide a solution for a well-defined scope. The solution scope may be motivated by the business, technology, law or other criterias.

Understanding Microservices

There are three main concepts about microservices:

  •  Approach is ideal for big systems: size of a system is relative, what does it mean big? Big is not referred to the size of our system, rather it is referred to the capability of scaling;
  • Architecture is goal-oriented: microservice architecture isn’t about identifying a specific collection of practices, rather it is an acknowledgment that software professionals are trying to solve a similar goal using a particular approach;
  • Microservices are focused on replaceability and loose coupling system: replace a component is better than maintaining and changing it;
Microservices characteristics

Microservice applications share some important characteristics. First of all, they are small in size and decentralized. Decentralized means that your application will no longer be managed by a central body.

Microservices are also autonomously developed and independently deployable. Each team develop a single microservice and manage the deployment. Less centralization results in fewer bottlenecks and less resistance to change, while more autonomy means that changes and decision can be made much quicker.

Adopting microservices

Microservices architecture does not refer to a particular set of technology, process, or tools. Instead, you will need to stay focused on the goal itself.  The real value of microservices is realised when we focus on two key aspects: speed and safety. Finding an effective balance between them at scale.

The desire for speed is a desire for immediate change and ultimately a desire for adaptability. Speed of change gets a lot of attention in cases study about microservices architecture, another important keyword is change safety. After all, “speed kills” and in most software shops nobody wants to be responsible for breaking production. Finally, another aim of the microservices architecture is the idea of solving problems that arise when software gets too big.

Microservices architecture benefits

The microservice architectural style was defined based on common patterns observed across a number of pioneering organizations. These organizations did not consciously implement a microservices architecture. They evolved to it in pursuit of specific goals.

Adam Trenaman, the Senior Vice president of Engineering @Glit, speaks about micoservices architecture in this article. Microservices have given to Glit the following benefits:

  • Lessens dependencies between teams – resulting in faster code to production;
  • Allows lots of initiatives to run in parallel;
  • Supports multiple technologies/languages/frameworks;
  • Enables graceful degradation of service;
  • Promotes ease of innovation through ‘disposable code’ – it is easy to fail and move on;

Some services require high availability, but are low volume, and it is the opposite for other services. A microservice approach allows us to tune for both of these situations, whereas in a monolith it’s all or nothing.

Deriving business value

The speed key aspect of microservices architecture provides the following business value:

  • Agility allows organizations to deliver new products, functions, and features more quickly;
  • Composability reduces development time and provides a compound benefit through reusability over time;
  • Comprehensibility of the software system simplifies development planning, increases accuracy, and allows new resources to come up to speed more quickly;
  • Independent deployability of components gets new features into production more quickly and provides more flexible options for piloting and prototyping;
  • Organizational alignment of services to teams reduces ramp-up time and encourages teams to build more complex products and features iteratively;
  • Polyglotism: permits the use of the right tools for the right task, thus accelerating technology introduction and increasing solution options;

The safety aspect described earlier  provides following business value:

  • Greater efficiency in the software system reduces infrastructure costs and reduces
    the risk of capacity-related service outages;
  • Independent manageability contributes to improved efficiency, and also reduces
    the need for scheduled downtime;
  • Replaceability of components reduces the technical debt that can lead to aging,
    unreliable environments;
  • Stronger resilience and higher availability ensure a good customer experience;
  • Better runtime scalability allows the software system to grow or shrink with the
    business;
  • Improved testability allows the business to mitigate implementation risks;

Final thought

In conclusion, we have introduced some of the concerns that first-time implementers often have. We also introduced the microservices architecture process, a goal-driven approach to building adaptable, reliable software. The balance of speed and safety at scale is key to understanding the essence of microservices.

For more informations about microservices architecture:

Microservices Resource Guide – Martin Fowler

Microservice Architecture: Aligning Principles, Practices, and Culture – Book

Microservices Design and Patterns – Clemens Vasters

APIApplication architectureArchitecture, PatternsASP.NET MVCBlogC#CoreLanguage featuresMVCPatterns, Software ArchThe MVC PatternWeb API

Unit testing ASP.NET Core Identity

March 27, 2017 — by Samuele Resca1

01_generation-why_cm100x160_olio-su-tela_2013.jpg?fit=768%2C480&ssl=1

ASP.NET Core Identity is a membership system which allows you to add login functionality to your application. Users can create an account and login with a user name and password or they can use an external login providers such as Facebook, Google, Microsoft Account, Twitter and more.

In the following article, you will learn how to implement and unit test ASP.NET Core Identity.

You can configure ASP.NET Core Identity to use a SQL Server database to store user names, passwords, and profile data. Alternatively, you can use your own persistent store to store data in another persistent storage, The following article will use SQL Server as data source engine.

The project described in the article will also use OpenIddict to implement token authentication: OpenIddict aims at providing a simple and easy-to-use solution to implement an OpenID Connect server in any ASP.NET Core application.

Setup Project

The following article will add ASP.NET Core Identity to the sample project used by: Implementing SOLID REST API using ASP.NET Core.

In order to use OpenIddict, add the appropriate MyGet repositories to your NuGet sources. This can be done by adding a new NuGet.Config file at the root of your solution:

In order to use ASP.NET Core Identity and OpenIddict add the following packages to your project:

Setup Authentication

Create new Startup.Auth.cs file which will contain the setup of authentication:

The Startup.Auth.cs contains the partials Startup class and initialises identity environment:

  • adds the IdentityDbContext to the application services;
  • maps AppUser model class as identity class;
  • configures the use of OpenIddict;
Retrieve data from data source

The following schema shows the API implementation, from Data access layer to the API layer:

ASP.NET Core Identity

In order to retrieve user data from data source, the application will use 4 key components:

  • AppUser defines the user data source model;
  • UserRepository connects services class to the data source. It uses DbContext in order to retrieve information from database;
  • UserService  aggregates different providers: UserValidator, PasswordValidator, SignInManager; It is used by the UserController to obtain informations form database;
  • UsersController handles http requests form the client and retrieve information about users;

The following code shows the implementation of the UserController. The UserService and UserRepository are available on Github.

 

Unit test User APIs

Obviously, we need to cover UsersController using unit tests. The following project will use xUnit and Moq as mocking framework.

Firstly, the UsersControllerTests defines two fake classes: FakeUserManager and FakeSignInManager, which will be used by the mocking framework:

In order to mock ASP.NET Core Identity, create a new Test server which will solves the application dependencies:

Finally, we need to mock our FakeUserManager and FakeSignInManager classes by using Moq. The mocking will be implemented by the constructor (setup) of the UserControllerTest class:

 

Conclusion

In conclusion, ASP.NET Core Identity is the out of box membership framework provided by ASP.NET Core. This article shows how to test the behaviour of the user authentication, you can find the complete project on GitHub.

Cover picture by Corrado Zeni.

Cheers 🙂

Architecture, PatternsASP.NETASP.NET MVCBlogC#Language featuresPatterns, Software ArchWeb API

Implementing SOLID REST API using ASP.NET Core

February 28, 2017 — by Samuele Resca1

slide03.jpg?fit=768%2C392&ssl=1

The following article shows how to implementing SOLID REST API using ASP.NET Core. The solution uses generic repository pattern to perform CRUD operations on database, and also xUnit as test runner.

The solution will contain three key namespace:

  • Data access: it will implement information about the domain model and the relationship between entities. It will also contain the information about data context;
  • Domain logic: it will implement the repositories and services used by our web APIs;
  • API: it will implement controllers and middlewares to manage incoming requests;

The project is available on Github.

Project overview

Here is a brief schema of the project structure:

Implementing SOLID REST API using ASP.NET Core

 

The Blog.Turnmeup.DAL  represents the Data access namespace, the Blog.Turnmeup.DL represents the Domain Logic namespace, finally the Blog.Turnmeup.API exposes Rest APIs.

Project Testing

The key parts of the solution are convered by unit tests and integration tests. Test projects will use Moq as mocking library and xUnit as test runner. All tests will be contained in two namespaces:

  • DL.Tests: it will contain domain logic tests;
  • API.Tests: it will contain APIs tests;

Data access using Entity Framework

Data access layer uses Entity Framework Core as ORM.

The project (Blog.Turnmeup.DAL) contains entity classes which describe model data. The namespace also defines  BaseEntity model class which contains some common attributes. BaseEntity is extended by the other entity classes. Domain Logic namespace references models  to retrieve data.

Here is a schema of the project:

Implementing SOLID REST API using ASP.NET Core

In order to generate the database schema you need to run the following command inside the directory of the project:

dotnet ef database update

Domain Logic layer overview

This project (Blog.Turnmeup.DL) implements the Domain logic layer. It uses the Generic repository pattern combined with a generic service class. Here is an overview of the project:

Implementing SOLID REST API using ASP.NET Core

Unit testing

The Domain logic layer is covered by unit tests. The project Blog.Turnmeup.DL.Tests defines the unit tests inside the CourseServiceTests class. Here is the source code of tests:

Model definition

The Domain Logic layer also defines BaseResponseModel.
Blog.Turnmeup.API uses BaseResponseModel to serialize data throughs HTTP response. The Blog.Turnmeup.API will useAutomapper to map Data access layer models with Domain logic models.

API namespace overview

The Blog.Turnmeup.API defines the Controller class which returns data through HTTP. The CourseController handles incoming HTTP requests and call Domain logic layer to retrieve or update data. Here is an schema overview of the project:

Implementing SOLID REST API using ASP.NET Core

Dependency injection and class mapping

The Blog.Turnmeup.API project uses dependency injection to inject concrete classes to interfaces. It also uses Automapper package to map Blog.Turnmeup.DAL to Blog.Turnmeup.DL response models. Dependency injection is managed in the Startup.cs file and classes mapping is managed by the Infrastructure.Mapper class:

Unit testing

The Rest APIs layer is covered by unit tests. The project Blog.Turnmeup.API.Tests defines the unit tests inside the CourseControllerTests class. Tests use the TextFixture class to initialize a Test server and retrieve the dependencies services:

Final thought

You can find the code on Github.
The project can be extended with other model classes and other controller. In the next article will be dicovered the authentication part by using ASP.NET Core Identity:

Unit testing ASP.NET Core Identity

Cover picture by Corrado Zeni.

Cheers 🙂