Power of Simplicity A Quick Tour of The Dotnet Cli How To Test Asp. Net Core Web Api
Power of Simplicity A Quick Tour of The Dotnet Cli How To Test Asp. Net Core Web Api
.NET
Core eMag Issue 68 - Jan 2019
FOLLOW US CONTACT US
GENERAL FEEDBACK [email protected]
ADVERTISING [email protected]
EDITORIAL [email protected]
There was a movement inside Microsoft and in the .NET community around the early 2010s to bring
the next generation of the framework to developers. One need was to shed the legacy technology that
had been accumulating in the .NET Framework since the beginning. Another was to allow the .NET
Framework to run on the three major operating systems that exist in the world of computers and also
the Cloud. The final push was after Satya Nadella was promoted to CEO of Microsoft in 2014 when the
open source wave was sweeping through Redmond with the Roslyn C# compiler and the Acquisition
of Xamarin.
What was created is the .NET Core that allowed the team inside Microsoft to learn from the past with
.NET Framework, look at the new ideas in software developer since 2000 and finally open sourcing the
.NET Core code to all developers. We now have a rich platform and set of APIs along with the perfor-
mance and efficiency that rivals many of the other application frameworks in the open source world.
We are just at the start of the .NET Core story, but it is one with already great minds pushing it to great
heights.
I had a great experience working on the two series of articles that are in this eMag as well as the au-
thors for all of the pieces. They are all great community members as well as friends that I have grown to
know more during the process of creating the series. Thanks to all of them for giving their time, knowl-
edge and wisdom. I want to thank the entire InfoQ staff for their support and patience over the last 16
months it took to publish all of these articles online. I hope you enjoy the articles as much I have and
they all give you the confidence to dive into the .NET Core stack of technologies and create the next
great apps.
CONTRIBUTORS
Chris “Woody” Woodruff
has been developing and designing software solutions for over 20
years and has worked with many different platforms and tools. He is a
community leader, helping such events as GR DevNight, GR DevDay, West
Michigan Day of .NET, and CodeMash. He has been a Microsoft MVP in
Visual C#, Data Platform, and SQL, and was recognized in 2010 as one of
the top-20 MVPs worldwide. Woodruff is a developer advocate for JetBrains
and evangelizes .NET, .NET Core, and JetBrains’ products in North America.
memory usage, VB.NET 15 added a language feature to return multiple values from a method.
Here’s a before and after:
deconstruction {
return new Tuple<string, int>(“Maarten”, 33);
}
are quite pleasant // After:
side effects of
private (string, int) GetNameAndAge()
{
return (“Maarten”, 33);
In the first case, we are allocating a Tuple<string, int>. While the effect
of the language will be negligible in this example, the allocation is done on the managed heap
and the Garbage Collector (GC) will have to clean it up at some point. In the
as well as the second case, the compiler-generated code uses the ValueTuple<string,
int> type that itself is a struct and is created on the stack — giving us access
to the two values we want to work with while making sure the GC won’t have
framework. to work on the containing data structure.
// Before:
.method private hidebysig instance class [System.Runtime]
System.Tuple`2<string, int32> GetNameAndAge() cil
managed
{
// ...
}
// After:
.method private hidebysig instance valuetype [System.
Runtime]System.ValueTuple`2<string, int32> GetNameAndAge()
cil managed
{
// ...
}
We can clearly see that the first example returns an instance of a class and the
second example returns an instance of a value type. The class is allocated in
the managed heap (tracked and managed by the Common Language Run-
time (CLR), subject to garbage collection, mutable), whereas the value type
is allocated on the stack (fast, less overhead, immutable). In short, System.
ValueTuple itself is not tracked by the CLR and merely serves as a simple
container for the embedded values we care about.
That is shorter, but we have benefits beyond that. Due to the way that
Span<T> is implemented, our method does not return a copy of the source
data. It instead returns a Span<T> that refers to a subset of our source.
In the example of splitting an HTTP request into headers and body, we’d
have three Span<T>: the incoming HTTP request, one Span<T> point-
ing to the original data’s header, and another Span<T> pointing to the
request body. The data would be in memory only once (the data from
which the first Span<T> is created), all else would just point to slices of
the original. There’s no duplicate data and no overhead in copying and
duplicating data.
Conclusion
With .NET Core and its faster release cycle, Microsoft and the open-
source community can progress faster on new features related to per-
formance. We have seen a lot of work that has gone into improving ex-
isting code and constructs in the framework, such as improving LINQ’s
.ToList() method.
Faster cycles and easier upgrades also upgrades also speed develop-
ment of new improvements to .NET Core performance, such as with
types like System.ValueTuple and and Span<T> that make it more nat-
ural for .NET developers to use the different types of memory we have
available in the runtime while avoiding their pitfalls.
Recently, folks who have either been hesitant or unable to switch off
the older, full .NET framework have asked me about the advantages in
moving to .NET Core. In reply, I’ll mention the better performance, the
improved csproj file format, the improved testability of ASP.NET Core,
and that it is cross-platform.
As the author of several OSS tools maintain build scripts for my proj- To get started with the dotnet
(including Marten, StructureMap, ects. It’s easier to run tests in build CLI, let’s first install the .NET SDK
and Alba), the single biggest ad- scripts, easier to both consume on our development machine.
vantage to me personally has and publish NuGet packages, Once that’s done, there’s a couple
been the advent of the dotnet and the CLI extensibility mecha- of helpful things to remember:
command-line interface (CLI). nism is fantastic for incorporating
Used in conjunction with the custom executables distributed • The dotnet tools are global-
new .NET SDK csproj file format, through NuGet packages into au- ly installed in our PATH and
the dotnet CLI tooling has made tomated builds. are available anywhere in
it far easier for me to create and our command-line prompts.
Since I just happen to be building this in a small Git repository, after add-
ing any new files with git add, I use git status to see what has been
newly created:
Now, to add the new project to our empty solution file, we can use the
dotnet sln command like this:
We’ve now got the shell of a new ASP.NET Core API service without ever
having to open Visual Studio.NET (or JetBrains Rider, in my case). To go
a little farther and start our testing project before we write any actual
code, issue these commands:
These commands create a new project using xUnit.NET as the test library
and add that new project to our solution file. The test project needs a
project reference to the HeyWorld project, and fortunately we can add a
project reference with the nifty dotnet add tool like so:
Next, I want to add at least one more NuGet reference to the testing project called Alba.AspNetCore2, which I’ll
use to author HTTP contract tests against the new web application:
Before working with the code, let’s make sure everything compiles just fine by issuing this command to build all
the projects in our solution at the command line:
And, ugh, that didn’t compile because of a diamond-dependency version conflict between Alba.AspNetCore2
and the versions of the ASP.NET Core NuGet references in the HeyWorld project. No worries though, because
that’s easily addressed by fixing the version dependency of the Microsoft.AspNetCore.All NuGet in the
testing project like this:
In the example above, using the --version flag with the value “2.1.2” will fix the reference to exactly that ver-
sion instead of just using the latest version found from our NuGet feeds.
To doublecheck that our NuGet dependency problems have all gone away, we can use the commands shown
below to do a quicker check than we’d get by recompiling everything:
As an experienced .NET developer, I’m paranoid about lingering files in the temporary /obj and /bin folders. I use
the “Clean Solution” command in Visual Studio.NET any time I try to change references, just in case something is
left behind. The dotnet clean command does the exact same thing from a command line.
Likewise, the dotnet restore command will try to resolve all known NuGet dependencies of the solution file
I specified. In this case, using dotnet restore will let us quickly spot any potential conflicts or missing NuGet
references without having to do a complete compilation — and that’s the main way I use this command in my
own work. In the latest versions of the dotnet CLI, NuGet resolution is done automatically (though we can over-
ride that behavior with a flag) in calls to dotnet build/test/pack/etc. that would first require NuGet.
using System.Threading.Tasks;
using Alba;
using Xunit;
namespace HeyWorld.Tests
{
public class verify_the_endpoint
{
[Fact]
public async Task check_it_out()
{
using (var system = SystemUnderTest.ForStartup<Startup>())
{
await system.Scenario(s =>
{
s.Get.Url(“/”);
s.ContentShouldBe(“Hey, world.”);
s.ContentTypeShouldBe(“text/plain; charset=utf-8”);
});
}
}
}
}
The resulting file should be saved in the HeyWorld.Tests directory with an appropriate name such as verify_the_
endpoints.cs.
Without getting too much into the Alba mechanics, this just specifies that the home route of our new HeyWorld
application should write “Hey, world.” We haven’t actually coded anything real in our HeyWorld application, but
let’s still run this test to see if it’s wired correctly and fails for the right reason.
Back in the command line, we can run all of the tests in the testing project with this command:
To sum up that output, one test was executed and it failed. We also see the standard xUnit output that gives us
some information about why the test failed. It’s important to note here that the dotnet test command will
return an exit code of zero, denoting success, if all the tests pass and a non-zero exit code, denoting failure, if
any test fails. This is important for CI scripting where most CI tools use the exit code of any commands to know
when the build has failed.
I’m going to argue that the test above failed for the “right” reason, meaning that the test harness seems to be
able to bootstrap the real application and I expected a 404 response because nothing has been coded yet. Mov-
ing on, let’s implement an MVC Core endpoint for the expected behavior:
(Note that the previous code should be appended as an additional class in our HeyWorld\startup.cs file.)
Returning to the command line, let’s run the previous dotnet test HeyWorld.Tests/HeyWorld.Tests.
csproj command again and hopefully we’ll see results like this:
To test out our new application now that it’s running, just navigate in a browser like so:
Note that we’re assuming all commands are being executed with the current directory set to the solution root
folder. If the current directory is a project directory and there is only one *.csproj file in that directory, we can
just type dotnet run.
Now that we have a tested web API application, let’s take the next step and put HeyWorld into a Docker image.
Using the standard template for dockerizing a .NET Core application, we’ll add a Dockerfile to our HeyWorld
project with this content:
As this article is just about the dotnet CLI, I just want to focus on the two usages of
that within the Dockerfile:
Note that we had to explicitly tell dotnet publish to compile with the Release
configuration through the usage of the -c Release flag. Any dotnet CLI command
that compiles code (build, publish, or pack, for example) will be default build
assemblies with the Debug configuration. Watch this behavior and remember to
specify the -c Release or --configuration Release if you are publishing a
NuGet or an application that is meant for production usage. You’ve been warned.
Just to complete the circle, we can now build and deploy our little HeyWorld appli-
cation through Docker with these commands:
The first command builds and locally publishes a Docker image named “heyworld”
for our application. The second command actually runs our application as a Docker
container named “myapp”. You can verify this by sending your browser to http://
localhost:8080.
Summary
The dotnet CLI makes automating and scripting builds on .NET projects simple,
especially compared to the state of the art in .NET a decade or so ago. In many
cases, you may even eschew any kind of task-based build-script tool (Cake, Fake,
Rake, Psake, etc.) in favor of simple shell scripts that just delegate to the dotnet
CLI. Moreover, the dotnet CLI extensibility model makes it possible to incorporate
external .NET-authored command-line applications distributed via NuGet into your
automated builds.
Installing Couchbase
The first step is to get the distribut- and unlimited for pre-production you will store your cached data. I
ed-cache server running. Choose use). I’ll be using the Community called my bucket “infoqcache”. I
the installation method that’s edition here. created an “Ephemeral” bucket
most convenient for you. You can (which is a memory-only option).
use Docker or a cloud provider, When you install Couchbase, You can also use a “Couchbase”
or you can install it on your local you’ll open up your web browser bucket (which will store the data
machine (which is what I did for and go through a short wizard as in memory first and persist to disk
this article). Couchbase Server shown in figure 1. The default set- asynchronously) - see figure 2.
is a free download, and you can tings are fine for these examples.
use the free Couchbase Commu- The last step in setting up Couch-
nity edition or the Enterprise Edi- Once you’ve installed Couchbase, base is security. Add a Couchbase
tion. (The Enterprise Edition is free create a data bucket. This is where user with appropriate permis-
sions to that bucket. I called my Make sure you’ve completed all [Route(“api/get”)]
public string Get()
user “infoq” and gave it a pass- these installation steps before
{
word of “password” (please use moving on to ASP.NET Core. Here
// generate a new
something stronger in produc- are the steps with links to more string
tion!). The Enterprise Edition, of- detailed documentation. var myString =
fers a lot of roles to choose from Guid.NewGuid() + “ ” +
but we don’t need them for this 1. Install Couchbase (follow the DateTime.Now;
simple use case. “Bucket Full Ac- instructions on the down-
cess” for infoqcache is enough. loads page).
(Figure 3)
ASP.NET Core and Now open up the Startup.cs file in the project. You will need to add
Couchbase integration some setup code to the ConfigureServices method here.
We now have an ASP.NET Core ap-
plication that needs caching and services.AddCouchbase(opt =>
a Couchbase Server instance that {
wants to help out. Let’s get them opt.Servers = new List<Uri>
{
to work together.
new Uri(“http://localhost:8091”)
};
The first step is to install a pack- opt.Username = “infoq”;
age from NuGet. You can use the opt.Password = “password”;
NuGet UI to search for Couch- });
base.Extensions.Caching, or
you can run this command in the services.AddDistributedCouchbaseCache(“infoqcache”, opt
Package Manager Console: In- => { });
stall-Package Couchbase.
Extensions.Caching -Version (I also added using Couchbase.Extensions.Caching; and us-
1.0.1. This is an open-source ing Couchbase.Extensions.DependencyInjection; at the top of the
project and the full source code is file, but I use ReSharper to identify and add those for me automatically.)
available on GitHub.
In the above code, AddCouchbase and AddDistributedCouchbase-
NuGet will install all the packages Cache are extension methods that add to the built-in ASP.NET Core IS-
you need for your ASP.NET Core erviceCollectioninterface.
application to talk to Couchbase
Server and to integrate with ASP. With AddCouchbase, I’m telling ASP.NET Core how to connect to Couch-
NET Core’s built-in distribut- base, giving it the user and password that I chose earlier.
ed-caching capabilities.
public ValuesController(IDistributedCachecache)
{
_cache = cache;
}
[Route(“api/getfast”)]
public string GetUsingCache()
{
// is the string already in the cache?
var myString = _cache.GetString(“CachedString1”);
if (myString == null)
{
// string is NOT in the cache
return myString;
}
In the above example, the cached data will live in the cache indefinite-
ly. But you can also specify an expiration for the cache. In the example
below, the endpoint will cache data for five minutes (on a sliding expi-
ration).
_cache.SetString(“CachedString1”, myString,
new DistributedCacheEntryOptions { SlidingExpiration =
TimeSpan.FromMinutes(5)});
Summary
ASP.NET Core can work hand in hand with Couchbase Server for distrib-
uted caching. ASP.NET Core’s standard distributed-cache interface makes
it easy for you start working with the cache. Next, get your ASP.NET Core
distributed applications up to speed with caching.
Before we can deploy our .NET Core apps into Azure, we need to set up an application
host or runtime in Azure. There are numerous ways we can deploy infrastructure and
services in Azure. The easiest way to get started is using the Azure Portal. From the
portal, we can find the services we need in the marketplace and go through a series of
guided questions to configure and deploy these services. Once the VM is in a running
state, we can remotely manage and configure it by using Remote Desktop if it is run-
ning Windows or using SSH if it’s running Linux.
If you’re a fan of DevOps like me, you probably like to script and automate as much as
you can so it’s repeatable and streamlined. Azure Resource Manager (ARM) templates
allow us to automate the service deployment in Azure. ARM templates are simply
JSON files that define the resources that we want to deploy and their relationships
to each other. These ARM templates are very popular, and there is a GitHub repo that
contains hundreds of pre-built templates for lots of services, platforms, and configu-
rations.
In addition to deploying and configuring Azure services, we can use ARM templates to
configure the OS and install other dependencies using VM extensions. For example, if
we are setting up a web server on Ubuntu Linux, we will need to deploy the Ubuntu Li-
nux VM and then deploy a web server like Apache. Using the Custom Script Extension,
we can execute our custom script after the VM has finished deploying. These custom
scripts can do things like install other services and application servers like Apache and
PHP. We can see an example of an ARM template that deploys an Ubuntu Linux server
with Apache in the Azure Quickstart Templates repo at GitHub. In the rendered RE-
ADME.md file there, we can click Deploy to Azure button as shown in Figure 1 to start
the deployment of the selected template into our Azure subscription. Once we have a
web server, we can deploy our ASP.NET Core apps and run them in Azure.
To get started, I created a new project in Visual Studio and selected the Web
category and the ASP.NET Core Web Application template as shown in Figure
2.
After creating the project, I added a model class that defines the properties
for a to-do-list item using the code shown in Figure 3. I kept it pretty light-
weight and only created properties for the ID and name of the to-do-list item
and a Boolean to track if the item is completed.
}
public class TodoItemRepository : ITodoItemRepository
{
private readonly TodoContext _context;
if (!_context.TodoItems.Any())
{
_context.TodoItems.Add(new TodoItem { Name = “Item1” });
_context.SaveChanges();
}
}
[Produces(“application/json”)]
[Route(“api/Todo”)]
public class TodoController : Controller
{
private ITodoItemRepository _repository;
[HttpGet]
public IEnumerable<TodoItem> Get()
{
return _repository.Get();
}
[HttpPost]
public void Post([FromBody]TodoItem value)
{
_repository.Add(value);
}
[HttpPut(“{id}”)]
public void Put(int id, [FromBody]TodoItem value)
{
_repository.Update(value);
}
Now that I have completed the classes that implement the to-do-list web API, I need to configure a couple of
things for the web API to work. When I created the web API controller implementation, I mentioned that I’m us-
ing the ITodoItemRepository, but after reviewing the code, you may be wondering how that ITodoItemRe-
pository field gets an instance of the TodoItemRepository that implements Entity Framework. ASP.NET Core
has built-in dependency-injection container support to inject implementations at runtime, and with a call to an
IServiceCollection.Add* method in the Startup.cs as shown in Figure 7, I can associate the TodoItemRe-
pository class with the ITodoItemRepository interface so that whenever a field of type ITodoItemReposi-
tory is needed, it can be initialized with an instance of the TodoItemRepository implementation. In this case,
I am using the AddScoped() method, which creates a new instance per request and is recommended for Entity
Framework. You can read more about the service-lifetime options.
services.AddScoped<ITodoItemRepository, TodoItemRepository>();
}
app.UseMvc();
}
}
Figure 7: Startup.cs.
One other aspect that I need to configure is the data Azure App Service is one of those higher-level plat-
store for Entity Framework. For my simple “Hello form services that abstract away and hide the serv-
World!” to-do-list API, I chose to use the in-memory ers and infrastructure and just provide a target for
database. In the Startup.cs shown in Figure 7, the call deploying our web applications. In Visual Studio, we
to IServiceCollection.AddDbContext configures can right-click on an ASP.NET project and select the
the in-memory database for the Entity Framework Publish option as shown in Figure 8 to start deploy-
context. ing a web application to Azure App Service.
Conclusion
As I mentioned at the beginning of this article, one
of my favorite features of .NET Core is the broad
platform support that includes Windows, Linux, and
macOS. Combined with Azure, this not only pro-
vides cross-platform development but also a cloud
platform to host and run your applications that sup-
ports many OSs including Windows Server and mul-
Figure 10: Adding Docker Support to ASP.NET tiple distributions of Linux and additionally provides
Core apps. many higher-level platform services like Azure App
Service, Docker containers, and serverless comput-
Visual Studio will then display a dialog that allows us ing with Azure Functions. This capability is extremely
to choose whether we want Windows or Linux Dock- exciting and opens many possibilities by providing
er containers. After we select a platform, Visual Studio broad ecosystem support and being open.
will add a Dockerfile to our project and will place in
the Solution Explorer a docker-compose node that
contains a couple of docker-compose.yml files.
On the eve of the first Cloud Native .NET track at Cloud Everything has changed for the better. It’s exciting.
Foundry Summit, we sat down with Neal to discuss
the state of .NET, and why he’s bullish on its future. Why do you say .NET Core was the catalyst for this?
How would you describe the state of the .NET Frame- The executives and operations folks I talk to feel this
work over last 10 years? way for one simple reason: .NET Core isn’t tied to Win-
dows. In 2018, the operating system is a commodity.
The .NET world takes its cues from Microsoft. The That’s true of Windows or Linux. Now, people care
community looks to Microsoft to lead the way, to about platforms and distributed systems.
show them where to go, how to change.
When the OS doesn’t matter - when Windows doesn’t
To be sure, Microsoft introduced incremental up- matter - you can break free from the APIs that tethered
grades, but seemed more focused on JavaScript, WPF, you to the OS. Deep integration with the OS is now an
and other non-.NET technologies. CIOs and other anti-pattern. It’s slow, inefficient, and you have to deal
executives saw .NET apps as a liability. There wasn’t with licensing.
a clear path to modernize applications. The anxiety
with .NET shops was palpable. It was easy to under- All that said, you are still going to have plenty of Win-
stand why people would feel this way. When your dows Server deployments, and lots of .NET Frame-
most important business systems run on .NET, and work apps for the next decade. That’s because of
you’re not sure of the tech’s future, there’s a real risk the ASP.NET Webforms module. A huge part of your
there. portfolio uses this component.
42
lar in concept to HTTP modules ments (especially production!).
but quite different in implemen- This is critical to DevOps practice,
tation. Middleware classes are as it allows you to verify, with hard
integrated through code, and are numbers, that the changes you
much simpler to configure. They are making to your system are not
form a request/response pipe- degrading its performance. You
line chain of modifiers for the re- can then add new features with
sponse to a request. the confidence that moving fast
does not have to break things.
Injecting middleware into an ASP.
NET Core application that per-
forms monitoring is almost triv- Conclusion
ially easy. I’ll demonstrate an .NET Core was conceived and
example with Azure Application developed with DevOps prac-
Insights. I created an Application tices in mind. The CLI and open
Insights resource in my Azure build system and libraries make it
portal, then edited exactly three possible to automate and adapt
files in my repository to enable the software delivery process to
Application Insights monitoring: just about any imaginable set
of requirements. Build automa-
• dotnet-angular.csproj — tion and continuous integration
added a line to reference the are achieved through CLI script-
Application Insights assem- ing, or deeper programmatic
bly (this manual step was integration if you prefer. Appli-
only necessary because I cation monitoring with open-
was using Visual Studio for source or paid enterprise tools
Mac; details here). are available to turn your system
from a black box to a clean pane
• appsettings.json — add- of glass. .NET Core, delivered us-
ed my Application Insights ing DevOps practices, is a com-
instrumentation key. pelling platform for modern soft-
• Startup.cs — where mid- ware systems.
dleware is configured. I add-
ed the Application Insights
middleware here.
Having done these things, I was
able to start debugging locally
and gather monitoring data from
Application Insights. You can try
it out yourself — just replace the
sample key in appsettings.
json with your key.
Ports-and-adapter pattern
We want our objects throughout our API solution to have single respon-
sibilities. This keeps our objects simple and allows us to easily fix bugs or
enhance our code. If code is difficult to change, it might be violating the
single-responsibility principle. As a general rule, I look at the implemen-
tations of the interface contracts for length and complexity. I do not
limit the number of lines of code in my methods, but if the IDE passes a
single view, it might be too long. Also, check the cyclomatic complexi-
ty of methods to determine the complexity of a project’s methods and
functions.
We can imagine the pattern best like an onion, with ports on the out-
side of the hexagon and the adapters and business logic located in lay-
ers closer to the core. I see the external connections of the architecture
as the ports. The API endpoints that are consumed or the database con-
nection used by Entity Framework Core 2.0 would be examples of ports
while the internal data repositories would be the adapters.
Domain layer
We need to explain how we build out the contracts through
interfaces and the implementations of our API business log-
ic. Let’s look at the domain layer. The domain layer has the
following functions:
• It defines the entity objects that will be used throughout the solution. These models will represent the data
layer’s DataModels.
• It defines the ViewModels that the API layer will use for HTTP requests and responses as single objects or
sets of objects.
• It defines the interfaces through which our data layer can implement the data-access logic.
• It implements the supervisor that will contain methods called from the API layer. Each method will represent
an API call and will convert data from the injected data layer to ViewModels to be returned.
Our domain entity objects are a representation of the database that we are using to store and retrieve data used
for the API business logic. Each entity will contain the properties represented in the SQL table. As an example,
this the Album entity:
The Album table in the SQL database has three columns: AlbumId, Title, and ArtistId. These three properties are
part of the Album entity, as is the artist’s name, tracks, and associated artists. As we will see in the other layers in
the API architecture, we will build upon this entity object’s definition for the ViewModels in the project.
The ViewModels are the extensions of the entities and supply more information to the consumers of the
APIs. Let’s look at the Album ViewModel. It is similar to the Album entity but has an additional property. In the
design of my API, I determined that each Album should have the name of the artist in the payload passed back
from the API. This allows the API consumer to have that crucial piece of information about the Album without
having to have the Artist ViewModel passed back in the payload (especially when we are sending back a large
set of albums). An example of our Album ViewModel is below.
The interface defines the methods needed to implement the data-access methods for the Album entity. Each
entity object and interface is well defined and simple, and that allows the next layer to be well defined.
Finally, the core of the domain layer is the Supervisor class. Its purpose is to translate to and from entities and
ViewModels and to perform business logic away from the API endpoints and the data-access logic. Having the
supervisor handle this also isolates the logic to allow unit testing on the translations and business logic.
Looking at the Supervisor method for acquiring and passing a single Album to the API endpoint, we can see
the logic in connecting the API front end to the data access injected into the supervisor while keeping each
isolated.
Keeping most of the code and logic in the domain layer will allow every project to adhere to the single-respon-
sibility principle.
Data layer
We are using Entity Framework Core 2.0 in this example, which means that we have the Entity Framework Core’s
DBContext defined and the data models generated for each entity in the SQL database. The data model for the
Album entity, for example, has three properties stored in the database along with a property that contains a list
of associated tracks to the album and a property that contains the artist object.
While we can have a multitude of data-layer implementations, remember that all must adhere to the require-
ments documented in the domain layer: each data-layer implementation must work with the ViewModels and
repository interfaces detailed there. The architecture we are developing uses the repository pattern to connect
the API layer to the data layer. This is done using dependency injection (as discussed earlier) for each of the
repository objects we implement (the “API layer” section contains the code for dependency injection). The key
to the data layer is the implementation of each entity repository using the interfaces developed in the domain
layer. The domain layer’s Album repository shows that it implements the IAlbumRepository interface. Each
repository will inject the DBContext that will allow access to the SQL database using Entity Framework Core.
_context.Update(album);
await _context.SaveChangesAsync(ct);
return true;
}
API layer
This is where our API consumers will interact. This layer contains the code for the web-API endpoint logic includ-
ing the controllers. The API project for the solution will have a single responsibility and that is to handle the HTTP
requests received by the web server and to return the HTTP responses with either success or failure. This project
will have minimal business logic. We will handle exceptions and errors that have occurred in the domain or data
projects and effectively communicate with the API consumer. This communication will use HTTP response codes
and any data to be returned will be located in the HTTP response body.
ASP.NET Core 2.0 handles web-API routing with attribute routing (to learn more about attribute routing in ASP.
NET Core, go here). We are also using dependency injection to assign the supervisor to each controller. Each
controller’s Action method has a corresponding Supervisor method that will handle the logic for the API call.
A segment of the Album controller shows these concepts:
[Route(“api/[controller]”)]
public class AlbumController : Controller
{
private readonly IChinookSupervisor _chinookSupervisor;
[HttpGet]
[Produces(typeof(List<AlbumViewModel>))]
public async Task<IActionResult> Get(CancellationToken ct =
default(CancellationToken))
{
try
{
return new ObjectResult(await _chinookSupervisor.GetAllAlbumAsync(ct));
}
catch (Exception ex)
{
return StatusCode(500, ex);
}
}
...
}
The web API for the solution is simple and thin. I strive to keep as little code as possible in this solution as it
could be replaced with another form of interaction in the future.
Conclusion
Designing and developing a great ASP.NET Core 2.0 web API solution takes insight, and can lead to a decoupled
architecture that allows each layer to follow the single-responsibility principle and to be easily testable. I hope
my information will allow you to create and maintain your production web APIs for your organization’s needs.
using Xunit;
using Chinook.API;
using Microsoft.
AspNetCore.TestHost;
using Microsoft.
AspNetCore.Hosting;
(See figure 3)
Figure 3: Integration test using directives.
I now have to set up the class with
our TestServer and HttpCli-
ent to perform the tests. I need
a private variable called _client
of type HttpClient that will be
created based on the TestServ-
er initialized in the constructor
of the AlbumAPITest class. The
TestServer is a wrapper around
a small web server that is created
based on the Chinook.API Start-
up class and the desired develop-
ment environment. In this case, I
am using the development envi-
ronment. I now have a web server
that is running the API and a cli-
ent that understands how to call
Figure 4: Our first integration test to get all albums. the APIs in the TestServer. I can
now write the code for the inte-
gration tests. (Figure 4)
test the entire logic for each API I need to add the appropriate
on my HTTP endpoint. This test- NuGet package. Add the package In addition to the constructor
ing will follow the entire workflow Microsoft.AspNetCore.TestHost code, Figure 4 also shows the
of the API from the API project’s to Chinook.IntegrationTest proj- code for the first integration test.
controllers to the domain proj- ect. This package contains the The AlbumGetAllTestAsync
ect’s supervisor, and finally to the resources to perform the integra- method will test to verify that
data project’s repositories (and tion testing. (Figure 2) the call to get every Album from
back the entire way to respond). the API works. Just like unit test-
I can now move on to creating my ing, the logic for my integration
first integration test to verify my testing also uses the arrange/
Creating the integration API externally. act/assert logic. I first create an
test project HttpRequestMessage object
To take advantage of your exist- with the HTTP verb supplied as
ing knowledge of testing, integra- Creating your first a variable from the InlineDa-
tion-testing functionality is based integration test ta annotation and the URI seg-
on current unit-testing libraries. To start with the external testing ment that represents the call for
I will use xUnit for creating my of all of the APIs in my solution, I all albums (/api/Album/). I next
integration tests. After I have am going to create a new folder have the HttpClient _client
created a new xUnit test project called API to contain my tests. I send an HTTP request, and finally
named Chinook.IntegrationTest, will also create a new test class for
I can also create integration tests for specific entity keys from the APIs.
For this type of test, I add additional value to the InlineData annotation Conclusion
that will be passed through the AlbumGetTestAsync method parame- Having a well-thought-out test
ters. The new test follows the same logic and uses the same resources as plan using both unit testing for
the previous test, but will pass the entity key in the API URI segment for internal testing and integration
the HttpRequestMessage object. You can see the code in Figure 5. testing for verifying external API
calls is just as important as the ar-
After you have created all of your integration tests to test your API, you chitecture you create for the de-
will need to run them through a test runner and make sure they all velopment of your ASP.NET Core
pass. You can also perform all of the tests you have created during your Web API solution.
DevOps CI process to test your API over the entire development and de-
66
`
Tech Ethics
65
Domain-Driven
Design in Practice
67 Chaos Engineering
This eMag will inspire you to dig deeper into your systems,
question your mental models, and use chaos engineering
to build confidence in your system’s behaviors under tur-
This eMag highlights some of the experience of re-
al-world DDD practitioners, including the challeng-
es they have faced, missteps they’ve made, lessons
learned, and some success stories.
64
bulent conditions. Testing Your Distributed
(Cloud) Systems