CODEMagazine SeptemberOctober2023
CODEMagazine SeptemberOctober2023
SEP
OCT
2023
codemag.com - THE LEADING INDEPENDENT DEVELOPER MAGAZINE - US $ 8.95 Can $ 11.95
The World of
Server-Side Events
Cover: @midjourney + onsightdesign
We will send an expert to your office to meet with you. You will receive:
1. An overview presentation of the current state of AI.
2. Learn how to use AI in your business while ensuring privacy of your and your clients’ information.
3. We’ll build a sample application built on your own HR documents – allowing your employees to query
those documents in English, which will cut down the number of questions that you
and your HR group have to answer.
4. A roadmap for future use of AI catered to what you do.
CONTACT US TODAY FOR A FREE CONSULTATION AND DETAILS ABOUT OUR SERVICES
codemag.com/ai-services
832-717-4445 ext. 9 • [email protected]
TABLE OF CONTENTS
Features
8 Top Azure Active Directory 58 Vite and Progressive Web Apps
Mistakes Shawn shows you how the development build environment Vite
uses abstraction for packaging, offline work, and installing a
Sahil examines some of the most common mistakes and misunderstood website outside the JavaScript framework.
concepts that cause insecure applications in Active Directory. Shawn Wildermuth
The protocols he covers are portable to any identity platform.
Sahil Malik
63 Authentication in Laravel, Part 2:
15 Building Web APIs Using Node.js Token Authentication
and Express: Part 3 In Part 2 of Bilal’s Authentication series, you learn about how
tokens work and what you can do to take advantage of the
In the third article of this series, Paul examines how to build a website token options that Laravel offers.
using Node.js and Express to serve web pages, how to use a templating Bilal Haidar
engine called Mustache to create dynamic web pages from the data
retrieved from API calls, and how to configure cross-domain resource
sharing (CORS).
Paul D. Sheriff
US subscriptions are US $29.99 for one year. Subscriptions outside the US pay $50.99 USD. Payments should be made in US dollars drawn on a US bank. American
Express, MasterCard, Visa, and Discover credit cards are accepted. Back issues are available. For subscription information, send e-mail to [email protected]
or contact Customer Service at 832-717-4445 ext. 9.
Subscribe online at www.codemag.com
CODE Component Developer Magazine (ISSN # 1547-5166) is published bimonthly by EPS Software Corporation, 6605 Cypresswood Drive, Suite 425, Spring, TX 77379 U.S.A.
POSTMASTER: Send address changes to CODE Component Developer Magazine, 6605 Cypresswood Drive, Suite 425, Spring, TX 77379 U.S.A.
Error-Driven Development
Any developer who’s been in this profession for any amount of time will relate to this story. I was
working on an application integration using a vendor’s REST API. The purpose of this integration was
to manipulate metadata across several different customer sites and the documents contained within.
My initial uses of the API went well. I was able cause a bug. But I digress. What happened sense in my mind. My technique was to ask the
to query the main list of sites and then I was here? Was some senior-developer Jedi mind student this question in return: “Can you ask
able to get a list of the documents for each trick used to root out the culprit? Or was it that a different way?” This required the student
site. The only problem was that once I tried to something else? to rethink the problem and gave me a few more
access documents beyond the first site in the beats to come up with an answer. This tech-
list, I received an HTTP 403 error. This error is a It was 100% something else. nique was often successful, and I continue its
security access error that says the current user use it today when working on solving problems
doesn’t have proper access to the requested re- Let’s talk about another development story. with other teammates.
source. Last week, I was working with a colleague who
came to me with the following message over So now you have a tool to solve sticky prob-
This was strange. The account I was using to Teams: “When you have a few cycles, can you lems. Simply ask yourself: “Can I look at this
test was a Site Administrator account with full take a look at the data for client ABDC to see another way?”
access. It was a head-scratcher. So I started why it might not be loading?” I was between
doing what all developers do when faced with tasks, so I took a look right away. This was a
a difficult challenge. I went fishing for ideas new client, so I queried the data warehouse to Rod Paddock
(AKA trying stuff). My first thought was to set see if they had their initial setup made. Here’s
up an account with different rights. I gave that the query:
a try only to be foiled by licensing. There were
no licenses available, so I was going to need to SELECT * FROM v_clients WHERE code = 'ABDC'
go a different route. I did what all good devel-
opers do: I consulted Google and came up with Zero records returned. I returned to Teams: “Hey
a few loose threads, but for the most part, it COLLEAGUE. Are you sure this client was set up
was no help. at all? Not seeing it in v_clients. Is the spelling
proper?” The answer came back immediately:
Back to the drawing board. I decided to return “Doh?!?!? Yeah, that’s the problem ABCD not
to the code to try to reproduce the error and see ABDC. Thanks for the assist!” Huzzah! Another
if there was any additional information I might victory. Was it my Jedi-level query skills that
have missed. With a few tweaks, I was able to solved this problem or was it something else?
reproduce the error. It returned the same vague
error. Okay, now what? I returned to the code It was 100% something else.
and tweaked something to see if I could get a
different answer. (It worked with my parents, In each of these situations, it wasn’t the micro
so why not this API? LOL.) After a few more solutions of error codes and queries that solved
failed attempts, I finally received a different the problem. Nope. What was it? I posit that
error value. It read something like: “The current the solution to the fix was applying a differ-
logged in user is not authenticated against this ent perspective to each given problem. In the
site.” former, it was triggering a different error code
and the latter solution was achieved by sim-
BINGO! I had a new error to investigate, this ply having someone else look at the problem.
time with actionable information. I returned to I can’t tell you how many times that particular
the documentation to be informed that I need- colleague helped me find a bug with laser-like
ed to authenticate against each site individu- precision in a millisecond just by looking at a
ally versus sharing a single common authenti- block of code for a moment. “There it is!” The
cation. I quickly tweaked my code to authenti- old fresh-pair-of-eyes technique seems to al-
cate with each site individually. Huzzah! This ways work.
worked perfectly. After this blocker, I was able
to finish my application integration in short or- Sometimes you need to step away from a
der. Error-driven development had worked. problem to find a different path to solving it.
There’s an old technique I learned when I was
I smile at the irony of causing errors (bugs) doing professional developer training. Some-
to find solutions to fixing other errors (more time students pose questions that I would find
bugs). It’s not every day that you get paid to difficult to understand or that just didn’t make
6 Editorial codemag.com
APPS
WITHOUT
COPILOTS
ARE NOW
LEGACY!
Microsoft has introduced Copilot-enabled applications and announced that all Microsoft applications
will follow this approach going forward. This aims to capitalize on the extreme productivity
gains this new paradigm promises. Microsoft also has tools to let you create Copilots for
your own applications and systems.
CODE Consulting can help you build Copilots into your own applications.
VISIT OUR COPILOT PAGE TO FIND OUT MORE ABOUT OUR SERVICES
AND TO SEE DEMONSTRATIONS AND EXAMPLES!
codemag.com/copilot
832-717-4445 ext. 9 • [email protected]
ONLINE QUICK ID 2309021
was given to the API surface, and thinking through every Boy, I’m a worry wart, aren’t I? Well, when it comes to
possible way it would be used and abused. Products were computer security, such shortcuts and speedy develop-
released later, documentation was released first. Then ment will almost always land you in trouble. You need to
some of us worked closely with product groups and wrote understand how security works because your features are
books. no good if the system isn’t secure.
How times have changed! Development today is built as a So I thought it would be a good idea to write an article
skyscraper of SDKs that we constantly update, sometimes highlighting some of the most common mistakes and mis-
automatically, and almost never understand the inner understood concepts I’ve seen that cause developers to
Sahil Malik workings and dependencies of. Security vulnerabilities write insecure applications. I’ll keep my discussion spe-
www.winsmarts.com creep in after we ship code, and most companies don’t cific to Azure Active Directory, but the protocols that are
@sahilmalik have budgets to remediate or even detect these vulner- standard to many of the things I’ll talk about in this ar-
abilities. Developers ship code knowing it has bugs, as ticle are portable to any identity platform.
Sahil Malik is a Microsoft
time to market and feature lists trump everything else.
MVP, INETA speaker,
The recourse is the internet, constant beta state, and Also, it’s worth saying that these are not shortcomings of
a .NET author, consultant,
quick updates and frequent patches. Documentation is a Azure Active Directory, but common mistakes that people
and trainer.
soldier missing in action. Books are not something being make.
Sahil loves interacting written anymore, and it’s all about time to market for
with fellow geeks in real guidance as videos and blog posts. Learning has become With that background, let’s get started.
time. His talks and train- more of a reference.
ings are full of humor and
practical nuggets. Pair this with two other facts: products have gotten more Redirect URI Pollution
complex and we depend increasingly on computers for Almost every modern identity protocol depends on redi-
His areas of expertise are everything, and that well-funded nation state actors have rect URIs. Redirect URIs is a whitelist of URIs where the
cross-platform Mobile app great incentives to cause damage. identity provider feels safe sharing sensitive information,
development, Microsoft be it SAML that may post you back a SAML assertion, or
anything, and security And don’t even get me started with AI-driven develop- an OIDC flow that posts back tokens or codes that be
and identity.
ment where we aren’t even writing code; we’re hoping the exchanged for tokens. Those tokens are post backs and
computer will write it for us. are very sensitive. You should treat them with the same
There are good reasons behind each of these limitations. But this problem of mixing app concerns goes a bit
For instance, requiring HTTPS ensures that the packets are deeper than that. Not every OIDC flow is equally secure.
not sniffable over the network. Also, browser protections The problem is that the flows are designed to offer you
ensure that a site masquerading as another will automati- the maximum security any individual platform offers. But
cally be detected. Or, for that matter, a man-in-the-middle there’s no way a single page application (SPA) can be as
sniffer will also be detected. This is, of course, defeatable secure as a thick client app. This is simply because the
if the trusted certificate store of the client computer can browser can’t offer the same crypto capabilities that a
be altered by the attacker. This is a common issue with mobile app can. Similarly, a mobile app can’t be as secure
certain governments or even organizations where they as a web app that never shares the tokens to the client
push certificates to your computer that can effectively app. The client computer is an untrustable surface.
allow a man-in-the-middle to sniff HTTPS traffic. So you
can’t assume that HTTPS is the final word in security. So the one obvious conclusion that comes out of this is
that you should always use the right OIDC flow for the
Of special mention is the localhost redirect URI. It’s quite right kind of app. The second conclusion that comes out
common to use localhost as a redirect URI when develop- of this is to offer your users the most secure app choice
ing locally or for certain categories of apps, specifically you can offer. For instance, if you have a choice between
thick client apps that don’t use brokers. The sad real- a SPA doing an OIDC flow to acquire an access token vs.
ity is that Azure AD does not have a good DevOps story, hosting APIs within the web app and doing an OIDC flow
and yeah, lecture me as much as you wish about Micro- suited more for web apps, all other things being equal,
soft Graph and the sales pitch. Developers are frequently lean toward the web app approach.
forced to develop in a production tenant because that’s
where all the licenses are, and that’s where all the data But this is where things get funny. Earlier, I talked about
resides. If they do set up a separate tenant, it’s quite an IT ogres being onerous while developer fairies try to keep
onerous task for them to move their app registration and the users happy. So imagine that Olaf the IT ogre refuses
all the necessary configuration, including all the users, to create a new app registration for you. But for a particu-
permissions, roles, etc. to another tenant. It ends up be- lar scenario, you need to support a web application and a
ing a lot of work writing scripts, relying on third-party thick client application. So instead of dealing with Olaf,
products, etc. you decide to enable public client flows in the same app
as the web app. This can be seen in Figure 2 under the
So guess what most developers do? You guessed it. They authentication section of an app registration.
develop in production (at least as far as Azure Active Direc-
tory goes), and there’s been a battle waging for the past You may be thinking that you’re using thick client applica-
thousand years between developer fairies and IT ogres. tion OIDC flows and web application flows in the same app
registration. But it’s important to see that you just opened
Now what URI will a developer use when developing lo- the gates to also allow device code flow, windows integrated
cally? You guessed it: https://localhost. If you go back auth flow, and ROPC flow. None of these are great choices
a few years, this used to be http://localhost. Invariably, when it comes to security. Don’t get me wrong, there are sit-
this slips into production. The developer either leaves uations where you need them and you have no alternative.
that redirect URI because they may still want to develop ROPC, for instance, relies on sending the user’s password
against the same app registration as production. Or they and username as a POST request. Device code flow involves
simply forget to remove that stray redirect URI. authenticating on a separate computer than where you did
your authentication. Neither of these works well with con-
The problem this causes is that a listener on any com- ditional access policies or device cleanliness. But you just
puter running on localhost can now accept your tokens, opened the doors to these less-than-ideal OIDC flows.
and oh, those listeners do exist and are hard to detect.
Organizations also try to reduce authentication prompts, What you should have done instead is restrict the less de-
so effectively, you’re SSOing a user into an app with a re- sirable OIDC flows to as low of a consent surface as pos-
direct URI to a nefarious listener that can now post your sible. And separate them in a new app registration. If Olaf
tokens to the internet. Ouch! the IT ogre objects, have him read this article please, ok?
and the tenant ID. Now you can acquire access tokens for
as long as you desire. What’s worse, if such a client secret
is leaked, there’s no way to get an alert either.
Instantly Search
Managed identities are service principles whose credentials
(in this case, certificates) are managed entirely by Azure.
They’re never shared with an app, and they’re never shared
Terabytes
with the user. They remain out of reach of any surface that
could leak them. Additionally, they’re automatically rotated
for you. How often are they rotated? I don’t care. Because I
don’t have to do that rotation and update a number of apps
and inform users when such rotation is occurring.
Figure 5: The access token header There are a few important things you need to understand
about token validation.
Figure 6: The JWKS URI output And this is where we’re done talking about signature valida-
tion issues. But token validation has other common problems.
Both access tokens and ID tokens are intended to be vali- When you validate a token in Azure AD, you must vali-
dated, albeit by different parties, but the validation logic date, at the bare minimum, the audience and the tenant,
has a few things in common. and then beyond that, per your authorization logic, the
subject and other claims such as roles and scopes.
First, the tokens are comprised of three parts: the header,
the body, and the signature. What would happen if you didn’t validate the tenant?
Well, then you could just set up another Azure AD tenant
The header gives you “packet” information. Think of it and set up the same audience, and pass your validation.
as the label on your package. It tells you what kind of Hopefully you’re validating the subclaim (subject), but if
signature algorithm is used, and where to find the keys you weren’t, I’m in. Yay!
to validate the signature. Azure AD has the private key
to sign the access token or ID token. It never shares the How confident are you that the copy/paste code from
private key and it rotates the public private keypair every the internet that you used is indeed performing all this
now and then. The public key is available to you at an validation, and caching the signature validation cert? Are
anonymous URL. You can use the public key to validate you sure? Better double check.
the signature. In Figure 5, you can see the header for an
access token I just acquired from my tenant.
Abusing Claims
Of special note is the “alg,” which stands for algorithm, OIDC defines a bare minimum standard. It allows you to
and the “kid” or key ID claim. The idea is that Azure AD add claims that the standard doesn’t define but doesn’t
is telling you that it used RS256 to sign the token, and prevent you from adding. There are a few problems here.
you can find the keyID from the JWKS URI, which resides
at the following URL: First, this gives you the leeway to stray away from the
standard. I suggest that you don’t stray away from the
https://login.microsoftonline.com/ standard. Do the bare minimum validation that OIDC re-
<tenant_id>/discovery/keys quires, and then add more on top as necessary and very
judiciously.
You can see the key ID for my app in Figure 6. I got this
as a simple GET request to the JWKS URI. Second, don’t use tokens as a database. Tokens are sup-
posed to be validated at every authorization step. The
To validate the signature of the token, graph the signa- larger the token, the worse your performance will be. In
ture key from the JWKS URI. This is where I should men- fact, in many situations, it will break your application
tion one of the biggest issues I see so many applications logic. Many platforms or even SDKs will cap the token
make a mistake in. size. Browsers are especially notorious for this. I’ve seen
well known applications (that shall remain unnamed)
Azure AD will rotate the key every now and then. You’re with ridiculously low limits like 256 characters. My South
supposed to follow the following pattern: Indian friends have very long last names, and I have seen
even the bare minimum token go beyond the 256 char-
1. First, check if you have the signing key. If you don’t, acter limit.
ask Azure AD for it and cache the key.
2. Now, use the cached key to check the signature of Additionally, Azure AD has some common-sense protec-
the token. tions built in. For instance, if you’re a member of groups,
3. If the signature fails, re-query Azure AD for a new and you’re a member of many groups, you can config-
key, just in case the key has changed. ure your application to issue the groups claim. But you
a SQL Server database. In this article, part three of the se- development projects. You can even open VS Code and use
ries, you’ll build a website using Node.js and Express to serve the terminal window in there. For this article, I’m using my
web pages. You’re going to see how to use a templating en- folder D:\Samples to create the new Node.js project. After
gine, Mustache, to create dynamic web pages from the data opening a command prompt within the development folder,
retrieved from API calls. To communicate from your website create a new folder under that folder and navigate to that
to your Web API server, you must configure cross-domain folder using the following two commands.
resource sharing (CORS). You’ll see how to enable CORS in
your Web API project. You’ll then build a set of search, add, mkdir AdvWorks
edit, and delete pages that make calls to your Web APIs. Icd AdvWorks
Paul D. Sheriff
Open the Visual Studio Code editor in this new folder us-
Create a Node.js Web Project Using ing the following command. Note that this is the word
http://www.pdsa.com
VS Code code followed by a space and a period (.). Paul has been working
Like my last two articles, this article is designed for you in the IT industry since
to follow along. You learn the most by doing, so please code .
1985. In that time, he has
successfully assisted hun-
feel free to join me to create a website together. You only
dreds of companies archi-
need a few tools, most of which you probably already From the menu system in VS Code, open a new terminal by
tect software applications
have on your computer. You need to install Node.js and selecting Terminal > New Terminal. Type in the following to solve their toughest
VS Code on your computer. You also need access to a SQL command in the terminal window to start building a new business problems. Paul
Server, so install one on your computer or use SQL Server JavaScript project. has been a teacher and
Express. Everything else is downloaded as you go along. mentor through various
For more information about installing these tools, please npm init mediums such as video
go back and review the instructions in Part 1 of this arti- courses, blogs, articles
cle (https://www.codemag.com/Article/2305031/Build- Within the terminal window, it asks for some information and speaking engage-
ing-Web-APIs-Using-Node.js-and-Express-Part-1). to describe this project. If you wish to accept the default ments at user groups and
answers, press the Enter key after each prompt; other- conferences around the
Let's get started creating this new website using Node.js wise enter the appropriate information for your project, world. Paul has multiple
and Express. Open a Command Prompt, the Windows Pow- as shown in Figure 1. At the end, answer Yes to save a courses in the www.plural-
erShell app, or some other terminal as appropriate for your file called package.json in your new project folder. The sight.com library (https://
computer and navigate to where you normally place your package.json file contains meta-data about your project bit.ly/3gvXgvj) and on
to help npm run the scripts in the project, install de- Udemy.com (https://bit.
pendencies, and identify the initial JavaScript file used ly/3WOK8kX) on topics
to start the application. The package.json file contains ranging from C#, LINQ,
meta-data about your project to help npm run the scripts JavaScript, Angular, MVC,
in the project, install dependencies, and identify the ini- WPF, XML, jQuery, and
Bootstrap. Contact Paul at
tial JavaScript file used to start the application.
[email protected].
Listing 1: In the index.js file is where you serve up the static HTML home page. Be sure to save the changes to the package.json file at
// Load the express module this point.
const express = require('express');
// Create an instance of express
const app = express(); Create Website Starting Point
// Create an instance of a Router
const router = express.Router(); Whether you’re creating a Web API project or a website
// Specify the port to use for this server project, the starting code for Express is very similar. Just
const port = 3010; like before, you need to create a file named index.js as
the starting point. Open the index.js file and add the code
&RQÀJXUHORFDWLRQ V RIVWDWLF+70/ÀOHV
app.use(express.static('public'));
shown in Listing 1. There are only a few differences in
this code from the Web API project created in the first
/** part of this article series as you will notice as you build
* GET the website in this article. First, set the port number to a
#UHWXUQVLQGH[KWPOÀOH different value from that of your Web API project. Invoke
*/
app.get('/', (req, res, next) => { the method app.use(express.static('public')) to set the
res.status(200).send(); location of where your static HTML files are located. Within
}); the app.get() function, call the send() function with no
parameters to send the HTML to the requesting browser.
// Create web server to listen
RQWKHVSHFLÀHGSRUW
let server = app.listen(port, function () { How does the Express server know to use a file called index.
console.log(`AdvWorks web server is running html from the app.get("/" …) call? In the line of code
on http://localhost:${port}.`); app.use(express.static('public')), the express.static()
});
function assumes that a file named index.html is within
the public folder. Override this file name, if you wish, by
passing in a second parameter to set the index property to
the name of HTML file you wish to use. For example, if you
Listing 2: Create the index HTML page. rename the index.html file in the public folder to home.
<!DOCTYPE html> html, use the following code to use that file name:
<html>
<head> app.use(express.static('public',
<title>Product Maintenance</title>
<link rel="icon" type="image/x-icon" {
href="#"> "index": "home.html"
<style> }
.text-center { ));
text-align: center
}
</style> Create the HTML Home Page
</head> Because you declared your intention to use a folder named
public in the express.static('public') call, create a folder
<body>
<div class="text-center">
named public in your project and add a file named index.
<h1>Product Maintenance</h1> html. Add the code shown in Listing 2 to this new index.
html file. Make sure you put the <script> tag that refer-
<button onclick="getAllProducts();"> ences the googleapi.com all on one line. I had to break
Get All Products the line because of formatting restrictions in this article.
</button>
</div>
Create JavaScript File to Make API Call
<div class="text-center"> At the bottom of the index.html file, there are two
<textarea rows="10" cols="80" <script> tags. The first tag references the ajax library
id="products">
</textarea> from Google to help make API calls. The second <script>
</div> tag references a local file called index.js into which you’re
<script going to write the code to make the API calls to the Ad-
src="https://ajax.googleapis.com vWorksAPI project you created in the previous articles.
/ajax/libs/jquery/3.6.3
/jquery.min.js">
</script> On the index.html file, you can see a button that calls
<script src="./scripts/index.js"> a function named getAllProducts(). Create a file named
</script> index.js file in the public folder, and add the getAll-
</body>
</html>
Products() function, as shown in Listing 3. The code in
the getAllProducts() function should be very familiar to
&RQÀJXUH&256
const corsHelper = require('./helpers/cors');
FRUV+HOSHUFRQÀJXUH DSS
Try It Out
Save all changes made to your AdvWorksAPI project.
Assuming that both projects are running, navigate to Figure 3: You should get an array of product data if everything worked correctly.
Listing 5: Create a hard-coded product page. http://localhost:3010 and click on the Get All Products
<!DOCTYPE html> button. If you’ve done everything correctly, you should
<html> see the product array appear in the text area, as shown
<head> in Figure 3.
<title>Product Maintenance</title>
</head>
Did You Get an Error?
<body> If you get an error message about not being able to make
<h1>Product List</h1> a connection, make sure that TCP/IP is enabled for SQL
Server on your computer. By default, it’s usually turned
<select size=10>
<option id="345">HL Road Frame - Red, 58</option>
off. Open the Computer Management app (Figure 4) in
<option id="346">Sport-100 Helmet, Red</option> Windows and expand the SQL Server Configuration Man-
<option id="348">Mountain Bike Socks, M</option> ager node, then expand the Protocols for MSSQLSERVER
</select> (or Protocols for SQLExpress if using SQLEXPRESS). Dou-
</body> ble-click on the TCP/IP protocol and in the dialog box,
</html>
change the status to Enabled.
module.exports = router;
How does Express know to use the file named product. Figure 5: Install the Mustache template extension to
mustache and not product.html or some other file? The make it easier to work with Mustache files.
answer is that it doesn't yet, but it will as soon as you
download Mustache and register it as your view engine.
Let's do that next. Listing 6: Configure the mustache-express package as the view engine for this website.
&RQÀJXUH0XVWDFKH7HPSODWLQJ(QJLQH
Add Mustache to Your Project let mustacheExpress =
Add the mustache templating engine to your AdvWorks require('mustache-express');
project by submitting the following command in the ter- // The following is the default
// change the directory name if you wish
minal window. //app.set('views', `${__dirname}/views`);
// Tell express the view engine you are using
npm install mustache-express app.set('view engine', 'mustache');
// Set the engine to Mustache
app.engine('mustache', mustacheExpress());
Open the index.js file and add the code shown in Listing 6
after the definition of the port constant. In this code, you // Mount routes from modules
load the Mustache module, tell Express what view engine router.use('/product',
you’re using with the app.set() function, and then register require('./routes/product'));
the engine using the app.engine() function. You now use &RQÀJXUHURXWHUVRDOOURXWHV
the router.use() function to create the product route with KDYHQRSUHÀ[
the routes defined in your routes\product.js file. app.use('/', router);
The token {{#data}} says to iterate over the array of Create a Table of Products
product objects contained in the data property. For each Open the routes\product.js file and add more properties
product object, emit all the HTML within the starting to each of the product objects in the array, as shown
{{#data}} token and the closing {{/data}} token. While in Listing 8. This provides better data to display in the
iterating through each, replace each property within the _productList partial page. Now that you have more prop-
tokens ({{productID}}, for example) with the data from erties for each product object, an HTML table is more ap-
that property in the product object. propriate for displaying that data. Open the views\_pro-
ductList.mustache file and replace the entire contents of
Try It Out the file with the HTML shown in Listing 9.
Save all the changes to the files in your AdvWorks project,
go to your browser, and refresh the page at http://local- Let's add a little bit of styling to the table. Open the
host:3010/product. You should see a different set of prod- views\product.mustache file and within the <head> ele-
ucts appear in your HTML list. These are now the products ment, add the following styles.
coming from the hard-coded product array in your router.
<style>
table, th, td {
Using Partial Pages border: 1px solid;
A great feature of Mustache is that it allows you to break border-collapse: collapse;
your HTML pages up into smaller chunks. This feature is }
called partial pages and is very similar to the way partial th, td {
padding: 1em; Listing 9: Add an HTML file with some templating in it to render the product data.
text-align: left; <table>
} <thead>
tr:nth-child(even) { <tr>
background-color: #f2f2f2; <th>Product ID</th>
<th>Product Name</th>
} <th>Product Number</th>
.text-end { <th>Color</th>
text-align: right; <th class="text-end">Cost</th>
} <th class="text-end">Price</th>
</tr>
</style>
</thead>
<tbody>
{{#data}}
Try It Out <tr>
Save all the changes you have made to the files in your <td>{{productID}}</td>
<td>{{name}}</td>
AdvWorks project. Go to your browser and refresh the <td>{{productNumber}}</td>
page at http://localhost:3010/product. You should now <td>{{color}}</td>
see a page with the product data in a nicely formatted <td class="text-end">{{standardCost}}</td>
table, as shown in Figure 6. <td class="text-end">{{listPrice}}</td>
</tr>
{{/data}}
</tbody>
Formatting Columns </table>
In the HTML table, you can see that the standard cost and
the list price fields are both right-justified. Because each
of these values are dollar amounts, they should be format-
ted as currency values. Mustache is doing the rendering of
the data via their tokens {{ and }}. Within these tokens,
you’re only allowed to use properties or functions that are
attached to the object passed as the second parameter to
the res.render() function. This means you need to add a
couple of properties to that object to render the cost and
the price as currency values. Open the routes\product.js
file and add two new properties before (or after) the data
property within the second parameter passed to the res.
render() function, as shown in Listing 10.
<td class="text-end">
{{costAsCurrency}}
</td> Figure 6: A table can be built using the Mustache templating engine.
<td class="text-end">
{{priceAsCurrency}}
</td> the this keyword to get the standardCost property from
the current row being processed. It uses the toLocale-
As Mustache processes each row in the array of product String() method on the Number data type in JavaScript
objects, it calls the costAsCurrency function which uses to convert the value to a U.S. currency format. It also
• https://www.npmjs.com/package/tiny-json-http
Another Method to Make API Calls • https://snyk.io/advisor/npm-package/tiny-json-
Instead of using the hard-coded data you put into the http/functions/tiny-json-http.post
route, let's make a call to the Web API to retrieve the • https://github.com/brianleroux/tiny-json-http
product data from SQL Server. There are many tools, such
as the Google APIs used earlier, that allow you to make Add an HTTP Call to the Product Route
Web API calls within an Express application. I like us- Open the routes\product.js file and replace the entire
ing the tiny-json-http package as it has a very simple contents with the code shown in Listing 11. The first
syntax. Within your AdvWorks project, open a terminal line of code still sets up the router from express. Next,
window and install this package by typing in the follow- load the tiny-json-http module so you can use it with-
ing command: in this module. Create a constant that has the URL for
making the calls to the Web API project you created.
npm install tiny-json-http Within the router.get() function, wrap the call to your
Web API within a try…catch block. Call tiny.get() pass-
ing in a JSON object with the url property set to the
Listing 10: Add two properties to retrieve numbers are currency values. url constant. The response object that’s returned has a
res.render('product', { data property within the body that contains the array
"costAsCurrency": function () { of product objects.
return new Number(this.standardCost)
.toLocaleString("en-US", In this version of the res.render() function, the second
{ "style": "currency",
"currency": "USD" }); parameter has four properties on this object: isListVis-
}, ible, data, costAsCurrency, and priceAsCurrency. On the
"priceAsCurrency": function () { product.mustache page (Listing 12) you’re going to use
return new Number(this.listPrice) the {{#isListVisible}} token that says if this variable is
.toLocaleString("en-US",
{ "style": "currency", a true value, then display whatever is contained within
"currency": "USD" }); this token and its closing token {{/isListVisible}}. In this
}, case, you display the HTML table in the _productList.mus-
"data": [ tache file.
{
"productID": 345,
"name": "HL Road Frame - Red, 58", {{#isListVisible}}
"productNumber": "FR-R92R-58", {{> _productList}}
"color": "Red",
"standardCost": 1059.3100, {{/isListVisible}}
"listPrice": 1500.0000,
PRGLÀHG'DWH: "2019-09-11"
}, Make a Nicer Looking Product List Page
// REST OF THE OBJECTS HERE Open the views\product.mustache file and replace the
entire contents with the code shown in Listing 12. This
] HTML is like the code you wrote before, only you’re now
} adding the Bootstrap 5.x CSS framework so you can take
advantage of all the nice CSS classes available in this
Listing 11: Call the Web API using the tiny JSON HTTP package.
// Create an instance of a Router "data": data,
const router = require('express').Router(); "costAsCurrency": function () {
return new Number(this.standardCost)
// Load tiny-json-http module .toLocaleString("en-US",
const tiny = require('tiny-json-http'); { "style": "currency",
"currency": "USD" });
// Create URL for Web API calls },
const url = 'http://localhost:3000/api/product'; "priceAsCurrency": function () {
return new Number(this.listPrice)
// GET /product .toLocaleString("en-US",
router.get('/', async (req, res, next) => { { "style": "currency",
try { "currency": "USD" });
// Request data from Web API }
let response = await tiny.get({ }
"url": url );
}); } catch (err) {
// Get data from response next(err);
let data = response.body.data; }
// Render the page });
res.render('product',
{ module.exports = router;
"isListVisible": true,
<body>
Listing 13: Add HTML to allow the user to input some search criteria.
<form class="mt-4" Greater Than Price
action="/product/search" method="get"> </label>
<div class="card"> <input type="text"
<div class="card-header bg-primary id="searchListPrice"
text-light"> name="searchListPrice"
<h5 class="card-title"> class="form-control"
Search for Products value="{{search.listPrice}}" />
</h5> </div>
</div> </div>
<div class="card-body"> <div class="card-footer bg-primary
<div class="form-group"> text-light">
<label for="searchName"> <button class="btn btn-success">
Product Name (or partial) Search
</label> </button>
<input type="text" <a href="/product"
id="searchName" class="btn btn-secondary">
name="searchName" Reset
class="form-control" </a>
value="{{search.name}}" /> </div>
</div> </div>
<div class="form-group"> </form>
<label for="searchListPrice">
<div class="row">
<div class="col text-center"> Search for Products
<h1>Product List</h1> In the Web API project, you created a search endpoint to
</div> allow you to search for products based on a partial name
</div> and/or if the list price is greater than a specified amount.
Create a new file in the views folder named _product-
Next, add the class attribute to the <table> element and Search.mustache file and add the code shown in Listing
add the Bootstrap classes shown in the following code. 13 to this new file.
These classes make formatting your HTML table easy.
Open the views\product.mustache and within the {{#is-
<table class="mt-2 table table-striped ListVisible}} tokens, add the token to bring in the new
table-bordered"> file you created. Make sure this file goes before the one
that references the _productList file.
// REST OF THE TABLE HERE
{{#isListVisible}}
</table> {{> _productSearch}}
Try It Out
Save the changes made to your project files and ensure that
the project is running. Navigate to the http://localhost:3010/
product page and type into the Product Name (or partial)
input field the value HL. Click the Search button and you
should see a few rows that match this criterion displayed
in your HTML table. Input the value 1439 into the Greater
Than Price input field and click the Search button again. You
should now see fewer rows. Finally, click the Reset button to
see the whole list of products re-appear in the table.
<th>Actions</th>
<td>
<a href="product/{{productID}}"
class="btn btn-primary">
Edit
</a>
</td>
{{^isListVisible}}
{{> _productDetail}}
{{/isListVisible}} Figure 8: This detail page is displayed after clicking on the Edit button on the table.
Listing 17: Add an insert function to add a product to the SQL Server table.
// POST from Detail Page // PUT an updated product
router.post('/', async (req, res, next) => { response = await tiny.put({
try { "url": request,
// Declare the response object "data": product
let response = {}; });
// Get posted values from form }
let product = req.body; // TODO: Handle a 404 or a 400
if (product.isAdding != 'false') {
// POST a new product // Redisplay the list
response = await tiny.post({ res.redirect('/product');
"url": url, } catch (err) {
"data": product next(err);
}); }
} });
else {
let request = url +
`/${product.productID}`;
Add the new delete route (Listing 19) by opening the Paul D. Sheriff
routes\product.js file and adding the new route BEFORE
the router.get("/:id"…) route. This route extracts the ID
passed in from the req.params.id property and uses that
ID to build the call to the Web API delete route. After the
delete has occurred, the product list page is once again
displayed, minus the row just deleted.
design—the contracting bounded context—to focus As the developers incorporate the contract aggregate fur-
on for the initial tactical design phase. That particular ther into their solution, it’s important that they make it
bounded context presented some interesting complexity easy for users to find a contract to work on, whether they
because of the many iterations a contract may go through want to make tweaks, create a new revision, or just look
as well as the possibility of co-authors on a book. The at some details. Users might want to search by author
added complexity makes the bounded context a great names, contract status, or one of the relevant dates, such
candidate for applying DDD patterns. as when the contract was first initiated or when an author
needs to respond to a version during negotiations.
The outcome of the tactical design is a contract aggre-
Julie Lerman gate (see Figure 1) defined by its entities, value objects, Although the lessons of this article apply to any stack,
@julielerman invariants and other business rules. The contract entity I’ll be using .NET Core and EF Core with a SQL Server
thedatafarm.com/contact itself is the root of the aggregate. A contract has one or database to explain the solution. The demo code is on
more versions including a version with default values and my GitHub account at https://github.com/julielerman/
Julie Lerman is a Microsoft
default specifications for every new contract. Each ver- FilteringwithEFCoreandDDD.
Regional director, Docker
sion includes a set of specifications that are encapsulated
Captain, and a long-time
in a value object. And each version also has one or more
Microsoft MVP who now
authors.
Factors that Lead to Filtering
counts her years as a coder
in decades. She makes
Problems
her living as a coach and The author class is also a value object that leans on yet In the solution, the EF Core DbContext designed to sup-
consultant to software another value object: PersonName. Figure 2 shows the port the aggregate takes care of ensuring that the ag-
teams around the world. properties of each of the aggregate classes in Visual Stu- gregate is mapped correctly to the relational database.
You can find Julie present- dio’s class designer. There’s a lot of logic involved to cre- Additionally, in order to protect the aggregate, this Db-
ing on Entity Framework, ate default versions, create revisions, finalize contracts, Context only exposes the aggregate root—the contract
Domain-Driven Design and etc. If you’re curious about that logic, it’s all detailed entity—for querying and updates. It’s possible to use the
other topics at user groups as part of my Pluralsight course: EF Core 6 and Domain- DbContext.Set<> method directly, but you should design
and conferences around Driven Design (https://www.pluralsight.com/courses/ your data access logic such that you are not circumvent-
the world. Julie blogs at ef-core-6-domain-driven-design). If you’re not familiar ing this guard rail other than, perhaps, in integration
thedatafarm.com/blog, is with DDD, you may want to start with the Domain-Driven tests as needed.
the author of the highly Design Fundamentals course also on Pluralsight that I co-
acclaimed “Programming authored with Steve Smith. For example, many of the details that would be helpful for
Entity Framework” books, selecting a contract such as author name or acceptance
and many popular videos The focus of this article is on enabling users to find pre- deadline are exposed throughout the aggregate in differ-
on Pluralsight.com. existing contracts by filtering. That seems like a problem ent classes. Yet accessing them to filter queries is overly
with a well-known and obvious solution. But is it really? complicated. It can be done but requires a lot of LINQ
trickery and expertise. And that’s just to filter the results.
The results themselves need to be a string composed of
appropriate highlights sufficient for the user to select the
exact one that they’re seeking. Listing 1 shows an ex-
ample of the most efficient solution I could come up with
to find a list of contracts based on the last name of any of
the authors. Remember that the authors could vary from
one version of the contract to another and, in this search,
the domain experts specified that this filter should only
find contracts with that author in the current version.
public class SearchResult Again, with this scenario, I have one DbContext for read-
{ ing and writing and another that only reads. Because the
public SearchResult(Guid key, second DbContext performs no writes and doesn’t even
string description,string contractNumber) perform LINQ queries, it has no bearing on the database
{ schema. The primary DbContext, ContractContext, will
KeyValue = key; be used with EF Core migrations to control the database
Description = description; schema. You should never use the SearchContext with mi-
ContractNumber = contractNumber; grations because EF Core would assume that the database
} has one single table with three columns and trash what
private SearchResult(){} was created by ContractContext.
public Guid KeyValue { get; private set; } As you are reading about having two different DbContexts
public string? Description in a single bounded context sharing the same database,
{ get; private set; } you may be wondering about the case of multiple ag-
public string? ContractNumber gregates each with their own DbContext in this scenario.
{ get; private set; } That’s not relevant to this article, but I do want to ad-
} dress the question. If you have two aggregates, you need
to find a balance where a single DbContext serves them
both and serves and controls a single database. If achiev-
Listing 2: The SearchContext class ing this is anything less than easy-peasy, that should
raise a red flag, yet again, that something is probably
public class SearchContext : DbContext wrong with your models or, more likely, that your deter-
{
public SearchContext
mination that they belong in the same bounded context
(DbContextOptions<SearchContext> options):base(options) was in error.
{
ChangeTracker.QueryTrackingBehavior =
QueryTrackingBehavior.NoTracking; What About Those Database Objects?
}
public DbSet<SearchResult> SearchResults => Here comes the fun part of the solution where it’s time
Set<SearchResult>(); to put on your database hat or find someone in your org
that wears one. Recall that I mentioned one view, one
protected override void OnModelCreating stored procedure, and one function. (Is anyone hearing a
(ModelBuilder modelBuilder)
{ George Thorogood song now?) The stored procedure may
modelBuilder.Entity<SearchResult>().HasNoKey(); seem complicated but for a database whiz, probably not.
} The hardest part will be formatting the SQL so that it’s
} readable in this article!
If the sproc is complicated, does that raise a red flag and version of a particular contract. Note that for space con-
tell me I’m doing it wrong? To me, the answer is no. I’ll sideration, the figure doesn’t display the full GUID values.
be asking the database to do something it’s very good at
and my coded solution will be clean and easy to read. And
from a DDD perspective, that means it will be easier to
Explicit Sprocs or One to Rule
understand and maintain my code. Them All?
I originally used different stored procedures to execute the
Based on the ContractContext, the database tables are needed queries. That was tidy but it led to complexity in
structured so that the contract highlights are in a Con- the application because I had to have different methods
tracts table and each version of the contract is in a Ver- and different calls. It worked and felt nicely organized.
sions table. The Versions table also contains the data for
the SpecificationSet value object, thanks to EF Core’s There was a lot of duplication in these procedures. I re-
Owned Entities mapping. Because there could be multiple moved some of it by encapsulating the creation of the
authors (remember Author is also a value object) for a description string into a function. Otherwise, the only
version, there’s a third table, Version_Authors, that con- differences were in the parameter lists and the WHERE
tains the author data. If you’re curious about how my statements in each procedure’s subquery.
mappings and migrations came up with this schema, I go
through that in detail in the above-mentioned Pluralsight For example, to find contracts by author last name, there’s
course. I’m aware that a collection of Value objects is a single parameter (@LastName) and the subquery is:
seen as an anti-pattern. I thought long and hard before
I decided to take this path. It’s a decision made based select currentversionid
on plenty of experience, so I’m comfortable with it. In from currentcontractversions
fact, it’s the first time I’ve ever designed an aggregate where left(LastName,len(trim(@LastName)))=
that has a collection of value objects as a property of the trim(@LastName)
aggregate root.
To find contracts by date initiated, the sproc takes two
Let’s start with the view. Why do you need a view? The da- parameters, @initdatestart and @initdateend, with the
tabase schema is designed to persist the aggregate. Like subquery:
the LINQ query, I’ll have to dig through multiple tables
to find the data I need to build the query for the filter select currentversionid
and the results. By encapsulating the search details and from CurrentContractversions
output details into a single view, queries will be much where
simpler to build. cast(dateinitiated as date)>=@initdatestart)
and cast(dateinitiated as date)<=@initdateend
I thought it would be easier for you to comprehend the
view in designer mode as it’s displayed in Figure 3. You Adding filters meant adding a new stored procedure and
can see which fields I’m extracting from which tables and then a new method in the search service to call that pro-
how those tables relate to one another. The SQL for the cedure. Lots of copy/pasting and the red flag came up
view is captured in a migration file in the solution that again!
you can download. This gives me all of the fields I might
want to search on and all of the fields I need to output. I considered dynamic queries (which would have meant
embedding strings into the TSQL) and decided against
The view lists all versions of each contract (Figure 4) that because—eww—and that would introduce too many
and notes which ContractVersionId represents the current complications. However, I really wanted to make things
simpler in the application and had to push the boundar- WHERE left(LastName,len(trim(@LastName)))
ies of my TSQL skills to come up with a single stored =trim(@LastName) ;
procedure. This may have been a simpler task for someone
more adept at TSQL, but I did a lot of due diligence to Finally, I execute one last query against that temporary
ensure that I was making the best choice. I hope that table where I concatenate all of the info needed in the
research led me (and as I pass this on to you) to the best list. In other words, combining the date initiated, the
conclusion. working title, and, using an aggregate function, the
names of any authors involved in that contract’s current
The stored procedure takes in all three parameters. version. Again, I recommend perusing the TSQL if you are
interested in how I implemented it all.
CREATE PROCEDURE GetContractsFlexTempTable
@LastName varchar(15), SELECT groupednames.contractId as
@initdatestart varchar(20), KeyValue,[description],ContractNumber
@initdateend varchar(20) FROM
(SELECT contractid,currentversionid,
You can inspect the TSQL in the download to see the full dbo.BuildContractHighlights(various data)
details but I will highlight some of them here. AS [description],ContractNumber
FROM CurrentContractversions
There’s some trickery to deal with null dates, for example, WHERE currentversionid IN
if you’re looking for a contract initiated after June 1 with (SELECT currentversionid
no end date. I am pre-creating converting the incoming FROM #ContractSubSet)
start and end dates to create values if they are null. GROUP BY various data) groupednames
DECLARE @SDate DATETIME With this procedure in place, the database now has a
DECLARE @EDate DATETIME single entry point—this one stored procedure that takes
in my three parameters. And I can always expand the
SET @SDate = ISNULL(@initdatestart, '19000101') logic to add more parameters and filters. For some con-
SET @EDate = ISNULL(@initdateend, GETDATE()+100) text, the current version of this sproc is 57 lines. It isn’t
a beast. It just felt like it when I had to figure out how
And there’s some more trickery to set up a temporary ta- to write it!
ble in order to collect the key of any versions that match
the filter. I’ve truly put all of the pain in the database into the
view, stored procedure, and function. And, thanks to this,
select currentversionid into #ContractSubSet from you’ll see that the code in my app is simple, readable,
CurrentContractversions WHERE 1=2 and succinct.
of SSE, and how to implement real-time updates, etc. If The Server-Sent DOM Events are the SSE foundation. By
you’re to work with the code examples discussed in this subscribing to events produced by a server using the Event-
article, you’ll need the following installed in your system: Source interface, browsers may get alerts any time new
events occur. When an EventSource attempts to get data,
• Visual Studio 2022 it accepts an HTTP event stream connection from a specific
• .NET 7.0 URL and keep the connection open. A server-sent event is
• ASP.NET 7.0 Runtime one that is always pushed from the server to a web browser
rather than retrieved or requested.
If you don’t already have Visual Studio 2022 installed,
Joydip Kanjilal you can download it from here: https://visualstudio.mi- Message Format
[email protected] crosoft.com/downloads/. The message format in SSE is defined by W3C. It should be
noted that the SSE data sent by a server to a client should
Joydip Kanjilal is an MVP be in UTF-8 encoded format and have the following header:
(2007-2012), software Introducing Server-Sent Events (SSE)
architect, author, and Server-Sent Events (SSE or Event Source) is a W3C standard Content-Type: text/event-stream
speaker with more than
for real-time communication between servers and clients Cache-Control: no-cache
20 years of experience.
over HTTP. With SSE, the server may provide the client with Connection: keep-alive
He has more than 16 years
of experience in Microsoft
real-time event-driven changes through an HTTP connec-
.NET and its related tion. SSE is a standardized push technology conceptualized The data sent from the server to the client consists of
technologies. Joydip has first in 2004 and included as part of the HTML5 specifica- several messages, each separated by \n\n characters. A
authored eight books, tion. It enables you to transmit notifications, messages, field contains the following values:
more than 500 articles, and events from a server to a client over HTTP.
and has reviewed more • Data: Indicates the payload to be sent from the
than a dozen books. A protocol for streaming events, SSE is supported by the server to the client
majority of contemporary web browsers. These include • Retry: Optional and indicates the time the client
Edge, Firefox, Chrome, and Safari. SSE eliminates the need will wait before it attempts for a reconnection in the
to manually query the server or establish several connec- event of a connection drop
tions to provide changes to the client by enabling uni- • Event: Represents the event type defined by the ap-
directional, real-time communication between the server plication
and the client. Figure 1 demonstrates an overview of SSE • ID: Optional and represents the ID of the data trans-
works. mitted from the server to the client
The following is an example of an SSE response: as a result of network problems or server restarts.
In order to maintain an uninterrupted stream of up-
HTTP/1.1 200 OK dates without user intervention, the client will make
Content-Type: text/event-stream an effort to reconnect the connection. A strong and
dependable communication route is offered by the
event: event-1 automatic reconnection capability.
data: This is a sample text. • Browser compatibility: SSE is supported by the major-
event: event-2 ity of current web browsers, including Chrome, Firefox,
data: {"code": "p001", "quantity": 456} Safari, and Edge. It performs effectively in settings like
limited networks or outdated browser versions where
Why Server-Sent Events? WebSockets may not be accessible or permitted. When
Here are some of the key benefits of SSE in web applica- WebSockets are not practical for real-time communica-
tion development: tion, SSE may be used as an alternative.
• Seamless integration: SSE is simple to incorpo-
• Updates in real-time: SSE permits real-time com- rate into current web applications without requiring
munication between the server and client. Instead significant infrastructure modifications. It makes
of the client continually querying the server for new use of the already-existing HTTP infrastructure and
information, it enables the server to send changes doesn't call for any new network settings or unique
to the client as soon as they happen. For applica- server configurations. A number of server-side tech-
tions like live feeds, chat rooms, market tickers, and nologies, such as ASP.NET Core, Node.js, Django, and
alerts where quick updates are critical, this real- others, support SSE.
time feature is essential.
• Simplicity: For server-to-client communication, SSE For real-time changes in web applications, Server-Sent
offers a straightforward and lightweight protocol. Events provide a simple and effective option. Without the
SSE offers a lower overhead than other real-time complexity of conventional real-time protocols, they let
technologies, such as WebSockets, and doesn't need developers create responsive and engaging experiences, in-
complicated handshakes or bidirectional communi- crease server effectiveness, and increase user engagement.
cation. SSE messages are text-based, simple to read,
and easy to use. Key Features of SSE
• Reduced load on the server: SSE allows the server Some of the best features include:
to transmit updates to clients only when required,
reducing the workload on the server. This lessens • Unidirectional communication: SSE offers a serv-
the need for clients to submit the server queries er-to-client unidirectional communication channel.
frequently, which lowers server load and enhances In this type of communication, the server can trans-
scalability. SSE is very effective for applications that mit data to a connected client, but the client can-
have a lot of customers and need server resource not send data back to a server.
optimization. • Text-based protocol: Because SSE is a text-based
• Support for cross-origin communication: SSE protocol, messages are sent via HTTP in plain text,
enables cross-origin communication, enabling the which makes it simpler to debug and comprehend.
client to get updates from several domains or ori- Fields like "event," "data," and "id," which are sent
gins. The ability to stream data from a server that as a string of text lines in the form of event fields,
is housed on a different domain or subdomain is make up an SSE message.
handy in such situations. Cross-origin resource shar- • Real-time updates: SSE enables servers to trans-
ing (CORS) guidelines are followed by SSE to enable mit updates depending on server events, allowing
safe connection between several sources. servers to provide event-driven updates to clients
• Automatic reconnection: SSE connections are du- in real-time. A particular event name, data related
rable and can reestablish themselves if they are lost to the event, and, optionally, an identity that may
codemag.com Developing Real-Time Web Applications with Server-Sent Events in ASP.NET 7 Core 35
be given to the event are typically included in all • You can leverage simple AJAX requests and page
updates. The client can listen for specific events or reloads to implement polling in your applications.
receive all events sent by the server. • Clients repeatedly request updates even when there
• Connection persistence: SSE creates a durable are none, resulting in unnecessary network traffic
HTTP connection between the client and server that and increased server load.
endures for the same amount of time as the client. • This approach is suitable for scenarios where updates
SSE maintains the connection open to permit con- are infrequent or a real-time response is not a priority.
tinuous data streaming in the future, contrary to
conventional HTTP requests, which are transient and Long Polling
are closed when a response has been received. • Long polling reduces unnecessary requests to the
• Resilient: Because SSE connections are robust, if server and enables near real-time updates compared
the connection is lost, SSE will automatically re- to regular polling.
establish the connection. As soon as they become • Servers hold requests open until an update is available
disconnected, clients will make an effort to rejoin to rather than responding immediately to a client request.
the server, ensuring that updates continue to flow • The server responds when an update is available.
consistently and seamlessly. Then, the client sends a new request to keep the
• Cross-origin support: Support for cross-origin com- connection alive.
munication: SSE allows for the client to receive up- • When no updates are available within a particular time-
dates from a domain or origin other than the web- frame, the server responds with an empty response. The
site to which it is linked. You can configure cross- client sends a new request and continues listening.
origin resource sharing (CORS) rules on the server to • Although long polling reduces the frequency of re-
control access and security. quests and enables a real-time response, it still in-
volves frequent connections and overhead due to
request/response cycles.
How Do Server-Sent Events Work?
Server-Sent Events (SSE) establish a long-lived connec- WebSocket
tion between a server and its connected client. Once • WebSocket enables communication between servers
this connection is established, the server communicates and consumers over a single, persistent, reliable,
event-driven changes to the client over a single HTTP and full-duplex connection.
connection. Thanks to the SSE connection management • Web Socket is ideally suited for applications requir-
layer and parsing algorithms, a server can submit new ing continuous data transfers, such as chat applica-
events one by one while HTTP responses can remain open. tions and collaboration tools.
Here are the series of steps that outline how SSE works: • Due to server-side infrastructure requirements, Web-
Socket isn’t supported in all legacy or restricted
1. Server-Sent Events (SSE) establishes a persistent environments such as older browsers and certain
connection between a server and its client. network configurations.
2. Once this connection is established, the server com-
municates event-driven changes to the client over a Server-Side Events
single HTTP connection. As soon as the SSE connec- • SSE provides a lightweight, unidirectional approach
tion is established, the server can start sending SSE to server-client communication over HTTP.
events to the client. • Contrary to WebSockets, communication between
3. Once the server receives an SSE request, it processes server and client in server-sent events runs in only
the request. Once processing is over, the server re- one direction, from server to client.
sponds with the appropriate SSE headers. • SSE enables real-time updates without the complex-
4. Next, the server sets the response headers to indi- ity of WebSockets.
cate that SSE events will follow. • SSE is well suited for scenarios where communica-
5. When the client receives the SSE event, it extracts tion is unidirectional, i.e., the server needs to for-
the event fields and takes appropriate action based ward updates to clients, such as news feeds, notifi-
on the data received. cations, or real-time monitoring dashboards.
Polling
• Polling involves a client sending requests to the server Implementing SSE
at regular intervals to check if there are any updates. The server must inform the client that the content type
• On receiving the request, the server responds with of the response should be text/event-stream. Upon estab-
new data if one is available or an empty response if lishing a connection between the server and client, the
no data has been updated. server keeps it active until HTTP requests are received.
36 Developing Real-Time Web Applications with Server-Sent Events in ASP.NET 7 Core codemag.com
Unless a timeout occurs or there are no further events from the messages the shared location. The key advantage
to process, the connection remains open. If a timeout of this pattern is that the producer and the consumer are
occurs, the client can reconnect to the server using the decoupled and disconnected from one another, in other
built-in reconnection mechanism. Figure 2 illustrates a words, they don't have any knowledge of the other.
typical implementation of a SSE server and client.
Creating the View
The first step is connecting to an EventSource, which Replace the source code of the Index.cshtml file with the
is accomplished by initializing an EventSource instance source code given in Listing 2.
with the URL of the stream to connect to. Under the
hood, EventSource connects to the server by sending an
HTTP request. The server responds to the request with a
stream of event data having text/event-stream as the
content type. Until the server determines there’s no more
data to send, or until the client actively closes the con-
nection using EventSource.close, the connection between
the server and client persists. A keep-alive message can
be sent every minute or so to avoid a timeout.
codemag.com Developing Real-Time Web Applications with Server-Sent Events in ASP.NET 7 Core 37
JOIN US at the 2ND ANNUAL
PowerPlatformConf.com #MPPC23
Announcing Thursday’s Featured Guest Keynote
Adam Grant
Organizational psychologist at Wharton,
bestselling author of Think Again, and
host of the podcasts WorkLife & Re:Thinking
Listing 2: The Index.cshtml file Now that we know the flow of the application, let’s exam-
@{ ine the components of the application.
ViewBag.Title = "Home Page";
} Application Components
<script>
function display() { In this application, there are three projects involved:
var source = new EventSource('/home/getmessage');
var ul = document.getElementById("sse"); • SSE_Server
source.onmessage = function (e) { • SSE_Client
var li = document.createElement("li");
var retrievedData = JSON.parse(e.data) • SSE_Scheduler
li.textContent = retrievedData.message;
ul.appendChild(li); Figure 5 shows the components of the application together
}
} with the classes and interfaces used to build the components.
window.addEventListener("DOMContentLoaded", display, false);
</script> As evident from their names, SSE_Server is the server proj-
<ul id="sse">
</ul> ect that sends out messages when requested for all con-
nected clients. SSE_Client is the client project that con-
nects to the server to retrieve messages. SSE_Scheduler is
used to send data to the server at regular intervals of time.
40 Developing Real-Time Web Applications with Server-Sent Events in ASP.NET 7 Core codemag.com
Create the NotificationRepository Class (SSE_
Scheduler)
The NotificationRepository class implements the methods
of the INotificationRepository interface. Create a new
class named NotificationRepository in a file having the
same name with a .cs extension. Now write the source
code given in Listing 3 in there.
builder.Services.AddSingleton
,1RWLÀFDWLRQ5HSRVLWRU\
1RWLÀFDWLRQ5HSRVLWRU\!
codemag.com Developing Real-Time Web Applications with Server-Sent Events in ASP.NET 7 Core 41
Listing 4: The CustomHostedService Class
public sealed class CustomHostedService : *HW1RWLÀFDWLRQV 5HVXOW
IHostedService, IAsyncDisposable
{ IRUHDFK YDUQRWLÀFDWLRQLQQRWLÀFDWLRQV
private readonly {
,1RWLÀFDWLRQ5HSRVLWRU\ LI QRWLÀFDWLRQ,V3URFHVVHG
BQRWLÀFDWLRQ5HSRVLWRU\ {
private Timer? _timer; HttpContent body =
public CustomHostedService new StringContent(JsonSerializer.
,1RWLÀFDWLRQ5HSRVLWRU\QRWLÀFDWLRQ5HSRVLWRU\ 6HULDOL]H QRWLÀFDWLRQ
!BQRWLÀFDWLRQ5HSRVLWRU\ QRWLÀFDWLRQ5HSRVLWRU\ Encoding.UTF8, "application/json");
YDUUHVSRQVH FOLHQW3RVW$V\QF DSLQRWLÀFDWLRQ
public async Task StartAsync "postmessage", body).Result; }
(CancellationToken cancellationToken) }
{ }
_timer = new Timer(SendMessage, null, public async Task StopAsync
TimeSpan.Zero, TimeSpan.FromSeconds(60)); (CancellationToken cancellationToken)
} {
private void SendMessage(object? state) await Task.Delay(TimeSpan.FromSeconds(60), cancellationToken);
{ }
using var client = new HttpClient(); public async ValueTask DisposeAsync()
new Uri("http://localhost:5101/" + {
DSLQRWLÀFDWLRQ _timer.Dispose();
YDUQRWLÀFDWLRQV }
QRWLÀFDWLRQ5HSRVLWRU\ }
42 Developing Real-Time Web Applications with Server-Sent Events in ASP.NET 7 Core codemag.com
Listing 7: The Subscribe Method
public async Task<IActionResult> Subscribe(string id) await foreach (var message
{ in _messageQueue.DequeueAsync
Response.StatusCode = 200; (id, HttpContext.RequestAborted))
Response.Headers.Add("Cache-Control", "no-cache"); {
Response.Headers.Add("Connection", "keep-alive"); await streamWriter.WriteLineAsync
Response.Headers.Add("Content-Type", ($"Message received: " +
"text/event-stream"); $"{message} at {DateTime.Now}");
await streamWriter.FlushAsync();
try }
{ }
1RWLÀFDWLRQQRWLÀFDWLRQ catch (Exception ex)
new1RWLÀFDWLRQ {
QRWLÀFDWLRQ,G LG return BadRequest(ex.Message);
QRWLÀFDWLRQ0HVVDJH $"Subscribed to" + }
$" client {id}"; ÀQDOO\
_messageQueue.Register(id); {
StreamWriter streamWriter = _messageQueue.Deregister(id);
new StreamWriter(Response.Body); }
await _messageQueue.EnqueueAsync
QRWLÀFDWLRQ return Ok();
HttpContext.RequestAborted); }
The CustomMessageQueue class implements the methods Listing 8: The PostMessage Action Method
of the ICustomMessageQueue interface. The code list- public async Task<IActionResult> PostMessage
ing given in Listing 6 shows the CustomMessageQueue >)URP%RG\@1RWLÀFDWLRQQRWLÀFDWLRQ
class. {
try
{
Install NuGet Package(s) BPHVVDJH4XHXH5HJLVWHU QRWLÀFDWLRQ,G
So far so good. The next step is to install the necessary await _messageQueue.EnqueueAsync
NuGet Package(s). To install the required packages into QRWLÀFDWLRQ+WWS&RQWH[W5HTXHVW$ERUWHG
your project, right-click on the solution and the select return Ok();
}
Manage NuGet Packages for Solution…. Now search for catch (Exception ex)
the package named Lib.AspNetCore.ServerSentEvents in {
the search box and install it. Alternatively, you can type return BadRequest(ex.Message);
the command shown below at the NuGet Package Manager }
Command Prompt: }
PM> Install-Package
Lib.AspNetCore.ServerSentEvents
codemag.com Developing Real-Time Web Applications with Server-Sent Events in ASP.NET 7 Core 43
Listing 10: The GetMessages Action Method
public async Task<IActionResult> GetMessages() _messageQueue.DequeueAsync())
{ {
Response.Headers.Add("Content-Type", await streamWriter.WriteLineAsync
"text/event-stream"); ($"{DateTime.Now} {message}");
Response.Headers.Add("Cache-Control", await streamWriter.FlushAsync();
"no-cache"); }
Response.Headers.Add("Connection",
"keep-alive"); return Ok();
Response.StatusCode = 200; }
catch (Exception ex)
try {
{ return BadRequest(ex.Message);
StreamWriter streamWriter = }
new StreamWriter(Response.Body); }
await foreach (var message in
44 Developing Real-Time Web Applications with Server-Sent Events in ASP.NET 7 Core codemag.com
Listing 12: The Program.cs file of the SSE Console Client Application
HttpClient client = new HttpClient(); {
client.Timeout = TimeSpan.FromSeconds(60); while (!streamReader.EndOfStream)
{
string url = $"http://localhost:5101/" + var message =
DSLQRWLÀFDWLRQVXEVFULEH; await streamReader.ReadLineAsync();
Console.WriteLine(message);
while (true) }
{ }
try }
{ catch (Exception ex)
Console.WriteLine("Establishing connection" + {
" with the server."); throw;
using (var streamReader = }
new StreamReader(await client.GetStreamAsync(url))) }
codemag.com Developing Real-Time Web Applications with Server-Sent Events in ASP.NET 7 Core 45
ONLINE QUICK ID 2309061
Just a few months ago it seems that statements like this paradigm and places LLMs at the base of pretty much any
would not only have been preposterous, but nobody would upcoming Microsoft product, whether you are an Office user,
have even known what I’m talking about. Then, OpenAI or whether you are a system administrator concerned with
released ChatGPT at the end of November 2022 and GPT-4 security, or anything in between. For instance, you’ll be able
in March of 2023. These artificial intelligence (AI) products to use AI built into MS Teams to provide you with a sum-
not only took most of the tech world by surprise, but they mary of a meeting you missed. You can then have the same
captured the imagination of people who normally don’t AI create a to-do list based on the decisions made in that
take an interest in software products. All of a sudden, I’d meeting. You can also have that UI create a PowerPoint pre-
go to a BBQ with people who had no connection to the sentation or a Word document based on what was discussed.
Markus Egger software industry, and all they would want to talk to me Perhaps if one of the discussion points was that a new low-
[email protected] about is artificial intelligence! I’ve explained generative AI code application needs to be created, you can have the AI
to retired people, hairdressers, and journalists. I’ve held ex- create a Power Platform application that satisfies the crite-
Markus is the founder and ecutive briefings showing some of America’s largest compa- ria. And most of the time, the AI will be able to do it better
publisher of CODE Magazine nies how this technology can elevate what they are doing. than most human users. And if not, then you either interfere
and is EPS President and It seems to not matter how technical a person is; everyone by hand or ask the AI to change something. You can do all
Chief Software Architect.
is fascinated by AI (and at times, also a little afraid). of that in plain English or any other language of your choice.
He is also a Microsoft RD
(Regional Director) and the
one of the longest (if not THE
I haven’t seen anything like this since the earlier days We refer to this approach as an AI “Copilot.” The AI is not
longest) running Microsoft of the internet, except this happened much faster. Much autonomous. The AI does not take over. But it’s always
MVP (Most Valuable Profes- faster! In fact, ChatGPT is the fastest adopted product there to assist you and do what you want, even if you’re
sional). Markus is also a of all times (at least outside of China) gaining millions not a programmer or don’t even express yourself particu-
renowned speaker and author. of users in a matter of days. Yet one can argue whether larly well or accurately. Welcome to the Age of Copilot!
it’s even a “product” or whether it’s “just a tech demo.” This may sound like science fiction, and half a year ago it
Markus’ client list contains Perhaps the truth lies somewhere in between. would have been, but this is now concrete reality.
some of the world’s larg-
est companies, including Whatever the case may be, one thing is clear: ChatGPT as Copilots are not only for Microsoft applications. Copilots
many on the Fortune 500. a product is not the grand vision. Instead, the technol- are a new application paradigm (just like Windows was
Markus has been published ogy of what powers ChatGPT—the large language models once a new paradigm over command line interfaces or the
extensively including MSDN (LLMs) that OpenAI created—are the basis for a com- web was a new paradigm over desktop apps). It’s hard to
Magazine, Visual Studio pletely new generation of software that, for the first time imagine an application that can’t benefit from the Copi-
Magazine, his own CODE in any modern developer’s career, completely changes how lot paradigm. Having built a number of business applica-
Magazine, and much more. we write programs and interact with computers. We move tions with Copilot features, I can testify to the enormous
Markus focuses on develop- from a model of clear and unambiguous instructions that productivity increase this provides. (Even though we are
ment in .NET (Windows, we code into software and that results in very predictable arguably just scratching the surface.) Applications sup-
Web, Windows Phone, and and meticulously orchestrated results, to a fuzzier, but porting this new paradigm have a considerable benefit
WinRT) as well as Android orders of magnitude, more powerful paradigm. We now over apps that don’t. Conversely, applications that do not
and iOS. He is passionate interact with software much more like we would interact have such features, even if they’re dominant players in
about overall application with a human. The interaction is rich and powerful, but their market segments today, will find themselves leap-
architecture, SOA, user also, at times, somewhat unpredictable. frogged and made obsolete. It’s an enormous opportunity
interfaces, general develop- but can be a great danger for those who “miss the boat.”
ment productivity, and
building maintainable and
reusable systems. Many Copilot/AI features in modern
applications would have been It’s an enormous opportunity
considered science fiction just but can be a great danger
a few months ago. for those who “miss the boat.”
Now there’s something you don’t see every day! It solves using Azure;
a problem I originally struggled with when I wanted to using Azure.AI.OpenAI;
show off AI in real-world applications. I often couldn’t
show the generated results for privacy reasons. But using var client = new OpenAIClient(
a system prompt like this, the AI understands that every- new Uri("https://<APP>.openai.azure.com/"), SPONSORED SIDEBAR:
thing that appears to be a name should be replaced with new AzureKeyCredential("<YOUR API KEY>"));
a made-up name instead. This is the kind of thing that’s AI-Searchable
very difficult to do with traditional programming tech- var options = Knowledgebase
niques—how would you even write an algorithm that de- new ChatCompletionsOptions { MaxTokens = 1000 };
tects what’s a name, let alone replace it with fake names, One of the first scenarios
and then apply that fake name consistently going forward options.Messages.Add(new ChatMessage( most companies want
in follow-up conversations? Because we’re dealing with ChatRole.System, to implement is an AI-
a model that is, at the very least, very good at apply- """ searchable knowledgebase.
This way, documents, such
ing statistical mathematical tricks to fake intelligence, it You are a friendly and funny chap,
as employee manuals, can
understands this kind of request and does quite well with used to amuse an online audience.
be “indexed” and then used
it. Amazing indeed! You respond as if you were John Cleese from
by Generative AI to answer
Monty Python but with an old-English, medieval
questions. This is fun, but
Here's one more system prompt I often use to amuse my- way of talking. most companies quickly
self when I sit in front of my computer at 3 a.m. to meet """)); outgrow such an approach.
the latest deadline (after all, you gotta have some fun The next step is a system
every now and then, don’t you?): var userPrompt = Console.ReadLine(); that can provide this
options.Messages.Add(new ChatMessage( capability in a way that’s
You respond in the style of John Cleese from ChatRole.User, userPrompt)); secure and tailored (not
Monty Python, acting as a medieval peasant. every user should be able
var response = await to see everything and not
If the output created by this doesn’t put a smile on your client.GetChatCompletionsAsync( "ChatBot", options); all information applies to
face, brave sir knight, I must humbly request you turn in all scenarios) and provides
your geek card. <g> Console.WriteLine( real-time information,
response.Value.Choices[0].Message.Content); rather than pre-indexed
As you can imagine, using a system prompt effectively is static documents. Finally,
important. You want to use a system prompt that’s pre- This generates a response, which is also known as a organizations realize that
cise. At the same time, you don’t want the system prompt “prompt.” The AI considers itself to be an “assistant” to if such a system also had
to be too long. After all, when dealing with LLMs, you’re the person using it (or at least OpenAI considers it as access to data (in a secure
and appropriate fashion),
constantly battling size limitations, as they can only pro- such—the AI itself has no opinion on the matter) and
such a system evolves from
cess a certain amount of text. Therefore, try to be specific therefore, this is known as the assistant prompt.
a fringe offering to the
without wasting too much space on the system prompt.
center that the business
You might wonder why the response is considered a evolves around and that
“prompt” at all. Isn’t it just the “response?” Yes and no. all other systems are
As you have longer conversations with the AI, you some- controlled from.
If this doesn’t put a smile on your how need to preserve the state of the conversation. For
face, brave sir knight, I must humbly instance, if you ask the AI “What is a Ferrari,” it responds Regardless of how far y
with something like “an Italian sports car.” If you then ou want to travel down
request you turn in your geek card! subsequently ask “what is its typical color,” the AI needs this rabbit hole, our
to somehow know what “it” is. To us humans, it’s obvious CODE Consulting
that “it” refers to the “Ferrari” from the prior question. division can help you in
The next type of prompt we’re interested in is the user However, an LLM is a completely stateless system. It does implementing such a system
prompt. This is the core question you send to the AI. If not memorize prior answers or dialogs. Therefore, if you quickly. Find out how at
you’ve ever used ChatGPT, the user prompt is what you want the AI to understand such context, you must send codemag.com/AI-Docs.
type into the interface on the web page. A common usage it all the parts of the conversation that are meaningful
pattern is to let the user enter the text and then send it for the context. The easiest way to do so is to simply
to the AI as the user prompt. Note that this isn’t the only add to the collection of prompts, including the assistant
way to use this type of prompt. Very often, you’ll engineer prompts, on every subsequent call:
other ways of coming up with the input text (as you will
see below). For now, let’s enhance the prior example by using Azure;
reading the input from the console so the user can enter using Azure.AI.OpenAI;
anything they want.
var client = new OpenAIClient(
Also, while I’m at it, I’m going to switch from a simple new Uri("https://<APP>.openai.azure.com/"),
“completions” scenario to the slightly more advanced new AzureKeyCredential("<YOUR API KEY>"));
var encoding =
SharpToken.GptEncoding.
GetEncodingForModel("gpt-4");
var text = encoding.Decode(tokens);
SPONSORED SIDEBAR: Adding Your Own Data ate an accurate answer. (Another concern that applies in
We have now created a rather nice chatbot that can con- very many scenarios is security and access rights).
AI Training verse coherently and probably fool people into thinking
they are talking to a person. This is impressive! But it A great way to support such a scenario is a pattern known
Our CODE Training can only hold interest for a short period of time. To make as Retrieval Augmented Generation (which results in the
division was the first a truly useful artificial intelligence, or even a Copilot, somewhat unfortunate acronym RAG). The general idea is
training organization that you need to make the AI understand your specific needs, that you first let the user enter a question, such as “How
offered in-depth training
data, and institutional knowledge. do I turn on the AC?” Then, you must retrieve all of the
about how to create
information you have about the air conditioning system
Copilots for your own
applications. We’re now
Let’s create an example scenario, in which you imagine specific to the property the user has rented. You then
regularly scheduling these that you’re running a vacation rental business that rents take all that information, hand it to the AI, and let the
types of training classes out properties to vacationers all over the world. Vacation- AI answer the original question.
(and others), which can ers may have many questions about the properties they
be attended online, in- rent, as well as the area they are visiting. I’m sure every-
person, or be delivered, one in the vacation rental business must have answered
customized, at and for your questions such as “How do I turn on the air conditioning To make a truly useful AI, you need
organization. system?” or “How do I operate the safe?” a million times to make it understand your own
and would rather offload such tasks to an AI that’s avail-
Find out more at able 24 hours a day and doesn’t get annoyed easily. data and institutional knowledge.
codemag.com/Training.
In his article in the July/August 2023 issue of CODE Mag-
azine, Wei-Meng Lee created an example using the Python In some ways, this is easier said than done. First, you need
LangChain package to index his entire collection of CODE to detect the user’s intent. Do they want to just chat about
Magazine issues, so the AI could answer specific ques- something LLMs know (such as “what is an air conditioning
tions related to everything CODE has ever published. This system?”), or do you need to retrieve different types of infor-
is very useful, but for the vacation rental example, you mation? In this example, intent detection is relatively trivial,
need to go a step further. Wei-Meng created a static index assuming you’re creating an AI specifically for answering
of all magazines and then kept using it (an approach that such questions. The intent is always that you need to find
is useful for some scenarios), but you need to use data documents related to the property they rented. (Similarly,
that is current up to the minute. Furthermore, and more Bing Chat always knows that the intent includes some sort of
importantly, you need to apply additional logic to create web search before the question can be answered.) Therefore,
a correct and meaningful answer. For instance, it’s not intent detection isn’t a big issue for this scenario. However,
useful in this scenario to index the documentation you I’m mentioning intent detection here, because many Copilot
have for all air conditioning systems in all of the vacation scenarios must do some heavy lifting around this concept,
homes. Instead, you need to know which property the and I’ll explore this in a future article. (For instance, a user’s
vacationer has rented, whether it has an air conditioning question in a business application may have to do with cus-
system (or whether the vacationer is authorized to use tomers or with product databases or with invoices or… you
it), and then only use this information for the AI to cre- get the idea. To have an AI answer such questions, you first
Augmenting Prompts
Now that you have a search term, you can fire that search
term into Azure Cognitive Search and retrieve a list of
ranked documents.
offline use, and being able to install a website as an app to deal with offline usage, caching, and other features,
outside the JavaScript framework you’re using. In this ar- this feature is not the same as a PWAs. As discussed earlier,
ticle, I’ll show you how. unless you write custom code, PWAs can also use server
workers for background processes, enable offline mode, and
do updates in a more standard way. But this article is about
What’s a Progressive Web App using PWAs in your Vite project, so let’s talk about Vite.
Although there’s always been a difference between build-
ing mobile apps and building websites, Progressive Web
Applications (PWAs) mean to fill in the gap between Vite
Shawn Wildermuth them. The basic features that PWAs offer to web develop- Vite (rhymes with ‘beat’) is taking the JavaScript world by
[email protected] ers include: storm. It presents a way to develop your applications in a
wildermuth.com very quick fashion. What might not be obvious is that Vite
@shawnwildermuth • Installation: Can be installed from the web or sub- isn’t a particular framework or library; it’s simply a tool
mitted to app stores. Web assets are installed on for running an application and reacting to changes. This
Shawn Wildermuth has
been tinkering with the local OS. isn’t dependent on any specific way you write your own
computers and software • Platform-agnostic: Can integrate with host OSs to code. Instead, it’s a development time tool for working
since he got a Vic-20 give the user a more native experience. with JavaScript or TypeScript.
back in the early ’80s. • Offline: Supports running the PWA offline so you
As a Microsoft MVP since don’t need a persistent network connection to Vite has a plug-in that can implement PWAs for you. The
2003, he’s also involved launch the application. plug-in is called vite-plugin-pwa. The plug-in is specific to
with Microsoft as an ASP. • Background processing: Supports multiple threads Vite, so it doesn’t matter what web framework you’re using.
NET Insider and ClientDev via Service Workers. It’s agnostic to the underlying application. So, this works if
Insider. He’s the author • Versioning: PWAs can have a consistent way to see you’re using Vite for React, Vue, Svelte-Kit, or even Vanilla
of over twenty Pluralsight if the code needs to be updated. JS. I like this approach because I can learn to handle a PWA
courses, written eight • Protocol handlers: Associate files with the PWA. in one place and apply it to multiple technologies.
books, an international
conference speaker, and To be clear, PWAs aren’t magical. They’re just web tech- Next up, let’s add PWA support to a Vite project.
one of the Wilder Minds. nologies so they have limitations about performance, and
You can reach him at his theming your application to look like the underlying OS You can get the starting project from https://github.
blog at http://wildermuth. is up to you. PWAs aren’t meant to replace native apps or com/wilder-minds/vite-pwa.
com. He’s also making his app frameworks like Flutter, MAUI, and Xamarin.
first, feature-length docu-
Installing the PWA Plug-in
mentary about software
The PWA Plugin is just another Vite plug-in. You can in-
developers today called
“Hello World: The Film.”
stall it via NPM like this:
You can see more about it at For a more in-depth explanation
npm install vite-plugin-pwa -save-dev
http://helloworldfilm.com. of PWAs, see Chris Love’s article:
https://www.codemag.com/ The purpose of the VitePWA plug-in is to enable creation
Article/1805091/Introducing- of a manifest.webmanifest file and JavaScript files to
set up and run the service worker. The VitePWA plug-in
Progressive-Web-Apps-The- automatically creates these files without you having to
No-Store-App-Solution. understand the nature of the files.
This just installs the website in an OS window. It asks for import^GHÀQH&RQÀJ`from 'vite'
metadata because it only has the <title></title> as the import vue from '@vitejs/plugin-vue'
name of the app, as seen in Figure 2: import { VitePWA } from "vite-plugin-pwa";
This is great for certain projects, but PWAs extend this idea KWWSVYLWHMVGHYFRQÀJ
even further. Although hosting a website in its own window export defaultGHÀQH&RQÀJ ^
does some of what PWAs do, unless you write specific code plugins: [
export defaultGHÀQH&RQÀJ ^
plugins: [
vue(),
VitePWA({
manifest: {}
}),
...
Figure 1: Installing a website as an app
For most people, you’ll want to be able to test and debug
this in development mode. So the first real configuration
that I’d suggest you do is to enable devOptions:enabled:
export defaultGHÀQH&RQÀJ ^
plugins: [
vue(),
VitePWA({
manifest: {},
devOptions: {
enabled: true
} Figure 2: Installing a website as an application
}),
export defaultGHÀQH&RQÀJ ^
plugins: [
vue(),
VitePWA({
manifest: {
icons: [
{
src: '/icons/icon-512x512.png',
sizes: '512x512',
type: 'image/png'
}
]
},
devOptions: {
enabled: true
}
}),
If you run the Vite project (e.g., “vite dev”), you’ll see
that the app is now installable, as seen in Figure 4:
You’ll notice that the browser has named the app “vite-
pwa”. This is the default name. If you open the dev-tools Figure 3: The empty manifest file
With the basic metadata complete, you can see how in-
stalling the app works.
Offline Support
Although you can certainly create a native application
that doesn’t support offline usage, a PWA has different
requirements. While in the browser, caching can help load
certain assets (e.g., HTML, JS, CSS), but typically, this
still depends on checking the server for a new version
even if there’s a cache. In PWAs, all the assets to load the
page need to be stored for offline use. To do this, it uses
Cache storage. If you run the installed app, you can still
Figure 6: A working manifest file load the dev tools. With this view, you can see the cache
storage that’s being used, as seen in Figure 10.
VitePWA({
Figure 7: Installation prompt ...
workbox: {
globPatterns: [ "**/*.{js,css,html,pdf}"]
}
})
workbox: {
globPatterns: ["**/*.{js,css,html,pdf}"], Figure 9: Installed on the OS
runtimeCaching: [{
urlPattern: ({ url }) => {
return url.pathname.startsWith("/api");
}
...
workbox: {
globPatterns: ["**/*.{js,css,html,pdf}"],
runtimeCaching: [{
urlPattern: ({ url }) => {
return url.pathname.startsWith("/api"); Figure 10: HTML page caching
Now that you have that working, let’s talk about handling
Figure 11: API caching in action updates to the application.
Configurating Updates
When the network is available and the code or markup
has changed, you’ll need a way of updating the cached
code. By default, the behavior is to prompt the user to
update the application by recreating the cache from the
server. To get this behavior, you don’t need to configure
this option.
VitePWA({
...,
Figure 12: Forcing the Service Worker to update on page reload registerType: 'autoUpdate'
})
including session, token, and stateless guards. In the • It improves performance by reducing the amount of
previous article, I covered authenticating users with the data that needs to be sent with each request.
Session Guard in detail. In this article, I’ll cover Token • It simplifies the implementation of authentication
Guard and explore how to authenticate a user in Laravel mechanisms in complex systems and enables easy
using tokens only. integration with other systems and applications.
• It can be used in distributed systems, where au-
Token Authentication in Laravel is typically used when thentication needs to be performed across multiple
building stateless APIs. The client application (such as a servers without requiring the servers to share au-
mobile app or a JavaScript application) needs to authen- thentication data.
ticate with the server on every request without storing Bilal Haidar
any session information on the server.
How Laravel Implements Token [email protected]
https://www.bhaidar.dev
Token Authentication works by issuing a token to the client Authentication? @bhaidar
upon successful authentication, which is then used to au- Laravel offers two packages for Token Authentication: Bilal Haidar is an
thenticate subsequent requests. The client sends this token Laravel Passport and Laravel Sanctum. accomplished author,
in the request headers, and the server uses it to authenticate Microsoft MVP of 10 years,
the user and authorize access to the requested resource. • Laravel Passport: Passport is a full-featured OAuth2 ASP.NET Insider, and has
server implementation that provides a complete solu- been writing for CODE
In Laravel, Sanctum is the Token Authentication imple- tion for API authentication. It allows clients to au- Magazine since 2007.
mentation used to authenticate API requests through the thenticate with your API using various OAuth2 flows,
auth:sanctum middleware, which protects the routes un- including password grant, authorization code grant, With 15 years of extensive
der the routes/api.php file. and client credentials grant. Passport requires more experience in Web devel-
configuration and set up than Sanctum, but it offers opment, Bilal is an expert
There are other ways to implement Token Authentication more advanced features for OAuth2 authentication. in providing enterprise
in Laravel other than Sanctum. Still, being a package de- • Laravel Sanctum: Sanctum is a lightweight pack- Web solutions.
veloped and maintained by the Laravel team, it’s consid- age that provides a simple way to authenticate API
He works at Consolidated
ered the optimal implementation for Token Authentica- requests using tokens. It's designed for single-page
Contractors Company in
tion in the Laravel application. applications, mobile applications, and APIs that need
Athens, Greece as a full-
a straightforward and easy-to-use authentication sys- stack senior developer.
tem. Sanctum uses Token Authentication, and it does
What’s Token Authentication? not support OAuth2 authentication flows. Bilal offers technical
Token Authentication is a method in which a token is consultancy for a variety
exchanged between the client and server to establish and Both packages provide a secure and reliable way to au- of technologies including
maintain authentication status. In this method, a token thenticate API requests, but they differ in complexity and Nest JS, Angular, Vue JS,
is issued to a user after successful authentication. It’s the features they offer. Choose the package that best fits JavaScript and TypeScript.
then used for subsequent authentication attempts in- the specific requirements of your application.
stead of sending the user's credentials, such as username
and password, with each request. Before you begin working on your application, consider
whether Laravel Passport or Laravel Sanctum would be more
The token is typically a string of characters generated suitable for your needs. If your application must support
by the server. It contains the user's identity and other OAuth2, then Laravel Passport is the appropriate choice.
relevant information that allows the server to validate the
authenticity of the request. The token is usually included On the other hand, if you're looking to authenticate a
in the HTTP Header of each request sent by the client, and single-page application, or mobile application, or gener-
the server can use it to verify that the request is coming ate API tokens, then Laravel Sanctum is the recommended
from an authenticated user. choice. Although it doesn't support OAuth2, it offers a
more straightforward API authentication development ex-
Token Authentication is helpful for several reasons: perience.
Laravel Sanctum helps solve two different problems: API Create a New Laravel Application
Tokens and SPA Authentication. Start by creating a new Laravel application. There are
several methods for creating a new Laravel app. I chose
API Tokens the Docker-based one using the Laravel Sail package. You
Sanctum provides a straightforward way to create API to- can read more about Laravel installation by following this
kens for your users. You can integrate Sanctum into your URL: https://laravel.com/docs/10.x/installation.
application's Account Settings page, where users can gen-
erate and manage their API tokens. These tokens usually Choose the method that best suits you. Before you start,
have a long lifespan but can be manually revoked by the make sure you have Docker running on your computer.
user. Laravel Sanctum stores user API tokens in a single
database table and verifies incoming HTTP requests using I'm using a MacBook Pro. I start by running this com-
the Authorization header that contains a valid API token. mand:
./vendor/bin/sail run dev Laravel Breeze for API removes your application's front-
end-related files and folders. This makes it suitable for
The application is now accessible at http://localhost. projects that only serve APIs. As a result, the package.
Open the URL in the browser, and you'll see the same json file and other front-end files are removed.
view as in Figure 1.
If you wish to serve your API project using Laravel Breeze
Next, let's install the Laravel Breeze starter kit. The Laravel for API, you can do so by running the following command:
team provides this starter kit for scaffolding Laravel authen-
tication and Profile management. This is my ultimate choice ./vendor/bin/sail \
when starting a new Laravel project. Trust me: It saves you artisan serve
a lot of time! You can read more about Laravel Breeze here:
https://laravel.com/docs/10.x/starter-kits#laravel-breeze. This command starts the Laravel Sail development envi-
ronment and serves your API project.
Laravel Breeze comes in four flavors:
Login Mobile Application Users
• Breeze & Blade The routes/api.php file governs the communication be-
• Breeze & React tween the Laravel API application and a mobile applica-
• Breeze & Vue tion. Let’s explore the content of this file.
• Breeze & Next.js/API
<?php
I'll use Breeze & Next.js/API for this article.
Route::middleware(['auth:sanctum'])
Run the following commands to install Laravel Breeze: ->get('/user', function (Request $request) {
return $request->user();
./vendor/bin/sail \ });
composer require \
laravel/breeze --dev The code snippet defines a GET route at the path /api/
user inside the routes/api.php file. This route is pro-
./vendor/bin/sail \ tected by the auth::sanctum middleware, meaning the
artisan breeze:install api user must be authenticated using Laravel Sanctum before
accessing this route.
./vendor/bin/sail \
artisan migrate The code inside the closure function of the route defi-
nition retrieves the currently authenticated user using
The Laravel Breeze for API added the basic configura- the $request->user() method. This method returns an
tion settings for Laravel Sanctum and all necessary HTTP instance of the authenticated user model if the user is
Controllers to log in, log out, register users, and more. authenticated or null if the user is not.
When a user makes a GET request to the /api/user end- 1. The controller method accepts an HTTP request via
point with a valid authentication token, the closure func- the $request parameter.
tion returns a JSON response containing the user object, 2. The $request parameter is then validated to ensure
which includes details such as the user's name, email ad- that it contains the required fields for authentica-
dress, and other information stored in the user model. tion. These fields are the user's email, password, and
the device name they are using to log in.
If the user is not authenticated or the authentication to- 3. Next, the method queries the User model to find a
ken is invalid, the auth:sanctum middleware automatically user with the email address provided in the request.
returns a 401 Unauthorized response, denying the user ac- 4. If the user is found, their password is checked to
cess. ensure that it matches the password provided in the
request. If the password does not match, a validation
When communicating with Laravel API endpoints, both exception is thrown with an error message stating
the request and response payloads contain JSON data. that the credentials are incorrect.
Hence, it’s essential to continually include two request 5. If the user's email and password are validated suc-
header keys in every request to the API: cessfully, the method generates a new token for the
user using the createToken() method provided by
• Accept Sanctum. This token is associated with the device
• Content-Type name provided in the request.
6. Finally, the method returns the plain text value of
The Content-Type header of the request should be set to the token to the client. The client can use this to-
application/json because you’re sending JSON data in the ken for subsequent authenticated requests to the
request body. This header tells the server that the request server.
body is in JSON format and should be parsed as such.
In this case, the result of authenticating a user is that a
The Accept header, on the other hand, is used to indicate new token is generated and sent to the user. This token
the desired response format. In this case, you’re return- should then be added to the header of every future re-
ing JSON data, so the Accept header can be set to ap- quest to the application.
plication/json to tell the server that you expect a JSON
response. Back to the original defined route in routes/api.php file:
Let’s add a new post endpoint to log in users coming from <?php
a Mobile application. To do so, add the following route to
the routes/api.php file: Route::middleware(['auth:sanctum'])
->get('/user', function (Request $request) {
Route::post('/login', LoginController::class); return $request->user();
});
Next, let’s create the LoginController class as an invok-
able controller using this command: Assuming that the user is authenticated and owns a to-
ken, an incoming request to the URL /api/user will be
./vendor/bin/sail \ authenticated using the Sanctum guard represented by
artisan make:controller \ the middleware auth::sanctum.
API/Auth/LoginController --invokable
This middleware first checks if a session cookie exists in
I’m placing the new controller inside the app/Http/Con- the incoming request (this is the default for SPA applica-
trollers/API/Auth folder. tions). Otherwise, it tries to locate a token in the request
header. For mobile users, that’s the default behavior.
Open the LoginController class and paste the source code Laravel Sanctum validates the token and accordingly al-
shown in Listing 1 inside the __invoke() controller method. lows or forbids access to the /api/user route.
Let’s explain the code in Listing 1 step by step. Now, let's add a few tests to ensure this code runs.
Create a new test using this command: Listing 3: More tests for LoginController
/** @test */
./vendor/bin/sail \ public function access_user_endpoint()
artisan make:test \ {
Sanctum::actingAs(User::factory()->create());
API/Auth/LoginControllerTest
$response = $this->get('/api/user');
Paste the source code in Listing 2 inside the LoginCon-
trollerTest file. $response->assertOk();
}
2. The validation rules are defined in an array and in- Paste the code in Listing 5 inside the RegisterControl-
clude the following: lerTest file.
• name: required, must be present in the request.
• email: required, must be a valid email address, The register_new_user() test method creates a new user
and must be unique in the users’ table. array with the following fields:
• password: required, must be at least eight char-
acters long, and must match the confirmation • name: the name of the user
password. • email: the email address of the user
• device_name: required, must be present in the • password: the password of the user
request. • device_name: the device's name associated with
3. A new user is created using the User::create() meth- the user's access token
od if the validation passes. The user's name, email,
and password are obtained from the request data and The test then makes a POST request to the /api/register
stored in the users’ table. endpoint with the user data in the request body.
4. The user's password is encrypted using the bcrypt()
method, which hashes the password and ensures that The assertSuccessful() method ensures that the response
it cannot be read in plain text. has a status code of 200 or 201, indicating that the user
5. Finally, the user's personal access token is created was successfully created.
using the createToken() method, which generates
a new token for the user and associates it with the The assertNotEmpty() method is used to ensure that the
provided device name. The plainTextToken attribute response content is not empty.
of the token is returned to the client, which can be
used to authenticate future requests. The assertDatabaseHas() method ensures that the new-
ly created user is stored in the database. The first call
Next, let’s add a test to ensure this endpoint works prop- checks for the existence of the user's email in the users'
erly. table, and the second call checks for the presence of the
device name in the personal_access_tokens table.
Create a new test using this command:
Logout Mobile Application Users
./vendor/bin/sail \ Let’s look at how to implement a log out functionality
artisan make:test \ for your mobile application users. Start by creating a new
API/Auth/RegisterControllerTest controller and add a new route to routes/api.php file.
Paste the code in Listing 7 inside the LogoutController- Run the following command to create a new Vue3 project.
Test file. The test in Listing 7 does the following:
npm init vue@latest
1. The Sanctum::actingAs($user) method is called to
authenticate the newly created user by generating Follow the instructions on the screen to create and
an access token using Laravel Sanctum's actingAs() run your SPA application. Notice the last step. When
method. This allows the user to make authenticated
requests to the API.
Listing 6: Add logout route to routes/api.php
2. The $this->post('/api/logout') method is used to
send a POST request to the /api/logout route to log Route::post('/login', LoginController::class);
out the authenticated user. Route::post('/register', RegisterController::class);
3. The assertOk() method is called on the response ob- Route::middleware(['auth:sanctum'])
ject to ensure that the response status code is 200 ->group(function() {
OK, indicating that the logout request was success- Route::get('/user', function (Request $request) {
return $request->user();
ful. });
4. The $this->assertDatabaseCount('personal_access_ Route::post('/logout', LogoutController::class);
tokens', 0) method is used to ensure that the ac- });
CORS works by adding specific headers to HTTP requests and The max_age field specifies the amount of time to cache
responses that indicate whether a particular request is al- Preflight CORS requests. Let’s explore Preflight requests.
lowed. These headers include Access-Control-Allow-Origin,
Access-Control-Allow-Headers, Access-Control-Allow-Meth- Preflight requests are a mechanism used by the browser to
ods, and Access-Control-Allow-Credentials, among others. determine whether it’s safe to make a cross-origin request to
a server. A preflight request is an HTTP OPTIONS request sent
What is an Origin anyway? An Origin is a combination of to the server before the actual cross-origin request is made.
a scheme (also known as the protocol, for example, HTTP
or HTTPS), hostname, and port (if specified). The preflight request includes headers such as Origin,
Access-Control-Request-Method, and Access-Control-Re-
Therefore, the domains we have so far, http://local- quest-Headers, which specify the origin of the request,
host:5173 and http://locationhost:8000, have two dif- the HTTP method that will be used in the actual request,
ferent origins because the ports are different. That’s why and the custom headers that will be sent in the actual
you should configure CORS at the Laravel API application request, respectively.
to allow this communication between two applications
hosted on two different origins. The server must respond to the preflight request with the
appropriate CORS headers, including Access-Control-Allow-
To configure the CORS in a Laravel application, go to the Origin, Access-Control-Allow-Methods, and Access-Control-
config/cors.php file. Listing 8 shows the content of this file. Allow-Headers, indicating that the request is allowed and
which origins, methods, and headers are allowed.
Listing 8: CORS.config file Once the browser receives the appropriate CORS headers
return [ in response to the preflight request, it proceeds with the
'paths' => ['*'], actual cross-origin request.
'allowed_methods' => ['*'],
Finally, the supports_credentials field specifies whether
'allowed_origins' => [env( the Laravel API application wants to share the Cookies
'FRONTEND_URL', 'http://localhost:3000'
)], with the SPA application.
'allowed_origins_patterns' => [], Now open the .env file and update the FRONTEND_URL
'allowed_headers' => ['*'],
environment variable to match the URL of the SPA ap-
plication.
'exposed_headers' => [],
FRONTEND_URL=http://localhost:5173
'max_age' => 0,
'supports_credentials' => true, Make sure you don’t add a trailing forward slash. That’s
]; very important to remember.
Ensure that you don’t add a trailing forward slash, a port Login SPA Users
number, or a scheme (HTTP and HTTPS). When you installed Laravel Breeze for APIs, it included
all the necessary server-side code to authenticate, log
Finally, you need to add one more environment variable, out, register, reset passwords, and many other functions.
the SANCTUM_STATEFUL_DOMAINS. Navigate to app/Http/Controllers/Auth folder and study
the code.
SANCTUM_STATEFUL_DOMAINS is an environment variable
used in Laravel Sanctum that specifies the domains for Locate the routes/web.php file and navigate to the last
which Sanctum's stateful authentication mechanisms will line in that file:
be used.
require __DIR__.'/auth.php';
Stateful authentication in Sanctum involves cookies to
authenticate the user. When a user logs in, Sanctum cre- It requires the auth.php routes file. If you open this file,
ates a cookie containing a signed JSON Web Token (JWT) you will find all the necessary routes to Login, Logout,
that identifies the user. This cookie is sent with every Register, Reset Password, and many other routes you need
subsequent request to the server, allowing the server to for your SPA.
authenticate the user without requiring their credentials
with each request. For instance, here is a code snippet to show the endpoint
route to allow SPA users to log in to the application:
However, cookies can only be sent to the domain that
sent them and not to other domains. This means that if Route::post('/login', [
you have a single-page application (SPA) that requests AuthenticatedSessionController::class,
your API from a different domain, the cookies that Sanc- 'store'
tum sets for authentication won't be sent with the re- ])
quests, and the user won't be authenticated. ->middleware('guest')
->name('login');
To enable stateful authentication for a different domain,
add the domain to the SANCTUM_STATEFUL_DOMAINS The AuthenticatedSessionController@store method is
environment variable. This tells Sanctum to also allow responsible for authenticating the user. This is the same
cookies to be sent to that domain, so the user can be logic used in any Laravel application, not only in Laravel
authenticated across domains. API applications.
user.value = data
Let’s add an HTML Form to the SPA application to allow 3. Create a reactive form object with email and pass-
the user to log in to the Laravel API application. word properties initialized to null.
4. Create a reactive user object to hold the user data
First, let’s install Axios (https://axios-http.com/docs/ once logged in.
intro) into the SPA application. Navigate to the SPA ap- 5. Define an onLogin() function that will be called
plication and run the following command: when the form is submitted. This function does the
following:
npm install axios • Sends a request to the server to get a CSRF to-
ken by calling axios.get('http://localhost:1006/
One nice thing I like about Axios is that it does some sanctum/csrf-cookie'). This request is a must to
things out of the box. For example, it takes the CSRF let the Laravel API application issue a CSRF token
cookie generated by the Laravel application and sets it as to protect all of the non-GET requests.
an HTTP Header when sending the requests to the Laravel • Sends a POST request to the server to authenti-
API application. cate the user by calling axios.post('http://loc-
alhost:1006/login', { email: form.value.email,
Also, I’ve installed and configured Taliwindcss. To install password: form.value.password }).
Tailwindcss, follow this guide: https://v2.tailwindcss. • Sends a GET request to the server to get the user data
com/docs/guides/vue-3-vite by calling axios.get('http://localhost:1006/api/
user'). The user data is stored in the data variable.
Add a new Login.vue component. Paste the code you see • Sets the user object to the data variable so that
in Listing 9 inside this new component file. it can be displayed in the template.
6. Include an HTML form with a submit binding @sub-
Let’s discover the significant sections of the Login com- mit.prevent=”onLogin”
ponent. 7. Bind the email and password fields of the HTML form
to the corresponding form.email and form.pass-
1. Start by importing the Axios and ref functions from word fields.
the Vue library.
2. Set the Axios.defaults.withCredentials configuration When the user clicks the button, the form submits, and
property to true allowing cross-site requests to include eventually, the user details are retrieved accordingly for a
credentials such as cookies. successful login request.
One last step is to import the Login.vue com- (Continued from 74)
ponent into the App.vue component to see the
Login page in the browser. Snowbird environment being one built on trust,
on respect-centered values within which people Sep/Oct 2023
Figure 2 shows the result of trying the Login wanted to work. Finally, it all boiled down to Volume 24 Issue 5
form in the browser. Agile being about “mushy stuff of values and
culture.” Channeling Hamilton, the history Group Publisher
By doing this, you finish implementing Laravel page is a contemporaneous account by one who Markus Egger
Sanctum for both SPA and Mobile applications. was in the room when it happened! Associate Publisher
Rick Strahl
CODA:
What Lies at Agile’s Heart?
As soon as I heard the term applied to software development and considering the 17 individuals who
codified, ratified, and released the Agile Manifesto in February, 2001, I instinctively knew the trail the
Agile Manifesto’s authors were blazing was the right path. I knew that because Agile wasn’t conjured
Contact us today for a complimentary one hour tech consultation. No strings. No commitment. Just CODE.
codemag.com/code
832-717-4445 ext. 9 • [email protected]
DO YOU WONDER
HOW ARTIFICIAL
INTELLIGENCE CAN
HELP YOUR BUSINESS?
Do you worry about privacy or regulatory issues stopping you from using AI to its fullest?
We have the answers!
We will send an expert to your office to meet with you. You will receive:
1. An overview presentation of the current state of AI.
2. Learn how to use AI in your business while ensuring privacy of your and your clients’ information.
3. We’ll build a sample application built on your own HR documents – allowing your employees to query
those documents in English, which will cut down the number of questions that you
and your HR group have to answer.
4. A roadmap for future use of AI catered to what you do.
CONTACT US TODAY FOR A FREE CONSULTATION AND DETAILS ABOUT OUR SERVICES
codemag.com/executivebriefing
832-717-4445 ext. 9 • [email protected]