Docker Cheat Sheet

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 50

package.

json file was not available that’s why npm install didn’t worked
COPY ./ ./ command means from current working directory inside our filesystem to
current working directory inside container.
docker build –t (for tagging ) stephengrider/simpleweb
name of the docker id and name of the image or project and :latest if we want

docker run stephengrider/simpleweb


here we have copied everything in root directory so in order to get rid of that we need
to specify work directory, suppose if we have any files or folders that conflict default
folder system like var, root or run, we might accidentaly overide something inside of our
container

we can have everything in WORKDIR, now anyfollowing command will be executed to


this relative folder
, we can also put this in var or put this in home directory
we can also do dockerexec
suppose if we change here Bye there to hi there then we have to rebuild our docker
server then we have to rerun

The only thing that requires npm install to run is COPY.JSON file, so during initial step
we need to copy the package.json file so that npm install doesn’t have to run over again.
now we can make as many changes as we want but package.json file will be in cache, The only time that
NPM is NPM install is going to be executed again is if we make a change to that stepor any step
above it.
And so in other words the really only effect is going to have is if we make a change to the
package.json file that's really the only situation in which NPM install will normally be executed
again during the build process.
Minimize cache busting
app overview so we can scale up the Node App
we can simply connect it by refering it’s name
Automatically Restart our Container crashed or
We have specified the node restart policy for one specific container

Here we need to check docker-compose ps from the level where our docker-compose
file is.

Creating production based workflow :-


So any changes that we will make to our master branch is eventually going to
automatically deployed to our hosting providers

Suppose other developer has made some code changes to feature branch and we are
going to pull that code and we are going to make the changes and we are going to push
that to git hub repository, again to the feature branch, once we push these changes to
feature branch we are going to call this pull request, and merge them over to the master
branch,
Merging of feature branch to master branch is called as pull request.
We are going to put our application and that is going to push our application to travis ci.
Two very important things are going to occur.
First off when you make this little pull requests we're going to set up a workflow that is
going to automatically take our application and push it over to a service called Travis
see-I Travis C.I. is a continuous integration provider.
And essentially what they do is pull down your code and run a set of tests that you write
on your code base
For our application.

The one that we're going to kind of test this flow out on we're going to just be making use of a
couple of pre-generated test so we don't need to worry too much about the testing for right now.
Assuming that Travis is able to pull all the code from our master branch and run tests on it
successfully. Travis C.I. is then going to be set up to automatically take our codebase take our
entire project and push it over to some Amazon Web Services hosting.
So essentially this entire flow is going to be depending upon you pushing some code up to this
feature branch creating the puller request merging the request with the master branch and the
instant you do that we'd run our tests if the tests run successfully.
Travis see-I will automatically deploy your application to AWS.
Project Generation

The application that we are going to use is react-app

we will generate tiny react little project and will wrap it up inside our container
so here we have generated the small project
In the last section we generated a brand new re-act project in this section and we're going to learn,
how to work around with this project just a little bit in case you're new to the no japes or
reactivates

The three commands that we're going to need to be aware of and these are commands that are
going to run several times as we develop this entire workflow our NPM run start run tests in run
built and run start as a development only commands it starts up a development server which is
used to host our application and make it available inside of our web browser.
And that's very important to note that this development server that is created is very much a
development server.
It is not appropriate for use in production.
And so when it gets to figuring out how we're going to take the dock or container and deploy to
the outside world Well we're definitely have to do a little bit of follow up on that on that side of
things.

NPM run test:- use to run test if there are any associated with the project, when we run
reactjs project it automatically created our tests, but our focus is just to run test , about
tests here in the context of making sure that we only deploy our application if all the tests are
successfully passing the last command that we need to be aware of for a weirdo.
For right now is
NPM run build.
This command is used to take all the javascript files that are tied to the project and essentially
concatenate them all down into one single file.
They can then be served in a production environment.
try NPM run built it's enter and run
build again inside of my front end directory when I execute that it'll say creating a optimized
production
built.
Again this is going to take all of our different files and essentially just kind of compact them all
down into a single file.
We can then list out all of our files and folders. You'll notice that there is now a built directory.

Inside build directory we have these files and inside build/static/js we have javascript
files.

This represents the actual javascript that is our application. And so at some point time we're going
to want to serve up this index saut HTML file and javascript file from some AWS instance or
some AWS service.
My last command is NPM run start.
Again that's going to start up a development server.
So execute NPM run start like so that's going to automatically open a tab inside of your browser
at localhost 3000 and you'll see the default re-act application appear.

We're going to start thinking about how we're going to wrap up our application inside of a docker
container that is appropriate for development specifically at development purposes.
development Dockerfile

docker build –f Dockerfile.dev it’s a development and –f means a file.

Here we are seeing build context the node why?


When we build tool npm install, So why are we seeing that here now.

When we definitely did not see it previously.


Well here's the issue when we just installed the Create re-act out tool and used it to generate a
new project that tool automatically installed all of our dependencies into our project directory.

A hundred and fifty five make about it's worth now in the past we did not install any of our
dependencies
into our working folder.
Instead we relied upon our Dr. image to install those dependencies when the image was initially
created.
So at present we essentially have two copies of dependencies and we really do not need to.

The easiest solution here is to delete the node modules folder inside of our working directory
which

How to create a container out of image


fot the port, When we make these copies over here we are essentially taking a snapshot of the
contents of SRC and public.
It's a snapshot that is locked in time and by default is not going to be updated anytime we make a
change to this code.
So in order to get these changes that we're making to files inside of SIRC and public we need to
kind of abandon this approach of doing the straight copy.
So rather than doing the straight copy we're going to adjust the docker run command that we use
to start up our running container by adjusting this command.
We're going to be making use of a feature include with docker called volume's.
So we're going to make use of a darker volume with a darker volume.
We essentially set up a placeholder of sorts inside of our dock or container.
And so we're no longer going to copy over the entire SIRC directory or the entire public
directory. Instead you can kind of imagine that we're going to put a sort of that's not how you
spell it.

We're going to put in a kind of reference here instead.


The volume is essentially going to set up a reference that's going to point back to our local
machine and give us access to the files and folders inside of these folders on the local machine.
So a docker volume can kind of be thoughtful in a very similar fashion to the port mappings that
we were setting up before the port mapping that we were using Map
A container is a port inside the container to a port outside the container with a darker volume
where essentially setting up a mapping from a folder inside the container to a folder outside the
container.

docker run –p 3000:3000 –v /app/node_modules –v $(pwd):/app <image_id>


$(pwd):/app it means map inside a container to this /app whatever inside pwd.
we got error because we didn’t mapped the node modules and it was unable to found the node
modules dependencies, The issue here is that when we set up that binding or that volume we said
Take everything let's go look at the diagram one more time right here.
We essentially said Take everything inside of our present working directory and map it up to the
app folder inside of our container.
The issue with that is that inside of our current directory right here you'll notice that we do not
presently have a node modules folder.
So when we try to take everything inside of here and map it into that folder Well we don't have a
node modules folder which is where all of our dependencies exist.
since we already deleted node module folder, So the node modules folder inside the container
essentially got overwritten.
We had a node modules folder over there we can kind of imagine that it was right here when we
set up that mapping.

When we do not use the colon then we just list a folder inside the container essentially saying we
want this to be a placeholder for the folder that is inside the container. Don't try to map it up
against anything.
And so clearly by setting up this volume any changes that we make to our local file system
essentially get propagated into our container the running container the reac server inside that
running container sees the change and it updates the page.
if needed we can get away with COPY . .

We've now got some solid infrastructure in place to run our container in a development
environment sort of set up at the docker filed out of file.
We've also set up a docker composed file that makes executing the docker container and starting
it up.
We've now got some solid infrastructure in place to run our container in a development
environment sort of set up at the dock or filed out of file.
We've also set up a docker composed file that makes executing the docker container and starting
it up.

We've also set up a docker composed file that makes executing the docker container and starting
it up.
So it's a little bit easier.
We're now going to start to shift our focus over to running the tests inside of our container.
We're going to first focus on just running our tests in our development environment and then
we're going to very quickly kind of take that knowledge and apply it to running our tests over on
Travis C.I. which remember is a continuous integration service specifically made to run tests for
your project.
All right.
And the good news here is that executing NPM run test inside of our container is going to be
awfully straightforward.
All we really have to do is build our container using the or files that file that we've already put
together.

Here we got our image id, And I remember to run a specific command inside that container when
it starts up or you just overwrite or override excuse me the existing command.
All we have to do is append the command that we actually want to run on the end of the dock or
run command.
So I can execute docker run the container ID which I just lost when we copy again and then I'll
add on NPM run test like so.
Here that’s why we should use the tags so that ids are longer number we can refer it to the docker
name via id
In src directory we have app.test.js file

above was the test that was in application

an now here we have pasted the same code now it means now we have two tests
and now if we run two test by hitting enter it’s still same,
We've got a container that's been created specifically to run some tests when we created that
container.
We essentially took a snapshot of all of our working files and folders and put that inside the
container. So this very temporary container that we've made just to run our tests does not have all
that volume stuff set up. That is the issue.
And so without those volumes set up we are using old and outdated files inside of our container.
And any changes we make to our test suite will not be reflected inside there.
So this is definitely a solution that works.
However this is not necessarily the best solution because if you are developing this application
it's going to require you to startup docker compose then get the idea that running container and
run that docker exec command which is kind of hard to remember off the top of your head.
We will create a whole new service in our docker-compose file to run our test
I want you to remember that NPM run build is going to build a production version of the
application.
Essentially it takes all the javascript files process them altogether puts them all together into a
single file and it spits it out to a folder on your hard drive.

So we need to essentially put something together here that's just going to take income your
requests and respond to them with those different files.
To solve this we're going to make use of a server called Engine X engine s engine X is an
extremely popular web server.
It doesn't have a lot of logic tied to it.
It's really just about taking incoming traffic and somehow routing it or somehow responding to it
with some static files which is exactly what you and I are going to use it for. So we are going to
create a separate docker file that is going to create a production version of our web container.
So we need to essentially put something together here that's just going to take income your
requests and respond to them with those different files.
To solve this we're going to make use of a server called Engine X engine s engine X is an
extremely popular web server. It doesn't have a lot of logic tied to it.
It's really just about taking incoming traffic and somehow routing it or somehow responding to it
with some static files which is exactly what you and I are going to use it for.So we are going to
create a separate docker file that is going to create a production version of our web container.

result of npm run build is, main.js file and index.js file

We're going to take the result of all that just that build directory and we're going to copy it over to
our built run phase.

Implementing Multistep Build


so starting ngnx means it will take care of the command automatically.
Here we got the id of image

You might also like