0% found this document useful (0 votes)
78 views

Linux

Package management systems simplify software installation on Linux systems by bundling applications and dependencies into packages. The two most popular package formats are .deb packages used by Debian-based systems and managed by tools like apt, and .rpm packages used by Red Hat-based systems and managed by tools like yum. Both systems track dependencies to ensure packages are installed and updated properly. Interpreted languages like JavaScript, Perl, and PHP are translated during execution for flexibility, while compiled languages like C, C++, and Java are translated ahead of time for performance. Programming languages and libraries provide tools to programmers for developing software.

Uploaded by

Aleksandros Hodo
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
78 views

Linux

Package management systems simplify software installation on Linux systems by bundling applications and dependencies into packages. The two most popular package formats are .deb packages used by Debian-based systems and managed by tools like apt, and .rpm packages used by Red Hat-based systems and managed by tools like yum. Both systems track dependencies to ensure packages are installed and updated properly. Interpreted languages like JavaScript, Perl, and PHP are translated during execution for flexibility, while compiled languages like C, C++, and Java are translated ahead of time for performance. Programming languages and libraries provide tools to programmers for developing software.

Uploaded by

Aleksandros Hodo
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 13

3.

4 Package Management
Every Linux system needs to add, remove, and update software. In the past
this meant downloading the source code, setting it up, compiling it, and
copying files onto each system that required updating. Thankfully, modern
distributions use packages, which are compressed files that bundle up an
application and its dependencies (or required files), greatly simplifying the
installation by making the right directories, copying the proper files into
them, and creating such needed items as symbolic links.
A package manager takes care of keeping track of which files belong to which
package and even downloading updates from repositories, typically a remote
server sharing out the appropriate updates for a distribution. In Linux, there
are many different software package management systems, but the two most
popular are those from Debian and Red Hat.
3.4.1 Debian Package Management
The Debian distribution, and its derivatives such as Ubuntu and Mint, use the
Debian package management system. At the heart of Debian package
management are software packages that are distributed as files ending in
the .deb extension.
The lowest-level tool for managing these files is the dpkg command. This
command can be tricky for novice Linux users, so the Advanced Package
Tool, apt-get (a front-end program to the dpkg tool), makes management of
packages easier. Additional command line tools which serve as front-ends
to dpkg include aptitude and GUI front-ends like Synaptic and Software
Center.
3.4.2 RPM Package Management
The Linux Standards Base, which is a Linux Foundation project, is designed
to specify (through a consensus) a set of standards that increase the
compatibility between conforming Linux systems. According to the Linux
Standards Base, the standard package management system is RPM.
RPM makes use of an .rpm file for each software package. This system is what
distributions derived from Red Hat, including Centos and Fedora, use to
manage software. Several other distributions that are not Red Hat derived,
such as SUSE, OpenSUSE, and Arch, also use RPM.
‌Like the Debian system, RPM Package Management systems track
dependencies between packages. Tracking dependencies ensures that when a
package is installed, the system also installs any packages needed by that
package to function correctly. Dependencies also ensure that software updates
and removals are performed properly.
The back-end tool most commonly used for RPM Package Management is
the rpm command. While the rpm command can install, update, query and
remove packages, the command line front-end tools such
as yum and up2date automate the process of resolving dependency issues.
There are also GUI-based front-end tools such as Yumex and Gnome
PackageKit that also make RPM package management easier.
Some RPM-based distributions have implemented the ZYpp (or libzypp)
package management style, mostly openSUSE and SUSE Linux Enterprise,
but mobile distributions MeeGo, Tizen and Sailfish as well.
The zypper command is the basis of the ZYpp method, and it features short
and long English commands to perform functions, such as zypper
in packagename which installs a package including any needed dependencies.
Most of the commands associated with package management require root
privileges. The rule of thumb is that if a command affects the state of a
package, administrative access is required. In other words, a regular user can
perform a query or a search, but to add, update or remove a package requires
the command to be executed as the root user.
3.5 Development Languages
It should come as no surprise that as software built on contributions from
programmers, Linux has excellent support for software development. The
shells are built to be programmable, and there are powerful editors included
on every system. There are also many development tools available, and many
modern programming languages treat Linux as a first-class citizen.
Computer programming languages provide a way for a programmer to enter
instructions in a more human readable format, and for those instructions to
eventually become translated into something the computer understands.
Languages fall into one of two camps: interpreted or compiled. An interpreted
language translates the written code into computer code as the program runs,
and a compiled language is translated all at once.
Linux itself was written in a compiled language called C. The main benefit of
C is that the language itself maps closely to the generated machine code so
that a skilled programmer can write code that is small and efficient. When
computer memory was measured in kilobytes, this was very important. Even
with large memory sizes today, C is still helpful for writing code that must run
fast, such as an operating system.
C has been extended over the years. There is C++, which adds object support
to C (a different style of programming), and Objective C that took another
direction and is in heavy use in Apple products.
The Java language puts a different spin on the compiled approach. Instead of
compiling to machine code, Java first imagines a hypothetical CPU called
the Java Virtual Machine (JVM) and then compiles all the code to that. Each
host computer then runs JVM software to translate the JVM instructions
(called bytecode) into native instructions.
The additional translation with Java might make you think it would be slow.
However, the JVM is relatively simple so it can be implemented quickly and
reliably on anything from a powerful computer to a low power device that
connects to a television. A compiled Java file can also be run on any computer
implementing the JVM!
Another benefit of compiling to an intermediate target is that the JVM can
provide services to the application that usually wouldn’t be available on a
CPU. Allocating memory to a program is a complex problem, but it’s built
into the JVM. As a result, JVM makers can focus their improvements on the
JVM as a whole, so any progress they make is instantly available to
applications.
Interpreted languages, on the other hand, are translated to machine code as
they execute. The extra computer power spent doing this can often be
recouped by the increased productivity the programmer gains by not having
to stop working to compile. Interpreted languages also tend to offer more
features than compiled languages, meaning that often less code is needed. The
language interpreter itself is usually written in another language such as C,
and sometimes even Java! This means that an interpreted language is being
run on the JVM, which is translated at runtime into actual machine code.
JavaScript is a high-level interpreted programming language that is one of the
core technologies on the world wide web. It is similar to but fundamentally
different from Java, which is a completely object-oriented programming
language owned by Oracle. JavaScript is a cross-platform scripting language
for adding interactive elements to web pages, that is in wide use across the
internet. By using JavaScript libraries, web programmers can add everything
from simple animations to complex server-side applications for internet users.
JavaScript is continuously evolving to meet the functionality and security
needs of internet users and is capable of being released under a GNU GPL
License.
Consider This
The term object-oriented refers to programing that abstracts complex actions
and processes so that the end user only deals with basic tasks. To visualize this
concept, think of a machine that performs a complex set of tasks by simply
pushing a button.
Perl is an interpreted language. Perl was originally developed to perform text
manipulation. Over the years, it gained favor with systems administrators and
continues to be improved and used in everything from automation to building
web applications.
PHP is a language that was initially built to create dynamic web pages. A PHP
file is read by a web server such as Apache. Special tags in the file indicate
that parts of the code should be interpreted as instructions. The web server
pulls all the different parts of the file together and sends it to the web browser.
PHP’s main advantages are that it is easy to learn and available on almost any
system. Because of this, many popular projects are built on PHP. Notable
examples include WordPress (for blogging), cacti (for monitoring), and even
parts of Facebook.
Ruby is another language that was influenced by Perl and Shell, along with
many other languages. It makes complex programming tasks relatively easy,
and with the inclusion of the Ruby on Rails framework, is a popular choice for
building complex web applications. Ruby is also the language that powers
many of the leading automation tools like Chef and Puppet, which make
managing a large number of Linux systems much simpler.
Python is another scripting language that is in general use. Much like Ruby it
makes complex tasks easier and has a framework called Django that makes
building web applications very easy. Python has excellent statistical
processing abilities and is a favorite in academia.
A computer programming language is just a tool that makes it easier to tell
the computer what you want it to do. A library bundles common tasks into a
distinct package that can be used by the developer. ImageMagick is one such
library that lets programmers manipulate images in code. ImageMagick also
ships with some command line tools that enable programmers to process
images from a shell and take advantage of the scripting capabilities there.
OpenSSL is a cryptographic library that is used in everything from web
servers to the command line. It provides a standard interface for adding
cryptography into a Perl script, for example.
At a much lower level is the C library. The C library provides a basic set of
functions for reading and writing to files and displays, and is used by
applications and other languages alike.
3.6 Security
Administrators and computer users are increasingly aware of privacy
concerns in both their personal and professional lives. High-profile data
breaches have been in the news all too often recently, and the cost of these
break-ins can reach into the millions of dollars for the institutions that fall
victim to hackers and ransomware attacks. Many times the cause of these
breaches is simply human error such as opening a suspicious email or
entering passwords into a phony login page.
Cookies are the primary mechanism that websites use to track you. Sometimes
this tracking is good, such as to keep track of what is in your shopping cart or
to keep you logged in when you return to the site.
As you browse the web, a web server can send back the cookie, which is a
small piece of text, along with the web page. Your browser stores this
information and sends it back with every request to the same site. Cookies are
normally only sent back to the site they originated from, so a cookie from
example.com wouldn’t be sent to example.org.
However, many sites have embedded scripts that come from third parties,
such as a banner advertisement or Google analytics pixel. If both
example.com and example.org have a tracking pixel, such as one from an
advertiser, then that same cookie will be sent when browsing both sites. The
advertiser then knows that you have visited both example.com and
example.org.
With a broad enough reach, such as placement on social network sites with
“Like” buttons and such, a website can gain an understanding of which
websites you frequent and figure out your interests and demographics.
There are various strategies for dealing with this. One is to ignore it. The
other is to limit the tracking pixels you accept, either by blocking them
entirely or clearing them out periodically.
Browsers typically offer cookie-related settings; users can opt to have the
browser tell the site not to track. This voluntary tag is sent in the request, and
some sites will honor it. The browser can also be set never to remember third-
party cookies and remove regular cookies (such as from the site you are
browsing) after being closed.
Tweaking privacy settings can make you more anonymous on the Internet,
but it can also cause problems with some sites that depend on third-party
cookies. If this happens, you might have to explicitly permit some cookies to
be saved.

Browsers also offer a private or incognito mode where cookies and tracking
pixels are deleted upon exiting the window. This mode can be helpful if you
would like to search for something without letting other websites know what
you are looking for.
3.6.1 Password Issues
Good password management is essential to security in any computing
environment. The Linux systems administrator is often the person responsible
for setting and enforcing password policies for users at all levels. The most
privileged user on any Linux system is root; this account is the
primary administrator and is created when the operating system is installed.
Often administrators will disable root access as the first line of defense against
intrusion since computer hackers will try to gain root access in order to take
control of the system.
There are many levels of access and various means of password management
on a Linux system. When users are created, they are given different login
permissions depending on what groups they are assigned to. For example,
administrators can create and manage users while regular users cannot.
Services that run on systems such as databases can also have login
permissions with their own passwords and privileges. Additionally, there are
specific passwords for accessing systems remotely through SSH, FTP, or other
management programs.
Managing all these accounts, and their accompanying passwords is a
complicated and necessary part of the systems administrator role. Passwords
need to be complex enough not to be easily guessed by hackers, yet easy to
remember for users. Increasingly users and administrators are turning
to password manager programs to store login credentials in encrypted form.
Another trend is two-factor authentication (2FA), a technique where a
password is supplemented by a second “factor,” often a passcode sent to the
user's phone or other devices. Keeping up with current security trends, while
ensuring authorized users' ease of access, is an ongoing challenge that must be
met.
3.6.2 Protecting Yourself
As you browse the web, you leave a digital footprint. Much of this information
goes ignored; some of it is gathered to collect statistics for advertising, and
some can be used for malicious purposes.
The easiest thing you can do is to use a good, unique password everywhere
you go, especially on your local machine. A good password is at least 10
characters long and contains a mixture of numbers, letters (both upper and
lower case) and special symbols. Use a password manager like KeePassX to
generate passwords, and then you only need to have a login password to your
machine and a password to open up your KeePassX file.
Also, limit the information you give to sites to only what is needed. While
giving your mother’s maiden name and birthdate might help unlock your
social network login if you lose your password, the same information can be
used to impersonate you to your bank.
After that, make a point of checking for updates periodically. The system can
be configured to check for updates on a regular basis. If there are security-
related updates, you may be prompted immediately to install them.

Finally, you should protect your computer from accepting incoming


connections. A firewall is a device that filters network traffic, and Linux has
one built-in. If you are using Ubuntu, then the Gufw is a graphical interface
to Ubuntu’s Uncomplicated Firewall (UFW).

Under the hood, you are using iptables, which is the built-in firewall system.
Instead of entering complicated iptables commands you use a GUI. While this
GUI lets you build an effective policy for a desktop, it barely scratches the
surface of what iptables can do.
3.6.3 Privacy Tools
The use of modern privacy tools, both at the server and user level, can help
prevent system intrusions and unauthorized access to data.
The good news is that Linux is by default one of the most secure operating
systems ever created. Many of the exploits that plague other operating
systems simply won’t work on Linux due to the underlying architecture.
However, there are still many known weaknesses that hackers can take
advantage of so the proactive systems administrator is wise to deploy privacy
tools that protect their users as well as the systems they use.
Encryption is probably the best-known and most widely-deployed privacy tool
in use today. Administrators deploy encryption with authentication keys on
almost every system that communicates with the outside world. One well-
known example is the HyperText Transfer Protocol Secure (HTTPS) standard
used on web servers to ensure that data transmitted between users and online
resources cannot be intercepted as it travels on the open Internet.
Virtual private networks (VPN) have been in use by companies to connect their
remote servers and employees for many years. Now they are gaining
popularity amongst ordinary users looking to protect their privacy online.
They work by creating an encrypted channel of communication between two
systems, so the data transmitted between them is scrambled by an algorithm
only the systems know.
The Tor project has long been involved in creating privacy tools like it’s Tor
Browser that works by relaying internet requests through a network of
servers that prevents websites and others from learning the identity of the
person making the request.
These tools are constantly evolving and choosing which ones are appropriate
for the users and systems involved is an essential part of the systems
administrator's role.

3.7 The Cloud


No doubt you’ve heard of the cloud. Whether you’re using Google Docs for
your homework or storing music and photos on iCloud, you probably have at
least some of your digital content hosted on a cloud server somewhere.
Cloud computing has revolutionized the way we access technology. As Internet
connectivity and speeds have increased, it’s become easier to move computing
resources to remote locations where content can be accessed, manipulated and
shared around the globe. Organizations are increasingly looking at the cloud
as essential to their businesses and operations. The migration of an
organization's IT applications and processes to cloud services, known as cloud
adoption, is rapidly becoming a strategic business decision for many. With
cloud adoption rising significantly all over the globe, cloud computing is not
the catchphrase that it once was. Cloud computing is seen as one of the major
disruptive technologies of the coming decade which will significantly
transform businesses, economies, and lives globally.
Physically, a cloud can be described as computing resources from one or
many off-site data centers which can be accessed over the internet. The cloud
builds on the benefits of a data center and provides computing solutions to
organizations who need to store and process data, and it allows them to
delegate management of IT infrastructure to a third-party. The data and
resources that organizations store in the cloud can include data, servers,
storage, application hosting, analytics and a myriad of other services.
A cloud deployment model provides a basis for how cloud infrastructure is
built, managed, and accessed. There are four primary cloud deployment
models:
 Public Cloud: A public cloud is a cloud infrastructure deployed by a
provider to offer cloud services to the general public and organizations
over the Internet. In the public cloud model, there may be multiple
tenants (consumers) who share common cloud resources. More than
likely, many of us have accessed public cloud resources at some point
through providers such as Amazon, Google, and other popular public
cloud providers.
 Private Cloud: A private cloud is a cloud infrastructure that is set up
for the sole use of a particular organization. When compared to a public
cloud, a private cloud offers organizations a greater degree of privacy,
and control over the cloud infrastructure, applications, and data. It can
be hosted either on servers managed by the company that is using it or
through a managed private cloud provider such as Rackspace or IBM.
 Community Cloud: A community cloud is a cloud infrastructure that is
set up for the sole use by a group of organizations with common goals or
requirements. The organizations participating in the community
typically share the cost of the community cloud service. This option may
be more expensive than the public cloud; however, it may offer a higher
level of control and protection against external threats than a public
cloud.
 Hybrid Cloud: A hybrid cloud is composed of two or more individual
clouds, each of which can be a private, community, or public cloud. A
hybrid cloud may change over time as component clouds join and leave.
The use of such technology enables data and application portability. It
also allows companies to leverage outside resources while retaining
control of sensitive resources.
3.7.1 Linux in the Cloud
Linux plays a pivotal role in cloud computing. It powers 90% of the public
cloud workload, most virtual servers are based on some version of the Linux
kernel, and Linux is often used to host the applications behind cloud
computing services. So what makes Linux uniquely suited to enabling cloud
computing?
Flexibility
Cloud computing provides the capability to provision IT resources quickly
and at any time. This agility enables rapid development and experimentation
that, in turn, facilitates innovation which is essential for research and
development, the discovery of new markets and revenue opportunities,
creating new customer segments, and the development of new products.
As a result, cloud computing must compensate for the fact that each
organization has a unique, evolving set of resource requirements.
Linux stands out here because it is highly adaptable. For starters, Linux is
modular by design, and at the center of an enormous ecosystem of open
source applications providing endless configuration options to suit various
systems and use cases. On top of that, Linux scales efficiently, allowing it to
run anything from a tiny remote sensor to an entire server farm.
Accessibility
In a traditional environment, IT resources are accessed from dedicated
devices, such as a desktop or a laptop. In cloud computing, applications and
data reside centrally and are accessed from anywhere over a network from
any device, such as desktop, mobile, or thin client, and there is a version of
Linux for every single one of these devices.
Cost-Effective
Cloud computing is attractive as it has the potential for consumers to reduce
their IT costs. In cloud computing, consumers can unilaterally and
automatically scale IT resources to meet workload demand, thereby
eliminating overhead from underutilized resources. Additionally, the expenses
associated with IT configuration, management, floor space, power, and
cooling are reduced.
Cloud providers absorb these infrastructure costs but must remain a low-cost
alternative. Choosing Linux is one of the most cost-effective solutions
providers can deploy. Linux is one of the most power efficient operating
systems, and the Linux kernel is completely free, as are many associated
applications, utilities, and additional software components.
Enterprise and government organizations can opt to pay for commercially-
supported distributions, which are still more cost-effective when compared to
licensed competitors. Non-commercial distributions that support cloud
computing also are a viable option for many organizations.
Not only can vendors pass these savings onto the customers, offering Linux-
based solutions can be cheaper for the client to implement. Setting up Linux
on their own systems eliminates expensive user licensing fees potentially
associated with competing operating systems.
Manageability
While Linux began as a niche operating system, its widespread presence in the
IT industry has made Linux use and administration a necessary skill for IT
professionals. It is becoming increasingly easy for cloud vendors and
consumers to acquire the necessary talent, or reallocate existing team
members.
The nature of Linux, built on the C programming language, also lends itself to
automated management tools. A significant portion of Linux servers
operating in the cloud are created and managed by automated management
programs rather than human operators. This process frees up administrators
to monitor computing operations rather than manually configuring and
updating systems.
Security
When using a cloud solution, especially a public cloud, an organization may
have concerns related to privacy, external threats, and lack of control over the
IT resources and data.
Linux can help offset these issues because it is one of the most secure and
reliable operating systems available. Linux is open source, meaning its source
code is available for anyone to obtain, review, and modify. This also means the
code can be inspected for vulnerabilities and compatibility issues, resulting in
an extensive community effort to rectify these issues and uphold the robust
reputation of Linux.
Virtualization
Virtualization is one of the most significant advancements that has contributed
to the enablement cloud of computing.
Linux is a multi-user operating system, which means that many different users
can work on the same system simultaneously and for the most part can’t do
things to harm other users. However, this does have limitations – users can
hog disk space or take up too much memory or CPU resources and make the
system slow for everyone. Sharing the system in multi-user mode also requires
that everyone run as unprivileged users, so letting each user run their own
web server, for example, is challenging.
Virtualization is the process where one physical computer, called the host,
runs multiple copies of an operating system, each copy called a guest. These
guest images can be pre-configured for specific functions to allow rapid
deployment, often automatically, when needed. The host system runs software
called a hypervisor that switches resources between the various guests just like
the Linux kernel does for individual processes. With bare metal hypervisors,
the hypervisor runs directly on computer hardware rather than on top of an
OS freeing up more resources for guest images.
Virtualization works because servers spend most of their time idling and don’t
need physical resources such as a monitor and keyboard. With software from
companies like VMWare and Openbox, you can now take a powerful CPU
and by using it to run multiple virtual machines administrators can optimize
usage of physical resources and dramatically reduce costs over the previous
one-machine, one-OS data center model. The main limitation is usually
memory, however, with advances in hypervisor technology and CPUs, it is
possible to put more virtual machines on one host than ever.
In a virtualized environment one host can run dozens of guest operating
systems, and with support from the CPU itself, the guests don’t even know
they are running on a virtual machine. Each guest gets its own virtual
resources and communicates with the network on its own. It is not even
necessary to run the same operating system on all the guests, which further
reduces the number of physical servers needed.
Virtualization offers a way for an enterprise to lower power usage and reduce
data center space over an equivalent fleet of physical servers. Guests are now
just software configurations, so it is easy to spin up a new machine for testing
and destroy it when its usefulness has passed.
Since it is possible to run multiple instances of an operating system on one
physical machine and connect to it over the network, the location of the
machine doesn’t matter. Cloud computing takes this approach and allows
administrators to have virtual machines in a remote data center owned by
another company, and only pay for the resources used. Cloud computing
vendors can take advantage of scales of economy to offer computing resources
at far lower prices than operating an on-site data center.
Containers and Bare Metal Deployments
With the rise of containerization technologies
like Docker and Kubernetes application software is now being written that
runs in a serverless environment. Essentially, programmers are creating
software that does one single function of a system (like database processing or
storage) that runs in a container. These containers are organized in pods that
run within a node and can talk with each other, and the outside world if
needed. Nodes, in turn, are organized and controlled by a master node that
provides services to each component within the structure. Building
applications in this way decouples each of the components from the others,
and from the overhead of running an OS. Since each piece of the puzzle can
be automatically destroyed and recreated by the master node they no longer
need to be as robust as software that runs on top of an OS. Although these
new programming architectures are in many ways bypassing the need for a
traditional OS the underlying technology that makes them work is still Linux.
So, working in Linux will increasingly be working within a development team
that draws on the disciplines of programming, database design, networking,
and systems administration to create the systems of the future.

You might also like