0% found this document useful (0 votes)
41 views

Lovely Professional University: Subject

Filters in Linux are programs that process data streams and modify them in some way. Common filters include awk, grep, sed, spell, and wc. Pipes allow the output of one program to be used as input for another program. This creates a pipeline that allows data to be passed between programs continuously for further processing. Benefits of pipes include incremental delivery of data, independence of filters, and support for producer-consumer mechanisms. Limitations include the inability to dynamically change pipes once started or intelligently re-order filters.

Uploaded by

Yogendra Singh
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views

Lovely Professional University: Subject

Filters in Linux are programs that process data streams and modify them in some way. Common filters include awk, grep, sed, spell, and wc. Pipes allow the output of one program to be used as input for another program. This creates a pipeline that allows data to be passed between programs continuously for further processing. Benefits of pipes include incremental delivery of data, independence of filters, and support for producer-consumer mechanisms. Limitations include the inability to dynamically change pipes once started or intelligently re-order filters.

Uploaded by

Yogendra Singh
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

LOVELY PROFESSIONAL UNIVERSITY

SYNOPSIS
.
SUBJECT:
LINUX PROGRAMMING

TOPIC:
VARIOUS FILTERS AVAILABLE IN LINUX. BENEFITS OF PIPES IN LINUX.

SUBMITTED TO:
MS.

SUBMITTED BY:
YOGENDRA SINGH
ROLL_NO: C1912A26
REG_NO: 10905201
B.TECH (CSE)
Review on filter available in Linux and benefits of pipes:

FILTER IN LINUX:
A filter is a step in processing data and it processes the
data and it is a process. Every filter has a data input and also
has a data output. In every processing step input data are
transformed or we can say modified. While transforming data,
it is possible to extract and to delete and to add or to
substitute data within the data of the stream. The kind of
transformation is determined by the filter itself.
In other words, filters are stream modifier, which
process incoming data in some specialized way and send that
modified data stream out over a pipe to another filter.

One of the most common use of filters is to restructure


output. We will discuss a couple of the most important filters.
Some useful ones are the commands awk, grep , sed, spell, and
wc.

SOME USEFUL FILTERS:


there are many linux commands that are filters, in addition to
awk, grep, and sort. Two filters to consider are tr (translate) and
sed (stream edit). Both commands allow us to modify the
stream –tr for simple changes and sed for the morecomplex.

For example, we can use tr [a-z] [a-z] to convert everything to


uppercase, or sed s/”*”//g to remove the stars from the names of
executable files.

Another filter to consider is tee, which enables you to split a


stream between stdout and a file. For example:
Ls –l | tee file.lst | wc -l

This will create a file (file.lst) containing the result from ls -l


and will display the number of files to the screen (or pass it on
to another filter, if you require).

PIPES:

A pipe is a form of redirection that is used in lnux and other unix-like


operating system to send the output of the one program for further
processing.

Redirection is the transferring of standard output to some other


destination, such as another program, a file or a printer, instead of the
display monitor which is its default destination. Standard output,
sometimes abbreviated stdout, is the destination of the output from
command line programs in Unix-like operating systems.

Basically, pipes are used to create what can be visualized as a pipeline


if command, which is a temporary direct connection between two or
more simple programs. This connection makes possible the
performance of some highly specialized task that none of the
constituent programs could perform by themselves. A command is
merely an instruction provided by a user telling a computer to do
something, such as launch a program. The command line programs
that do the further processing are referred to as filter.

This direct connection between programs allows them to operate


simultaneously and permits data to be transferred between them
continuously rather than having to pass it through temporary text files
or through the display screen and having to wait for one program to be
completed before the next program begins.

PIPES AND FILTERS FEATURES:

 Incremental delivery: data is output as work is conducted


 Filter work independently and ignorantly of one another, and
therefore are plug-and-play
 Filters are ignorant of other files in the pipeline –there are no
filter-filter interdependent
 Multiple readers and writers are possible
 Very good at supporting producer-consumer mechanisms

BENEFITS:

 Fair simply to understand and implement


 Filter are substitutable black boxes, and can be plug and played,
and thus reused in creative ways
 Filter are highly modifiable , since there is no coupling between
filters and new filters can be created and added to an existing
pipeline.

LIMITATIONS:

 A batch processing metaphor is not inherently limiting but this


pattern does not facilitate highly dynamic responses to system
interaction.
 Because filters are black boxes, and are ignorant of one another,
they can’t intelligently re-order themselves dynamically.
 Once a pipeline is in progress, it can’t be altered without
corrupting the stream.
 Difficult to configure dynamic pipelines , where depending on
contents, data is routed to one filter or another.

You might also like