Queues
Queues
Table of Contents
3: Queue
3.1: Introduction
3.2: Operations on Queue
3.3: Software Queue
3.3.1: Implementation of queue
3.3.1.1: Linear Representation
3.3.1.2: Queue Representation using two stacks
3.3.1.3: Linked representation
3.3.1.4: Comparing two implementations of Queue representation
3.3.2: Generic queue
3.4: Circular Queue
3.5: Deque
3.6: Priority Queue
3.7: Queue: Application
3.7.1: Queue as scheduling queue
3.7.2: Queue as Buffering queue
3.7.3: Queue- search space Exploration
3.7.4: circular queue in real life
3.7.5: Application of priority Queue
3.7.6: Message Queue
3.7.7: The M/M/1 Queue
Summary
Exercises
Glossary
References
3 Queue
As discussed in previous section, stack is an abstract data structure with the special
property that elements are always added to and removed from the top. We know many
areas where we have to refer the data elements in different manner. Queue is another
abstract data structure where elements are referred in different order.
3.1 Introduction
Queue is a linear data structure in which data can be added to one end and retrieved
from the other. Just like the queue of the real world, the data that goes first into the
queue is the first one to be retrieved. That is why queues are also called as First-In-
First-Out list.
In real time application, we use this data structure in many situations. For example ,
line of students on a fee counter in a college to pay the fees is an example of queue.
Another common example is to make a line of commuters waiting at a bus stop for a
bus. There are several other situations also where queue is used so that addition takes
place only at the rear and deletion only at front.
Source: haldolongwidget.files.wordpress.com/2008/12/bus-queue.jpg
Source: ortiznellengraceit123.blogspot.com/
Fig 3.2: Building a queue
Source: http://www.lean-academy.com/web/Lean_Ausbildung/Lean_Expert/107/120/127/67.html
The data last added in the stack is the first one to be retrieved while in case of queues,
the data first added is the first one to be retrieved. The terms used in representation of
queue are
1. Rear: A variable indicating the end where new data will be added (in the
queue).
2. Front: It is a variable indicating the end from where the data will be retrieved .
OPERATIONS DESCRIPTION
Create To create an initial queue.
Create: This operation creates a initial empty queue with two pointers front and rear
pointing to their initial values.
Destroy: This operation destroys queue by deleting all elements from the queue.
Enqueue: This operation allows to add the element from the end of the queue known
as rear .This also reflect the change in the rear pointer. If queue is empty, adding an
element moves both the pointers front and rear.
Dequeue: This operation allows deletion of an element from the end known as front.
This changes the front pointer to reflect deletion. If queue is empty ,no elements will
be deleted and both pointers points to their initial values. If queue has one element,
deletion of this element makes front and rear pointing to same initial value. Dequeue
retrieves and removes the front element from the queue.
Peep: This operation allows to retrieve the front element from the queue . It will not
remove the element while retrieving as Dequeue operation does.
IsFull : It checks whether the queue is full or not by checking the pointers rear and
front.
Isempty: It checks whether the queue is empty or not by checking the pointers rear
and front.
Source :Wikipedia.org
Source: http://www.javacoffeebreak.com/books/extracts/javanotesv3/c11/queue.gif
Process of adding and retrieving data in the queue will change the positions of pointers
Font and Rear. To illustrate this let us discuss one example.
Suppose we have a queue represented by an array queue [10], which is empty to start
with. The values of front and rear variable upon different actions are mentioned in {}.
enqueue (‘a ‘)
enqueue (‘b’)
In this way, a queue like a stack, can grow and shrink over time.
Source: http://codefords.wordpress.com
Procedure Dequeue(Q, front, rear, item) : This procedure removes an element from the
front end of the queue Q and return back through item.
STEP 1 : [Check for the underflow on stack]
If (front=rear)
then write(Queue empty)
Return
STEP 2 : [Hold the former front element of queue into item]
item <-- Q[front]
STEP 3 : [increment the front pointer or index by 1]
front <-- front+1
STEP 4 : [finished-Return the retrieved item from the queue]
Return(item)
As Underflow condition is checked for in the first step of the algorithm. If such a
condition exists, then the deletion cannot be performed and an appropriate error
message results.
Procedure Display(Q, front) : This procedure displays the contents of the queue i.e.,
vector Q.
STEP 1 : [check for empty on queue]
if (front=rear)
then write('queue empty')
Return
STEP 2 : [Repeat through STEP 3 from i=TOP to 0]
Repeat through STEP 3 for i=front to rear-1 STEP 1]
STEP 3 : [Display the queue content]
write (Q[i])
STEP 4 : [Finished]
Return
The first step of this algorithm checks for an empty condition. If such a condition exists,
then the contents of the queue cannot be displayed and an appropriate error message
results
Now have a look at the following example program that illustrates all this in C++:
// queue class
class queue
{
int arr[MAX];
int front, rear;
public:
queue();
void enqueue(int);
int dequeue(void);
};
// queue class ends
// member functions
queue::queue() //constructor
{
// initialize index
// variables
front=-1;
rear=0;
}
if(rear==MAX-1)
{
cout<<"QUEUE FULL!";
return;
}
arr[rear]=data;
// increase index
// variable
rear++;
if(front=-1)
front=0;
}
if(front==-1)
{
cout<<"QUEUE EMPTY!";
return NULL;
}
data=arr[front];
arr[front]=0;
return data;
}
// member functions ends
while(ch!=3)
{
cout<<"1> ADD";
cout<<"\n2> RETRIVE";
cout<<"\n3> QUIT\n";
cin>>ch;
switch(ch)
{
case 1:
cout<<"enter element:";
cin>>num;
obj.enqueue(num);
break;
case 2:
cout<<"\n\nRetrieved: ";
cout<<obj.dequeue();
cout<<"\n\n";
break;
}
}
}
Another problem with the Linear queue is If the last position of the queue is occupied,
it is not possible to enqueue any more elements even though some positions are
vacant towards the front positions of the queue.
However, this limitation can be overcome by moving the elements forward, such that
the first element of the queue goes to position with index 0, and the rest of the
elements move accordingly. And finally the front and rear variables are adjusted
appropriately. The modified version of functions enqueue and dequeue to take care this
shifting are :
Dequeue()
int queue::dequeue()
{ int data;
if(front==-1)
{
cout<<"QUEUE EMPTY!";
return NULL;
}
else
{ data=arr[front];
arr[front]=0;
if(front==rear-1)
front=-1;
rear=0;
else
front++;
// this loop shifts the data from queue last positions to the beginning to make
// space for more data at the end
int k=0;
for(int i=front;i<=rear;i++)
{
arr[k]=arr[i];
k++;
}
front=0;
rear=k-1;
}
return data;
}
Enqueue()
We can implement a FIFO queue using two stacks. Lets call the stacks Instack and
Outstack. An element is inserted in the queue by pushing it into the Instack. An
element is extracted from the queue by popping it from the Outstack. If the Outstack is
empty then all elements currently in Instack are transferred to Outstack but in the
reverse order. stack1.)
steps:
c. Out instack (instack will push the last elemet ie 2) [instack - 1, ][outstack - , ]
g. Out outstack (since the Last element is 1,B will push 1 out)
[instack - , ] [outstack- 2, ]
[instack - , ][outstack- , ]
j. Display the Result of 'g' ... Out Put : 1 2
Source: http://keithblue1984.spaces.live.com/blog/
Fig 3.5: Implementation of queue using two stacks
Given two queues with their standard operations (enqueue, dequeue, isempty,
size), implement a stack with its standard operations (pop, push, isempty,
size).There should be TWO versions of the solution.s
Version A:
push:
enqueue in queue1
pop:
while size of queue1 is bigger than 1, pipe dequeued items from queue1
into queue2
dequeue and return the last item of queue1, then switch the names of
queue1 and queue2
Version B:
push:
enqueue in queue2
enqueue all items of queue1 in queue2, then switch the names of queue1
and queue2
pop:
deqeue from queue1
s
Following diagram shows the working according to the version B.
Step 0: initial empty stack and queues
Step 0:
"Stack"
+---+---+---+---+---+
| | | | | |
+---+---+---+---+---+
Queue A Queue B
+---+---+---+---+---+ +---+---+---+---+---+
| | | | | | | | | | | |
+---+---+---+---+---+ +---+---+---+---+---+
Step 1: Enqueue in queue B , enqueue all elements in queue B from A. swap the names of
queues. Step 1:
"Stack"
+---+---+---+---+---+
| 1 | | | | |
+---+---+---+---+---+
Queue A Queue B
+---+---+---+---+---+ +---+---+---+---+---+
| 1 | | | | | | | | | | |
+---+---+---+---+---+ +---+---+---+---+---+
Step 2: Enqueue(2) in queue B and enqueue (1) from queue A to queue B. swap the names of
queue.
Step 2:
"Stack"
+---+---+---+---+---+
| 2 | 1 | | | |
+---+---+---+---+---+
Queue A Queue B
+---+---+---+---+---+ +---+---+---+---+---+
| | | | | | | 2 | 1 | | | |
+---+---+---+---+---+ +---+---+---+---+---+
Then dequeue()=3 from queue A which is the last element entered in the queue.
They work on stack principle “LIFO”.
Step 3:
"Stack"
+---+---+---+---+---+
| 3 | 2 | 1 | | |
+---+---+---+---+---+
Queue A Queue B
+---+---+---+---+---+ +---+---+---+---+---+
| 3 | 2 | 1 | | | | | | | | |
+---+---+---+---+---+ +---+---+---+---+---+
This idea is to simulate LIFO by having data structures that works in FIFO order.
Source:http://stackoverflow.com/questions/688276/implement-stack-using-two-
queues(modified-self)
The array based linear representation of Queue sometimes called Bounded queue
because it has many limitations .They are illustrated below:
1. Size of the Queue must be known in advance. It is fixed in size.
2. There are many real or system maintaining situations when an attempt to
add an element cause overflow. However, queue, as an abstract data structure
cannot be full .Hence abstractly, it is always possible to add an element in
queue. Therefore, representing data structure queue as an array prohibits the
growth of queue beyond the finite number of elements.
Therefore , some dynamic structure should be required to represent queue.
A Queue which is represented using a linked list is also known as linked queue. Linked
queue can be represented as singly linked list and doubly linked list with two pointer
variables front and rear. The linked representation allow a queue to grow to a limit of
the computer’s memory. The class structures for linked queue is as follows. Node of
linked list in the class have integer element info and next is a pointer pointing to next
element. Class queue contains pointer front and rear pointing to first and last element
of linked list respectively and also all related operations as member functions . The
addatrear(enqueue) and deleteatfront (dequeue) operations are exactly same except
list is maintained by next pointer. Traverse operation prints all the elements from the
queue.
{
front=front->next;
}
delete t;
return m;
else
{
rear->next=p;
rear=p;
}
void queue::traverse()
{
node *q;
q=front;
cout<<"Entered info is......";
while(q!=rear->next)
{ cout<<q->info<<" ";
q=q->next;
}
void main()
{
clrscr();
queue v;
int e;
char ch;
do
{
cout<<"\nEnter the node....";
cout<<"\nEnter the info.....";
cin>>e;
v.addatrear(e);
cout<<"want to enter more elements..(y/n)....";
cin>>ch;
}while(ch=='y'||ch=='Y');
v.traverse();
int n=v.deleteatfront();
v.traverse();
getch();
}
The deleteatfront (dequeue) operation deletes the first node pointed by front
from the linked list by performing the following task:
The addatrear(enqueue) operation adds the new node in the beginning of linked
list by performing the following tasks:
The traverse operation will print all the information from the nodes of the list by
performing the following task:
front 6 3 rear
6 3 9 5
New link
front rear
deleteatfront()
rear
6
3 9 5
front
New link
Source:self
Fig 3.6 : Linked representation of Queue : enqueue and dequeue
When we compare the queue implementations, we look at two different factors: the
amount of memory required to store the structure and the amount of "work" required
by the solution, as expressed in Big-O notation. Let's compare the two implementations
that we have coded completely: the array-based implementation and the dynamically
linked implementation.
An array variable of the maximum queue size takes the same amount of memory, no
matter how many array slots are actually used; we need to reserve space for the
We can also compare the relative "efficiency" of the implementations, in terms of Big-O
notation. The class constructors, IsFull, and IsEmpty operations are clearly O(1); they
always take the same amount of work regardless of how many items are on the queue.
What about Enqueue and Dequeue? Does the number of elements in the queue affect
the amount of work done by these operations? No, it does not; in both implementations,
we can directly access the front and rear of the queue. The amount of work done by
these operations is independent of the queue size, so these operations also have O(1)
complexity.
Only the MakeEmpty operation differs from one implementation to the other. The static
array-based implementation merely sets the front and rear indexes, so it is clearly an
O(1) operation. The dynamic array-based implementation is only one statement, so it
also is O(1). The linked implementation must process every node in the queue to free
the node space. This operation, therefore, has O(N) complexity, where N is the number
of nodes in the queue. The class destructor was not needed in the statically allocated
array-based structure but was required in the dynamically allocated array-based
structure. The class destructor in the array-based implementation in dynamic storage
has only one statement, so it has O(1) complexity. The class destructor in the
dynamically allocated linked structure contains a loop that executes as many times as
there are items on the queue. Thus the dynamically linked version has O(N) complexity.
As with the array-based and linked implementations of stacks, these two queue
implementations are roughly equivalent in terms of the amount of work they do,
differing only in one of the six operations and in the class destructor. Table 5.3
summarizes the Big-O comparison of the queue operations.
Here , In Enqueue operation next rear position is calculated using modulus operator
implementing circular queue.
Source: www.cs.usfca.edu/~srollins/courses/cs112-f07/web/notes/queues.html
When same queue can be used to hold different type of data is called Generic queue.
A C++ language construct that allows the compiler to generate multiple versions of a
class Queue type by allowing parameterized types. This is known as Generic queue.
This can be achieved using template parameters.
Template<class Queuetype>
class queue
{
Queuetype arr[MAX];
int front, rear;
public:
void enqueue(Queuetype);
Queuetype dequeue(void);
queue();
};
The main program code creates three different types of queue q1,q2,q3 where q1 is
integer queue, q2 is floating point queue, q3 is character queue using the following
statements:
queue<int> q1;
queue<float> q2;
queue<char> q3;
Similarly member functions can be made generic by using template keyword while
writing the description.
// Enqueue
template <class QueueType>
void queue<QueueType> :: enqueue(Queuetype)
{
……..........
}
//Dequeue
template <class QueueType>
QueueType queue<QueueType> :: dequeue(void)
{
……..........
}
//queue constructor
template <class QueueType>
queue<queueType> :: queue()
{
……..........
}
The need to move the elements in the array arose from our decision to keep the front
of the queue fixed in the first array slot. If we keep track of the index of the front as
well as the rear, we can let both ends of the queue float in the array.
Figure 3.8 shows how several Enqueue and Dequeue operations would affect the queue.
(For simplicity, the figure shows only the elements in the queue. The other slots
contain logical garbage, including dequeued values.) The Enqueue operations have the
same effect as before; they add elements to subsequent slots in the array and
increment the index of the rear indicator. The Dequeue operation is simpler, however.
Instead of moving elements up to the beginning of the array, it merely increments the
front indicator to the next slot. The MaxQue is the size of queue.
Enqueue(‘A’)
Front=0
A Rear =0
0 1 2 3 4
Enqueue(‘B’)
Front=0
A B Rear =1
0 1 2 3 4
Enqueue(‘C’)
A B C Front=0
Rear =2
0 1 2 3 4
Dequeue()
B C Front=1
Rear =2
0 1 2 3 4
Source:self
Figure 3.8: The effect of Enqueue and Dequeue
Front=3
J K Rear =4
0 1 2 3 4
(b)Using circular array, wrap the queue around to top of array
L J K Front=3
Rear =0
0 1 2 3 4
source: self
Figure 3.9: Wrapping the queue elements around
Letting the queue elements float in the array creates a new problem when the rear
indicator reaches the end of the array. In our first design, this situation told us that the
queue was full. Now, however, the rear of the queue might potentially reach the end of
the (physical) array when the (logical) queue is not yet full (Figure 3.9a).
Because space may still be available at the beginning of the array, the obvious solution
is to let the queue elements "wrap around" the end of the array. In other words, we
can treat the array as a circular structure, in which the last slot is followed by the first
slot (Figure 3.9b). To get the next position for the rear indicator, for instance, we can
use an if statement:
if (rear = = maxQue - 1)
rear = 0;
else
rear = rear + 1
(a)Initial conditions
Front=2
A Rear =2
0 1 2 3 4
(b)Dequeue()
Front=3
Rear =2
0 1 2 3 4
Source:self
Fig 3.10: An empty queue
(a)Initial conditions
Front=3
C D A B Rear =1
0 1 2 3 4
(b)Enqueue(‘E’)
C D E A B Front=3
Rear =2
0 1 2 3 4
The first solution that comes to mind is to add another data member to our queue class,
in addition to front and rear-a count of the elements in the queue. When the count
member is 0, the queue is empty; when the count is equal to the maximum number of
array slots, the queue is full. Note that keeping this count adds work to the Enqueue
and Dequeue routines. If the queue user frequently needed to know the number of
elements in the queue, however, this solution would certainly be a good one. Another
common, but less intuitive approach is to let front indicate the index of the array slot
preceding the front element in the queue, rather than the index of the front element
itself. If rear still indicates the index of the rear element in the queue, the queue is
empty when front is equal to rear. To dequeue an element, we increment front to
indicate the true location of the front queue element, and assign the value in that array
slot to item. (Updating front precedes assigning the value in this design, because front
does not point to the actual front element at the beginning of Dequeue.) After this
Dequeue operation, IsEmpty finds that front is equal to rear, indicating that the queue
is empty (see Figure 3.12).
(a)Initial conditions
Front=1
A Rear =2
0 1 2 3 4
(b)Dequeue()
Front=2
Rear =2
0 1 2 3 4
An additional convention that we must establish to implement this scheme is that the
slot indicated by front (the slot preceding the true front element) is reserved. It cannot
contain a queue element. Thus, if there are 100 array positions, the maximum size of
the queue is 99 elements. To test for a full queue, we check whether the next space
available (after rear) is the special reserved slot indicated by front (see Figure 3.13).
C D reserved A B Front=2
Rear =1
0 1 2 3 4
To enqueue an element, we must first increment rear so that it contains the index of
the next free slot in the array. We can then insert the new element into this space.
Using this scheme, how do we initialize a queue to its empty state? We want front to
indicate the array index that precedes the front of the queue, so that when we first call
Enqueue the front of the queue is in the first slot of the array. Which position precedes
the first array slot? Because the array is circular, the first slot is preceded by the last
slot. As a consequence, we initialize front to maxQue - 1. Because our test for an
empty queue is checking whether front is equal to rear, we initialize rear to front, or
maxQue - 1.
Now we see that we must add two data members to the QueType class: front and rear.
The header file follows. Through the parameterized constructor, we let the user
determine the maximum size of the queue when a class object is declared. Because our
implementation takes one more array slot, we must increment max (the parameter to
the constructor) before we save it in maxQue. (This implementation is called a circular
or ring queue.) The modified version of circular queue code is given below:
class FullQueue
{}:
class EmptyQueue
{};
Note that Dequeue, like the stack Pop operation, does not actually remove the value of
the item from the array. The dequeued value still physically exists in the array. It no
longer exists in the queue, however, and cannot be accessed because of the change in
front. That is, the dequeued data element exists in the implementation but not in the
abstraction.
Comparing Array Implementations The circular array solution is not nearly as simple or
intuitive as first queue design. What did we gain by adding some amount of complexity
to our design? By using a more efficient Dequeue algorithm, we achieved better
performance. To find out how much better, analyzing the first design. Because the
amount of work needed to move all of the remaining elements is proportional to the
number of elements, this version of Dequeue is an 0(N) operation. The second array-
based queue design simply requires Dequeue to change the values of the front
indicator and to put the value into item to be returned. The amount of work never
exceeds some fixed constant, no matter how many elements are in the queue, so the
algorithm has 0(1) complexity.
The other operations all have 0(1) complexity. No matter how many items are in the
queue, they do (essentially) a constant amount of work.
Our QueType class contains two pointers, one to each end of the queue. This design is
based on the linear structure of the linked queue. Given only a pointer to the front of
the queue, we could follow the pointers to reach the rear, but this tactic turns
accessing the rear (to Enqueue an item) into an O(N) operation. With a pointer to the
rear of the queue only, we could not access the front because the pointers go only from
front to rear.
If we made the queue circularly linked, we could access both ends of the queue from a
single pointer. That is, the next member of the rear node would point to the front node
of the queue (see Figure 3.15). Now QueType has only one data member, rather than
two. One interesting thing about this queue implementation is that it differs from the
logical picture of a queue as a linear structure with two ends. This queue is a circular
structure with no ends. What makes it a queue is its support of FIFO access.
rear
Source:self
To Enqueue an element, we access the "rear" node directly through the pointer rear. To
Dequeue an element, we access the "front" node of the queue. We don't have a pointer
to this node, but we do have a pointer to the node preceding it-rear. The pointer to the
"front" node of the queue is in Next(rear). An empty queue is represented by rear =
NULL.We can test both linked implementations of the Queue ADT by using the same
test plan that we wrote for the array-based version.
3.5 DeQue
A double-ended queue is an abstract data type similar to an ordinary queue, except
that it allows you to insert and delete from both sides. The term deque is a contraction
of the name double-ended queue. There are two variations of a deque,
These are :-
1.Input restricted deque- which allows insertion only at one end and allows deletions
at both ends.
2.Output restricted deque-which allow insertions at both ends but allow deletions
only at one end.
addatfront addatback
e
Deleteatfront 0 1 2 3 4
Deleteatback
//description of double ended queue having two pointers head and tail
class dqueue: public node
{
node *head,*tail;
public:
dqueue()
{
head=NULL;
tail=NULL;
}
};
// the insertion into doubly ended Queue at front and back
void insert(int x){
node *temp;
int ch;
if( head == 0)
{// add first data at head if queue empty
head = new node;
head->data=x;
head->next=NULL;
head->prev=NULL;
tail=head;
else
{
cout <<" Add element 1.FRONT 2.BACK\n enter ur choice:";
cin >> ch;
if(ch==1)
{// add at front
temp=new node;
temp->data=x;
temp->next=head;
temp->prev=NULL;
head->prev=temp;
head=temp;
}
else
{// add at back
temp=new node;
temp->data=x;
temp->next=NULL;
temp->prev=tail;
tail->next=temp;
tail=temp;
}
}
};
if(ch==1)
{
//deletes at front
head=head->next;
head->prev=NULL;
}
Else
{
// delete at back
tail=tail->prev;
tail->next=NULL;
}
};
(1)Original List
head 6 3 tail
6 3 9 5
New link
head tail
(3) deleteatfront()
tail
6
3 9 5
head
New link
(4) Addatfront(7)
New link tail
7 New link
New link
3 9 5
head
(5) deleteatback()
7 3 9 5
New link
head tail
A priority queue is an abstract data type (ADT) which have elements arranged in the
list according to their priority. It supporting the following three main operations:
The priority queue can do number of operations insertion and deletion of element,
finding element with maximum priority of minimum priority, searching an element
according to priority. The priority queue can implemented using array, singly linked list,
doubly linked list . Here, we are using doubly linked list for implementation . assuming
that data stored into the node is priority itself. We can have different structures of node
as having data as well as priority stored in it or having data such that it is a priority
itself. The following code contain some member functions as well as class structure in
detail required to implement priority queue.
The class required for the priority queue are Node and Priorityqueue for
node of doubly linked list and priority queue itself conataining member
functions.
public:
PriorityQueue();
int Maximum(void);
int Minimum(void);
void Insert(int);
int Delete(int);
void Display(void);
int Search (int);
~PriorityQueue(void);
};
constructor can be written as:
// Constructor
PriorityQueue::PriorityQueue()
{
head=0;
Some of the functions are given for a reference. Others can be wriiten
similarly to complete the program.
// Function Finding Maximum Priority Element
int PriorityQueue::Maximum(void)
{
Node *ptr;
int Temp;
ptr=head;
Temp=ptr->Data;
while(ptr->Next!=NULL)
{
if(ptr->Data>Temp)
Temp=ptr->Data;
ptr=ptr->Next;
}
if(ptr->Next==NULL && ptr->Data>Temp)
Temp=ptr->Data;
return(Temp);
}
ptr=head;
if(head==0)
{
cout<<"Cannot Delete the only Node"<<endl;
return 0;
}
if(ptr->Data==DataDel)
{
{
/*** Checking condition for deletion of ***/
/*** all nodes except first and last node ***/
if(ptr->Next->Data==DataDel)
mynode=ptr;
temp=ptr->Next;
mynode->Next=mynode->Next->Next;
mynode->Next->Previous=ptr;
delete temp;
return(1);
}
ptr=ptr->Next;
}
if(ptr->Next->Next==NULL && ptr->Next->Data==DataDel)
{
/*** Checking condition for deletion of last node ***/
temp=ptr->Next;
delete temp;
ptr->Next=NULL;
return(1);
}
}
return(0);
}
// Function Searching element in Priority Queue
int PriorityQueue::Search(int DataSearch)
{
Node *ptr;
ptr=head;
while(ptr->Next!=NULL)
{
if(ptr->Data==DataSearch)
return ptr->Data;
ptr=ptr->Next;
}
if(ptr->Next==NULL && ptr->Data==DataSearch)
return ptr->Data;
return(0);
}
if(head->Next==NULL)
delete head;
}
//Main Function
void main()
{
PriorityQueue PQ;
int choice;
int DT;
while(1)
{
cout<<"Enter your choice"<<endl;
cout<<"1. Insert an element"<<endl;
cout<<"2. Display a priorty Queue"<<endl;
cout<<"3. Delete an element"<<endl;
cout<<"4. Search an element"<<endl;
cout<<"5. Exit"<<endl;
cin>>choice;
switch(choice)
{
case 1:
cout<<"Enter a Data to enter Queue"<<endl;
cin>>DT;
PQ.Insert(DT);
break;
case 2:
PQ.Display();
break;
case 3:
{
int choice;
cout<<"Enter your choice"<<endl;
cout<<"1. Maximum Priority Queue"<<endl;
cout<<"2. Minimum Priority Queue"<<endl;
cin>>choice;
switch(choice)
{
case 1:
PQ.Delete(PQ.Maximum());
break;
case 2:
PQ.Delete(PQ.Minimum());
break;
default:
cout<<"Sorry Not a correct choice"<<endl;
}
}
break;
case 4:
cout<<"Enter a Data to Search in Queue"<<endl;
cin>>DT;
if(PQ.Search(DT)!=FALSE)
cout<<DT<<" Is present in Queue"<<endl;
else
cout<<DT<<" is Not present in Queue"<<endl;
break;
case 5:
exit(0);
default:
cout<<"Cannot process your choice"<<endl;
}
}
}
Source: www.coopsoft.com/ar/PriQueArticle.html
5 Queue: Applications
A queue is natural data structure for a system to serve the incoming requests. Most of the
process scheduling or disk scheduling algorithms in operating systems use queues.
We can implement a round robin scheduler using a queue, Q, by repeatedly performing
the following steps:
1. e = Q.dequeue()
2. Service element e
3. Q.enqueue(e)
The Queue
While working with the operation system, it access many jobs. Jobs are added into a queue known as
Job queue. One by One Job is retrieved from the job queue and added to the ready queue for assigning
it to processor. This generates the ready queue. Now, This algorithm partitions the ready queue into
several separate queues. The nature of jobs in this multilevel queue is different . According to the
nature , the queue is given priority for executing jobs in it. These jobs are in the form of processes.
There can be many types of processes queue like system queue contains all system processes. The
processes are permanently assigned to one queue based on some property of the process. Each queue
has its own specific scheduling algorithm. Consider the following illustration of Multilevel priority
queue scheduling:
Each queue has absolute priority over lower priority queues. No process in batch queue will be able to
run until the queues above it are empty. Another possibility is to time slice between the queues by
allotting each process with a certain portion of CPU time.
Source:http://read.cs.ucla.edu/111/notes/lec7
Computer hardware like processor or a network card also maintain buffers in the form
of queues for incoming resource requests. A stack like data structure causes starvation
of the first requests, and is not applicable in such cases. A mailbox or port to save
messages to communicate between two users or processes in a system is essentially a
queue like structure. To be specific ,A circular queue buffer is used for storage of data
in various activities of computers. This buffer is known as cyclic buffer or ring buffer. It
is a data structure that uses a single, fixed-size buffer as if it were connected end-to-
end. This structure lends itself easily to buffering data streams.
An example that could possibly use an overwriting circular buffer is with multimedia. If
the buffer is used as the bounded buffer in the producer-consumer problem then it is
probably desired for the producer (e.g., an audio generator) to overwrite old data if the
consumer (e.g., the sound card) is unable to momentarily keep up. Another example is
the digital waveguide synthesis method which uses circular buffers to efficiently
simulate the sound of vibrating strings or wind instruments.
The "prized" attribute of a circular buffer is that it does not need to have its elements
shuffled around when one is consumed. (If a non-circular buffer were used then it
would be necessary to shift all elements when one is consumed.) In other words, the
circular buffer is well suited as a FIFO buffer while a standard, non-circular buffer is
well suited as a LIFO buffer.
It is a ring like structure having one pointer to access the element . It act as a buffer to
store data so that there is no wastage of the locations in the queue. It insert and delete
the elements in sequence . After the queue reaches the maximum , it again starts
from first location and act as a continuous buffer for storing data.
Like stacks, queues can be used to remember the search space that needs to be explored at
one point of time in traversing algorithms. Breadth first search of a tree uses a queue to
remember the nodes yet to be visited. It is a level by level traverse in a graph or a tree. The
following tree BFS traversal gives the sequence 1 2 3 4 5 6 7 8 9 10 11 12 after traversing
the nodes level by level. For coding this traversal we require a Queue as under:
Algorithm-BFS
Here , Traversing starts from root node. Visiting this parent node requires to
store its child node which are at next level. Queue is used to store the child
nodes of this currently visited node. The next node is then dequeued from the
queue to be visited and again stores it children in the Queue. This process
should go on till the queue is empty. The sequence of visit gives the Breadth
first traversal of tree.
Traffic light sequence on a large roundabout is an example of circular queue. It changes the
signals circularly in a sequence with equal interval of time. Some other real time examples
are print spooler of operating system,bottle caping systems in cold drink factory,
resolves bullet cylinder when place an object into two side opened container and biscuit
baking.
Source: http://i.ytimg.com/vi/GELrKh8lrRw/0.jpg
Bandwidth management
Many modern protocols for Local Area Networks also include the concept of Priority
Queues at the Media Access Control (MAC) sub-layer to ensure that high-priority
applications (such as VoIP or IPTV) experience lower latency than other applications
which can be served with Best effort service. Examples include IEEE 802.11e (an
amendment to IEEE 802.11 which provides Quality of Service) and ITU-T G.hn (a
standard for high-speed Local area network using existing home wiring (power lines,
phone lines and coaxial cables).
Usually a limitation (policer) is set to limit the bandwidth that traffic from the highest
priority queue can take, in order to prevent high priority packets from choking off all
other traffic. This limit is usually never reached due to high level control instances such
as the Cisco Callmanager, which can be programmed to inhibit calls which would
exceed the programmed bandwidth limit.
Another use of a priority queue is to manage the events in a discrete event simulation.
The events are added to the queue with their simulation time used as the priority. The
execution of the simulation proceeds by repeatedly pulling the top of the queue and
executing the event thereon.
Dijkstra's algorithm
When the graph is stored in the form of adjacency list or matrix, priority queue can be
used to extract minimum efficiently when implementing Dijkstra's algorithm.
The A* search algorithm finds the shortest path between two vertices of a weighted
graph, trying out the most promising routes first. The priority queue (also known as the
fringe) is used to keep track of unexplored routes; the one for which a lower bound on
the total path length is smallest is given highest priority. If memory limitations make
A* impractical, the SMA* algorithm can be used instead, with a double-ended priority
queue to allow removal of low-priority items.
The semantics of priority queues naturally suggest a sorting method: insert all the
elements to be sorted into a priority queue, and sequentially remove them; they will
come out in sorted order. This is actually the procedure used by several sorting
algorithms, once the layer of abstraction provided by the priority queue is removed.
This sorting method is equivalent to the following sorting algorithms:
Applications communicate over networks by simply putting messages in queues and getting
messages from queues.
Source: http://middlewares.files.wordpress.com/2008/04/19.jpg
Source: http://middlewares.files.wordpress.com/2008/04/21.jpg
Optimistic simulation is besides the conservative methods the most promising approach
to distributed discrete-event simulation (DDEVS). In the optimistic Time Warp strategy,
every sub-simulator has to handle in addition to its main event queue (input queue)
two other queues, holding information about sent messages and past states,
respectively. We present efficient data structures and algorithms for the three queue
types in Time Warp simulation. It shows that for two queue types the simple doubly
linked list (DLL) approach suffices, for the input queue however, significant speed-up
can be achieved by enhancing a data structure originally developed for sequential
simulation. We show an improvement from an O(n) to an O(1) average case behavior
for some operations by moving from the DLL to the new data structure without
increasing the average case and worst case order of the other operations. Empirical
results show that the new data structure yields substantial faster operation for
enqueuing items into the list, which — regardless of the overhead introduced by the
new data structure to the other queue operations — results in queue access
acceleration for typical Time Warp simulations even for medium input queue sizes of
about 100. For large queue sizes of about 10000 the new data structure yields more
than 15 times faster overall queue operation performance.
Summary
Queue is an abstract data structure which has two ends front and rear for
inserting and deleting an element.
It works on a principle of first in first out. Its main operations are enqueue and
dequeue for inserting and deleting an element respectively from the Queue.
Insertion takes place from the rear and deletion is allowed from the front.It
moves two pointers to reflect the status of Queue at that moment.
Queue is implemented mainly in two representations : Array and Linked. Array
implementation is static in nature but Linked representation involves dynamic
nature of queue to grow.
Queue can be represented using two stacks with the help of simulation of FIFO
principle with LIFO.
Circular Queue is the queue which is continuous in storage not wasting the
empty locations of the queue.
Another form of queue is priority queue which stores the element depending
upon their priority attached to it. This queue has many application in the
different areas of managing computers.
Similarly, Deque is double ended Queue which allow insertion and deletion from
both ends.
Queue has many applications like job scheduling, breadth first traversal,
message communication ,sorting techniques, simulation etc.
Exercises
Q.1 Design a complete code for all the functions of the Queue?
Q.2 Write the code for solving polynomial Addition using Queue?
Q. 3. Implement Linked Queue Using Singly linked list and Doubly Linked List.
Q.6 List some of the real life examples where you can use the priority Queue.
Creative Exercises
Write a data type Deque.cpp that implements the deque using a singly linked
list.
Glossary
Algorithm
Big-O notation
A list in which every node has a successor; but last element is succeded by
first element.
Constructor
Delimiter
DeQue
Double Ended Queue is the queue in which insertion and deletion is done at
both ends.
A linked list in which each node is linked to both its successor and its
predecessor.
A data structure that can expand and contract during program execution.
Implementing
Index
Priority Queue
Queue
References
Suggested Readings
Web Links
1. www.codefords.wordpress.com
2. www.Wikipedia .org
3. www. read.cs.ucla.edu
4. www.cs.usfca.edu