Make your own free website on
Lecture 6 - Queues

4.2. Queues (Sections 3.4, 3.6, 4.7 & 4.4)

A queue is a chronologically ordered list; the only difference between a stack and a queue is that, in a stack, elements are added and removed at the same end (the top), whereas in a queue, values are added at one end (the rear) and removed from the other end (the front). The order of elements in a queue, therefore, perfectly reflects the order in which they were added: the first element added will be the first one removed (so queues are called FIFO - ``first in first out''), and the last one added will be the last one removed.

In everyday life, we encounter queues everywhere - a line of people waiting to buy a ticket or waiting to be served in a bank; all such lines of people are queues. The first person in line is the next one served, and when another person arrives, he/she joins the queue at the rear.

The specification of a queue is almost identical to that of a stack; rather than giving a full specification, I will only sketch the main points.

For all these operations to be constant-time, it is necessary to have constant-time access to both ends of the data structure representing the queue. We have seen several ways of doing this: circular lists, for example, or ordinary lists with pointers to both the first and last nodes.

For example, the queue containing 9-1-5 could be represented this way:

If we now did a SERVE operation, the value 9 would be returned and Q would be:
If we ENQUEUE 7 onto this queue, we get:
There are two main uses of queues in computing.

Serving Requests

In computing it often happens that many processes (human users or programs) require the service of a single, shared resource. For example, usually several people share the use of a printer. If requests for printing are submitted faster than they can be satisfied by the printer, the requests are placed on a queue so as to preserve the order in which the requests were submitted. New requests are added at the end of the queue, and, when the printer finishes one request it starts on the request at the front of the queue. Another computing resource that is usually shared by many users is mass-storage devices, such as large disks. The text book (section 3.7) discusses in some detail the use queues to schedule requests to read/write data from/to a disk.


The other main use of queues is storing data that is being transferred asynchronously between two processes. Asynchronous means that the sender and receiver are not synchronized ... i.e. that the receiver is not necessarily ready/able to receive the data at the time and speed at which the sender is sending it. A queue is placed between the two processes: the sender enqueues the data whenever it likes, the receiver serves out the data whenever it likes. Such a queue is sometimes called a buffer.