Page 24 - DCAP201_FUNDAMENTALS_OF_DATA_STRUCTURES
P. 24
Unit 2: Data Structure Operations and Algorithms Complexity
Insertion: Insertion operation is used for adding a new element to the list. Notes
Deletion: Deletion operation is used to remove an element from the list.
Sorting: This operation is used for arranging the elements in some type of order.
Merging: By means of merging operation, we can combine two lists into a single list.
Notes Depending upon the relative frequency, we may perform these operations with a
particular any one of the linear structure.
Self Assessment
Fill in the blanks:
1. ......................... operation is used for adding a new element to the list.
2. ......................... operation is used for arranging the elements in some type of order.
2.2 Algorithm Complexity
An algorithm is a clearly specified set of simple instructions to be followed to solve a problem.
An algorithm is a procedure that you can write as a C function or program, or any other
language. Once an algorithm is given for a problem and decided (somehow) to be correct, an
important step is to determine how much in the way of resources, such as time or space, the
algorithm will require. An algorithm that solves a problem but requires a year is hardly of any
use.
!
Caution An algorithm that requires a gigabyte of main memory is not (currently) useful.
In order to analyze algorithms in a formal framework, we need a model of computation. Our
model is basically a normal computer, in which instructions are executed sequentially. Our
model has the standard repertoire of simple instructions, such as addition, multiplication,
comparison, and assignment, but, unlike real computers, it takes exactly one time unit to do
anything (simple). To be reasonable, we will assume that, like a modern computer, our model
has fixed size (say 32-bit) integers and that there are no fancy operations, such as matrix inversion
or sorting, that clearly cannot be done in one time unit. We also assume infinite memory.
This model clearly has some weaknesses. Obviously, in real life, not all operations take exactly
the same time. In particular, in our model one disk read counts the same as an addition, even
though the addition is typically several orders of magnitude faster. Also, by assuming infinite
memory, we never worry about page faulting, which can be a real problem, especially for
efficient algorithms. This can be a major problem in many applications.
The most important resource to analyze is generally the running time. Several factors affect the
running time of a program. Some, such as the compiler and computer used, are obviously
beyond the scope of any theoretical model, so, although they are important, we cannot deal with
them here. The other main factors are the algorithm used and the input to the algorithm.
An algorithm states explicitly how the data will be manipulated. Some algorithms are more
efficient than others.
LOVELY PROFESSIONAL UNIVERSITY 17