Loading...
Development

Introduction to Algorithms

Here are well-structured, easy-to-understand notes for UNIT I – Introduction to Algorithms with clear explanations and good examples for each topic.

1. Introduction to Algorithms

Definition:
An algorithm is a well-defined, step-by-step procedure to solve a problem in a finite amount of time.

Characteristics of a Good Algorithm:

  • Input: Zero or more inputs
  • Output: At least one output
  • Definiteness: Each step must be clear and unambiguous
  • Finiteness: Must terminate after finite steps
  • Effectiveness: Every step must be basic and executable

Example: Problem: Find maximum number in a list [5, 2, 9, 1, 7]
Algorithm (Pseudocode):

MAXIMUM(arr)
    max ← arr[0]
    for i ← 1 to length(arr)-1
        if arr[i] > max
            max ← arr[i]
    return max

Output: 9

2. Analyzing Algorithms

We analyze algorithms based on two main resources:

  • Time Complexity: How running time increases with input size
  • Space Complexity: How much memory is used

We focus mainly on worst-case, average-case, and best-case time complexity.

Example: Linear Search:

  • Best case: O(1) → element found at first position
  • Average case: O(n)
  • Worst case: O(n) → element at the end or not present

3. Complexity of Algorithms (Time & Space)

Expressed using Big-O Notation (asymptotic upper bound)

NotationNameExample Algorithms
O(1)ConstantAccessing array element
O(log n)LogarithmicBinary Search
O(n)LinearLinear Search, Traversal
O(n log n)LinearithmicMerge Sort, Quick Sort (average)
O(n²)QuadraticBubble Sort, Insertion Sort
O(2ⁿ)ExponentialRecursive Fibonacci, TSP (naive)

4. Growth of Functions

Understanding how fast a function grows as input size increases.

Order from slowest to fastest growth: 1 + log n < n < n log n < n² < n³ < 2ⁿ < n!

Example Comparison (for n = 64):

FunctionValueGrowth Rate
log₂(64)6Very slow
6464Linear
64 log 64~384Moderate
64²4096Fast
2⁶⁴~10¹⁸Extremely fast

Rule: Even a small increase in growth rate dominates over time.

5. Performance Measurement

Ways to measure algorithm performance:

  1. Theoretical Analysis → Using asymptotic notation (Big-O, Omega, Theta)
  2. Empirical Analysis → Actual running time on real machines (affected by hardware, language, compiler)

Note: Theoretical analysis is preferred because it is machine-independent.

6. Sorting & Order Statistics

A. Shell Sort

  • Improvement over Insertion Sort
  • Compares elements that are far apart first (using gaps)
  • Gap sequence: usually n/2, n/4, ..., 1

Example: Array: [8, 3, 7, 4, 9, 2, 6]
Gap = 3 → Compare: (8,4), (3,9), (7,2), (4,6) → [4, 3, 2, 8, 6, 9, 7]
Gap = 1 → Normal insertion sort → Final: [2, 3, 4, 6, 7, 8, 9]

Time Complexity: O(n¹·²) to O(n¹·⁵) depending on gap sequence (not O(n²))

B. Quick Sort

  • Divide and Conquer + In-place sorting
  • Choose a pivot, partition array into < pivot and > pivot, recurse

Steps:

  1. Pick pivot (e.g., last element)
  2. Partition: rearrange so left < pivot, right > pivot
  3. Recurse on left and right subarrays

Example: [5, 3, 8, 4, 9, 1, 6, 2, 7], pivot = 7
After partition: [5, 3, 1, 4, 6, 2] 7 [8, 9]
Continue recursively → Final sorted array

Complexity:

  • Best/Average: O(n log n)
  • Worst: O(n²) → when already sorted & pivot is min/max
  • Fix: Use randomized pivot or median-of-three

C. Merge Sort

  • Classic Divide and Conquer
  • Divide → Sort → Merge

Steps:

  1. Divide array into two halves
  2. Recursively sort both halves
  3. Merge two sorted halves

Example: [38, 27, 43, 3, 9, 82, 10]
→ Split → [38,27,43] [3,9,82,10]
→ Sort → [27,38,43] [3,9,10,82]
→ Merge → [3,9,10,27,38,43,82]

Complexity:

  • Always O(n log n) – stable and predictable
  • Space: O(n) → needs extra array for merging

D. Heap Sort (Using Max-Heap)

  • Build Max-Heap → Repeatedly extract max

Steps:

  1. Build max-heap from array
  2. Swap root (max) with last element
  3. Reduce heap size by 1, heapify root
  4. Repeat until sorted

Example: [10, 20, 15, 30, 40]
→ Build Max Heap: [40, 30, 15, 10, 20]
→ Extract max repeatedly → [10, 15, 20, 30, 40]

Complexity:

  • Time: O(n log n) always
  • Space: O(1) → In-place
  • Not stable

7. Comparison of Sorting Algorithms

AlgorithmBest TimeAverage TimeWorst TimeSpaceStable?In-place?
Bubble SortO(n)O(n²)O(n²)O(1)YesYes
Insertion SortO(n)O(n²)O(n²)O(1)YesYes
Selection SortO(n²)O(n²)O(n²)O(1)NoYes
Shell Sort-O(n¹·³)O(n¹·⁵)O(1)NoYes
Merge SortO(n log n)O(n log n)O(n log n)O(n)YesNo
Quick SortO(n log n)O(n log n)O(n²)O(log n)NoYes
Heap SortO(n log n)O(n log n)O(n log n)O(1)NoYes

8. Sorting in Linear Time (O(n))

Possible when data has special properties (limited range)

Counting Sort

  • For integers in small range [0, k]
  • Count occurrences → Build output using cumulative count

Example: [4, 2, 1, 4, 3, 2] (k=4)
Count: [0,1,2,2,1] → Cumulative: [0,1,3,5,6]
Output: [1,2,2,3,4,4]

Time: O(n + k), Space: O(k)

Radix Sort

  • Sort digits from LSD to MSD
  • Use stable sort (like Counting Sort) for each digit

Example: 170, 45, 75, 90, 802, 2
→ Sort units → tens → hundreds → Final: 2, 45, 75, 90, 170, 802

Time: O(d(n + k)) where d = number of digits

Summary Table (Most Important Sorting Algorithms)

AlgorithmTime (Worst)SpaceStableWhen to Use
Merge SortO(n log n)O(n)YesNeed stable, external sorting
Quick SortO(n²)O(log n)NoFastest in practice (average)
Heap SortO(n log n)O(1)NoGuaranteed O(n log n), no extra space
Counting SortO(n + k)O(k)YesSmall integer range
Radix SortO(d(n+k))O(n+k)YesFixed-length integers/strings

These notes give you a strong conceptual understanding with practical examples — perfect for exams and interviews! Let me know if you want diagrams or code implementations.