Sorting Algorithm Visualizer

Visualize and understand how different sorting algorithms work with animated step-by-step demonstrations.

Visualization

Comparisons
0
Swaps
0
Time
Space

Color Legend

Default
Comparing
Swapping
Sorted

Complexity

Bubble Sort
Time: O(n²), Space: O(1)
Selection Sort
Time: O(n²), Space: O(1)
Insertion Sort
Time: O(n²), Space: O(1)
Merge Sort
Time: O(n log n), Space: O(n)
Quick Sort
Time: O(n log n), Space: O(log n)
Heap Sort
Time: O(n log n), Space: O(1)

About Sorting Algorithm Visualizer

Our Sorting Algorithm Visualizer helps you understand how different sorting algorithms work through animated visualizations. Watch as elements are compared, swapped, and sorted in real-time. Perfect for students learning algorithms, developers preparing for interviews, and educators teaching computer science concepts.

How to Use the Sorting Algorithm Visualizer

  1. Select a sorting algorithm from the dropdown (Bubble, Selection, Insertion, Merge, Quick, or Heap Sort)
  2. Choose array size (10, 20, 30, 50, or 100 elements)
  3. Set animation speed (Slow, Medium, or Fast)
  4. Click "New Array" to generate a random unsorted array
  5. Click "Start Sorting" to begin the visualization
  6. Watch the algorithm sort the array with color-coded animations
  7. View real-time statistics: comparisons, swaps, time and space complexity
  8. Click "Stop" to pause the animation at any time

Sorting Algorithms Explained

Bubble Sort

Bubble Sort repeatedly steps through the list, compares adjacent elements, and swaps them if they're in the wrong order. The pass through the list is repeated until the list is sorted. Larger elements "bubble" to the end of the array.

  • Time Complexity: O(n²) - quadratic
  • Space Complexity: O(1) - constant
  • Best For: Small datasets, educational purposes
  • Stable: Yes

Selection Sort

Selection Sort divides the array into sorted and unsorted regions. It repeatedly selects the smallest (or largest) element from the unsorted region and moves it to the end of the sorted region.

  • Time Complexity: O(n²) - quadratic
  • Space Complexity: O(1) - constant
  • Best For: Small datasets, minimizing swaps
  • Stable: No

Insertion Sort

Insertion Sort builds the final sorted array one item at a time. It takes each element and inserts it into its correct position in the already sorted portion of the array.

  • Time Complexity: O(n²) - quadratic (O(n) best case)
  • Space Complexity: O(1) - constant
  • Best For: Small or nearly sorted datasets
  • Stable: Yes

Merge Sort

Merge Sort is a divide-and-conquer algorithm that divides the array into halves, recursively sorts them, and then merges the sorted halves back together. It's one of the most efficient sorting algorithms.

  • Time Complexity: O(n log n) - linearithmic
  • Space Complexity: O(n) - linear
  • Best For: Large datasets, linked lists
  • Stable: Yes

Quick Sort

Quick Sort selects a 'pivot' element and partitions the array around it, placing smaller elements before and larger elements after. It then recursively sorts the sub-arrays. Very efficient in practice.

  • Time Complexity: O(n log n) average, O(n²) worst case
  • Space Complexity: O(log n) - logarithmic
  • Best For: Large datasets, general purpose
  • Stable: No (typical implementation)

Heap Sort

Heap Sort builds a max heap from the array, then repeatedly extracts the maximum element and rebuilds the heap. It combines the better attributes of merge sort and insertion sort.

  • Time Complexity: O(n log n) - linearithmic
  • Space Complexity: O(1) - constant
  • Best For: Large datasets, memory-constrained systems
  • Stable: No

Color Legend

  • Gray (Default): Unsorted elements
  • Yellow (Comparing): Elements being compared
  • Red (Swapping): Elements being swapped
  • Green (Sorted): Elements in their final sorted position

Complexity Comparison

AlgorithmTime (Best)Time (Average)Time (Worst)Space
Bubble SortO(n)O(n²)O(n²)O(1)
Selection SortO(n²)O(n²)O(n²)O(1)
Insertion SortO(n)O(n²)O(n²)O(1)
Merge SortO(n log n)O(n log n)O(n log n)O(n)
Quick SortO(n log n)O(n log n)O(n²)O(log n)
Heap SortO(n log n)O(n log n)O(n log n)O(1)

Common Use Cases

  • Learning Algorithms: Understand how sorting works visually
  • Interview Preparation: Practice explaining sorting algorithms
  • Teaching: Demonstrate algorithm concepts to students
  • Algorithm Comparison: Compare efficiency of different approaches
  • Debugging: Visualize algorithm behavior for debugging
  • Performance Analysis: Understand why certain algorithms are faster

When to Use Each Algorithm

  • Bubble Sort: Educational purposes, very small datasets
  • Selection Sort: When memory writes are expensive
  • Insertion Sort: Small or nearly sorted datasets, online sorting
  • Merge Sort: Stable sort needed, linked lists, external sorting
  • Quick Sort: General purpose, average case performance critical
  • Heap Sort: Guaranteed O(n log n), memory-constrained systems

Key Concepts

  • Comparison-Based: All these algorithms compare elements to sort
  • In-Place: Bubble, Selection, Insertion, Heap Sort use O(1) extra space
  • Stable: Bubble, Insertion, Merge Sort maintain relative order of equal elements
  • Adaptive: Insertion Sort and Bubble Sort perform better on partially sorted data
  • Divide-and-Conquer: Merge Sort and Quick Sort use this strategy

Real-World Applications

  • Database Systems: Use merge sort for external sorting of large datasets
  • Operating Systems: Use heap sort for priority queues
  • Programming Languages: Many use hybrid algorithms (Timsort in Python, Introsort in C++)
  • Search Engines: Sort search results by relevance
  • E-commerce: Sort products by price, rating, popularity
  • Data Analysis: Sort data for visualization and analysis

Optimization Tips

  • Bubble Sort: Add early termination if no swaps occur
  • Quick Sort: Use median-of-three for pivot selection
  • Insertion Sort: Use binary search for finding insertion position
  • Hybrid Approaches: Switch to insertion sort for small subarrays
  • Parallel Processing: Merge sort and quick sort can be parallelized

Frequently Asked Questions

Which sorting algorithm is the fastest?

It depends on the data and constraints. Quick Sort is generally fastest for average cases. Merge Sort guarantees O(n log n). For small or nearly sorted data, Insertion Sort can be fastest.

Why is Quick Sort popular despite O(n²) worst case?

Quick Sort has excellent cache performance, low overhead, and the worst case is rare with good pivot selection. It's very fast in practice for most real-world data.

When should I use Merge Sort over Quick Sort?

Use Merge Sort when you need stable sorting, guaranteed O(n log n) performance, or when sorting linked lists. It's also better for external sorting of large datasets.

Are these the only sorting algorithms?

No! There are many others including Radix Sort, Counting Sort, Bucket Sort, Shell Sort, and Timsort. These six are the most fundamental and commonly taught algorithms.

Learning Resources

  • Try different array sizes to see how performance scales
  • Compare algorithms side-by-side by running them multiple times
  • Implement these algorithms in your favorite programming language
  • Practice explaining each algorithm's approach and complexity
  • Solve sorting problems on coding platforms