Comparing algorithms

Algorithms are instructions on how to perform a task. Not all algorithms are created equal. Some are fast, some are excruciatingly slow. Most algorithms are affected by the size of the input (typically denoted by n). The first step to improve code performance is to measure it.

Can one time how long it takes to run their computer program? Of course, they can. However, if one runs the same program on a smartwatch, cellphone or desktop computer, they will take different times. Wouldn’t it be great if we could compare algorithms regardless of the hardware they would be run on? That’s what time complexity is for.

But why stop at time, it would be just as useful to compare the memory usage of various algorithms. That’s what space complexity is for.

So we have some questions to answer:

  • What is the best way to measure the performance of an algorithm independent of the hardware?
  • What is the Big-O notation and how is it used to compare algorithms?
  • How to use algorithm analysis to improve program speed?

Time complexity

Time complexity, in computer science, is a function that describes the number of operations a program will execute for an input of size n.

How does one get a function that gives them the number of operations that will be executed? We count line by line and mind code inside loops. Here’s an example

function getMin(array) {
  let min;    // 1 operation

  for (let index = 0; index < array.length; index++) {  // runs n times
    const element = array[index];                       // 1 operation
    if (min === undefined || element < min) {           // 1 operation
      min = element;       // 1 operation (worst case)
    }
  }

  return min;  // 1 operation
}

Total number of operations = 3(n) + 3

Space complexity

Similar to time complexity, instead of counting the number of operations, calculate the memory used by the operations (in addition to the input).

In the example above, we create a single variable min. So the space complexity is 1. If we had copied the array for some reason, the space complexity would have been n.

Simplifying complexity with asymptotic analysis

Asymptotic analysis describes the behaviour of functions as their inputs approach to infinity. In the example above, the time complexity was found to be 3(n) + 3. So,

n (size)        Operations          Total
10              3 (10) + 3          33
10,000          3 (10000) + 3       30,003
1,000,000       3 (1000000) + 3     3,000,003

As the input size n grows bigger, the expression 3(n) + 3 gets closes and closer to 3n. At large scales, the highest order term of the function matters more than lower-order terms and constants.

There’s a notation called Big O, where O refers to the order of the function. A program with time complexity of 7n³ + 3n² + 5 can be expressed in Big O notation as O(n³). The other terms (3n² + 5) will become increasingly less significant as the input grows bigger.

All algorithms have three scenarios:

  • Best-case: the most favourable input so the program will take the least amount of operations to complete. E.g. an already sorted array is beneficial for some sorting algorithms.
  • Average-case: the most common case. E.g. array items in random order for a sorting algorithm.
  • Worst-case: the inputs are arranged in such a way that causes the program to take the longest to complete. E.g. array items in reversed order for some sorting algorithms.

Big O notation only cares about the “biggest” terms in time/space complexity in the worst-case scenario.