Asynchronous Code & Promises

Node.js is a cross-platform runtime environment and a library for running Javascript applications outside of the browser.  It uses non-blocking I/O and asynchronous events to create a highly performance environment built on Google’s V8 engine.

The engine is consisted of the memory heap (where memory allocations happen) and the call stack (where the stack frames are as code executes).

Asynchronous callbacks–allow us to execute heavy code without blocking the UI and making the browser unresponsive.

You can execute any chunk of code asynchronously by using setTimeout(callback, milliseconds).  setTimeout doesn’t automatically put your callback on the event loop queue.  It sets up a timer, and when that timer expires, your callback will enter the event loop to be executed.

The event loop is a special queue for asynchronous callbacks.  Its job is to monitor the call stack and the callback queue.  If the call stack is empty, it will take the first event from the callback queue and push it to the call stack, which effectively runs it.

Below is an awesome gif showing this process of queueing up asynchronous callbacks and executing them.

asyncPromises are the solution to callback hell.

  • It is an object that may produce a single value some time in the future: either a resolved value, or a reason that it’s rejected.
  • A promise may be in one of the 3 possible states: fulfilled, rejected, or pending.
  • All promise instances get a then method which allows you to react to the promise.  then method callbacks  can also be chained.
  • The catch callback is executed when the promise is rejected.

Screen Shot 2018-01-01 at 8.09.19 PM


Asynchronous Code & Promises

Time Complexity

Time complexity:  describes how many operations an algorithm has to do relative to its input size.

slow-growing: runtime does not increase with additional input
fast-growing: runtime increases super fast with additional inputScreen Shot 2017-12-03 at 8.28.33 PM.png

O(1):  An algorithm is run in constant time if it requires the same amount of time regardless of the input size.  (e.g. array lookup).

O(logN): runtime decreases at a rate that’s inversely proportional to input size.  (e.g. binary search – narrowing search).

O(n): runtime increases at a rate that’s directly proportional to the input size.  (e.g. a for loop to iterate over every item in an array).

O(n^2): runtime increases at an order of magnitude proportional to the square of its input size.  (e.g. two nested for loops)

O(c^n): With each step the function performs, it’s subsequent step will take longer by an order of magnitude equivalent to a factor of N. (e.g. guessing password combos)

Time Complexity

Big O

Big O notation describes the performance or complexity of an algorithm.

Screen Shot 2017-03-09 at 12.12.04 PM

O(1) – Constant time complexity – describes an algorithm (a one-line statement code) that will always execute in the same time (or space) regardless of the size of the input data set. An example is accessing a value of an array.

var arr = [ 1,2,3,4,5];
arr[2]; // => 3


O(N) – Linear time complexity – describes an algorithm (usually a loop) whose performance will grow linearly and in direct proportion to the size of the input data set. For example, if the array has 10 items, we have to print 10 times. If it has 1,000 items, we have to print 1,000 times.

//if we used for loop to print out the values of the arrays
for (var i = 0; i < array.length; i++) {


O(log N) – Logarithmic time complexity – describes an algorithm where have a large set of data and you halve the dataset at each iteration until you get the result that you’re looking for.  An example of this is finding a word in a dictionary (binary search).   Sorting a deck of cards (merge sort) would be O(N log N).

Other examples:

Example 1:
for (var i = 1; i < n; i = i * 2)
Example 2:
for (i = n; i >= 1; i = i/2)


O(N2) – Quadratic time complexity – represents an algorithm whose performance is directly proportional to the square of the size of the input data set. This is common with algorithms that involve nested iterations over the data set. Deeper nested iterations will result in O(N3), O(N4) etc.  Examples include checking for duplicates in a deck of cards, bubble sort, selection sort, or insertion sort.

for(var i = 0; i < length; i++) {     // has O(n) time complexity
    for(var j = 0; j < length; j++) { // has O(n^2) time complexity
      // More loops?


O(2N) – Exponential time complexity – denotes an algorithm whose growth doubles with each addition to the input data set. An example of an O(2N) function is the recursive calculation of Fibonacci numbers.  Another example is trying to break a password by testing every possible combination (assuming numerical password of length N).

Amortized time: 

If you do an operation say a million times, you don’t really care about the worst-case or the best-case of that operation – what you care about is how much time is taken in total when you repeat the operation a million times.

So it doesn’t matter if the operation is very slow once in a while, as long as “once in a while” is rare enough for the slowness to be diluted away (that the cost is “amortized”).  Essentially amortised time means “average time taken per operation, if you do many operations“.

Source: stackoverflow

Space complexity: 

Sometimes we want to optimize for using less memory instead of (or in addition to) using less time. Talking about memory cost (or “space complexity”) is very similar to talking about time cost and we also use the Big O notation.

Big O Cheatsheet.

Big O