LeetCode #315: Count of Smaller Numbers After Self
The question is as follows:
You are given an integer array nums and you have to return a new counts array. The counts array has the property where counts[i]
is the number of smaller elements to the right of nums[i]
.
Example:
Given nums = [5, 2, 6, 1]To the right of 5 there are 2 smaller elements (2 and 1).
To the right of 2 there is only 1 smaller element (1).
To the right of 6 there is 1 smaller element (1).
To the right of 1 there is 0 smaller element.
Return the array [2, 1, 1, 0]
.
The trivial solution is pretty obvious. For every number in our nums array, we just iterate through the rest of the array and count all of the numbers that are less than that specific number. The run-time for this solution would be O(n)². Can we do better?
One clever solution would be to build a binary search tree. To do this, we will need to define a new Node class that keeps track of the value in the array as well as how many elements are less than that number. As you can infer, this is some variation of dynamic programming. If we iterate through the nums array from higher index to lower, every node will remember all lesser values we have already processed. Thus, by a result of your tree traversal while you insert, you can build the result you want to return. This is this solution better visualized in code:
Tracing this code should be pretty standard. We only update the lessThan field for a TreeNode when we traverse left. Note that when we traverse right, we have to pay close attention to duplicates. Furthermore, note that once we finally insert a TreeNode, we set that TreeNode’s lessThan attribute equal to 0. This is because we update that value later via traversing. We can avoid this by getting rid of the sum parameter in our recursive function and instead letting every TreeNode’s lessThan attribute serve that purpose.
This solution runs in O(n log n) in the average case. However, if the nums array is given to us sorted in ascending or descending order, that run time degenerates into O(n)². We can fix this by using a self-balancing BST (AVL tree or a red-black tree) and retaining the same logic.
Now, a better solution that runs in O(n log n) for worst and average cases is some variation of merge sort. The main idea is that since merge sort is a stable sort, we just need to keep track of how many times an element “leaps” over another during the merging.