LeetCode: Binary Search

Table Of Content
If you don't reduce the search space and include mid, the loop may never terminate
- Insight on Pointer Updates
- Limitation with duplicates
- Binary Search Application: Target Binary Search
- Binary Search Application: Multiple Layers Binary Search
- Binary Search Application: Optimization Search Min Max Binary Search
- Binary Search Application: Condition Adapted Binary Search
- Binary Search Application: Upper Ceiling or Lower Floor Trick Based on Target Binary Search
- 704. Binary Search ::2:: - Easy
- 74. Search a 2D Matrix ::2:: - Medium
- 875. Koko Eating Bananas ::2:: - Medium
- 153. Find Minimum in Rotated Sorted Array ::1:: - Medium
- 33. Search in Rotated Sorted Array ::1:: - Medium
- 981. Time Based Key Value Store ::2:: - Medium
- 162. Find Peak Element ::1:: - Medium
- 4. Median of Two Sorted Arrays ::2:: - Hard
Binary Search Intro:
Leetcode problems with elegant solutions using binary search.
What is Binary Search
Binary Search is a divide and conquer algorithm for searching sorted lists or ranges effectively.
It works by creating a middle value, comparing middle value to the target or evaluating a condition, and halving the search space based on that comparison.
Why Use Binary Search
Binary search reduces the search space by half on each iteration.
Technique | Time Complexity | Space Complexity |
---|---|---|
Linear Search | O(n) | O(1) |
Binary Search (Recursive) | O(log n) | O(log n) |
Binary Search (Iterative) | O(log n) | O(1) |
Target Binary Search
Used to search for a specific value or index.
def binary_search(nums, target):
left, right = 0, len(nums) - 1
# Note 1:
# Target Binary Search: uses '<='
while left <= right:
mid = (left + right) // 2
# Note 2:
# 3 parts during iteration
# target check, remove left half, remove right half
if nums[mid] == target:
return mid
elif nums[mid] < target:
left = mid + 1
else:
right = mid - 1
# Note 3:
# Exit: left > right triggers ends
# Return value if not found
return -1
Monotonic Optimization Min/Max Binary Search
Used to find the smallest or largest value that satisfies a condition.
- Find the min element greater than or equal to the target:
In ceiling search, the goal is to locate the minimal number in a sorted
def find_min_valid_value():
left, right = 0, N-1
# Note 1:
# Optimization Binary Search: uses '<'
while left < right:
mid = (left + right) // 2
# Note 2:
# Check if condition is satisfied
# If so, include mid with left or right half
if condition(mid):
# or left = mid, depending on which half we are checking
right = mid
else:
# or right = mid + 1, depending on which half we are removing
left = mid + 1
# Note: 3
# Exit: left == right, triggers the exit
# we have found the optimized solution
# or right
return left
Optimization Ceiling and Floor Style Binary Search
Variant of Optimization Binary Search:
The rule is that we can search directly for ceilings, but not for floors.
You can search directly for the ceiling (smallest element >= target) with binary search because the loop naturally stops at that boundary.
def find_ceiling(nums, target):
left, right = 0, len(nums)
# Loop condition: left < right
while left < right:
mid = (left + right) // 2
# If mid element is less than target,
# the ceiling must be to the right of mid (exclude mid)
if nums[mid] < target:
left = mid + 1
else:
# nums[mid] >= target, potential ceiling candidate
# Keep mid in search space by moving right boundary to mid
right = mid
# At the end, left == right, points to the smallest element >= target if exists
# If left == len(nums), no ceiling found
if left == len(nums):
return None
return nums[left]
You cannot search directly for the floor (largest element < = target) with binary search
def find_floor_wrong(nums, target):
left, right = 0, len(nums)
# Loop condition: left < right
while left < right:
mid = (left + right) // 2
# Attempt:
# mid element <= target:
# move left boundary up to mid (include mid)
if nums[mid] <= target:
# PROBLEM:
# left might not move forward when left == mid, causing infinite loop
left = mid
else:
# mid element > target:
# move right boundary down (exclude mid)
right = mid
# When loop ends, left == right, but it may not point to floor directly
# Without shifting left or checking bounds, this is unreliable
if left == 0 and nums[left] > target:
return None
return nums[left]
Understanding Optimization Search: Min vs Max
In optimization style binary search, we're not finding a specific value, but rather are finding the smallest or largest values that satisfies a condition.
- Finding the Minimum Valid Value
if condition(mid):
right = mid # mid might be the answer, keep it
else:
left = mid + 1 # mid is invalid, discard it
- Finding the Maximum Valid Value
if condition(mid):
left = mid # mid might be the answer, keep it
else:
right = mid - 1 # mid is invalid, discard it`
Why Binary Search Can Find < or > but Not < or > equal to
Binary Search splits the space at mid and decides whether to include or exclude it.
This means it works best with strict boundaries.
Its easy to shrink towards a point where the transition happens on < or >.
However, it struggles with ambiguous boundaries like < and > or equal to. As you don't know which direction to move in.
Ex: first element < or equal to target
# When:
arr[mid] == target
# Should you go left or right?
# If you go right, you might miss an earlier match
# If you go left, you might go too far and miss it entirely
# If you don't reduce the search space and include mid, the loop may never terminate
So binary search prefers to work with < for upper bound, > for lower bound. And shift answer manually to get < and > or equal to.
Insight on Pointer Updates
A basic but easily overlooked principle in binary search is that the search space must shrink with every iteration.
This ensures the algorithm makes progress and terminates.
In target binary search:
We are looking for a specific index or value. After comparing mid, we either return mid as we have found the target, or search the left or right subspace while excluding mid as mid has already been checked.
In target binary search, we might hit the target before the pointers trigger termination
In optimization binary search:
We are looking for a min/max valid value, not just looking for a specific value. In this case after comparing mid, search the left or right subspace and include mid as mid could be the potential solution, until our pointers hit each other and we have found our optimal solution.
In optimization binary search, the trigger will be that the pointers will equal each other every time, returning the optimal solution.
Limitation with duplicates
In most standard binary search problems, duplicates do not interfere with correctness or efficiency as long as:
- The array is fully sorted
- We're doing a target lookup or numeric optimization
- No structural transformations like rotation or flattening is applied that would affect the sorted order
Some problems depend on structure or ordering of elements (e.g., rotated arrays). In these cases, duplicates introduce ambiguity that binary search cannot resolve cleanly.
Ex: without duplicates
nums = [4, 5, 6, 7, 0, 1, 2]
def find_min_no_duplicates(nums):
left, right = 0, len(nums) - 1
while left < right:
mid = (left + right) // 2
if nums[mid] > nums[right]:
# Min must be in the right half (excluding mid)
left = mid + 1
else:
# Min is at mid or in the left half
right = mid
return nums[left]
Ex: with duplicates
nums = [2, 2, 2, 0, 1, 2]
def find_min_with_duplicates(nums):
left, right = 0, len(nums) - 1
while left < right:
mid = (left + right) // 2
if nums[mid] > nums[right]:
left = mid + 1
elif nums[mid] < nums[right]:
right = mid
else:
# nums[mid] == nums[right], can't determine side, shrink range
right -= 1
return nums[left]
Binary Search Application: Target Binary Search
Using Binary Search to locate the position of a target element in a sorted list by repeatedly dividing the search interval in half.
Ex: Searching for a target number in a sorted array.
def binarySearch(nums, target):
left, right = 0, len(nums) - 1
while left <= right:
mid = (left + right) // 2
if nums[mid] == target:
return mid # Target found
elif nums[mid] < target:
left = mid + 1 # Search right half
else:
right = mid - 1 # Search left half
return -1 # Target not found
Binary Search Application: Multiple Layers Binary Search
Applying binary search in multiple stages or layers, typically to first narrow down a substructure (i.e., row, segment, or time range), and then again within that substructure
This approach is useful when data is structured in nested sorted layers.
Ex: Searching for a target in a sorted 2D matrix
def searchMatrix(matrix: List[List[int]], target: int) -> bool:
if not matrix or not matrix[0]:
return False
rows, cols = len(matrix), len(matrix[0])
# First layer: Binary search over rows to find candidate row
top, bottom = 0, rows - 1
while top <= bottom:
mid_row = (top + bottom) // 2
if matrix[mid_row][0] <= target <= matrix[mid_row][-1]:
break
elif matrix[mid_row][0] > target:
bottom = mid_row - 1
else:
top = mid_row + 1
else:
return False # Target is outside all row ranges
# Second layer: Binary search within the found row
row = mid_row
left, right = 0, cols - 1
while left <= right:
mid_col = (left + right) // 2
if matrix[row][mid_col] == target:
return True
elif matrix[row][mid_col] < target:
left = mid_col + 1
else:
right = mid_col - 1
return False
Binary Search Application: Optimization Search Min Max Binary Search
Using binary search to find the smallest or largest valid value that satisfies a given constraint (e.g., minimum speed, minimum time, maximum capacity)
Searches over a numeric range, not for a target, and typically uses while left < right (left == right, as a trigger) to fin the min or max boundary.
Ex: Find the minimum eating speed to finish all bananas within h hours
def minEatingSpeed(piles: List[int], h: int) -> int:
def hours_needed(speed):
return sum((pile + speed - 1) // speed for pile in piles) # ceil
left, right = 1, max(piles)
# Optimization Search
while left < right:
mid = (left + right) // 2
# Try slower (minimization)
if hours_needed(mid) <= h:
right = mid
# Need faster
else:
left = mid + 1
# Note:
# there is no case where left > right breaks the loop,
# because the condition is < and not <=,
# so trigger will always be:
# left == right
# Smallest k that works
return left
Binary Search Application: Condition Adapted Binary Search
Adapting search conditions of binary search to account for special problem parameters or constraints (e.g., rotated array, duplicates, bounded search).
Covers problems where binary search is applied, but the standard algorithm is modified.
Ex: Finding the minimum element in a rotated sorted array
def findMin(nums: List[int]) -> int:
left, right = 0, len(nums) - 1
# If array is not rotated (fully sorted ascending)
if nums[left] < nums[right]:
return nums[left]
# Modified binary search with parameter-based decision
while left < right:
mid = (left + right) // 2
# Decision based on comparing mid and right elements
if nums[mid] > nums[right]:
left = mid + 1
else:
right = mid
# left == right points to minimum element
return nums[left]
Binary Search Application: Upper Ceiling or Lower Floor Trick Based on Target Binary Search
Binary search cannot directly find the largest element less than or equal to a target (or similarly find the smallest elements greater than or equal to a target) in one step.
Instead we search for the first element greater than the target and then simply shift one position left to get the largest element less than or equal.
Ex: Find the first element less than or equal to target
def largest_leq_shift(nums, target):
left, right = 0, len(nums) # half-open interval [left, right)
while left < right:
mid = (left + right) // 2
if nums[mid] <= target:
left = mid + 1 # move right since mid less to target
else:
right = mid # mid > target, shrink right side
# left is now the index of first element > target
# so the answer is at left - 1 (if valid)
return nums[left - 1] if left > 0 else ""
704. Binary Search ::2:: - Easy
Topics: Array, Binary Search
Intro
Given an array of integers nums which is sorted in ascending order, and an integer target, write a function to search target in nums. If target exists, then return its index. Otherwise, return -1. You must write an algorithm with O(log n) runtime complexity.
Example Input | Output |
---|---|
nums = [-1,0,3,5,9,12] target = 9 | 4 |
nums = [-1,0,3,5,9,12], target = 2 | -1 |
Constraints:
1 ≤ nums.length ≤ 104
-104 ≤ nums[i], target ≤ 104
All the integers in nums are unique
nums is sorted in ascending order
Abstraction
Given a sorted list of elements, find the target
Space & Time Complexity
Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
---|---|---|---|---|
Bug | Error |
---|---|
Brute Force: (iterative)
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Find the Bug: Missing return value
def search(self, nums: List[int], target: int) -> int:
def helper(left, right):
# base case: target not found
if left > right:
return -1
mid = (left+right)//2
# base case target found
if nums[mid] == target:
return mid
# search right
elif nums[mid] < target:
# INCORRECT:
# no return value
# Instead:
# return helper(mid+1, right)
helper(mid+1, right)
# search left
else:
# INCORRECT:
# no return value
# Instead:
# return helper(left, mid-1)
helper(left, mid-1)
res = helper(0, len(nums)-1)
return res
Solution 1: Recursive Binary Search - Binary Search/Searching
def search(self, nums: List[int], target: int) -> int:
# Note:
# Recursive binary search calls itself with smaller range,
# and leads to call stack of O(log n)
def helper(left, right):
# base case: target not found
if left > right:
return -1
mid = (left + right) // 2
# base case: target found
if nums[mid] == target:
return mid
# search right side
elif nums[mid] < target:
return helper(mid + 1, right)
# search left side
else:
return helper(left, mid - 1)
# time complexity: each call halves the search space O(log n)
# space complexity: call stack grows with depth O(log n)
result = helper(0, len(nums) - 1)
# overall: time complexity O(log n)
# overall: space complexity O(log n)
return result
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Variable assigning | O(1) | O(1) | Initializing variables in constant O(1) | No additional memory allocation for constant variables O(1) |
Recursive Binary Search | O(log n) | O(log n) | Search space is halved on each recursive call O(log n) | Call stack grows to depth O(log n) |
Overall | O(log n) | O(log n) | Binary search over sorted list dominates, leading to O(log n) | Stack memory used for recursive calls dominates, leading to O(log n) |
Solution 2: Iterative Binary Search - Binary Search/Searching
def search(self, nums: List[int], target: int) -> int:
# Note:
# simple binary search
# Precondition:
# nums is sorted in ascending order
# Invariant:
# Subarray nums[left:right] contains the target if it exists
# Search space halves on each iteration
# space complexity: simple variables in constant O(1)
left, right = 0, len(nums)-1
# target binary search: '<='
# time complexity: search space halves each iteration O(log n)
while left <= right:
mid = (left+right)//2
# Discard Mid
if nums[mid] == target:
return mid
# Mid is less than target:
# discard left side of [left, mid, right]
# new search interval: [mid+1, right]
elif nums[mid] < target:
left = mid + 1
# mid is greater than target:
# discard right side of [left, mid, right]
# new search interval: [left, mid-1]
else:
right = mid - 1
# overall: time complexity O(log n)
# overall: space complexity O(1)
return -1
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Variable assigning | O(1) | O(1) | Initializing variables in constant O(1) | No additional memory allocation for constant variables O(1) |
Iterative Binary Search | O(log n) | O(1) | Loop halves the search space on each iteration O(log n) | No additional memory allocation beyond loop variables O(1) |
Overall | O(log n) | O(1) | Binary search over sorted array O(log n) | No additional memory allocation O(1) |
74. Search a 2D Matrix ::2:: - Medium
Topics: Array, Binary Search, Matrix
Intro
You are given an m x n integer matrix matrix with the following two properties: Each row is sorted in non-decreasing order. The first integer of each row is greater than the last integer of the previous row. Given an integer target, return true if target is in matrix or false otherwise. You must write a solution in O(log(m * n)) time complexity.
Example Input | Output |
---|---|
matrix = [[1,3,5,7],[10,11,16,20],[23,30,34,60]], target = 3 | true |
matrix = [[1,3,5,7],[10,11,16,20],[23,30,34,60]], target = 13 | false |
Constraints:
m == matrix.length n == matrix[i].length
1 ≤ m, n ≤ 100
-104 ≤ matrix[i][j], target ≤ 104
Abstraction
Given a sorted matrix, check if target is in matrix
Space & Time Complexity
Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
---|---|---|---|---|
Bug | Error |
---|---|
Brute Force:
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Find the Bug: Did not follow Binary Search
Find the Bug: Forgot to break
def searchMatrix(self, matrix: List[List[int]], target: int) -> bool:
if len(matrix) == 0 or len(matrix[0]) == 0:
return False
rows, cols = len(matrix), len(matrix[0])
top, bottom = 0, rows-1
rowCand = 0
# first binary search
while top <= bottom:
mid = (top+bottom)//2
if matrix[mid][0] <= target <= matrix[mid][-1]:
# INCORRECT:
# never exiting the while loop
# Instead:
# break
rowCand = mid
elif matrix[mid][-1] < target:
top = mid+1
else:
bottom = mid-1
else:
return False
left, right = 0, len(matrix[rowCand])-1
while left <= right:
mid = (left+right)//2
val = matrix[rowCand][mid]
if val == target:
return True
elif val < target:
left = mid + 1
else:
right = mid - 1
else:
return False
Solution 1: Flattened Matrix by Index To Coordinates Binary Search - Binary Search/Searching
def searchMatrix(self, matrix: List[List[int]], target: int) -> bool:
# Note:
# Treats the 2D matrix as a sorted 1D array by flattening indices.
# Uses binary search on the 1D index space,
# and converts indices to 2D coordinates with the helper function.
# This approach runs in O(log(m * n)) time and O(1) space.
# Precondition:
# Each row is sorted in ascending order,
# and first element of each row is greater than the last of the previous row
# diagram:
# matrix = [
# col 0 1 2
# [ 1, 3, 5], # row 0
# [ 7, 9, 11], # row 1
# [13, 15, 17], # row 2
# ]
# Invariant:
# Target, if it exists, lies within subarray [left, right]
# Search space is reduced by half on each iteration.
# Helper: Convert 1D index to 2D matrix coordinates (row, col)
# index 4 -> (4//3, 4%3) -> (1, 1) = 9, index 5 -> (5//3, 5%3) -> (1, 2) = 11
def index_to_coords(index, cols) -> tuple[int, int]:
row = index // cols
col = index % cols
return (row, col)
# Empty Check
if not matrix or not matrix[0]:
return False
# calculate 2D matrix representation
rows, cols = len(matrix), len(matrix[0])
left, right = 0, (rows * cols) - 1
# binary target search: '<='
# time complexity: binary search on flattened 2D matrix [0,(m*n)-1] O(log(m*n))
while left <= right:
# calculating mid
mid = (left + right) // 2
(row, col) = index_to_coords(mid, cols)
mid_value = matrix[row][col]
# Check Mid
if mid_value == target:
return True
# mid is less than target:
# discard left side of [left, mid, right]
# new search interval: [mid+1, right]
elif mid_value < target:
left = mid + 1
# mid is greater than target:
# discard the right side of [left, mid, right]
# new search interval: [left, mid-1]
else:
right = mid - 1
# left > right: searched entire array
# target not found and does not exist
# overall time complexity: O(log (m * n))
# overall space complexity: O(1)
return False
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Binary search 1D index | O(log (m*n)) = O(log(m) + log(n)) | O(1) | Binary search over full flattened matrix treated as 1D O(log (m*n)) | No additional memory allocation |
Solution 2: Two Phase Binary Search over Row Ranges and Binary Search in Row Candidate - Binary Search/Searching
def searchMatrix(self, matrix: List[List[int]], target: int) -> bool:
# Note:
# Performs two separate binary searches:
# 1. Binary Search over rows to locate potential row
# 2. Binary Search within that row to find the target
# Precondition:
# Each row is sorted in ascending order,
# and the first element of each row is greater than the last of the previous row
# diagram:
# matrix = [
# col 0 1 2
# [ 1, 3, 5], # row 0
# [ 7, 9, 11], # row 1
# [13, 15, 17], # row 2
# ]
# Invariant (First Binary Search):
# Target must lie within range of candidate row window [top, bottom] (if it exists)
# Empty Check
if not matrix or not matrix[0]:
return False
# top and bottom rows (refer to diagram)
rows, cols = len(matrix), len(matrix[0])
top, bottom = 0, rows - 1
# binary target search: '<='
# time complexity: binary search over n rows O(log n)
while top <= bottom:
# get mid
mid_row = (top + bottom) // 2
# Check Mid
if matrix[mid_row][0] <= target <= matrix[mid_row][-1]:
break
# mid_row smallest value is greater than target:
# discard upper rows
# new search interval: [top, mid_row - 1]
elif matrix[mid_row][0] > target:
bottom = mid_row - 1
# mid_row smallest value is less than target:
# discard lower rows
# new search interval: [mid_row + 1, bottom]
else:
top = mid_row + 1
# No break triggered:
# top > bottom: all row ranges checked
# no target exists
else:
return False
# row boundaries
row = mid_row
left, right = 0, cols - 1
# binary target search: '<='
# time complexity: binary search across m columns in row O(log m)
while left <= right:
# mid of row
mid_col = (left + right) // 2
# Check mid
if matrix[row][mid_col] == target:
return True
# mid is less than target:
# discard left side of [left, mid, right]
# new search interval: [mid + 1, right]
elif matrix[row][mid_col] < target:
left = mid_col + 1
# mid is greater than target:
# discard right side of [left, mid, right]
# new search interval: [left, mid - 1]
else:
right = mid_col - 1
# left > right: all columns in candidate row checked
# no target exists
# overall: time complexity O(log (m * n))
# overall: space complexity O(1)
return False
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Binary search rows | O(log m) | O(1) | Binary search across m rows O(log m) | No additional memory allocation for binary search O(1) |
Binary search columns | O(log n) | O(1) | Binary search across column of n length O(log n) | No additional memory allocation for binary search O(1) |
Overall | O(log m + log n) = O(log (m*n)) | O(1) | Sum of two binary searches dominates, leading to O(log m + log n) | No additional memory allocation, leading to O(1) |
875. Koko Eating Bananas ::2:: - Medium
Topics: Array, Binary Search, Network Packet Routing, Design, System Level Design
Intro
Koko loves to eat bananas. There are n piles of bananas, the ith pile has piles[i] bananas. The guards have gone and will come back in h hours. Koko can decide her bananas-per-hour eating speed of k. Each hour, she chooses some pile of bananas and eats k bananas from that pile. If the pile has less than k bananas, she eats all of them instead and will not eat any more bananas during this hour. Koko likes to eat slowly but still wants to finish eating all the bananas before the guards return. Return the minimum integer k such that she can eat all the bananas within h hours.
Example Input | Output |
---|---|
piles = [3,6,7,11], h = 8 | 4 |
piles = [30,11,23,4,20], h = 5 | 30 |
piles = [30,11,23,4,20], h = 6 | 23 |
Constraints:
1 ≤ piles.length ≤ 104
piles.length ≤ h ≤ 109
1 ≤ piles[i] ≤ 109
Abstraction
Find minimum bananas per hour so that all are eaten in time.
Space & Time Complexity
Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
---|---|---|---|---|
Bug | Error |
---|---|
Brute Force:
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Find the Bug: Hours For Speed k Is Always Rounding Down
def minEatingSpeed(self, piles: List[int], h: int) -> int:
left, right = 1, max(piles)
while left < right:
speed = (left+right)//2
hours = 0
for pile in piles:
# INCORRECT:
# division is rounding down (floor)
# Instead:
# (pile/speed) -> leaves remainder to be picked up by ceiling
hours += math.ceil(pile//speed)
if hours <= h:
right = speed
else:
left = speed+1
return left
Find the Bug: Floor Caused K Starting Speed at 0, Leading to Division By 0 Error
def minEatingSpeed(self, piles: List[int], h: int) -> int:
# INCORRECT:
# Division could lead to remainder 0.1
# that rounds to 0.
# Safer to round up
# Instead:
# speedCheck1 = math.ceil(totalBananas / h)
totalBananas = sum(piles)
speedCheck1 = math.floor(totalBananas / h)
# INCORRECT:
# Division could lead to remainder 0.1
# that rounds to 0.
# Safer to round up
# Instead:
# speedCheck2 = math.ceil(min(piles) / evenTime)
numPiles = len(piles)
evenTime = int(h/numPiles)
speedCheck2 = math.floor(min(piles) / evenTime)
# INCORRECT:
# Division could lead to remainder 0.1
# that rounds to 0.
# Safer to round up
# Instead:
# speedCheck2 = math.ceil(max(piles) / (h - (numPiles - 1)))
speedCheck3 = math.floor(max(piles) / (h - (numPiles - 1)))
speed = max(speedCheck1, speedCheck2, speedCheck3)
while (True):
# Grab hours needed for current speed
smallKTotalTime = 0
for pile in piles:
smallKTotalTime += math.ceil(pile / speed)
# Greedy Search: designed to search from lower bound eating speed k
# Implies: first valid speed we encounter is the smallest one
if smallKTotalTime <= h:
return speed
# if totalTime is too large, double the speed
elif smallKTotalTime >= 2 * h:
speed *= 2
# if close but still too large, we increment the speed by 1
else:
speed += 1
Find the Bug: Wait a minute, This House Has No Roof
def minEatingSpeed(self, piles: List[int], h: int) -> int:
# INCORRECT:
# speed will have remainder, not valid speed
# Instead:
# speed1 = math.ceil(allBananas/h)
allBananas = sum(piles)
speed1 = allBananas/h
# INCORRECT:
# speed will have remainder, not valid speed
# Instead:
# speed2 = math.ceil(min(piles)/evenTime)
evenTime = int(h/len(piles))
speed2 = min(piles)/evenTime
# INCORRECT:
# speed will have remainder, not valid speed
# Instead:
# speed3 = math.ceil(max(piles)/unEvenTime)
unEvenTime = h - (len(piles)-1)
speed3 = max(piles)/unEvenTime
speed = max(speed1, speed2, speed3)
while (True):
requiredHours = 0
for pile in piles:
requiredHours += math.ceil(pile/speed)
if requiredHours <= h:
return speed
elif requiredHours >= 2*h:
speed *= 2
else:
speed += 1
Find the Bug: Cannot Bananas Divided By Hour, with Remainder
def minEatingSpeed(self, piles: List[int], h: int) -> int:
allBananas = sum(piles)
speed1 = math.ceil(allBananas/h)
# INCORRECT:
# evenTime will have a remained, leading to error in divisions
# Instead:
# evenTime = int(h/len(piles))
evenTime = h/len(piles)
speed2 = math.ceil(min(piles)/evenTime)
unEvenTime = h - (len(piles)-1)
speed3 = math.ceil(max(piles)/unEvenTime)
speed = max(speed1, speed2, speed3)
while (True):
requiredHours = 0
for pile in piles:
requiredHours += math.ceil(pile/speed)
if requiredHours <= h:
return speed
elif requiredHours >= 2*h:
speed *= 2
else:
speed += 1
Solution 1: Binary Search - Binary Search/Optimization Search Min Max
import math
def minEatingSpeed(self, piles: List[int], h: int) -> int:
# Note:
# Find minimum int k such that Koko can eat all bananas within h hours
# Preconditions:
# piles is a non empty list of positive integers
# h >= len(piles), h is the number of hours
# Invariant:
# k min speed is in the interval [left, right]
# Search space halves each iteration
# Search Space:
# Min speed: 1 banana per hour (slowest possible)
# Max speed: max(piles), no pile would need > max(piles) per hour
left, right = 1, max(piles)
# time complexity: iterate over n piles O(n)
def hours_needed(speed):
total_hours = 0
for pile in piles:
# Koko has fixed eating speed k
# if she has pile bananas and eats at k bananas/hr
# she will take ceil(pile/k) (cannot stop in middle of hour)
# 3 bananas with speed 9, will still take 1 full hour
total_hours += math.ceil(pile / speed)
return total_hours
# binary optimization search: '<' for min k
# time complexity: binary search over k speeds O(log(max(piles)))
while left < right:
# mid speed
mid = (left + right) // 2
required_hours = hours_needed(mid)
# Zoom In Including Mid:
# Mid is valid speed: can finish in h hours
# Discard faster speeds on right side of [left, mid, right]
# Include current mid in slower speed search
# new search interval: [left, mid]
if required_hours <= h:
right = mid
# Zoom In Excluding Mid:
# Mid is invalid speed: cannot finish in h hours
# Discard slower speeds on lef side of [left, mid, right]
# Exclude current mid from faster speed search
# new search interval: [mid + 1, right]
else:
left = mid + 1
# Invariant:
# k min speed is in the interval [left, right]
# Loop Terminates:
# left == right
# Valid [left, right] has been reduced to one point
# Both left and right point to min valid k speed
# overall: time complexity O(n * log(max(piles)))
# overall: space complexity O(1)
return left
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Outer Binary Search | O(log(max(piles))) | O(1) | Binary search over range of possible speeds from 1 to max files size O(log(max(piles))) | No additional memory allocation for binary search O(1) |
Hours calculation | O(n) | O(1) | Iterate over all piles to calculate if piles are eaten O(n) | No additional memory allocation for calculation O(1) |
Overall | O(n * log(max(piles))) | O(1) | Binary search over range alongside hours calculation per candidate dominates, leading to O(n * log(max(piles))) | No additional memory allocation O(1) |
Solution 2: Networking Router Analogy, Greedy Heuristic Hybrid Target Optimization Monotonic Search with Exponential Speed Doubling - Binary Search/Optimization Search Min Max
def minEatingSpeed(self, piles: List[int], h: int) -> int:
# -----------------------------------------------------
# Network Router Analogy:
# Think of Koko as a router,
# banana piles as incoming data packets,
# banana in pile as amount of data to be processed for packet p,
# h as the deadline (maximum allowable latency in seconds),
# and speed as the throughput (packets processed per second).
# Note:
# Greedy Start:
# Start speed k from the highest heuristic lower bound between three choices.
# Note:
# Greedy search:
# If total eating time exceeds h by double allowed time:
# increase speed k aggressively by doubling.
# If total eating time is close to h:
# increment speed k slowly by 1.
# Pros:
# Can outperform binary search in practice for some input with high k speeds.
# Cons:
# Lacks worst case guarantees, (binary search guarantees O(log n) worst case)
# Note:
# Choosing Heuristics -> System Level Perspective:
# 1. Total bananas: Throughput -> Global Bandwidth
# 2. Smallest Pile: Fair Time Allocation + Small Packet -> Quick Packet Burst
# 3. Largest Pile: Unfair Time Allocation + Big Packet -> Bottleneck Prevention
# Good Heuristics by Constraints:
# Must be lower bound -> never overestimate
# Close to real answer -> to minimize search time
# Computable in O(n) time -> scalable
# Multiple heuristics -> different system perspectives
# Discarding Heuristics -> Bad Choices:
# 1. Average between piles: Does not factor time constraint, assumes infinite time
# 2. Max pile divided over h hours: Does not factor rest of piles, high router k speed wasted on small piles
# Choosing Heuristics Take away :
# Should respect at least two (or all three) key constraints, to generate a good lower bound k
# Packets (piles)
# Data size (bananas in pile)
# Time Constraint (h)
# -----------------------------------------------------
# 3 Heuristics -> Greedy Lower Bound Starting Speed:
# ------------
# Heuristic 1: Banana Based Average (Throughput Constraint)
# Spread all bananas evenly over h available hours
# All bananas / hours -> baseline average k speed
# Lower Bound: No matter how time is spent between small or large piles,
# if k is less than totalBananas/h, all bananas will not get finished
# Produces small k:
# Underestimates when bananas are skewed into a few large piles
# by assuming bananas are even across piles
totalBananas = sum(piles)
speedCheck1 = math.ceil(totalBananas / h)
# ------------
# Heuristic 2: Hour Per Pile Based Average, Split Time Evenly (Fair time Sharing For Each Pile Constraint)
# Heuristic 2: Minimum Speed to Finish Smallest Pile within Split Even Time
# Total hours / num of piles -> equal time allocated per pile
# Smallest Pile / Allocated Time -> speed k needed to eat smallest pile within allocated time
# Lower Bound: If each pile gets equal time, this is the minimum speed
# required to finished the smallest pile without exceeding time budget
# Produces small k:
# Dividing h across numPiles, regardless of pile size,
# giving each pile generous time budget,
# and since we are testing it on the smallest pile,
# which doesn't need much time, makes k small
numPiles = len(piles)
evenTime = int(h/numPiles)
speedCheck2 = math.ceil(min(piles) / evenTime)
# ------------
# Heuristic 3: Biggest Pile Based Speed (Bottleneck preparing detection Constraint)
# Give Koko 1 hour per pile, excluding the largest one,
# spend the remaining time on largest pile.
# Lower Bound: Assume speed k is high enough to finish every pile in just 1 hour.
# That takes (numPiles - 1) hours, leaving (h - (numPiles-1)) hours.
# To finish the largest pile in this remaining time,
# k must be at least maxPile / remaining time.
# Otherwise, the final pile will not get finished.
# Produces small k:
# since we have many h hours remaining.
remainingTime = h - (numPiles - 1)
speedCheck3 = math.ceil(max(piles) / remainingTime)
# ------------
# Greedy Starting Speed: Pick the highest lower bound
# avoids testing extremely high speeds unnecessarily
# by choosing realistic starting speed
# and only incrementing speed
speed = max(speedCheck1, speedCheck2, speedCheck3)
# -----------------------------------------------------
# Conditions:
# Preconditions:
# piles is a non empty list of positive integers
# h >= len(piles), h is the number of hours
# variable 'speed' starts with speed k that is a valid lower bound based on heuristics
# Postconditions:
# Returns min speed k such that koko can eat all bananas within h hours
# Invariant:
# min valid speed k lies in [speed, +inf]
# speed is monotonically increasing during search
# first valid time <= h: speed is guaranteed to be min due to greedy monotonic increasing search strategy
# Greedy Search:
# If totalTime is too large compare to h constraint, double eating k speed
# If totalTime is close to h constraint, increment speed k by 1
# monotonically increase k search speed
# phase 1 time complexity: exponential doubling O(log max(piles))
# phase 2 time complexity: linear scan O(max(piles))
while (True):
# Get hours for speed k
smallKTotalTime = 0
for pile in piles:
smallKTotalTime += math.ceil(pile / speed)
# Valid speed found
# Implies: first valid speed we encounter is the smallest one
if smallKTotalTime <= h:
return speed
# speed is much too slow
# totalTime is twice available h -> double speed k
# new search interval: [speed*2, +inf]
elif smallKTotalTime >= 2 * h:
speed *= 2
# totalTime close to required time -> increment speed by 1
# new search interval: [speed+1, +inf]
else:
speed += 1
# overall: time complexity O(n * log (max(piles)))
# overall: space complexity O(1)
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
153. Find Minimum in Rotated Sorted Array ::1:: - Medium
Topics: Array, Binary Search
Intro
Suppose an array of length n sorted in ascending order s rotated between 1 and n times. For example, the array nums = [0,1,2,4,5,6,7] might become: [4,5,6,7,0,1,2] if it was rotated 4 times. [0,1,2,4,5,6,7] if it was rotated 7 times. Notice that rotating an array [a[0], a[1], a[2], ..., a[n-1]] 1 time results in the array [a[n-1], a[0], a[1], a[2], ..., a[n-2]]. Given the sorted rotated array nums of unique elements, return the minimum element of this array. You must write an algorithm that runs in O(log n) time.
Example Input | Output |
---|---|
nums = [3,4,5,1,2] | 1 |
nums = [4,5,6,7,0,1,2] | 0 |
nums = [11,13,15,17] | 11 |
Constraints:
n == nums.length
1 ≤ n ≤ 5000
-5000 ≤ nums[i] ≤ 5000
All of the integers of nums are unique
nums is sorted and rotated between 1 and n times
Abstraction
Find smallest number in an array that has been rotated some amount of times.
Space & Time Complexity
Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
---|---|---|---|---|
Bug | Error |
---|---|
Brute Force:
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Find the Bug:
Solution 1: Find Min by Determining if Smaller Numbers on Left or Right of Middle Binary Search - Binary Search/Condition Adapted Binary Search
def findMin(self, nums: List[int]) -> int:
# Preconditions:
# nums is a non empty list of integers
# nums was originally sorted in ascending order
# nums has been rotated
# no duplicates
# Postcondition:
# Loop terminates when left == right
# both left and right point to min element
# Invariant:
# The min value is always in the interval [left, right]
# At each iteration we discard half of search space
# outer boundaries
left, right = 0, len(nums) - 1
# Check: Rotated into Ascending Order
if nums[left] < nums[right]:
return nums[left]
# Binary Optimization Search: '<' for min
while left < right:
mid = (left + right) // 2
# property of rotated sorted array:
# 1: [4, 5, 6, 7, 0, 1, 2]
# 7 > 2:
# smaller numbers on right half of mid
# 2: [10, 1, 2, 3, 6, 7, 8]
# 3 < 8:
# smaller elements on left half of mid (including mid)
# mid is less than right
# discard higher elements on right side
# include lower elements on left side with mid
# new search interval: [left, mid]
if nums[mid] < nums[right]:
right = mid
# mid is greater than right
# discard higher elements on left side with mid
# include lower elements on right side
# new search interval: [mid+1, right]
else:
left = mid+1
# Loop Terminated:
# left == right is the minimum element index
# Via Invariant:
# left and right point to min element
# overall: time complexity O(log n)
# overall: space complexity O(1)
return nums[left]
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Binary Search Loop | O(log n) | O(1) | Binary search iteration halves search space due to sorted and rotated property O(log n) | No additional memory allocation O(1) |
No explicit target | O(1) | O(1) | We are optimizing loop to converge to min O(1) | No additional memory allocation O(1) |
Overall | O(log n) | O(1) | Binary search over array of n length dominates, leading to O(log n) | No additional memory allocation for binary search |
33. Search in Rotated Sorted Array ::1:: - Medium
Topics: Array, Binary Search
Intro
There is an integer array nums sorted in ascending order (with distinct values). Prior to being passed to your function, nums is possibly rotated at an unknown pivot index k (1 ≤ k ≤ nums.length) such that the resulting array is [nums[k], nums[k+1], ..., nums[n-1], nums[0], nums[1], ..., nums[k-1]] (0-indexed). For example, [0,1,2,4,5,6,7] might be rotated at pivot index 3 and become [4,5,6,7,0,1,2]. Notice that rotating an array [a[0], a[1], a[2], ..., a[n-1]] Given the array nums after the possible rotation and an integer target, return the index of target if it is in nums, or -1 if it is not in nums. You must write an algorithm with O(log n) runtime complexity.
Example Input | Output |
---|---|
nums = [4,5,6,7,0,1,2], target = 0 | 4 |
nums = [4,5,6,7,0,1,2], target = 3 | -1 |
nums = [1], target = 0 | -1 |
Constraints:
1 ≤ nums.length ≤ 5000
-104 ≤ nums[i] ≤ 104
All values of nums are unique.
nums is an ascending array that is possibly rotated.
-104 ≤ target ≤ 104
Abstraction
Find target in an array that has been rotated some amount of times.
Space & Time Complexity
Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
---|---|---|---|---|
Bug | Error |
---|---|
Brute Force:
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Find the Bug: Returned Index Instead of Value
def findMin(self, nums: List[int]) -> int:
left, right = 0, len(nums)-1
while left < right:
mid = (left+right)//2
if nums[mid] < nums[right]:
right = mid
else:
left = mid + 1
# INCORRECT:
# returning index of min
# instead of actual min value
# Instead:
# return nums[left]
return left
Solution 1: Inequality Boundaries Discard Mid Binary Search - Binary Search/Condition Adapted Binary Search
def search(self, nums: List[int], target: int) -> int:
# Precondition:
# num is non empty list of integers
# num is sorted in ascending order
# num has been rotated
# num has no duplicates
# Postcondition:
# Returns index of target if exists
# Returns -1 if target does not exist
# Invariant:
# target, if exists, is in interval [left, right]
# interval [left, right] is always either
# a) entirely sorted
# b) contains a rotation
# search space halves each iteration
# outer boundaries
left, right = 0, len(nums) - 1
# binary target search: '<='
# time complexity: binary search over list n length O(log n)
while left <= right:
mid = (left + right) // 2
# Check Mid
if nums[mid] == target:
return mid
# guarantees right half is sorted
# contains ascending range: [mid, right]
if nums[mid] <= nums[right]:
# target is within range (mid, right]
# discard elements on left side
# new search interval: [mid+1, right]
if nums[mid] < target <= nums[right]:
left = mid + 1
# target is not with range (mid, right]
# discard higher elements on right side
# new search interval: [left, mid-1]
else:
right = mid - 1
# guarantees left half is sorted
# contains ascending range: [left, mid]
else:
# target is within range [left, mid)
# discard on right side
# new search interval: [left, mid-1]
if nums[left] <= target < nums[mid]:
right = mid - 1
# target is not within range [left, mid)
# discard elements on left side
# new search interval: [mid+1, right]
else:
left = mid + 1
# Loop exited: left > right
# Invariant broken: no valid [left, right] remaining
# Target not found -> return -1
# overall: time complexity O(log n)
# overall: space complexity O(1)
return -1
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Binary search loop | O(log n) | O(1) | Search space halved each iteration O(log n) | No additional memory allocation for iterative binary search O(1) |
One target comparison | O(1) | O(1) | Mid comparison in constant O(1) | No additional memory allocation for comparison O(1) |
Overall | O(log n) | O(1) | Binary search over n elements dominates, leading to O(log n) | No additional memory allocation O(1) |
981. Time Based Key Value Store ::2:: - Medium
Topics: Hash Table, String, Binary Search, Design
Intro
Design a time based key value data structure that can store multiple values for the same key at different time stamps and retrieve the key's value at a certain timestamp. Implement the TimeMap class: TimeMap(): Initializes the object of the data structure. void set(String key, String value, int timestamp): Stores the key key with the value value at the given time timestamp. String get(String key, int timestamp): Returns a value such that set was called previously, with timestamp_prev ≤ timestamp. If there are multiple such values, it returns the value associated with the largest timestamp_prev. If there are no values, it returns "".
Example Input | Output |
---|---|
["TimeMap", "set", "get", "get", "set", "get", "get"] [[], ["foo", "bar", 1], ["foo", 1], ["foo", 3], ["foo", "bar2", 4], ["foo", 4], ["foo", 5]] | [null, null, "bar", "bar", null, "bar2", "bar2"] |
Constraints:
1 ≤ keys.length, values.length ≤ 100
key and value consist of lowercase English letters and digits.
1 ≤ timestamp ≤ 107
All the timestamps timestamp of set are strictly increasing.
At most 2 * 105 calls will be made to set and get.
Abstraction
Find target in an array that has been rotated some amount of times.
Space & Time Complexity
Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
---|---|---|---|---|
Bug | Error |
---|---|
Brute Force:
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Find the Bug:
Solution 1: Reverse Linear Scan - Binary Search/Condition Adapted Binary Search
class TimeMap:
# Note:
# Stores multiple (value, timestamp) tuples per key in insertion order
# The 'get' function performs a reverse linear scan to find
# the latest time <= the query timestamp.
# 'get' can be slow if many timestamps per key O(m)
# space complexity: n is the total set calls across all keys O(n)
def __init__(self):
# key -> [(value, timestamp)]
self.store = {}
# overall: time complexity O(1) amortized
def set(self, key: str, value: str, timestamp: int) -> None:
# append to key list: (value, timestamp)
if key not in self.store:
self.store[key] = []
self.store[key].append((value, timestamp))
# overall: time complexity where m is number of timestamps for this key O(m)
def get(self, key: str, timestamp: int) -> str:
# if key list does not exist
if key not in self.store:
return ''
# grab key list: [(value, timestamp)]
values = self.store[key]
# outer boundaries
i = len(values) - 1
# Reverse linear scan:
# grab first valid highest timestamp <= query
while 0 <= i and values[i][1] > timestamp:
i -= 1
# not found case: index -1
# or
# timestamp is now <= query
return values[i][0] if i != -1 else ''
# overall: time complexity O(m)
# overall: space complexity O(n)
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
set | O(1) | O(1) | Amortized constant time append O(1) | One tuples stored per call |
get | O(m) | O(1) | Linear scan backward through at most m timestamps | No additional memory allocation |
Overall | O(m) | O(n) | get scaling linearly with timestamps count dominates, leading to O(m) | Storage is proportional to total number of sets O(n) |
Solution 2: Upper Ceiling to Find First Invalid TimeStamp - Binary Search/Upper Ceiling or Lower Floor Trick Based on Target Binary Search
class TimeMap:
# Note:
# Upper Ceiling Trick:
# We cannot directly optimize for 'timestamp <= target' using '<=' via this binary search.
# Instead, optimize for the condition 'timestamp < target' (strictly greater than).
# The search finds the index of the first timestamp greater than the target.
# After, we shift the pointer by one (left - 1) to get the largest timestamp <= target.
# Note:
# Stores (value, timestamp) tuples per key in insertion order (sorted by timestamp)
# 'get' uses binary search to find the largest timestamp <= the query timestamp.
# 'get' more efficient O(log m)
# space complexity: n is the total set calls across all keys O(n)
def __init__(self):
# key -> [(value, timestamp)]
self.store = {}
# overall: time complexity amortized O(1)
def set(self, key: str, value: str, timestamp: int) -> None:
# append to key list: (value, timestamp)
if key not in self.store:
self.store[key] = []
self.store[key].append((value, timestamp))
# overall: time complexity where m is number of timestamps for this key O(log m)
def get(self, key: str, timestamp: int) -> str:
# Precondition:
# self.store[key] contains (value, timestamp) pairs sorted ascending by timestamp
# timestamp is an integer >= 1
# Postcondition:
# returns value with largest timestamp <= given timestamp
# returns '' if none exists
# Empty Check
if key not in self.store:
return ''
values = self.store[key]
# Upper Ceiling Binary Search Shift Trick:
# In order to find: highest timestamp <= query
# First first: query < min timestamp
# Then shift index: find highest timestamp <= query
# outer boundaries
# intentionally start right at len(values)
left, right = 0, len(values)
# Loop invariant:
# 1. 0 <= left <= right <= len(values)
# 2. forall i :: 0 <= i < left ==> values[i][1] <= timestamp
# 3. forall j :: right <= j < len(values) ==> values[j][1] > timestamp
# Interpretation:
# all indices before 'left' are valid timestamps <= query
# all indices after 'right' are invalid timestamps > query
# When loop exits (left == right)
# left points to first invalid timestamp > query (if any)
# left-1 points to largest valid timestamp <= query or -1 if none exists
# time complexity: binary search over time stamps O(log(m))
while left < right:
mid = (left + right) // 2
# If timestamp is greater than query timestamp -> valid
if values[mid][1] > timestamp:
right = mid
# If timestamp is less than or equal to query timestamp -> invalid
else:
left = mid + 1
# Found smallest timestamp > query:
# If we shift index by 1 -> largest timestamp <= query:
# Or ''
return values[left - 1][0] if left != 0 else ''
# overall: time complexity O(log m)
# overall: space complexity O(n)
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
set | O(1) | O(1) | Amortized constant time append O(1) | One tuple stored per call O(1) |
get | O(log m) | Binary search over m timestamps O(log m) | No additional memory allocation for binary search O(1) | |
Overall | O(log m) | O(n) | Binary search retrieval dominates, leading to O(log m) | Storage proportional to total number of sets O(n) |
Solution 3: Floor Search Tracking Largest Valid Index With Right Pointer - Binary Search/Upper Ceiling or Lower Floor Trick Based on Target Binary Search
class TimeMap:
def __init__(self):
self.mp = {}
def set(self, key: str, value: str, timestamp: int) -> None:
# put a key: val -> to a timestamp
item = (timestamp, value)
if key not in self.mp:
self.mp[key] = []
self.mp[key].append(item)
def get(self, key: str, timestamp: int) -> str:
# Precondition:
# self.map[key] contains sorted (timestamp, value) pairs by ascending timestamp
# timestamp is an int >= 1
# Postcondition:
# returns value with largest timestamp <= given timestamp
# returns empty string if no such timestamp exists
# Empty Case
if key not in self.mp:
return ''
values = self.mp[key]
# Edge Case:
# If timestamp >= end of array timestamp, return last value
if timestamp >= values[-1][0]:
return values[-1][1]
left, right = 0, len(values)-1
# Loop invariant:
# 1. 0 <= left <= right + 1 <= len(values)
# 2. All values before 'left' have timestamp <= query timestamp
# 3. All values after 'right' have timestamp > query timestamp
# Ensures:
# When left > right, right points to
# the largest timestamp <= query or -1 if none exists
while left <= right:
mid = (left + right) // 2
time, val = values[mid]
# found target
if time == timestamp:
return val
# if time larger than query: reduce right side
elif time > timestamp:
right = mid - 1
# if time smaller than request: reduce left side
else:
left = mid + 1
# refer to postcondition
return value_list[right][1] if right != -1 else ''
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
162. Find Peak Element ::1:: - Medium
Topics: Array, Binary Search
Intro
A peak element is an element that is strictly greater than its neighbors. Given a 0-indexed integer array nums, find a peak element, and return its index. If the array contains multiple peaks, return the index to any of the peaks. You may imagine that nums[-1] = nums[n] = -inf. In other words, an element is always considered to be strictly greater than a neighbor that is outside the array. You must write an algorithm that runs in O(log n) time.
Example Input | Output |
---|---|
nums = [1,2,3,1] | 2 |
nums = [1,2,1,3,5,6,4] | 5 |
Constraints:
1 ≤ nums.length ≤ 1000
-231 ≤ nums[i] ≤ 231 - 1
nums[i] != nums[i + 1] for all valid i.target in an array that has been rotated some amount of time
Abstraction
Find the peak elements in an array, where peak elements are defined as a height that is greater than both of their neighbors. Multiple peaks can exist per array.
Space & Time Complexity
Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
---|---|---|---|---|
Bug | Error |
---|---|
Brute Force:
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Find the Bug:
Solution 1: Modified Binary Search over Towards Higher Slope - Binary Search/Condition Adapted Binary Search
def findPeakElement(self, nums: List[int]) -> int:
# Precondition:
# len(nums) >= 1
# nums[i] != nums[i+1] for all valid i
# Postcondition:
# returns index i such that nums[i] is a peak:
# nums[i] > nums[i-1] (if i > 0)
# nums[i] > nums[i+1] (if i < n-1)
left, right = 0, len(nums) - 1
# Invariant:
# At least one peak exists within interval [left, right]
# Half of search space removed per iteration
# Binary Search Optimization: '<'
while left < right:
# mid
mid = (left + right) // 2
# Zoom In Excluding Mid:
# Peak towards right
# Discard left half of [left, mid, right]
# new search interval: [mid+1, right]
if nums[mid] < nums[mid + 1]:
left = mid + 1
# Zoom Including Mid:
# Peak towards left
# Discard right half of [left, mid, right] including Mid
# new search interval: [left, mid]
else:
right = mid
# Loop Exits: left == right
# Invariant: peak exists within interval [left, right]
# both left and right are pointing to peak
# overall: time complexity O(log n)
# overall: space complexity O(1)
return left
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
4. Median of Two Sorted Arrays ::2:: - Hard
Topics: Array, Binary Search, Divide and Conquer
Intro
Given two sorted arrays nums1 and nums2 of size m and n respectively, return the median of the two sorted arrays. The overall run time complexity should be O(log (m+n)).
Example Input | Output |
---|---|
nums1 = [1,3], nums2 = [2] | 2.00000 |
nums1 = [1,2], nums2 = [3,4] | 2.50000 |
Constraints:
nums1.length == m
nums2.length == n
0 ≤ m ≤ 1000
0 ≤ n ≤ 1000
1 ≤ m + n ≤ 2000
-106 nums1[i], nums2[i] ≤ 106
Abstraction
Find median between two sorted arrays.
Space & Time Complexity
Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
---|---|---|---|---|
Bug | Error |
---|---|
Brute Force:
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Find the Bug:
Solution 1: Two Pointer Merge Until Median - Binary Search/Condition Adapted Binary Search
def findMedianSortedArrays(self, nums1, nums2):
# Precondition:
# nums1 and nums2 are sorted in non-decreasing order
# 0 <= len(nums1), len(nums2) <= 10^6
# Postcondition:
# Returns the median value of the merged sorted array
# If merged length is odd: median is the middle value
# If merged length is even: median is the average of the two middle values
# Lengths of the two arrays
m, n = len(nums1), len(nums2)
# Pointers to the current index in nums1 and nums2
p1, p2 = 0, 0
# Helper function:
# simulating one step of merge by returning the smallest
# of the current elements pointed to by p1 and p2
def get_min():
nonlocal p1, p2
# Case 1: Both pointers in bounds
# Return the smaller element and advance the pointer
if p1 < m and p2 < n:
# p1 smaller
if nums1[p1] < nums2[p2]:
ans = nums1[p1]
p1 += 1
# p2 smaller
else:
ans = nums2[p2]
p2 += 1
# Case 2: One of the pointers is out of range
# Return the smaller element in the other pointer and advance
# p2 out of bounds
elif p2 == n:
ans = nums1[p1]
p1 += 1
# p1 out of bounds
else:
ans = nums2[p2]
p2 += 1
# current smallest value between p1 and p2
return ans
# Invariant:
# At call i to get_min() (0 based indexing):
# Returns (i+1)th smallest element
# After i calls to get_min():
# i smallest elements have been discarded
# Note:
# Instead of merging the arrays fully,
# simulate merge only up to the median index.
# Note:
# For length k = m + n:
# If Odd: median is the k//2 element
# k = 5 -> k // 2 = 2 (3rd smallest)
# If Even: median is average of (k//2)-1 and k//2
# k = 6 -> (k // 2)-1 = 2 k//2 = 3 (3rd and 4th smallest)
# For odd length merged array
if (m+n)%2 == 1:
# Discard the first k//2 smallest elements
for i in range((m + n) // 2):
get_min()
# Return the k//2 smallest element
return get_min()
# For even length merged array
else:
# Discard the first (k//2 - 1) elements
for i in range((m + n) // 2 - 1):
get_min()
# Grab (k//2)-1
# Grab (k//2)
# Return average
return (get_min() + get_min()) / 2
# overall: time complexity O(m + n)
# overall: space complexity O(1)
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|
Solution 2: Binary Search Partition - Binary Search/Condition Adapted Binary Search
def findMedianSortedArrays(self, nums1, nums2):
# Precondition:
# nums1 and nums2 are sorted in ascending order
# 0 <= len(nums1), len(nums2) <= 10^6
# Postcondition:
# returns the median of the merged sorted array:
# If Odd: middle value of total length
# If Even: average of two middle values
# Invariant:
# At each iteration, we maintain a partition (cut point) in nums1 (partitionX)
# and implicit partition (cut point) in nums2 (partitionY) such that:
# 1. Total count of elements between both left partitions =
# Total count of elements between both right partitions
# 2. maxLeftX <= minRightY and maxLeftY <= minRightX:
# means the partition correctly divides arrays so
# 3. all elements in left partitions are less than or equal to all elements in right partitions.
# sanity check smaller array
if len(nums1) > len(nums2):
nums1, nums2 = nums2, nums1
# lens of the two arrays
m, n = len(nums1), len(nums2)
# search space for the cut point in nums1
# range between [0, m]
# can put anywhere from no elements to all elements on left
low, high = 0, m
# Target Binary Search: "<=" in smaller array
# Each iteration is "guess" a cut across the smaller array
# (and implicitly the larger array) such that the Partition Rule is true
while low <= high:
# Guess cut point for nums1
# current nums1 search interval: [low, high]
# we guess partitionX in nums1, and computer partitionY in nums2 accordingly
partitionX = (low + high) // 2
# Total count left half is fixed (half of all elements),
# so once you choose how many elements from nums1 go left (partitionX),
# you know how many elements from nums2 must go left (partitionY),
# in order to keep the two half of all elements balanced
# (5+1)//2 -> 3, (6+1)//2 -> 3
partitionY = (m + n + 1) // 2 - partitionX
# Grab values for partitions
# Allow for our invariant condition to be possible
# after reaching out of bounds of array
# if maxLeftX <= minRightY and maxLeftY <= minRightX:
# If the left part of nums1 is empty (partitionX == 0), then the maximum of an empty set should not restrict the comparison.
# Setting maxLeftX = -inf ensures the condition maxLeftX <= minRightY is always satisfied in this case.
maxLeftX = float('-inf') if partitionX == 0 else nums1[partitionX - 1]
# If the right part of nums1 is empty (partitionX == m), then there are no values to compare in the right half.
# Setting minRightX = +inf ensures the condition maxLeftY <= minRightX is always satisfied in this case.
minRightX = float('inf') if partitionX == m else nums1[partitionX]
# If the left part of nums2 is empty (partitionY == 0), then the maximum of an empty set should not restrict the comparison.
# Setting maxLeftY = -inf ensures the condition maxLeftY <= minRightX is always satisfied in this case.
maxLeftY = float('-inf') if partitionY == 0 else nums2[partitionY - 1]
# If the right part of nums2 is empty (partitionY == n), then the minimum of an empty set should not restrict the comparison.
# Setting minRightY = inf ensures the condition maxLeftX <= minRightY is always satisfied in this case.
minRightY = float('inf') if partitionY == n else nums2[partitionY]
# Discard Mid via Partitions Rule:
# max of left partition nums1 < min of right partition of nums2
# max of left partition nums2 < min of right partition of nums1
# we have successfully split 'merged' array into halves
if maxLeftX <= minRightY and maxLeftY <= minRightX:
# Even Merged Array
if (m + n) % 2 == 0:
return (max(maxLeftX, maxLeftY) + min(minRightX, minRightY)) / 2
# Odd Merged Array:
else:
return max(maxLeftX, maxLeftY)
# Continue Split Monotonic Increasing Shift
# If maxLeftX > minRightY:
# Means largest element on the left side of nums1 partition is greater
# than the smallest element on the right side of nums2 partition
# Implies: our partitionX is too far right -> too many elements from nums1
# are in the left half
# Condition Breaks Invariant: maxLeftX <= minRightY
# Discard right half of [low, partitionX, high]
# new search interval: [low, partitionX - 1]
elif maxLeftX > minRightY:
high = partitionX - 1
# If maxLeftY > minRightX
# Means largest element on the left side of nums2 partition is greater
# than the smallest element on the right side of nums1 partition
# Implies: our partitionX is too far left -> not enough elements from nums1
# are in the left half
# Condition Breaks Invariant: maxLeftY <= minRIghtX
# Discard left half of [low, partitionX, high]
# new search interval: [partition + 1, high]
elif maxLeftY > minRightX:
low = partitionX + 1
# overall: time complexity O(log (min(m, n)))
# overall: space complexity O(1)
Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
---|---|---|---|---|