LeetCode: Two Pointers II Binary Search

Table Of Contents
If you don't reduce the search space and include mid, the loop may never terminate
- Insight on Pointer Updates
- Limitation with duplicates
- Binary Search Application: Target Binary Search
- Binary Search Application: Multiple Layers Binary Search
- Binary Search Application: Optimization Search Min Max Binary Search
- Binary Search Application: Condition Adapted Binary Search
- Binary Search Application: Upper Ceiling or Lower Floor Trick Based on Target Binary Search
15. 3Sum ::2:: - Medium
- Intro
- Abstraction
- Space & Time Complexity
- Brute Force (all triplets comparison)
- Brute Force Hashmap Two Sum I Approach (No Sorting)
Bugs
- Find the Bug: List is unHashable
- Find the Bug: Index Into Wrong Array (2x) [im half asleep rn ok? :) cut me some slack]
- Find the Bug: Giving Sorted() 3 arguments instead of 1 [i dont know python syntax, YET :)]
- Find the Bug: Adding Set to Set, cannot because set is un-hashable type [again, syntax]
- Find the Bug: Did not increment pointers if res == 0 case
- Solution 1: [Sorting] Two Sum Mimic By Per Element Opposite Ends Pointer Shift By BinarySearch Modification For 0 [TC Opt] - Two Pointers/K Pointer Variants
- Solution 2: [No Sorting] Grouping By Parity 4 Triplet Combinations [Time Limit Exceeded] - Two Pointers/Algorithm
74. Search a 2D Matrix ::2:: - Medium
- Intro
- Abstraction
- Space & Time Complexity
- Brute Force:
- Find the Bug: Did not follow Binary Search
- Find the Bug: Forgot to break
- Solution 1: Iterative Opposite Ends Two Pointer Flattened Matrix by Index To Coordinates Binary Search - Binary Search/Searching
- Solution 2: Two Layer Binary Search Outer Row Range Search And Inner Row Element Search - Binary Search/Searching
875. Koko Eating Bananas ::2:: - Medium
- Intro
- Abstraction
- Space & Time Complexity
- Brute Force:
- Find the Bug: Hours For Speed k Is Always Rounding Down
- Find the Bug: Floor Caused K Starting Speed at 0, Leading to Division By 0 Error
- Find the Bug: Wait a minute, This House Has No Roof
- Find the Bug: Cannot Bananas Divided By Hour, with Remainder
- Solution 1: Binary Search - Binary Search/Optimization Search Min Max
- Solution 2: [Greedy] [Networking] Networking Router Analogy, Greedy Heuristic Hybrid Target Optimization Monotonic Search with Exponential Speed Doubling - Binary Search/Optimization Search Min Max
981. Time Based Key Value Store ::2:: - Medium
- Intro
- Abstraction
- Space & Time Complexity
- Brute Force:
- Find the Bug:
- Solution 1: Reverse Linear Scan - Binary Search/Condition Adapted Binary Search
- Solution 2: Upper Ceiling to Find First Invalid TimeStamp - Binary Search/Upper Ceiling or Lower Floor Trick Based on Target Binary Search
- Solution 3: Floor Search Tracking Largest Valid Index With Right Pointer - Binary Search/Upper Ceiling or Lower Floor Trick Based on Target Binary Search
Binary Search Intro:
Leetcode problems with elegant solutions using binary search.
What is Binary Search
Binary Search is a divide and conquer algorithm for searching sorted lists or ranges effectively.
It works by creating a middle value, comparing middle value to the target or evaluating a condition, and halving the search space based on that comparison.
Why Use Binary Search
Binary search reduces the search space by half on each iteration.
| Technique | Time Complexity | Space Complexity |
|---|---|---|
| Linear Search | O(n) | O(1) |
| Binary Search (Recursive) | O(log n) | O(log n) |
| Binary Search (Iterative) | O(log n) | O(1) |
Target Binary Search
Used to search for a specific value or index.
def binary_search(nums, target):
left, right = 0, len(nums) - 1
# Note 1:
# Target Binary Search: uses '<='
while left <= right:
mid = (left + right) // 2
# Note 2:
# 3 parts during iteration
# target check, remove left half, remove right half
if nums[mid] == target:
return mid
elif nums[mid] < target:
left = mid + 1
else:
right = mid - 1
# Note 3:
# Exit: left > right triggers ends
# Return value if not found
return -1 Monotonic Optimization Min/Max Binary Search
Used to find the smallest or largest value that satisfies a condition.
- Find the min element greater than or equal to the target:
In ceiling search, the goal is to locate the minimal number in a sorted
def find_min_valid_value():
left, right = 0, N-1
# Note 1:
# Optimization Binary Search: uses '<'
while left < right:
mid = (left + right) // 2
# Note 2:
# Check if condition is satisfied
# If so, include mid with left or right half
if condition(mid):
# or left = mid, depending on which half we are checking
right = mid
else:
# or right = mid + 1, depending on which half we are removing
left = mid + 1
# Note: 3
# Exit: left == right, triggers the exit
# we have found the optimized solution
# or right
return left Optimization Ceiling and Floor Style Binary Search
Variant of Optimization Binary Search:
The rule is that we can search directly for ceilings, but not for floors.
You can search directly for the ceiling (smallest element >= target) with binary search because the loop naturally stops at that boundary.
def find_ceiling(nums, target):
left, right = 0, len(nums)
# Loop condition: left < right
while left < right:
mid = (left + right) // 2
# If mid element is less than target,
# the ceiling must be to the right of mid (exclude mid)
if nums[mid] < target:
left = mid + 1
else:
# nums[mid] >= target, potential ceiling candidate
# Keep mid in search space by moving right boundary to mid
right = mid
# At the end, left == right, points to the smallest element >= target if exists
# If left == len(nums), no ceiling found
if left == len(nums):
return None
return nums[left]You cannot search directly for the floor (largest element < = target) with binary search
def find_floor_wrong(nums, target):
left, right = 0, len(nums)
# Loop condition: left < right
while left < right:
mid = (left + right) // 2
# Attempt:
# mid element <= target:
# move left boundary up to mid (include mid)
if nums[mid] <= target:
# PROBLEM:
# left might not move forward when left == mid, causing infinite loop
left = mid
else:
# mid element > target:
# move right boundary down (exclude mid)
right = mid
# When loop ends, left == right, but it may not point to floor directly
# Without shifting left or checking bounds, this is unreliable
if left == 0 and nums[left] > target:
return None
return nums[left]Understanding Optimization Search: Min vs Max
In optimization style binary search, we're not finding a specific value, but rather are finding the smallest or largest values that satisfies a condition.
- Finding the Minimum Valid Value
if condition(mid):
right = mid # mid might be the answer, keep it
else:
left = mid + 1 # mid is invalid, discard it- Finding the Maximum Valid Value
if condition(mid):
left = mid # mid might be the answer, keep it
else:
right = mid - 1 # mid is invalid, discard it`Why Binary Search Can Find < or > but Not < or > equal to
Binary Search splits the space at mid and decides whether to include or exclude it.
This means it works best with strict boundaries.
Its easy to shrink towards a point where the transition happens on < or >.
However, it struggles with ambiguous boundaries like < and > or equal to. As you don't know which direction to move in.
Ex: first element < or equal to target
# When:
arr[mid] == target
# Should you go left or right?
# If you go right, you might miss an earlier match
# If you go left, you might go too far and miss it entirely
# If you don't reduce the search space and include mid, the loop may never terminate So binary search prefers to work with < for upper bound, > for lower bound. And shift answer manually to get < and > or equal to.
Insight on Pointer Updates
A basic but easily overlooked principle in binary search is that the search space must shrink with every iteration.
This ensures the algorithm makes progress and terminates.
In target binary search:
We are looking for a specific index or value. After comparing mid, we either return mid as we have found the target, or search the left or right subspace while excluding mid as mid has already been checked.
In target binary search, we might hit the target before the pointers trigger termination
In optimization binary search:
We are looking for a min/max valid value, not just looking for a specific value. In this case after comparing mid, search the left or right subspace and include mid as mid could be the potential solution, until our pointers hit each other and we have found our optimal solution.
In optimization binary search, the trigger will be that the pointers will equal each other every time, returning the optimal solution.
Limitation with duplicates
In most standard binary search problems, duplicates do not interfere with correctness or efficiency as long as:
- The array is fully sorted
- We're doing a target lookup or numeric optimization
- No structural transformations like rotation or flattening is applied that would affect the sorted order
Some problems depend on structure or ordering of elements (e.g., rotated arrays). In these cases, duplicates introduce ambiguity that binary search cannot resolve cleanly.
Ex: without duplicates
nums = [4, 5, 6, 7, 0, 1, 2]
def find_min_no_duplicates(nums):
left, right = 0, len(nums) - 1
while left < right:
mid = (left + right) // 2
if nums[mid] > nums[right]:
# Min must be in the right half (excluding mid)
left = mid + 1
else:
# Min is at mid or in the left half
right = mid
return nums[left]Ex: with duplicates
nums = [2, 2, 2, 0, 1, 2]
def find_min_with_duplicates(nums):
left, right = 0, len(nums) - 1
while left < right:
mid = (left + right) // 2
if nums[mid] > nums[right]:
left = mid + 1
elif nums[mid] < nums[right]:
right = mid
else:
# nums[mid] == nums[right], can't determine side, shrink range
right -= 1
return nums[left]Binary Search Application: Target Binary Search
Using Binary Search to locate the position of a target element in a sorted list by repeatedly dividing the search interval in half.
Ex: Searching for a target number in a sorted array.
def binarySearch(nums, target):
left, right = 0, len(nums) - 1
while left <= right:
mid = (left + right) // 2
if nums[mid] == target:
return mid # Target found
elif nums[mid] < target:
left = mid + 1 # Search right half
else:
right = mid - 1 # Search left half
return -1 # Target not foundBinary Search Application: Multiple Layers Binary Search
Applying binary search in multiple stages or layers, typically to first narrow down a substructure (i.e., row, segment, or time range), and then again within that substructure
This approach is useful when data is structured in nested sorted layers.
Ex: Searching for a target in a sorted 2D matrix
def searchMatrix(matrix: List[List[int]], target: int) -> bool:
if not matrix or not matrix[0]:
return False
rows, cols = len(matrix), len(matrix[0])
# First layer: Binary search over rows to find candidate row
top, bottom = 0, rows - 1
while top <= bottom:
mid_row = (top + bottom) // 2
if matrix[mid_row][0] <= target <= matrix[mid_row][-1]:
break
elif matrix[mid_row][0] > target:
bottom = mid_row - 1
else:
top = mid_row + 1
else:
return False # Target is outside all row ranges
# Second layer: Binary search within the found row
row = mid_row
left, right = 0, cols - 1
while left <= right:
mid_col = (left + right) // 2
if matrix[row][mid_col] == target:
return True
elif matrix[row][mid_col] < target:
left = mid_col + 1
else:
right = mid_col - 1
return FalseBinary Search Application: Optimization Search Min Max Binary Search
Using binary search to find the smallest or largest valid value that satisfies a given constraint (e.g., minimum speed, minimum time, maximum capacity)
Searches over a numeric range, not for a target, and typically uses while left < right (left == right, as a trigger) to fin the min or max boundary.
Ex: Find the minimum eating speed to finish all bananas within h hours
def minEatingSpeed(piles: List[int], h: int) -> int:
def hours_needed(speed):
return sum((pile + speed - 1) // speed for pile in piles) # ceil
left, right = 1, max(piles)
# Optimization Search
while left < right:
mid = (left + right) // 2
# Try slower (minimization)
if hours_needed(mid) <= h:
right = mid
# Need faster
else:
left = mid + 1
# Note:
# there is no case where left > right breaks the loop,
# because the condition is < and not <=,
# so trigger will always be:
# left == right
# Smallest k that works
return left Binary Search Application: Condition Adapted Binary Search
Adapting search conditions of binary search to account for special problem parameters or constraints (e.g., rotated array, duplicates, bounded search).
Covers problems where binary search is applied, but the standard algorithm is modified.
Ex: Finding the minimum element in a rotated sorted array
def findMin(nums: List[int]) -> int:
left, right = 0, len(nums) - 1
# If array is not rotated (fully sorted ascending)
if nums[left] < nums[right]:
return nums[left]
# Modified binary search with parameter-based decision
while left < right:
mid = (left + right) // 2
# Decision based on comparing mid and right elements
if nums[mid] > nums[right]:
left = mid + 1
else:
right = mid
# left == right points to minimum element
return nums[left]Binary Search Application: Upper Ceiling or Lower Floor Trick Based on Target Binary Search
Binary search cannot directly find the largest element less than or equal to a target (or similarly find the smallest elements greater than or equal to a target) in one step.
Instead we search for the first element greater than the target and then simply shift one position left to get the largest element less than or equal.
Ex: Find the first element less than or equal to target
def largest_leq_shift(nums, target):
left, right = 0, len(nums) # half-open interval [left, right)
while left < right:
mid = (left + right) // 2
if nums[mid] <= target:
left = mid + 1 # move right since mid less to target
else:
right = mid # mid > target, shrink right side
# left is now the index of first element > target
# so the answer is at left - 1 (if valid)
return nums[left - 1] if left > 0 else ""167. Two Sum II Sorted Array ::2:: - Medium
Topics: Array, Two Pointers, Binary Search
Intro
Given a 1 indexed array of sorted integer in non decreasing order, find two numbers that add up to a target. Let these two numbers be numbers[index1] and numbers[index2] where 1 ≤ index1 < index2 ≤ numbers.length. Return the indices of the two numbers, index1 and index2, added by one as an integer array [index1, index2] of length 2. The tests are generated such that there is exactly one solution. You may not use the same element twice.
| Input List | Input Target | Output |
|---|---|---|
| [2, 7, 11, 15] | 9 | [1,2] |
| [2, 3, 4] | 6 | [1,3] |
| [-1, 0] | -1 | [1,2] |
Constraints:
Solutions can only use constant space O(1)
Abstraction
Given a sorted array, find two numbers that add up to target, and numbers cannot share indexes.
Space & Time Complexity
| Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
|---|---|---|---|---|
| Binary Search for Complement | O(n log n) | O(1) | Iterates over list of n elements O(n), performing binary search on each O(log n) leading to O(n log n) | No additional memory allocation O(1) |
| Two Pointers | Iterates through array of n length using two pointers O(n) | No additional memory allocation O(1) |
Bug: Binary Search updates left and right pointer by midValue instead of midIndex
def twoSum(self, numbers: List[int], target: int) -> List[int]:
for i in range(len(numbers)):
currComplementTarget = target - numbers[i]
left, right = i + 1, len(numbers) - 1
while left <= right:
midIndex = (left + right)//2
midInt = numbers[midIndex]
if currComplementTarget == midInt:
return [i + 1, mid + 1]
# BUG:
# moving pointer based on int value, instead of index
elif currComplementTarget < midInt:
right = midInt - 1
# BUG:
# moving pointer based on int value, instead of index
else:
left = midInt + 1
return []Solution 1: [Two Pointers] Opposite Ends Pointer Shift by BinarySearch Modification for Target [TC Opt] - Two Pointers/Opposite Ends
def twoSumII(self, numbers: List[int], target: int) -> List[int]:
# Set up BinarySearch Modification
# initialize outer pointers
# space complexity: left and right pointers constant O(1)
left, right = 0, len(numbers) - 1
# BinarySearch Modification
# "<": working within subarray containing left and right
# cannot evaluate num at left == right, breaks constraints of index1 < index2
# Base Case: no more array remaining to search,
# return []
# time complexity: iterate over list of n length O(n)
while left < right:
# grab current sum
currSum = numbers[left] + numbers[right]
# Note:
# BinarySearch Pattern (1)
# 3 Paths: check middle target, if left path, else right path
# check middle target: found sum
if currSum == target:
# convert to 1-indexed array, per test cases
return [left + 1, right + 1]
# eheck left path: currSum is too low
elif currSum < target:
left += 1
# else right path: currSum is too high
else:
right -= 1
# overall: time complexity O(n)
# overall: space complexity O(1)
return []| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
| Pointer Loop | O(n) | O(1) | Left and right pointers iterate over list of n length O(n) | No additional memory allocation O(1) |
| Comparison | O(1) | O(1) | Sum and comparison operations take constant O(1) | No additional memory allocation for sum and comparison operations O(1) |
| Overall | O(n) | O(1) | Iterating with left and right pointers over list of n length dominates leading to O(n) | No additional memory allocation leading to constant O(1) |
15. 3Sum ::2:: - Medium
Topics: Array, Two Pointers, Sorting, Binary Search
Intro
Given an integer array nums, return all the triplets [nums[i], nums[j], nums[k]] such that i != j, i != k, and j != k, and nums[i] + nums[j] + nums[k] == 0. Notice that the solution set must not contain duplicate triplets.
| nums | Output |
|---|---|
| [-1,0,1,2,-1,-4] | [[-1,-1,2],[-1,0,1]] |
| [0,1,1] | [] |
| [0,0,0] | [[0,0,0]] |
Constraints:
Abstraction
Given an array find all the groups of 3 integers that add up to 0, without duplicates.
Sorting vs non-sorting is the first distinction which will lead to different solutions.
We can:
- Sort by increasing value
- Sort by parity
Both of these give us different assumptions we can use to our advantage as we traverse the array. Let see:
Space & Time Complexity
| Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
|---|---|---|---|---|
| Brute Force (all triplets comparison) | ||||
| Hashmap (Two Sum I Approach) | ||||
| Two Pointers Set | ||||
| Two Pointers Early Breaks Set |
Brute Force (all triplets comparison)
def threeSum(self, nums: List[int]) -> List[List[int]]:
# space complexity: set of unique m triplets tuples O(m)
res = set()
n = len(nums)
# time complexity: n choose three for all triplet combinations / three nested loops O(n^3)
for i in range(n - 2):
for j in range(i + 1, n - 1):
for k in range(j + 1, n):
# time complexity: sum and comparison operation in constant O(1)
if nums[i] + nums[j] + nums[k] == 0:
# time complexity: sorting 3 elements in near constant O(1)
triplet = sorted([nums[i], nums[j], nums[k]])
# time complexity: insert operation in constant O(1)
res.add(tuple(triplet))
# overall: time complexity O(n^3)
# overall: space complexity O(,)
return [list(triplet) for triplet in res]| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
Brute Force Hashmap Two Sum I Approach (No Sorting)
def threeSum(self, nums: List[int]) -> List[List[int]]:
# space complexity: set of unique m triplet tuples O(m)
n = len(nums)
res = set()
# freezing one of the triple tuple values
# time complexity: iteration over list of n length O(n)
for i in range(n - 2):
# grab complement
target = -nums[i]
seen = {} # [ (value → its index), ...]
# freezing second of the triple tuple values
# time complexity: iteration over list of n length per outer iteration O(n^2)
for j in range(i + 1, n):
# grab complement
complement = target - nums[j]
# time complexity: lookup operation in constant O(1)
if complement in seen:
# add unique tuple to res
triplet = tuple(sorted((nums[i], nums[j], complement)))
res.add(triplet)
else:
seen[nums[j]] = j
# overall: time complexity
# overall: space complexity
return [list(t) for t in res]| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
Bugs
Find the Bug: List is unHashable
def threeSum(self, nums: List[int]) -> List[List[int]]:
# Note: Two Sum vs 3Sum
# Two sum, finding two numbers whose sum equals a target is simple
# with hashmaps or two pointer approaches being efficient
# 3Sum adds complexity with a third variable.
# Grouping techniques aim to narrow and organize the solution space
# reducing redundant checks while finding valid triplets
# positive, negative, and zero sets
# space complexity:
p, n, z = [], [], []
# set ensures no duplicate triplets
# space complexity:
res = set()
# time complexity: iterate over list of n length O(n)
for i in nums:
if i < 0:
n.append(i)
elif i > 0:
p.append(i)
else:
z.append(i)
# sets for unique groups
# time complexity: convert list into set O(n)
P, N = set(p), set(n)
# (0, num, -num)
# time complexity: iterate over positive numbers list n length O(n)
if len(z) > 0:
for i in P:
# time complexity: negative target lookup in constant O(1)
nTarget = -i
if nTarget in N:
res.add((nTarget, 0, i))
# (0, 0, 0)
# time complexity: len operation constant O(1)
if len(z) >= 3:
res.add((0, 0, 0))
# (-, -, +) negative pairs
# time complexity: iterate over list of negative numbers n length O(n)
for i in range(len(n)):
# time complexity: iterate over list of negative numbers n length O(n) per outer iteration O(n), leading to O(n^2)
for j in range(i+1, len(n)):
# time complexity: lookup operation constant O(1)
pTarget = -(n[i] + n[j])
if pTarget in P:
# INCORRECT:
# sorted() -> List[int]
# cannot hash list into set()
# must use tuple(sorted())
res.add((sorted([n[i], n[j], pTarget])))
# (-, +, +) positive pairs
# time complexity: iterate over list of positive numbers n length O(n)
for i in range(len(p)):
# time complexity: iterate over list of positive numbers n length O(n) per outer iteration O(n), leading to O(n^2)
for j in range(i+1, len(p)):
# time complexity: lookup operation constant O(1)
nTarget = -(p[i] + p[j])
if nTarget in n:
# INCORRECT:
# sorted() -> List[int]
# cannot hash list into set()
# must use tuple(sorted())
res.add(tuple(sorted([p[i], p[j], nTarget])))
# convert valid set of tuple triplets into valid list of tuple triplets
# overall: time complexity O(n^2)
# overall: space complexity O(n)
return list(res) Find the Bug: Index Into Wrong Array (2x) [im half asleep rn ok? :) cut me some slack]
def threeSum(self, nums: List[int]) -> List[List[int]]:
# Note: Two Sum vs 3Sum
# Two sum, finding two numbers whose sum equals a target is simple
# with hashmaps or two pointer approaches being efficient
# 3Sum adds complexity with a third variable.
# Grouping techniques aim to narrow and organize the solution space
# reducing redundant checks while finding valid triplets
# positive, negative, and zero sets
# space complexity:
p, n, z = [], [], []
# set ensures no duplicate triplets
# space complexity:
res = set()
# time complexity: iterate over list of n length O(n)
for i in nums:
if i < 0:
n.append(i)
elif i > 0:
p.append(i)
else:
z.append(i)
# sets for unique groups
# time complexity: convert list into set O(n)
P, N = set(p), set(n)
# (0, num, -num)
# time complexity: iterate over positive numbers list n length O(n)
if len(z) > 0:
for i in P:
# time complexity: negative target lookup in constant O(1)
nTarget = -i
if nTarget in N:
res.add((nTarget, 0, i ))
# (0, 0, 0)
# time complexity: len operation constant O(1)
if len(z) >= 3:
res.add((0, 0, 0))
# (-, -, +) negative pairs
# time complexity: iterate over list of negative numbers n length O(n)
for i in range(len(n)):
# time complexity: iterate over list of negative numbers n length O(n) per outer iteration O(n), leading to O(n^2)
for j in range(i+1, len(n)):
# time complexity: lookup operation constant O(1)
# INCORRECT:
# stop being half asleep, index into the correct array
pTarget = -(nums[i] + nums[j])
if pTarget in P:
# INCORRECT:
# stop being half asleep, index into the correct array
res.add(tuple(sorted([nums[i], nums[j], pTarget])))
# (-, +, +) positive pairs
# time complexity: iterate over list of positive numbers n length O(n)
for i in range(len(p)):
# time complexity: iterate over list of positive numbers n length O(n) per outer iteration O(n), leading to O(n^2)
for j in range(i+1, len(p)):
# time complexity: lookup operation constant O(1)
# INCORRECT:
# stop being half asleep, index into the correct array
nTarget = -(nums[i] + nums[j])
if nTarget in n:
# INCORRECT:
# stop being half asleep, index into the correct array
res.add(tuple(sorted([nums[i], nums[j], nTarget])))
# convert valid set of tuple triplets into valid list of tuple triplets
# overall: time complexity O(n^2)
# overall: space complexity O(n)
return list(res) Find the Bug: Giving Sorted() 3 arguments instead of 1 [i dont know python syntax, YET :)]
def threeSum(self, nums: List[int]) -> List[List[int]]:
res = set()
p, n, z = [], [], []
for num in nums:
if num > 0:
p.append(num)
elif num < 0:
n.append(num)
else:
z.append(num)
P, N = set(p), set(n)
# (0, -k, k)
if len(z) >= 1:
for neg in N:
pos = (-neg)
if pos in P:
res.add((neg, 0, pos))
# (0, 0, 0)
if len(z) >= 3:
res.add((0, 0, 0))
# (-a, -b, c)
for i in range(len(n)):
for j in range (i+1, len(n)):
pos = -1 * (n[i] + n[j])
if pos in P:
# INCORRECT:
# sorted takes 1 argument list of ints,
# not 3 arguments
res.add(tuple(sorted(n[i], n[j], pos)))
# (a, b, -c)
for i in range(len(p)):
for j in range(i+1, len(p)):
neg = -1 * (p[i] + p[j])
if neg in N:
# INCORRECT:
# sorted takes 1 argument list of ints,
# not 3 arguments
res.add(tuple(sorted(p[i], p[j], neg)))
return resFind the Bug: Adding Set to Set, cannot because set is un-hashable type [again, syntax]
def threeSum(self, nums: List[int]) -> List[List[int]]:
res = set()
p, n, z = [], [], []
for num in nums:
if num > 0:
p.append(num)
elif num < 0:
n.append(num)
else:
z.append(num)
P, N = set(p), set(n)
# (0, -k, k)
if len(z) >= 1:
for neg in N:
pos = (-neg)
if pos in P:
res.add((neg, 0, pos))
# (0, 0, 0)
if len(z) >= 3:
res.add((0, 0, 0))
# (-a, -b, c)
for i in range(len(n)):
for j in range (i+1, len(n)):
pos = -1 * (n[i] + n[j])
if pos in P:
# INCORRECT:
# TypeError: un-hashable type: 'set'
# as sets are mutable
# instead do
# tuple(sorted([p[i], p[j], neg]))
# as tuples are not mutable, and thus hashable
res.add(set(sorted([n[i], n[j], pos])))
# (a, b, -c)
for i in range(len(p)):
for j in range(i+1, len(p)):
neg = -1 * (p[i] + p[j])
if neg in N:
# INCORRECT:
# TypeError: un-hashable type: 'set'
# as sets are mutable
# instead do
# tuple(sorted([p[i], p[j], neg]))
# as tuples are not mutable, and thus hashable
res.add(set(sorted([p[i], p[j], neg])))
return list(res)Find the Bug: Did not increment pointers if res == 0 case
def threeSum(self, nums: List[int]) -> List[List[int]]:
res = set()
n = len(nums)
nums.sort()
for i in range(n - 2):
# only iterate over 1 number once
if i > 0 and nums[i] == nums[i-1]:
continue
# only allow negative numbers
if nums[i] > 0:
break
left, right = i + 1, n-1
# '<' becuase left != right
while left < right:
currSum = nums[i] + nums[left] + nums[right]
if currSum == 0:
res.add((nums[i], nums[left], nums[right]))
# INCORRECT:
# forgot to iterate both left and right
# should have been:
# left += 1
# right -= 1
elif currSum < 0:
left += 1
else:
right -= 1
return resSolution 1: [Sorting] Two Sum Mimic By Per Element Opposite Ends Pointer Shift By BinarySearch Modification For 0 [TC Opt] - Two Pointers/K Pointer Variants
def threeSum(self, nums: List[int]) -> List[List[int]]:
# Note:
# Two sum: finding two numbers whose sum equals a target is simple
# with hashmaps or two pointer solutions being efficient
# 3Sum: adds complexity with a third variable,
# but adding constraints turns 3Sum into a two sum
# and allows the use of hashmap or two pointer solutions again
# set ensures no duplicate triplets
# sc: relative to input O(n)
results = set()
n = len(nums)
# tc: python (2.3-3.10) TimSort, Python (3.11+) PowerSort Worst/Avg: O(n log n), Best: O(n)
# sc: sorts in place, no extra memory O(1)
nums.sort()
# tc: iterate over list O(n)
for i in range(n - 2):
# skip iteration:
# i should only "BinarySearch" through any number once
if i > 0 and nums[i] == nums[i - 1]:
continue
# early break:
# only allow negative numbers for i in (i, j, k)
if nums[i] > 0:
break
# Note:
# BinarySearch Pattern (1)
# 3 Paths: check middle target, if left path, else right path
# shift left boundary
# reset right boundary to rightmost
left, right = i + 1, n - 1
# BinarySearch Modification (1)
# cannot evaluate num at left == right given constraint i != j, i != k, j != k
# thus, change to left < right instead of left <= right
# Base case: no more array remaining, numbers[i] complement not found,
# next iteration
# tc: BinarySearch on n elements O(log n), for n iterations O(n), (n log n)
while left < right:
# grab current sum (i, j, k)
currSum = nums[i] + nums[left] + nums[right]
# check middle target: found sum
if currSum == 0:
# add triplet to result set
results.add((nums[i], nums[left], nums[right]))
# complete pair skip:
# only j and k can add to 0, we can step to next pair
left += 1
right -= 1
# Early duplicates skip:
# we already checked the combination of j and k, we can skip all duplicates
while left < right and nums[left] == nums[left - 1]:
left += 1
while left < right and nums[right] == nums[right + 1]:
right -= 1
# if left path: currSum is too low
elif currSum < 0:
left += 1
# else right path: currSum is too low
else:
right -= 1
# convert set of tuple triplets into list
# Time Complexity: O(n^2)
# Space Complexity: O(n)
return list(results)| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
| Sorting | O(n log n) | O(1) | Sorting array using TimSort in O(n log n) | No additional memory allocation for in place sorting |
| Outer Loop | O(n) | O(1) | Iterate over n elements O(n) | No additional memory allocation for iteration constant O(1) |
| Inner Loop | O(n2) | O(1) | Two pointer traversal for subarray search over n elements O(n) per outer iteration n O(n), leading to O(n2) | No additional memory allocation for pointer traversal |
| Result Storage | O(1) | O(n) | Insert takes constant O(1) | Stores k unique triplets O(k) which in worst case is just O(n) |
| Overall | O(n2) | O(n) | Two pointer traversal for n elements dominates, leading to O(n2) | Worst case of n/3 valid triplets O(n/3), which leads to O(n) |
Solution 2: [No Sorting] Grouping By Parity 4 Triplet Combinations [Time Limit Exceeded] - Two Pointers/Algorithm
def threeSum(self, nums: List[int]) -> List[List[int]]:
# Note:
# Two sum: finding two numbers whose sum equals a target is simple
# with hashmaps or two pointer approaches
# 3Sum: adds complexity with a third variable.
# Grouping techniques aim to narrow and organize the solution space
# reducing redundant checks while finding valid triplets
# group by parity: positive, negative, and zero sets
# sc: relative to input O(n)
p, n, z = [], [], []
# set ensures no duplicate triplets
# sc: relative to input O(n)
res = set()
# tc: iterate n length O(n)
for i in nums:
if i < 0:
n.append(i)
elif i > 0:
p.append(i)
else:
z.append(i)
# sets ensures no duplicates
# tc: convert list into set O(n)
P, N = set(p), set(n)
# (0, 0, 0): at least 3 0's for a valid triplet
# tc: len operation constant O(1)
if len(z) >= 3:
res.add((0, 0, 0))
# (0, +, -): at least 1 additive inverse for a valid triplet with at least 1 0
if len(z) > 0:
# tc: iterate n positive numbers O(n)
for i in P:
nTarget = -i
# tc: inverse lookup constant O(1)
if nTarget in N:
# tc: put constant O(1)
res.add((nTarget, 0, i))
# (-, -, +): negative pairs which are additive inverse of 1 positive number
# tc: iterate n negative numbers O(n)
for i in range(len(n)):
# tc: iterate n O(n)
for j in range(i+1, len(n)):
pTarget = -(n[i] + n[j])
# tc: inverse lookup constant O(1)
if pTarget in P:
# tc: put constant O(1)
res.add(tuple(sorted([n[i], n[j], pTarget])))
# (-, +, +): positive pairs which are additive inverse of 1 negative number
# tc: iterate n positive numbers O(n)
for i in range(len(p)):
# tc: iterate n positive numbers O(n)
for j in range(i+1, len(p)):
nTarget = -(p[i] + p[j])
# tc: inverse lookup constant O(1)
if nTarget in N:
# tc: put constant O(1)
res.add(tuple(sorted([p[i], p[j], nTarget])))
# convert valid set of tuple triplets into valid list of tuple triplets
# overall: time complexity O(n^2)
# overall: space complexity O(n)
return list(res) | Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
| Grouping Iteration | O(n) | O(n) | Iterate over list of n length to group into positive, negative, and zero lists O(n) | Grouping list of n length into three groups O(n) |
| Conversion to Sets | O(n) | O(n) | Convert list of positive and negative numbers into sets O(n) | Convert list of n length O(n) |
| Inverse Check | O(n) | O(1) | Iterate over positive list n length O(n) and lookup inverse in negative set O(1), leading to O(n) | No additional memory allocation for iteration O(1) |
| Zero Check | O(1) | O(1) | Zero list len check in constant O(1) | No additional memory allocation for len check O(1) |
| Negative Pairs | O(n2) | O(n) | Outer/Inner iteration over list of negative numbers O(n2) with inverse positive lookup O(1), leading to O(n2) | Adding n/3 triplets to result O(n/3), leading to O(n) |
| Positive Pairs | O(n2) | O(n) | Outer/Inner iteration over list of positive numbers O(n2) with inverse positive lookup O(1), leading to O(n2) | Adding n/3 triplets to result O(n/3), leading to O(n) |
| Overall | O(n2) | O(n) | Negative and positive iterations dominate leading to O(n2) | Memory allocation for n/3 valid triplets, leading to O(n) |
16. 3Sum Closest ::1:: - Medium
Topics: Array, Two Pointers, Sorting, Binary Search
Intro
Given an integer array nums of length n and an integer target, find three integers at distinct indices in nums such that the sum is closest to target. Return the sum of the three integers. You may assume that each input would have exactly one solution.
| nums | Output |
|---|---|
| nums = [-1,2,1,-4], target = 1 | 2 |
| nums = [0,0,0], target = 1 | 0 |
Constraints:
3 ≤ nums.length ≤ 500
-1000 ≤ nums[i] ≤ 1000
-10^4 ≤ target ≤ 10^4
Abstraction
Given
Space & Time Complexity
| Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
|---|---|---|---|---|
Brute Force (all triplets comparison)
Find the Bug:
Solution 1: [Two Pointer] Optimal Two Pointers - Two Pointers/K Pointer Variants
def threeSumClosest(self, nums: List[int], target: int) -> int:
# Two Pointer Pattern (Sorted Array)
# Idea:
# 1. Sort array.
# 2. Fix one number.
# 3. Use two pointers to find closest sum.
# Since array is sorted:
# - moving left pointer increases sum
# - moving right pointer decreases sum
nums.sort()
n = len(nums)
# initialize with first possible sum
closest = nums[0] + nums[1] + nums[2]
# Fix one element
for i in range(n - 2):
left = i + 1
right = n - 1
# Two Pointer Search
while left < right:
curr_sum = nums[i] + nums[left] + nums[right]
# update closest if better
if abs(curr_sum - target) < abs(closest - target):
closest = curr_sum
# exact match: best possible answer
if curr_sum == target:
return curr_sum
# move pointers
if curr_sum < target:
left += 1
else:
right -= 1
# overall: tc O(n^2)
# overall: sc O(1) (excluding sorting)
return closest| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
18. 4Sum ::1:: - Medium
Topics: Array, Two Pointers, Sorting
Intro
Given an array nums of n integers, return an array of all the unique quadruplets [nums[a], nums[b], nums[c], nums[d]] such that: 0 lte a, b, c, d lt n a, b, c, and d are distinct. nums[a] + nums[b] + nums[c] + nums[d] == target You may return the answer in any order.
| nums | Output |
|---|---|
| nums = [-1,2,1,-4], target = 1 | 2 |
| nums = [0,0,0], target = 1 | 0 |
Constraints:
3 ≤ nums.length ≤ 500
-1000 ≤ nums[i] ≤ 1000
-10^4 ≤ target ≤ 10^4
Abstraction
Given
Space & Time Complexity
| Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
|---|---|---|---|---|
Brute Force (all triplets comparison)
Find the Bug:
Solution 1: [Two Pointer] Two Pointers - Two Pointers/K Pointer Variants
def fourSum(self, nums: List[int], target: int) -> List[List[int]]:
# Two Pointer Pattern (Extended 3Sum)
# Idea:
# 1. Sort array.
# 2. Fix first number (i).
# 3. Fix second number (j).
# 4. Use two pointers to find remaining two numbers.
# Need duplicate skipping to ensure unique quadruplets.
#
nums.sort()
#
n = len(nums)
res = []
# Fix first number
for i in range(n - 3):
# skip duplicates for first number
if i > 0 and nums[i] == nums[i - 1]:
continue
# Fix second number
for j in range(i + 1, n - 2):
# skip duplicates for second number
if j > i + 1 and nums[j] == nums[j - 1]:
continue
# Two Pointer Search
left = j + 1
right = n - 1
while left < right:
curr_sum = (nums[i] + nums[j] + nums[left] + nums[right])
if curr_sum == target:
res.append([nums[i], nums[j], nums[left],nums[right]])
# skip duplicates
left += 1
right -= 1
while left < right and nums[left] == nums[left - 1]:
left += 1
while left < right and nums[right] == nums[right + 1]:
right -= 1
elif curr_sum < target:
left += 1
else:
right -= 1
# overall: tc O(n^3)
# overall: sc O(1) (excluding output)
return res| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
Solution 2: [Two Pointer] Optimal Two Pointers Early Pruning [TC Opt] - Two Pointers/K Pointer Variants
def fourSum(self, nums: List[int], target: int) -> List[List[int]]:
# Two Pointer Pattern (Extended 3Sum)
# Idea:
# 1. Sort array.
# 2. Fix first number (i).
# 3. Fix second number (j).
# 4. Use two pointers to find remaining two numbers.
# Optimization:
# - Early pruning using minimum and maximum possible sums
# - Since array is sorted, we can safely break or continue early.
nums.sort()
n = len(nums)
res = []
# Fix first number
for i in range(n - 3):
# Skip duplicates for first number
if i > 0 and nums[i] == nums[i - 1]:
continue
# Early Prune:
# smallest possible sum already exceeds target
if nums[i] + nums[i+1] + nums[i+2] + nums[i+3] > target:
break
# Early Continue:
# largest possible sum still smaller than target
if nums[i] + nums[n-3] + nums[n-2] + nums[n-1] < target:
continue
# Fix second number
for j in range(i + 1, n - 2):
# Skip duplicates for second number
if j > i + 1 and nums[j] == nums[j - 1]:
continue
# Early Prune (inner loop):
# minimum possible sum too large
if nums[i] + nums[j] + nums[j+1] + nums[j+2] > target:
break
# Early Continue (inner loop):
# maximum possible sum too small
if nums[i] + nums[j] + nums[n-2] + nums[n-1] < target:
continue
# Two Pointer Search
left, right = j + 1, n - 1
while left < right:
total = (nums[i] + nums[j] + nums[left] + nums[right])
if total == target:
res.append([nums[i], nums[j], nums[left], nums[right]])
# Skip duplicates for third and fourth numbers
while left < right and nums[left] == nums[left + 1]:
left += 1
while left < right and nums[right] == nums[right - 1]:
right -= 1
left += 1
right -= 1
elif total < target:
left += 1
else:
right -= 1
# overall: tc O(n^3)
# overall: sc O(1) (excluding output)
return res| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
704. Binary Search ::2:: - Easy
Topics: Array, Binary Search
Intro
Given an array of integers nums which is sorted in ascending order, and an integer target, write a function to search target in nums. If target exists, then return its index. Otherwise, return -1. You must write an algorithm with O(log n) runtime complexity.
| Example Input | Output |
|---|---|
| nums = [-1,0,3,5,9,12] target = 9 | 4 |
| nums = [-1,0,3,5,9,12], target = 2 | -1 |
Constraints:
1 ≤ nums.length ≤ 104
-104 ≤ nums[i], target ≤ 104
All the integers in nums are unique
nums is sorted in ascending order
Abstraction
Given a sorted list of elements, find the target
Space & Time Complexity
| Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
|---|---|---|---|---|
| Bug | Error |
|---|---|
Brute Force: (iterative)
| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
Find the Bug: Missing return value
def search(self, nums: List[int], target: int) -> int:
def helper(left, right):
# base case: target not found
if left > right:
return -1
mid = (left+right)//2
# base case target found
if nums[mid] == target:
return mid
# search right
elif nums[mid] < target:
# INCORRECT:
# no return value
# Instead:
# return helper(mid+1, right)
helper(mid+1, right)
# search left
else:
# INCORRECT:
# no return value
# Instead:
# return helper(left, mid-1)
helper(left, mid-1)
res = helper(0, len(nums)-1)
return resSolution 1: Recursive Function Call Binary Search - Binary Search/Searching
def search(self, nums: List[int], target: int) -> int:
# Binary Search: Recursive
# Recursive function that checks a middle target,
# and depending on result searches the left or right subspace.
# The recursive function calls continue for O(log n) on
# smaller and smaller range determining the time complexity,
# and the recursive call stack creates a space complexity of O(log n)
# Invariant:
# Subarray nums[left:right] contains the target if it exists
# Search space halves on each iteration
# sc: setting up search space [0, n-1] O(1)
left, right = 0, len(nums)-1
# Binary Search 1:
# Target Binary Search Space: '<='
# Exits when left > right, as this means target was not found
# tc: binary search over n elements O(log(n))
def bs_recursive(left, right):
# Current Search Space:
# [left, mid, right]
# End Case: out of search space, target not found
if !(left <= right):
return -1
# Middle Target:
mid = (left + right) // 2
# Base Case: target found
if nums[mid] == target:
return mid
# Target is greater than mid:
# Discard left side of search space [left, mid]
# Search right side of previous space, exclude mid: [mid+1, right]
elif nums[mid] < target:
return bs_recursive(mid + 1, right)
# Target is less than mid:
# Discard right side of search space [mid, right]
# Search left side of previous space, exclude mid: [left, mid-1]
else:
return bs_recursive(left, mid - 1)
# tc: each call halves the search space O(log n)
# sc: call stack grows with depth O(log n)
result = bs_recursive(left, right)
# overall: tc O(log n)
# overall: sc O(log n)
return result| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
| Variable assigning | O(1) | O(1) | Initializing variables in constant O(1) | No additional memory allocation for constant variables O(1) |
| Recursive Binary Search | O(log n) | O(log n) | Search space is halved on each recursive call O(log n) | Call stack grows to depth O(log n) |
| Overall | O(log n) | O(log n) | Binary search over sorted list dominates, leading to O(log n) | Stack memory used for recursive calls dominates, leading to O(log n) |
Solution 2: Iterative Opposite Ends Two Pointer Binary Search - Binary Search/Searching
def search(self, nums: List[int], target: int) -> int:
# Binary Search: Iterative
# Iterative function that checks a middle target,
# and depending on result searches the left or right subspace.
# The iterative function calls continue for O(log n) on
# smaller and smaller range determining the time complexity,
# and since we are using pointers there is no call stack
# leaving a space complexity of O(1)
# Invariant:
# Subarray nums[left:right] contains the target if it exists
# Search space halves on each iteration
# sc: setting up search space [0, n-1] O(1)
left, right = 0, len(nums)-1
# Binary Search 1:
# Target Search Search Space: '<='
# Exits when left > right, as this means target was not found
# tc: search space halves each iteration O(log n)
while left <= right:
# Current Search Space:
# [left, mid, right]
# Middle Target:
mid = (left+right)//2
# Base Case: target found
if nums[mid] == target:
return mid
# Target is greater than mid:
# Discard left side of search space [left, mid]
# Search right side of previous space, exclude mid: [mid+1, right]
elif nums[mid] < target:
left = mid + 1
# Target is less than mid:
# Discard right side of search space [mid, right]
# Search left side of previous space, exclude mid: [left, mid-1]
else:
right = mid - 1
# End Case: out of search space, target not found
# overall: tc O(log n)
# overall: sc O(1)
return -1| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
| Variable assigning | O(1) | O(1) | Initializing variables in constant O(1) | No additional memory allocation for constant variables O(1) |
| Iterative Binary Search | O(log n) | O(1) | Loop halves the search space on each iteration O(log n) | No additional memory allocation beyond loop variables O(1) |
| Overall | O(log n) | O(1) | Binary search over sorted array O(log n) | No additional memory allocation O(1) |
74. Search a 2D Matrix ::2:: - Medium
Topics: Array, Binary Search, Matrix
Intro
You are given an m x n integer matrix matrix with the following two properties: Each row is sorted in non-decreasing order. The first integer of each row is greater than the last integer of the previous row. Given an integer target, return true if target is in matrix or false otherwise. You must write a solution in O(log(m * n)) time complexity.
| Example Input | Output |
|---|---|
| matrix = [[1,3,5,7],[10,11,16,20],[23,30,34,60]], target = 3 | true |
| matrix = [[1,3,5,7],[10,11,16,20],[23,30,34,60]], target = 13 | false |
Constraints:
m == matrix.length n == matrix[i].length
1 ≤ m, n ≤ 100
-104 ≤ matrix[i][j], target ≤ 104
Abstraction
Given a sorted matrix, check if target is in matrix
Space & Time Complexity
| Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
|---|---|---|---|---|
| Bug | Error |
|---|---|
Brute Force:
| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
Find the Bug: Did not follow Binary Search
Find the Bug: Forgot to break
def searchMatrix(self, matrix: List[List[int]], target: int) -> bool:
if len(matrix) == 0 or len(matrix[0]) == 0:
return False
rows, cols = len(matrix), len(matrix[0])
top, bottom = 0, rows-1
rowCand = 0
# first binary search
while top <= bottom:
mid = (top+bottom)//2
if matrix[mid][0] <= target <= matrix[mid][-1]:
# INCORRECT:
# never exiting the while loop
# Instead:
# break
rowCand = mid
elif matrix[mid][-1] < target:
top = mid+1
else:
bottom = mid-1
else:
return False
left, right = 0, len(matrix[rowCand])-1
while left <= right:
mid = (left+right)//2
val = matrix[rowCand][mid]
if val == target:
return True
elif val < target:
left = mid + 1
else:
right = mid - 1
else:
return FalseSolution 1: Iterative Opposite Ends Two Pointer Flattened Matrix by Index To Coordinates Binary Search - Binary Search/Searching
def searchMatrix(self, matrix: List[List[int]], target: int) -> bool:
# Matrix Manipulation Flattening:
# Represent the 2D matrix as a sorted 1D array by flattening its indices.
# Then simply use binary search on the 1D index space,
# and converts indices back to the original 2D coordinates with a helper function.
# Matrix =
# [ row
# [ 1, 3, 5], 0
# [ 7, 9, 11], 1
# [13, 15, 17] 2
# ]
# col 0 1 2
# Flattened:
# [ 1, 3, 5, 7, 9, 11, 13, 15, 17 ]
# 0 1 2 3 4 5 6 7 8
# Invariant:
# Target, if it exists, lies within subarray [left, right]
# Search space is reduced by half on each iteration.
# Helper:
# Revert 1D flat index to original 2D tuple coordinates (row, col)
# Flat Index 4 => (4//3, 4%3) = Matrix (1, 1)
# Flat Index 5 => (5//3, 5%3) = Matrix (1, 2)
def index_to_coords(index, cols) -> tuple[int, int]:
row = index // cols
col = index % cols
return (row, col)
# Empty Check: target not within matrix
if not matrix or not matrix[0]:
return False
# Flattened Representation:
# Calculate the 2D matrix representation as a single array
# sc: setting up search space [0, n-1] O(1)
rows, cols = len(matrix), len(matrix[0])
left, right = 0, (rows * cols) - 1
# Binary Search 1:
# Target Search Search Space: '<='
# Exits when left > right, as this means target was not found
# tc: binary search over m*n O(log(m*n))
while left <= right:
# Current Search Space:
# [left, mid, right]
# Middle Target:
mid = (left + right) // 2
(row, col) = index_to_coords(mid, cols)
mid_value = matrix[row][col]
# Base Case: target found
if mid_value == target:
return True
# Target is greater than mid:
# Discard left side of search space [left, mid]
# Search right side of previous space, exclude mid: [mid+1, right]
elif mid_value < target:
left = mid + 1
# Target is less than mid:
# Discard right side of search space [mid, right]
# Search left side of previous space, exclude mid: [left, mid-1]
else:
right = mid - 1
# End Case: out of search space, target not found
# overall tc: O(log (m * n))
# overall sc: O(1)
return False| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
| Binary search 1D index | O(log (m*n)) = O(log(m) + log(n)) | O(1) | Binary search over full flattened matrix treated as 1D O(log (m*n)) | No additional memory allocation |
Solution 2: Two Layer Binary Search Outer Row Range Search And Inner Row Element Search - Binary Search/Searching
def searchMatrix(self, matrix: List[List[int]], target: int) -> bool:
# Outer Inner Binary Search
# Performs two separate binary searches
# - Binary Search over rows to locate potential row candidate
# - Binary Search within that row candidate to verify target existence
# Matrix =
# [ row
# [ 1, 3, 5], 0
# [ 7, 9, 11], 1
# [13, 15, 17] 2
# ]
# col 0 1 2
# Invariant (First Binary Search):
# Target, if it exists, lies within row range [matrix[row][0], matrix[row][-1]]
# Search space is reduced by half on each iteration.
# Invariant (Second Binary Search):
# Target, if it exists, lies within row elements [left, right]
# Search space is reduced by half on each iteration.
# Empty Check: target not within matrix
if not matrix or not matrix[0]:
return False
# Matrix Representation:
# Top is row 0 and bottom is row n-1 (refer to diagram)
# sc: setting up search space [0, n-1] O(1)
rows, cols = len(matrix), len(matrix[0])
top, bottom = 0, rows - 1
# Binary Search 1:
# Target Search Search Space: '<='
# Exits when left > right, as this means target was not found
# tc: binary search over n rows O(log n)
while top <= bottom:
# Middle Target:
mid_row = (top + bottom) // 2
# Base Case: target found
if matrix[mid_row][0] <= target <= matrix[mid_row][-1]:
break
# Target is greater than mid:
# Discard left side of search space [left, mid]
# Search right side of previous space, exclude mid: [mid+1, right]
elif matrix[mid_row][0] < target:
# Target is less than mid:
# Discard right side of search space [mid, right]
# Search left side of previous space, exclude mid: [left, mid-1]
else:
bottom = mid_row - 1
# End Case: out of search space, target not found
else:
return False
# row boundaries
row = mid_row
left, right = 0, cols - 1
# Binary Search 1:
# Target Search Search Space: '<='
# tc: binary search across m columns in row O(log m)
while left <= right:
# Middle Target:
mid_col = (left + right) // 2
# Base Case: target found
if matrix[row][mid_col] == target:
return True
# Target is greater than mid:
# Discard left side of search space [left, mid]
# Search right side of previous space, exclude mid: [mid+1, right]
elif matrix[row][mid_col] < target:
left = mid_col + 1
# Target is less than mid:
# Discard right side of search space [mid, right]
# Search left side of previous space, exclude mid: [left, mid-1]
else:
right = mid_col - 1
# End Case: out of search space, target not found
# overall: tc O(log (m * n))
# overall: sc O(1)
return False| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
| Binary search rows | O(log m) | O(1) | Binary search across m rows O(log m) | No additional memory allocation for binary search O(1) |
| Binary search columns | O(log n) | O(1) | Binary search across column of n length O(log n) | No additional memory allocation for binary search O(1) |
| Overall | O(log m + log n) = O(log (m*n)) | O(1) | Sum of two binary searches dominates, leading to O(log m + log n) | No additional memory allocation, leading to O(1) |
875. Koko Eating Bananas ::2:: - Medium
Topics: Array, Binary Search, Network Packet Routing, Design, System Level Design
Intro
Koko loves to eat bananas. There are n piles of bananas, the ith pile has piles[i] bananas. The guards have gone and will come back in h hours. Koko can decide her bananas-per-hour eating speed of k. Each hour, she chooses some pile of bananas and eats k bananas from that pile. If the pile has less than k bananas, she eats all of them instead and will not eat any more bananas during this hour. Koko likes to eat slowly but still wants to finish eating all the bananas before the guards return. Return the minimum integer k such that she can eat all the bananas within h hours.
| Example Input | Output |
|---|---|
| piles = [3,6,7,11], h = 8 | 4 |
| piles = [30,11,23,4,20], h = 5 | 30 |
| piles = [30,11,23,4,20], h = 6 | 23 |
Constraints:
1 ≤ piles.length ≤ 104
piles.length ≤ h ≤ 109
1 ≤ piles[i] ≤ 109
Abstraction
Find minimum bananas per hour so that all are eaten in time.
Space & Time Complexity
| Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
|---|---|---|---|---|
| Bug | Error |
|---|---|
Brute Force:
| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
Find the Bug: Hours For Speed k Is Always Rounding Down
def minEatingSpeed(self, piles: List[int], h: int) -> int:
left, right = 1, max(piles)
while left < right:
speed = (left+right)//2
hours = 0
for pile in piles:
# INCORRECT:
# division is rounding down (floor)
# Instead:
# (pile/speed) -> leaves remainder to be picked up by ceiling
hours += math.ceil(pile//speed)
if hours <= h:
right = speed
else:
left = speed+1
return left Find the Bug: Floor Caused K Starting Speed at 0, Leading to Division By 0 Error
def minEatingSpeed(self, piles: List[int], h: int) -> int:
# INCORRECT:
# Division could lead to remainder 0.1
# that rounds to 0.
# Safer to round up
# Instead:
# speedCheck1 = math.ceil(totalBananas / h)
totalBananas = sum(piles)
speedCheck1 = math.floor(totalBananas / h)
# INCORRECT:
# Division could lead to remainder 0.1
# that rounds to 0.
# Safer to round up
# Instead:
# speedCheck2 = math.ceil(min(piles) / evenTime)
numPiles = len(piles)
evenTime = int(h/numPiles)
speedCheck2 = math.floor(min(piles) / evenTime)
# INCORRECT:
# Division could lead to remainder 0.1
# that rounds to 0.
# Safer to round up
# Instead:
# speedCheck2 = math.ceil(max(piles) / (h - (numPiles - 1)))
speedCheck3 = math.floor(max(piles) / (h - (numPiles - 1)))
speed = max(speedCheck1, speedCheck2, speedCheck3)
while (True):
# Grab hours needed for current speed
smallKTotalTime = 0
for pile in piles:
smallKTotalTime += math.ceil(pile / speed)
# Greedy Search: designed to search from lower bound eating speed k
# Implies: first valid speed we encounter is the smallest one
if smallKTotalTime <= h:
return speed
# if totalTime is too large, double the speed
elif smallKTotalTime >= 2 * h:
speed *= 2
# if close but still too large, we increment the speed by 1
else:
speed += 1Find the Bug: Wait a minute, This House Has No Roof
def minEatingSpeed(self, piles: List[int], h: int) -> int:
# INCORRECT:
# speed will have remainder, not valid speed
# Instead:
# speed1 = math.ceil(allBananas/h)
allBananas = sum(piles)
speed1 = allBananas/h
# INCORRECT:
# speed will have remainder, not valid speed
# Instead:
# speed2 = math.ceil(min(piles)/evenTime)
evenTime = int(h/len(piles))
speed2 = min(piles)/evenTime
# INCORRECT:
# speed will have remainder, not valid speed
# Instead:
# speed3 = math.ceil(max(piles)/unEvenTime)
unEvenTime = h - (len(piles)-1)
speed3 = max(piles)/unEvenTime
speed = max(speed1, speed2, speed3)
while (True):
requiredHours = 0
for pile in piles:
requiredHours += math.ceil(pile/speed)
if requiredHours <= h:
return speed
elif requiredHours >= 2*h:
speed *= 2
else:
speed += 1
Find the Bug: Cannot Bananas Divided By Hour, with Remainder
def minEatingSpeed(self, piles: List[int], h: int) -> int:
allBananas = sum(piles)
speed1 = math.ceil(allBananas/h)
# INCORRECT:
# evenTime will have a remained, leading to error in divisions
# Instead:
# evenTime = int(h/len(piles))
evenTime = h/len(piles)
speed2 = math.ceil(min(piles)/evenTime)
unEvenTime = h - (len(piles)-1)
speed3 = math.ceil(max(piles)/unEvenTime)
speed = max(speed1, speed2, speed3)
while (True):
requiredHours = 0
for pile in piles:
requiredHours += math.ceil(pile/speed)
if requiredHours <= h:
return speed
elif requiredHours >= 2*h:
speed *= 2
else:
speed += 1
Solution 1: Binary Search - Binary Search/Optimization Search Min Max
def minEatingSpeed(self, piles: List[int], h: int) -> int:
import math
# Binary Search: Optimization Search
# Iterative function that checks if middle element is valid
# using some constraint, in this case under h minimum hours,
# and depending on result searches the left or right subspace
# but also includes the middle element if it was valid.
# The iterative function calls continue for O(log n) on
# smaller and smaller range determining the time complexity,
# and since we are using pointers there is no call stack
# leaving a space complexity of O(1)
# Banana Abstraction:
# - k = minimum eating speed such that koko is able to eat all bananas
# within the h hours deadline
# - Piles = groups of bananas, piles vary in size, only positive integers
# - h = hours deadline is at least greater than the number of piles: len(piles) <= h
# Setting Up Search Space:
# - Left Speed: 1 banana per hour (slowest possible)
# - Right speed: max(piles) as no pile would need more than max(piles) eating per hour
left, right = 1, max(piles)
# Invariant:
# Min k eating speed is within the interval [left, right]
# Search space halves each iteration
# Helper:
# Finds the h number of hours needed for some k eating speed to finish piles
# overall tc: iterate over n piles O(n)
def hours_needed(speed):
# time counter for candidate k
total_hours = 0
# Koko iterates over piles to eat
# tc: iterate over n piles O(n)
for pile in piles:
# Round up time:
# We cannot stop eating in middle of hour, example:
# p = 3 bananas, k = 9 eating speed, bananas / eating speed,
# will still take 1 full hour, so we must round the result up
total_hours += math.ceil(pile / speed)
# total h hours needed for k eating speed for all piles
return total_hours
# Binary Search 2:
# Optimization Search Space: '<'
# Exits when left == right, as this mean answer has been optimized
# tc: binary search over speeds 1 -> max(piles) speeds O(log(max(piles)))
while left < right:
# Middle Target:
mid = (left + right) // 2
# Hours needed for candidate k eating speed
required_hours = hours_needed(mid)
# Candidate is valid eating speed (below h hours)
# Discard faster speeds on right side [mid, right]
# Search left side of previous space, and include mid => [left, mid]
if required_hours <= h:
right = mid
# Candidate is not valid eating speed (above h hours)
# Discard slower speeds on left side [left, mid]
# Search right side of previous space, and exclude mid => [mid+1, right]
else:
left = mid + 1
# Optimization Exit:
# left == right, answer has been optimized
# The space of valid speeds, [left, right], has been reduced to one speed
# This final speed is the optimized speed
# overall: tc O(n * log(max(piles)))
# overall: sc O(1)
return left| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
| Outer Binary Search | O(log(max(piles))) | O(1) | Binary search over range of possible speeds from 1 to max files size O(log(max(piles))) | No additional memory allocation for binary search O(1) |
| Hours calculation | O(n) | O(1) | Iterate over all piles to calculate if piles are eaten O(n) | No additional memory allocation for calculation O(1) |
| Overall | O(n * log(max(piles))) | O(1) | Binary search over range alongside hours calculation per candidate dominates, leading to O(n * log(max(piles))) | No additional memory allocation O(1) |
Solution 2: [Greedy] [Networking] Networking Router Analogy, Greedy Heuristic Hybrid Target Optimization Monotonic Search with Exponential Speed Doubling - Binary Search/Optimization Search Min Max
def minEatingSpeed(self, piles: List[int], h: int) -> int:
# README.md
# Koko eats bananas is a cleverly disguised networking problem!
# This problem is a system design exercise regarding networking packets:
# -----------------------------------------------------
# Network Router Analogy:
# Router = Koko
# Incoming Data Packets = Banana Piles
# Data In Packet To Be Process = Bananas In Pile
# Maximum Allowed Latency = h deadline
# Throughput = k eating speed
# -----------------------------------------------------
# Greedy Choosing K Speed Start:
# - Start with the highest reasonable lower bound to speed up search efficiently
# - Different system perspectives for lower bound, use at least 3 perspectives
# - Use the networking analogy to determine the perspectives
# Review Networking Analogy To Choose Perspectives:
# - Packets (piles)
# - Data size (bananas in pile)
# - Time Constraint (h)
# A perspective should take into account at least two or all three
# (packets, data size, and time) in order to generate a good lower bound k
# 3 Perspectives:
#
# 1. Throughput/Global Bandwidth
# - Total bananas being processed (total data to be processed)
#
# 2. Quick Packet Bursts
# - Speed based on smallest pile
# - To avoid having excessively large speed
# - Smallest Pile Consideration With Fair Time Allocation
#
# 3. Bottleneck Prevention
# - Speed based on largest pile
# - Avoid having small speed that gets stuck on largest pile
# -Unfair Time Allocation:
# Give every pile 1 hour of time, and the rest of time to the largest pile
# (Largest Pile Consideration With Unfair Time Allocation)
# Why these 3 perspectives?
# - Determined the perspectives by constraints:
# 1. Throughput => Close to real answer in order to minimize search time
# 2. smallest packet => Must be lower bound and never overestimate
# 3. largest packet => avoid too lower of a lower bound
# - Computable in O(n) time in order to scalable = simple perspectives
# - Multiple heuristics to take into account different system perspectives = at least 3 perspectives
# Why not other perspectives?
# - Other perspectives only take into account 1 aspect of the analogy
# (packets, data size, time constraint), and are thus not good lower bounds
#
# 1. Average between all piles
# - Does not consider the time constraint, assumes infinite time
#
# 2. Max pile divided over h hours
# - Does not factor rest of pile size, if max pile is much larger,
# we would waster k speed on the smaller piles
# Overall Pros:
# Can outperform binary search in practice for some input with high k speeds.
# Overall Cons:
# Lacks worst case guarantees, binary search has O(log n) worst case,
# this does not have a worst case guarantee
# ------------------------
# Perspective 1: Total Data Volume + Required Processing/Throughput
# - Compute total workload (total bananas)
# - Determine minimum average processing rate (throughput) needed to finish within h hours
# - Provides a baseline lower bound for the eating speed
# Throughput Constraint Summary:
# - Speed needed to process largest pile while spending most of time on largest pile
# Solution Idea:
# Spread all bananas evenly over h available hours
# All bananas / hours -> baseline average k speed
# Lower Bound:
# No matter how time is spent between small or large piles,
# if k is less than totalBananas/h, all bananas will not get finished
# Creates Small k:
# Underestimates when bananas are skewed into a few large piles
# by assuming bananas are even across piles
totalBananas = sum(piles)
speedCheck1 = math.ceil(totalBananas / h)
# ------------------------
# Perspective 2: Smallest Packet Processing + Even Time
# - Assume time is evenly split among all piles
# - Focus on the smallest pile
# - Determine a minimum speed to finish the smallest pile using evenly split time
# - Prevents overestimating speed due to small piles, giving a realistic lower bound
# Quick Packet Bursts Constraint Summary:
# - The minimum speed needed to finish the smallest pile while splitting the time evenly between piles
# Solution Idea:
# Part 1: Split Time Evenly Between Piles (Fair time Sharing) = Hour Per Pile Based Average,
# Part 2: Minimum Speed to Finish Smallest Pile using Split Even Time = Minimum Speed Needed For Fair Time Sharing
# Lower Bound:
# If each pile gets equal time, find the minimum speed
# required to finished with the smallest pile have the same time as all other piles.
# Since we are determining k based on the smallest pile, which does not need much time,
# this produces a small k:
numPiles = len(piles)
splittingTimeEvenly = int(h/numPiles)
smallestPile = min(piles)
speedCheck2 = math.ceil(smallestPile / splittingTimeEvenly)
# ------------------------
# Perspective 3: Largest Pile + Bottleneck Time Simulation
# - Assume bottleneck scenario, most time is spent on largest pile
# - Focus on the largest pile
# - Ensures chosen speed process largest pile under h deadline
# - Provides another lower bound to prevent underestimating speed
# Bottleneck Constraint Summary:
# - Spend the majority of time on the largest pile, to simulate a bottleneck
# and determine a speed that will work within that constraint
# Solution Idea:
# Part 1: Simulate a bottleneck, by assuming we will spend little time on smaller piles, giving them all 1 hour
# Part 2: Simulate a bottleneck, by assuming we will spend most of our time on the largest pile, giving it the rest of the time
# Lower Bound:
# As the largest pile has a large number of hours to finish,
# this allows the k eating speed to be lower,
# this produces a small k:
remainingTime = h - (numPiles - 1)
speedCheck3 = math.ceil(max(piles) / remainingTime)
# ------------
# Greedy Starting Speed k: highest lower bound
# - We have created 3 lower bounds between the 3 perspectives
# - We will pick the highest lower bound between the 3
# - Avoids wasting time on speeds below k
speed = max(speedCheck1, speedCheck2, speedCheck3)
# -----------------------------------------------------
# Greedy Aggressive Monotonic Time Search Optimization:
# - If the total time needed for k eating speed is double the h deadline,
# we will aggressively increase k by doubling it
# - If total eating time less than double the h deadline,
# we will increment speed k slowly by 1.
# -----------------------------------------------------
# Invariant:
# min valid speed k lies in [speed, +inf]
# speed is monotonically increased during search
# first valid time h: speed is guaranteed to be the min
# due to our Greedy Aggressive Monotonic Time Search Optimization
# tc doubling x2: exponential doubling O(log max(piles))
# tc iterating +1: linear scan O(max(piles))
# overall: worst case linear scan O(max(piles)/2) => O(max(piles))
while (True):
# time counter for candidate k
smallKTotalTime = 0
# Koko iterates over piles to eat
# tc: iterate over n piles O(n)
for pile in piles:
# Round up time:
# We cannot stop eating in middle of hour, example:
# p = 3 bananas, k = 9 eating speed, bananas / eating speed,
# will still take 1 full hour, so we must round the result up
smallKTotalTime += math.ceil(pile / speed)
# Greedy Aggressive Monotonic Time Search Optimization:
# Check: if valid k speed found
# Implies: we were iterating and found the first valid k, the is the min
if smallKTotalTime <= h:
return speed
# Check: if h is double the deadline
# Implies: k is at least half as slow as it should be
# Then: aggressively double k speed,
# tc: search space halves each iteration [speed*2, +inf]: O(log n)
elif smallKTotalTime >= 2 * h:
speed *= 2
# Check: if h is below double the deadline
# Implies: k is close to its min speed
# Then: iterate by 1 until we find the first valid k speed
# tc: search space decreases by 1 each iteration [speed+1, +inf]: O(n)
else:
speed += 1
# overall: tc best: O(n * log(max(piles))), worst: O(n * max(piles))
# overall: sc O(1)| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
153. Find Minimum in Rotated Sorted Array ::1:: - Medium
Topics: Array, Binary Search
Intro
Suppose an array of length n sorted in ascending order s rotated between 1 and n times. For example, the array nums = [0,1,2,4,5,6,7] might become: [4,5,6,7,0,1,2] if it was rotated 4 times. [0,1,2,4,5,6,7] if it was rotated 7 times. Notice that rotating an array [a[0], a[1], a[2], ..., a[n-1]] 1 time results in the array [a[n-1], a[0], a[1], a[2], ..., a[n-2]]. Given the sorted rotated array nums of unique elements, return the minimum element of this array. You must write an algorithm that runs in O(log n) time.
| Example Input | Output |
|---|---|
| nums = [3,4,5,1,2] | 1 |
| nums = [4,5,6,7,0,1,2] | 0 |
| nums = [11,13,15,17] | 11 |
Constraints:
n == nums.length
1 ≤ n ≤ 5000
-5000 ≤ nums[i] ≤ 5000
All of the integers of nums are unique
nums is sorted and rotated between 1 and n times
Abstraction
Find smallest number in an array that has been rotated some amount of times.
Space & Time Complexity
| Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
|---|---|---|---|---|
| Bug | Error |
|---|---|
Brute Force:
| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
Find the Bug:
Solution 1: Find Min by Determining if Smaller Numbers on Left or Right of Middle Binary Search - Binary Search/Condition Adapted Binary Search
def findMin(self, nums: List[int]) -> int:
# Property of rotated sorted array:
# At some point, there will be an element that is splitting the array
# ------------------------
# T/F Function: compare mid to right
# Example 1:
# arr = [4, 5, 6, 7, 0, 1, 2]
# mid < right => 7 < 2 ==> false ==> smaller nums on right half of mid (excluding mid)
# Example 2:
# arr = [10, 1, 2, 3, 6, 7, 8]
# mid < right => 3 < 8 ==> true ==> smaller nums on left half of mid (including mid)
# ------------------------
# Invariant:
# The target min value is always in the search space [left, right]
# At each iteration we discard half of search space
# Defining search space
left, right = 0, len(nums) - 1
# Check: if array is in original correct order
if nums[left] < nums[right]:
return nums[left]
# Binary Search 2:
# Optimization Search Space: '<'
# Exits when left == right, as this mean answer has been optimized
# tc: binary search over k speeds O(log(n))
while left < right:
# Middle Target:
mid = (left + right) // 2
# Mid is less than right
# Smaller elements are on the left of mid
# Discard higher elements on right side [mid, right]
# Include lower elements on left side, include lower mid element [left, mid]
if nums[mid] < nums[right]:
right = mid
# Mid is greater than right
# Smaller elements are on the right of mid
# Discard higher elements on left side with mid [left, mid]
# Include lower elements on right side, exclude larger mid element [mid+1, right]
else:
left = mid+1
# Optimization Loop Termination:
# left == right
# The space of smaller elements, [left, right], has been reduced to one element
# This final element is the minimum element
# overall: tc O(log n)
# overall: sc O(1)
return nums[left]| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
| Binary Search Loop | O(log n) | O(1) | Binary search iteration halves search space due to sorted and rotated property O(log n) | No additional memory allocation O(1) |
| No explicit target | O(1) | O(1) | We are optimizing loop to converge to min O(1) | No additional memory allocation O(1) |
| Overall | O(log n) | O(1) | Binary search over array of n length dominates, leading to O(log n) | No additional memory allocation for binary search |
33. Search in Rotated Sorted Array ::1:: - Medium
Topics: Array, Binary Search
Intro
There is an integer array nums sorted in ascending order (with distinct values). Prior to being passed to your function, nums is possibly rotated at an unknown pivot index k (1 ≤ k ≤ nums.length) such that the resulting array is [nums[k], nums[k+1], ..., nums[n-1], nums[0], nums[1], ..., nums[k-1]] (0-indexed). For example, [0,1,2,4,5,6,7] might be rotated at pivot index 3 and become [4,5,6,7,0,1,2]. Notice that rotating an array [a[0], a[1], a[2], ..., a[n-1]] Given the array nums after the possible rotation and an integer target, return the index of target if it is in nums, or -1 if it is not in nums. You must write an algorithm with O(log n) runtime complexity.
| Example Input | Output |
|---|---|
| nums = [4,5,6,7,0,1,2], target = 0 | 4 |
| nums = [4,5,6,7,0,1,2], target = 3 | -1 |
| nums = [1], target = 0 | -1 |
Constraints:
1 ≤ nums.length ≤ 5000
-104 ≤ nums[i] ≤ 104
All values of nums are unique.
nums is an ascending array that is possibly rotated.
-104 ≤ target ≤ 104
Abstraction
Find target in an array that has been rotated some amount of times.
Space & Time Complexity
| Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
|---|---|---|---|---|
| Bug | Error |
|---|---|
Brute Force:
| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
Find the Bug: Returned Index Instead of Value
def findMin(self, nums: List[int]) -> int:
left, right = 0, len(nums)-1
while left < right:
mid = (left+right)//2
if nums[mid] < nums[right]:
right = mid
else:
left = mid + 1
# INCORRECT:
# returning index of min
# instead of actual min value
# Instead:
# return nums[left]
return leftSolution 1: Inequality Boundaries Discard Mid Binary Search - Binary Search/Condition Adapted Binary Search
def search(self, nums: List[int], target: int) -> int:
# Property of rotated sorted array:
# At some point, there will be an element that is splitting the array
# Invariant:
# target, if exists, is in interval [left, right]
# interval [left, right] is always either
# a) entirely sorted
# b) contains a rotation
# search space halves each iteration
# Defining search space
left, right = 0, len(nums) - 1
# Binary Search 1:
# Target Binary Search Space: '<='
# tc: binary search over list n length O(log n)
while left <= right:
# Current Search Space:
# [left, mid, right]
# Middle Target:
mid = (left + right) // 2
# Base Case: target found
if nums[mid] == target:
return mid
# Check: if mid less than right
# Implies: right half of search space is sorted
# Implies: we have a guaranteed ascending range of: [mid, right]
if nums[mid] <= nums[right]:
# Check: if target within ascending range of [mid, right]
# Implies: target is within right section
# Discard left side of search space: [left, mid]
# Search right side of previous space,
# exclude middle: [mid+1, right]
if nums[mid] < target <= nums[right]:
left = mid + 1
# Check: if target not within ascending range of [mid, right]
# Implies: target is not with range (mid, right]
# Discard right side of search space: [mid, right]
# Search left side of previous space
# exclude middle: [left, mid-1]
else:
right = mid - 1
# Check: if mid is greater than right
# Implies: left half of search space is sorted
# Implies: we have a guaranteed ascending range of [left, mid]
else:
# Check: if target is within ascending range of [left, mid]
# Implies: target is within left section
# Discard right side of search space: [mid, right]
# Search left side of previous space,
# exclude middle: [left, mid-1]
if nums[left] <= target < nums[mid]:
right = mid - 1
# Check: if target is not within ascending range of [left, mid]
# Implies: target is within right section
# Discard left side of search space: [left, mid]
# Search right side of previous space,
# exclude middle: [mid+1, right]
else:
left = mid + 1
# End Case: out of search space, target not found
# overall: tc O(log n)
# overall: sc O(1)
return -1| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
| Binary search loop | O(log n) | O(1) | Search space halved each iteration O(log n) | No additional memory allocation for iterative binary search O(1) |
| One target comparison | O(1) | O(1) | Mid comparison in constant O(1) | No additional memory allocation for comparison O(1) |
| Overall | O(log n) | O(1) | Binary search over n elements dominates, leading to O(log n) | No additional memory allocation O(1) |
981. Time Based Key Value Store ::2:: - Medium
Topics: Hash Table, String, Binary Search, Design
Intro
Design a time based key value data structure that can store multiple values for the same key at different time stamps and retrieve the key's value at a certain timestamp. Implement the TimeMap class: TimeMap(): Initializes the object of the data structure. void set(String key, String value, int timestamp): Stores the key key with the value value at the given time timestamp. String get(String key, int timestamp): Returns a value such that set was called previously, with timestamp_prev ≤ timestamp. If there are multiple such values, it returns the value associated with the largest timestamp_prev. If there are no values, it returns "".
| Example Input | Output |
|---|---|
| ["TimeMap", "set", "get", "get", "set", "get", "get"] [[], ["foo", "bar", 1], ["foo", 1], ["foo", 3], ["foo", "bar2", 4], ["foo", 4], ["foo", 5]] | [null, null, "bar", "bar", null, "bar2", "bar2"] |
Constraints:
1 ≤ keys.length, values.length ≤ 100
key and value consist of lowercase English letters and digits.
1 ≤ timestamp ≤ 107
All the timestamps timestamp of set are strictly increasing.
At most 2 * 105 calls will be made to set and get.
Abstraction
Find target in an array that has been rotated some amount of times.
Space & Time Complexity
| Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
|---|---|---|---|---|
| Bug | Error |
|---|---|
Brute Force:
| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
Find the Bug:
Solution 1: Reverse Linear Scan - Binary Search/Condition Adapted Binary Search
class TimeMap:
# TimeMap:
# We need to track
# Note:
# Stores multiple (value, timestamp) tuples per key in insertion order
# The 'get' function performs a reverse linear scan to find
# the latest time <= the query timestamp.
# 'get' can be slow if many timestamps per key O(m)
# sc: n is the total set calls across all keys O(n)
def __init__(self):
# key -> [(value, timestamp)]
self.store = {}
# overall: tc O(1) amortized
def set(self, key: str, value: str, timestamp: int) -> None:
# append to key list: (value, timestamp)
if key not in self.store:
self.store[key] = []
self.store[key].append((value, timestamp))
# Reverse linear scan:
# overall: tc where m is number of timestamps for this key O(m)
def get(self, key: str, timestamp: int) -> str:
# if key list does not exist
if key not in self.store:
return ''
# grab key list: [(value, timestamp)]
values = self.store[key]
# outer boundaries
i = len(values) - 1
# Reverse linear scan:
# grab first valid highest timestamp <= query
while 0 <= i and values[i][1] > timestamp:
i -= 1
# not found case: index -1
# or
# timestamp is now <= query
return values[i][0] if i != -1 else ''
# overall: tc O(m)
# overall: sc O(n)| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
| set | O(1) | O(1) | Amortized constant time append O(1) | One tuples stored per call |
| get | O(m) | O(1) | Linear scan backward through at most m timestamps | No additional memory allocation |
| Overall | O(m) | O(n) | get scaling linearly with timestamps count dominates, leading to O(m) | Storage is proportional to total number of sets O(n) |
Solution 2: Upper Ceiling to Find First Invalid TimeStamp - Binary Search/Upper Ceiling or Lower Floor Trick Based on Target Binary Search
class TimeMap:
# Note:
# Upper Ceiling Trick:
# We cannot directly optimize for 'timestamp <= target' using '<=' via this binary search.
# Instead, optimize for the condition 'timestamp < target' (strictly greater than).
# The search finds the index of the first timestamp greater than the target.
# After, we shift the pointer by one (left - 1) to get the largest timestamp <= target.
# Note:
# Stores (value, timestamp) tuples per key in insertion order (sorted by timestamp)
# 'get' uses binary search to find the largest timestamp <= the query timestamp.
# 'get' more efficient O(log m)
# space complexity: n is the total set calls across all keys O(n)
def __init__(self):
# key -> [(value, timestamp)]
self.store = {}
# overall: time complexity amortized O(1)
def set(self, key: str, value: str, timestamp: int) -> None:
# append to key list: (value, timestamp)
if key not in self.store:
self.store[key] = []
self.store[key].append((value, timestamp))
# overall: time complexity where m is number of timestamps for this key O(log m)
def get(self, key: str, timestamp: int) -> str:
# Precondition:
# self.store[key] contains (value, timestamp) pairs sorted ascending by timestamp
# timestamp is an integer >= 1
# Postcondition:
# returns value with largest timestamp <= given timestamp
# returns '' if none exists
# Empty Check
if key not in self.store:
return ''
values = self.store[key]
# Upper Ceiling Binary Search Shift Trick:
# In order to find: highest timestamp <= query
# First first: query < min timestamp
# Then shift index: find highest timestamp <= query
# outer boundaries
# intentionally start right at len(values)
left, right = 0, len(values)
# Loop invariant:
# 1. 0 <= left <= right <= len(values)
# 2. forall i :: 0 <= i < left ==> values[i][1] <= timestamp
# 3. forall j :: right <= j < len(values) ==> values[j][1] > timestamp
# Interpretation:
# all indices before 'left' are valid timestamps <= query
# all indices after 'right' are invalid timestamps > query
# When loop exits (left == right)
# left points to first invalid timestamp > query (if any)
# left-1 points to largest valid timestamp <= query or -1 if none exists
# time complexity: binary search over time stamps O(log(m))
while left < right:
mid = (left + right) // 2
# If timestamp is greater than query timestamp -> valid
if values[mid][1] > timestamp:
right = mid
# If timestamp is less than or equal to query timestamp -> invalid
else:
left = mid + 1
# Found smallest timestamp > query:
# If we shift index by 1 -> largest timestamp <= query:
# Or ''
return values[left - 1][0] if left != 0 else ''
# overall: time complexity O(log m)
# overall: space complexity O(n)| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
| set | O(1) | O(1) | Amortized constant time append O(1) | One tuple stored per call O(1) |
| get | O(log m) | Binary search over m timestamps O(log m) | No additional memory allocation for binary search O(1) | |
| Overall | O(log m) | O(n) | Binary search retrieval dominates, leading to O(log m) | Storage proportional to total number of sets O(n) |
Solution 3: Floor Search Tracking Largest Valid Index With Right Pointer - Binary Search/Upper Ceiling or Lower Floor Trick Based on Target Binary Search
class TimeMap:
def __init__(self):
self.mp = {}
def set(self, key: str, value: str, timestamp: int) -> None:
# put a key: val -> to a timestamp
item = (timestamp, value)
if key not in self.mp:
self.mp[key] = []
self.mp[key].append(item)
def get(self, key: str, timestamp: int) -> str:
# Precondition:
# self.map[key] contains sorted (timestamp, value) pairs by ascending timestamp
# timestamp is an int >= 1
# Postcondition:
# returns value with largest timestamp <= given timestamp
# returns empty string if no such timestamp exists
# Empty Case
if key not in self.mp:
return ''
values = self.mp[key]
# Edge Case:
# If timestamp >= end of array timestamp, return last value
if timestamp >= values[-1][0]:
return values[-1][1]
left, right = 0, len(values)-1
# Loop invariant:
# 1. 0 <= left <= right + 1 <= len(values)
# 2. All values before 'left' have timestamp <= query timestamp
# 3. All values after 'right' have timestamp > query timestamp
# Ensures:
# When left > right, right points to
# the largest timestamp <= query or -1 if none exists
while left <= right:
mid = (left + right) // 2
time, val = values[mid]
# found target
if time == timestamp:
return val
# if time larger than query: reduce right side
elif time > timestamp:
right = mid - 1
# if time smaller than request: reduce left side
else:
left = mid + 1
# refer to postcondition
return value_list[right][1] if right != -1 else ''| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
162. Find Peak Element ::1:: - Medium
Topics: Array, Binary Search
Intro
A peak element is an element that is strictly greater than its neighbors. Given a 0-indexed integer array nums, find a peak element, and return its index. If the array contains multiple peaks, return the index to any of the peaks. You may imagine that nums[-1] = nums[n] = -inf. In other words, an element is always considered to be strictly greater than a neighbor that is outside the array. You must write an algorithm that runs in O(log n) time.
| Example Input | Output |
|---|---|
| nums = [1,2,3,1] | 2 |
| nums = [1,2,1,3,5,6,4] | 5 |
Constraints:
1 ≤ nums.length ≤ 1000
-231 ≤ nums[i] ≤ 231 - 1
nums[i] != nums[i + 1] for all valid i.target in an array that has been rotated some amount of time
Abstraction
Find the peak elements in an array, where peak elements are defined as a height that is greater than both of their neighbors. Multiple peaks can exist per array.
Space & Time Complexity
| Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
|---|---|---|---|---|
| Bug | Error |
|---|---|
Brute Force:
| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
Find the Bug:
Solution 1: Modified Binary Search over Towards Higher Slope - Binary Search/Condition Adapted Binary Search
def findPeakElement(self, nums: List[int]) -> int:
# Precondition:
# len(nums) >= 1
# nums[i] != nums[i+1] for all valid i
# Postcondition:
# returns index i such that nums[i] is a peak:
# nums[i] > nums[i-1] (if i > 0)
# nums[i] > nums[i+1] (if i < n-1)
left, right = 0, len(nums) - 1
# Invariant:
# At least one peak exists within interval [left, right]
# Half of search space removed per iteration
# Binary Search 2:
# Optimization Search Space: '<'
# Exits when left == right, as this mean answer has been optimized
# tc: binary search over O(log(n))
while left < right:
# Current Search Space:
# [left, mid, right]
# Middle Target:
mid = (left + right) // 2
# Check: ii mid less than its following element
# Implies: peak is on the right side of mid
# Discard lower speeds on left side [left, mid]
# Search right side of previous space,
# and exclude mid: [mid+1, right]
if nums[mid] < nums[mid + 1]:
left = mid + 1
# Check: if mid greater than its following element
# Implies: peak is on left side of mid
# Discard lower speeds on right side [mid+1, right]
# Search left side of previous space,
# and include mid: [left, mid]
else:
right = mid
# Optimization Exit:
# left == right, answer has been optimized
# The space of larger elements, [left, right],
# has been reduced to one element
# This final element is the highest peak
# overall: tc O(log n)
# overall: sc O(1)
return left| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
4. Median of Two Sorted Arrays ::2:: - Hard
Topics: Array, Binary Search, Divide and Conquer
Intro
Given two sorted arrays nums1 and nums2 of size m and n respectively, return the median of the two sorted arrays. The overall run time complexity should be O(log (m+n)).
| Example Input | Output |
|---|---|
| nums1 = [1,3], nums2 = [2] | 2.00000 |
| nums1 = [1,2], nums2 = [3,4] | 2.50000 |
Constraints:
nums1.length == m
nums2.length == n
0 ≤ m ≤ 1000
0 ≤ n ≤ 1000
1 ≤ m + n ≤ 2000
-106 nums1[i], nums2[i] ≤ 106
Abstraction
Find median between two sorted arrays.
Space & Time Complexity
| Solution | Time Complexity | Space Complexity | Time Remark | Space Remark |
|---|---|---|---|---|
| Bug | Error |
|---|---|
Brute Force:
| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
Find the Bug:
Solution 1: Two Pointer Merge Until Median - Binary Search/Condition Adapted Binary Search
def findMedianSortedArrays(self, nums1, nums2):
# Precondition:
# nums1 and nums2 are sorted in non-decreasing order
# 0 <= len(nums1), len(nums2) <= 10^6
# Postcondition:
# Returns the median value of the merged sorted array
# If merged length is odd: median is the middle value
# If merged length is even: median is the average of the two middle values
# Lengths of the two arrays
m, n = len(nums1), len(nums2)
# Pointers to the current index in nums1 and nums2
p1, p2 = 0, 0
# Helper function:
# simulating one step of merge by returning the smallest
# of the current elements pointed to by p1 and p2
def get_min():
nonlocal p1, p2
# Case 1: Both pointers in bounds
# Return the smaller element and advance the pointer
if p1 < m and p2 < n:
# p1 smaller
if nums1[p1] < nums2[p2]:
ans = nums1[p1]
p1 += 1
# p2 smaller
else:
ans = nums2[p2]
p2 += 1
# Case 2: One of the pointers is out of range
# Return the smaller element in the other pointer and advance
# p2 out of bounds
elif p2 == n:
ans = nums1[p1]
p1 += 1
# p1 out of bounds
else:
ans = nums2[p2]
p2 += 1
# current smallest value between p1 and p2
return ans
# Invariant:
# At call i to get_min() (0 based indexing):
# Returns (i+1)th smallest element
# After i calls to get_min():
# i smallest elements have been discarded
# Note:
# Instead of merging the arrays fully,
# simulate merge only up to the median index.
# Note:
# For length k = m + n:
# If Odd: median is the k//2 element
# k = 5 -> k // 2 = 2 (3rd smallest)
# If Even: median is average of (k//2)-1 and k//2
# k = 6 -> (k // 2)-1 = 2 k//2 = 3 (3rd and 4th smallest)
# For odd length merged array
if (m+n)%2 == 1:
# Discard the first k//2 smallest elements
for i in range((m + n) // 2):
get_min()
# Return the k//2 smallest element
return get_min()
# For even length merged array
else:
# Discard the first (k//2 - 1) elements
for i in range((m + n) // 2 - 1):
get_min()
# Grab (k//2)-1
# Grab (k//2)
# Return average
return (get_min() + get_min()) / 2
# overall: time complexity O(m + n)
# overall: space complexity O(1)| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|
Solution 2: Binary Search Partition - Binary Search/Condition Adapted Binary Search
def findMedianSortedArrays(self, nums1, nums2):
# --------------------------------------------------
# Median of merged sorted array:
# - If Odd: return the middle value = [o o * o o]
# - If Even: return average of two middle values = [o * * o]
# nums1 = [1, 3, 8]
# nums2 = [7, 9, 10, 11]
# [1, 3, 7, 8, 9, 10, 11]
# odd Length => median = 8,
# --------------------------------------------------
# Partition Step:
# nums1: [1, 3 | 8] => left max = 3, right min = 8
# nums2: [7 | 9, 10, 11] => left max = 7, right min = 9
# left: [1, 3, 7]
# right: [8, 9, 10, 11]
# --------------------------------------------------
# The Final Partition Must Pass These Conditions:
#
# - left side has equal or more elements than right side:
# odd: left side has 1 more element than right
# even: left side has same number of elements than right
#
# - every element on left is <= than every element on the right
#
# The Final Partition Will Point To The Median:
# - Odd total length, median = max(left side)
# - Even total length, median = (max(left side) + min(right side)) / 2
# Final Partition:
# nums1: [1, 3, 8 | ] => left max = 8, right min = -
# nums2: [7 | 9, 10, 11] => left max = 7, right min = 9
# odd Length => median = 8,
# --------------------------------------------------
# Invariant:
# At each iteration, we maintain a partition (cut point) in nums1 (partitionX)
# and implicit partition (cut point) in nums2 (partitionY) such that:
#
# 1. Right Number Count <= Left num count
#
# 2. max left <= min right
# & a left <= min right:
#
# 3. all left elements <= all right elements
# --------------------------------------------------
# Ensures: nums1 is the smaller or equal array
if len(nums1) > len(nums2):
nums1, nums2 = nums2, nums1
# Len of two arrays
num1Len, num2Len = len(nums1), len(nums2)
# Preparing for nums2 determination
totalLen = num1Len + num2Len
leftSize = (totalLen + 1) // 2 # how many elements should be on the left
# Search space for nums1 cut point: [0, len(nums1)]
nums1Left, nums1Right = 0, num1Len
# Binary Search 1:
# Target Binary Search Space: '<='
# Each iteration is a 'guess' in smaller array that partitions it,
# this 'guess' implicitly creates a second 'guess' in the the larger array as well
while nums1Left <= nums1Right:
# Current Search Space nums1 Cut:
# [low, mid, high]
# nums1 Cut:
# guess cut point in nums1, and computer cut point in nums2 accordingly
nums1Guess = (nums1Left + nums1Right) // 2
# Invariant 1: Left num count >= Right num count
#
# - nums1 cut grabs a certain number of elements from nums1 and assigns them to the
# left or right side
#
# - this determines how many elements we need from nums2 for the left and right side
# to keep invariant 1 true
# Examples:
# nums1 = [1, 3, 8] # num1Len = 3
# nums2 = [7, 9, 10, 11] # num2Len = 4
# nums1Guess = 1
# (6+1)//2 -> 3
nums2Determined = leftSize - nums1Guess
# Grab values for partitions
# Allow for our invariant condition to be possible
# after reaching out of bounds of array
# if num1MaxLeft <= num2MinRight and num2MaxLeft <= num1MinRight:
# If the left part of nums1 is empty (partitionX == 0), then the maximum of an empty set should not restrict the comparison.
# Setting num1MaxLeft = -inf ensures the condition num1MaxLeft <= num2MinRight is always satisfied in this case.
num1MaxLeft = float('-inf') if nums1Guess == 0 else nums1[nums1Guess - 1]
# If the right part of nums1 is empty (partitionX == m), then there are no values to compare in the right half.
# Setting num1MinRight = +inf ensures the condition num2MaxLeft <= num1MinRight is always satisfied in this case.
num1MinRight = float('inf') if nums1Guess == num1Len else nums1[nums1Guess]
# If the left part of nums2 is empty (partitionY == 0), then the maximum of an empty set should not restrict the comparison.
# Setting num2MaxLeft = -inf ensures the condition num2MaxLeft <= num1MinRight is always satisfied in this case.
num2MaxLeft = float('-inf') if nums2Determined == 0 else nums2[nums2Determined - 1]
# If the right part of nums2 is empty (partitionY == n), then the minimum of an empty set should not restrict the comparison.
# Setting num2MinRight = inf ensures the condition num1MaxLeft <= num2MinRight is always satisfied in this case.
num2MinRight = float('inf') if nums2Determined == num2Len else nums2[nums2Determined]
# Discard Mid via Partitions Rule:
# max of left partition nums1 < min of right partition of nums2
# max of left partition nums2 < min of right partition of nums1
# we have successfully split 'merged' array into halves
if num1MaxLeft <= num2MinRight and num2MaxLeft <= num1MinRight:
# Even Merged Array
if (nums1Len + nums2Len) % 2 == 0:
return (max(num1MaxLeft, num2MaxLeft) + min(num1MinRight, num2MinRight)) / 2
# Odd Merged Array:
else:
return max(num1MaxLeft, num2MaxLeft)
# Continue Split Monotonic Increasing Shift
# If num1MaxLeft > num2MinRight:
# Means largest element on the left side of nums1 partition is greater
# than the smallest element on the right side of nums2 partition
# Implies: our partitionX is too far right -> too many elements from nums1
# are in the left half
# Condition Breaks Invariant: num1MaxLeft <= num2MinRight
# Discard right half of [low, partitionX, high]
# new search interval: [low, partitionX - 1]
elif num1MaxLeft > num2MinRight:
nums1Right = nums1Guess - 1
# If num2MaxLeft > num1MinRight
# Means largest element on the left side of nums2 partition is greater
# than the smallest element on the right side of nums1 partition
# Implies: our partitionX is too far left -> not enough elements from nums1
# are in the left half
# Condition Breaks Invariant: num2MaxLeft <= num1MinRight
# Discard left half of [low, partitionX, high]
# new search interval: [partition + 1, high]
elif num2MaxLeft > num1MinRight:
nums1Left = nums1Guess + 1
# overall: time complexity O(log (min(m, n)))
# overall: space complexity O(1)| Aspect | Time Complexity | Space Complexity | Time Remarks | Space Remarks |
|---|---|---|---|---|