Problem Description
Given a positive integer n
, you can apply one of the following operations:
- If
n
is even, replacen
withn / 2
. - If
n
is odd, replacen
with eithern + 1
orn - 1
.
Return the minimum number of operations needed for n
to become 1
.
Example 1:
Input: n = 8 Output: 3 Explanation: 8 -> 4 -> 2 -> 1
Example 2:
Input: n = 7 Output: 4 Explanation: 7 -> 8 -> 4 -> 2 -> 1 or 7 -> 6 -> 3 -> 2 -> 1
Example 3:
Input: n = 4 Output: 2
Constraints:
1 <= n <= 231 - 1
Difficulty: Medium
Tags: dynamic programming, greedy, bit manipulation, memoization
Rating: 73.74%
Approach 1: Greedy with Bit Manipulation
The first solution uses a greedy approach combined with bit manipulation. The insight here is that we want to minimize the number of 1 bits in our binary representation whenever possible, as each 1 bit will require additional operations.
def integerReplacement(n: int) -> int:
operations = 0
while n != 1:
if n % 2 == 0:
n = n // 2
else:
# Special case for 3
if n == 3:
n = n - 1
# For other odd numbers, check the last two bits
elif (n & 2) == 2: # if second-to-last bit is 1
n = n + 1
else:
n = n - 1
operations += 1
return operations
How the Bit Manipulation Works
Let’s understand why we look at the second-to-last bit for odd numbers:
- When n is odd, its binary form ends with ‘1’
- If the last two bits are ‘11’:
- Adding 1 gives us ‘100’, creating trailing zeros
- Subtracting 1 gives us ‘10’, which still has a 1 bit
- If the last two bits are ‘01’:
- Adding 1 gives us ‘10’, which has one 1 bit
- Subtracting 1 gives us ‘00’, which is better
For example:
- 15 (1111) → 16 (10000) is better than 14 (1110)
- 5 (101) → 4 (100) is better than 6 (110)
The special case for n = 3 (11) exists because going to 2 (10) is better than going to 4 (100) in this specific case.
Approach 2: Dynamic Programming with Memoization
Our second approach uses dynamic programming to cache intermediate results, preventing redundant calculations for numbers we’ve already processed.
def integerReplacement(self, n: int) -> int:
cache = {}
def helper(num):
if num == 1:
return 0
if num in cache:
return cache[num]
if num % 2 == 0:
result = 1 + helper(num // 2)
else:
result = 1 + min(helper(num + 1), helper(num - 1))
cache[num] = result
return result
return helper(n)
How the DP Solution Works
The dynamic programming solution breaks down the problem into smaller subproblems:
- For even numbers: We only have one choice (divide by 2)
- For odd numbers: We try both options (add 1 or subtract 1) and take the minimum
- We cache results to avoid recalculating the same values
- The base case is when we reach 1, requiring 0 additional operations
This approach is particularly useful for:
- Understanding the optimal solution for each number
- Avoiding repeated calculations
- Handling complex cases automatically
Complexity Analysis
Greedy Approach
- Time Complexity: - We divide by 2 roughly half the time
- Space Complexity: - We only use a constant amount of extra space
DP Approach
- Time Complexity: - We may need to calculate values for all numbers up to n
- Space Complexity: - We cache results for up to n numbers