#46 - Permutations
Permutations
- Difficulty: Medium
- Topics: Array, Backtracking
- Link: https://leetcode.com/problems/permutations/
Problem Description
Given an array nums of distinct integers, return all the possible permutations. You can return the answer in any order.
Example 1:
Input: nums = [1,2,3]
Output: [[1,2,3],[1,3,2],[2,1,3],[2,3,1],[3,1,2],[3,2,1]]
Example 2:
Input: nums = [0,1]
Output: [[0,1],[1,0]]
Example 3:
Input: nums = [1]
Output: [[1]]
Constraints:
1 <= nums.length <= 6-10 <= nums[i] <= 10- All the integers of
numsare unique.
Solution
Permutations - Comprehensive Solution
1. Problem Deconstruction (500+ words)
Rewrite the Problem
Technical Version: Given an array of distinct integers where and , compute the complete set of all possible permutations where each is an ordered -tuple containing every element of exactly once. Formally, where is the symmetric group on elements. The output must be a collection of arrays (or equivalent data structure) containing all distinct permutations without any specific ordering requirement.
Beginner Version: You have a list of different numbers. Your task is to find every possible way to rearrange these numbers into a different order. For example, if you have [1,2,3], you need to produce all arrangements like [1,2,3], [1,3,2], [2,1,3], [2,3,1], [3,1,2], and [3,2,1]. You need to make sure you don’t miss any arrangement and don’t repeat any arrangement. The numbers are all different, so you won’t have to worry about duplicates like [1,1,2].
Mathematical Version: Let be a set of distinct integers. Find the set where is the symmetric group on (the group of all bijections from to ). In simpler terms: . The cardinality of the solution set is .
Constraint Analysis
1 ≤ nums.length ≤ 6
- This constraint fundamentally changes the algorithmic approach we can consider. Since , even an exponential algorithm like brute force recursion () is completely acceptable. For , we generate only 720 permutations, which is trivial for modern computers.
- The upper bound of 6 eliminates concerns about stack overflow in recursive solutions (maximum recursion depth is 6).
- This constraint allows us to consider algorithms with factorial time complexity without worrying about performance. In contrast, if could be 20, would be impossible to compute.
-10 ≤ nums[i] ≤ 10
- This small integer range has several implications:
- Memory efficiency: Small integers can be stored efficiently (often in 4 bytes or less).
- No overflow concerns: Operations on these values won’t cause integer overflow in any language.
- Possible optimization: Since values are bounded, we could theoretically use bitmasking or other encoding schemes, though with distinct values, this isn’t particularly beneficial.
- Edge cases: The range includes negative numbers, so algorithms shouldn’t assume non-negative values for indexing purposes.
All integers of nums are unique
- This is the most critical constraint that simplifies the problem dramatically:
- No duplicate permutations: We don’t need deduplication logic, which would otherwise add significant complexity.
- Simpler backtracking: In backtracking algorithms, we don’t need to track which specific instance of a duplicate value was used.
- Simpler swapping: In swap-based approaches, swapping identical values would create identical permutations, but this isn’t a concern here.
- Counting is straightforward: The number of permutations is exactly , not fewer due to duplicates.
- Algorithm selection: We can use simpler permutation generation algorithms that assume distinct elements.
Hidden edge cases implied by constraints:
- Single element array (): The only permutation is the array itself.
- Maximum size array (): The algorithm must handle 720 permutations efficiently.
- Arrays with negative numbers: Must handle correctly without assuming positive indices.
- The constraint suggests that space complexity is less critical than with larger , as the output itself occupies space anyway.
2. Intuition Scaffolding
Generate 3 Analogies
1. Real-World Metaphor: Seating Arrangements Imagine you have distinct people (named by their numbers) and distinct chairs in a row. A permutation corresponds to a specific seating arrangement where each person gets exactly one chair. Generating all permutations is like trying every possible seating arrangement. You start by placing one person in the first chair, then for the second chair, you choose from the remaining people, and so on. This directly maps to the recursive backtracking approach.
2. Gaming Analogy: Inventory Rearrangement Think of a role-playing game where you have unique items in your inventory slots. You want to try every possible arrangement of these items to see which looks best or works best for quick access. Starting with an empty arrangement, you pick an item for the first slot, then another for the second, continuing until all slots are filled. Each complete arrangement is a permutation. The “undo” operation (backtracking) is like putting an item back into your pool to try a different one.
3. Math Analogy: Function Mapping Consider a set . A permutation is a bijective function . Generating all permutations means enumerating all possible bijections. This is equivalent to filling an -length sequence where position gets , with the constraint that is injective (no repeats). The number of such functions is the number of ways to arrange distinct objects: .
Common Pitfalls Section
1. The “Generate All Arrays and Filter” Fallacy What: Attempting to generate all possible arrays of length using elements from the input, then filtering to keep only those with all distinct elements. Why it fails: For , this generates candidates instead of valid permutations, wasting enormous computation. The time complexity would be which grows much faster than .
2. The “Global Tracking Without Backtracking” Error What: Using a single global array to build permutations but forgetting to “undo” choices when returning from recursion. Why it fails: This leads to permutations that contain the same element multiple times or incomplete permutations, as the global state becomes corrupted across recursive calls.
3. The “Swap and Forget to Swap Back” Mistake What: In iterative swap-based approaches (like Heap’s algorithm), forgetting to swap elements back after recursion or iteration, leading to incorrect future permutations. Why it fails: The array becomes permanently modified, preventing exploration of all possible permutations.
4. The “Assume Sorted Input for Ordering” Assumption What: Assuming the output must be in lexicographic order when the problem explicitly states “any order.” Why it’s problematic: While not wrong, it adds unnecessary complexity. Some algorithms naturally produce lexicographic order (like next_permutation), while others don’t (like backtracking). Imposing ordering when not required can complicate implementation.
5. The “Recursive Depth Equal to Array Length” Oversight What: Not realizing that recursion depth equals the number of elements, which for is trivial but would be problematic for larger . Why it’s worth noting: While safe here due to , this pattern wouldn’t scale to due to stack overflow, so it’s good to acknowledge the limitation.
6. The “Mutable vs Immutable Confusion” What: Modifying the input array directly without creating copies, then accidentally returning references to the same array for multiple permutations. Why it fails: All entries in the result would point to the same array object, which gets modified, resulting in all permutations being identical (the last permutation generated).
3. Approach Encyclopedia
Approach 1: Recursive Backtracking with Visited Tracking
Concept: Build permutations incrementally by choosing one unused element at each step, marking it as used, recursing to build the rest, then unmarking it (backtracking) to try other possibilities.
Pseudocode:
function generate_permutations(nums):
result = empty list
current = empty list
visited = boolean array of size n, all false
function backtrack():
if current.length == nums.length:
result.append(copy of current)
return
for i from 0 to nums.length-1:
if not visited[i]:
// Choose
visited[i] = true
current.append(nums[i])
// Explore
backtrack()
// Unchoose (backtrack)
current.remove_last()
visited[i] = false
backtrack()
return result
Complexity Proof:
- Time: for the loop and copying. Solving the recurrence: since we perform leaf operations, each requiring work for copying the permutation. More precisely: .
- Space: for recursion stack + for current permutation + for visited array = . The output space is which is separate from algorithm space.
Visualization for nums = [1,2,3]:
Level 0: []
Choose 1: [1]
Choose 2: [1,2]
Choose 3: [1,2,3] ✓
Choose 3: [1,3]
Choose 2: [1,3,2] ✓
Choose 2: [2]
Choose 1: [2,1]
Choose 3: [2,1,3] ✓
Choose 3: [2,3]
Choose 1: [2,3,1] ✓
Choose 3: [3]
Choose 1: [3,1]
Choose 2: [3,1,2] ✓
Choose 2: [3,2]
Choose 1: [3,2,1] ✓
Approach 2: Backtracking by Swapping (Reduced Space)
Concept: Generate permutations in-place by swapping elements. The first k elements form the current partial permutation, and we swap elements to generate new possibilities.
Pseudocode:
function generate_permutations(nums):
result = empty list
function backtrack(start):
if start == nums.length:
result.append(copy of nums)
return
for i from start to nums.length-1:
swap(nums[start], nums[i])
backtrack(start + 1)
swap(nums[start], nums[i]) // backtrack
backtrack(0)
return result
Complexity Proof:
- Time: Same as Approach 1: . Each leaf creates a copy (), and there are leaves.
- Space: for recursion stack only (no visited array or separate current list). The swaps happen in-place.
Visualization for nums = [1,2,3], showing array state at each node:
Start: [1,2,3]
swap(0,0): [1,2,3]
swap(1,1): [1,2,3]
swap(2,2): [1,2,3] ✓
swap(1,2): [1,3,2]
swap(2,2): [1,3,2] ✓
swap(0,1): [2,1,3]
swap(1,1): [2,1,3]
swap(2,2): [2,1,3] ✓
swap(1,2): [2,3,1]
swap(2,2): [2,3,1] ✓
swap(0,2): [3,2,1]
swap(1,1): [3,2,1]
swap(2,2): [3,2,1] ✓
swap(1,2): [3,1,2]
swap(2,2): [3,1,2] ✓
Approach 3: Heap’s Algorithm (Iterative)
Concept: An efficient algorithm that generates each permutation from the previous one by a single swap. It minimizes swaps and can be implemented iteratively.
Pseudocode:
function generate_permutations(nums):
result = [copy of nums]
n = nums.length
c = array of size n, all zeros // counter array
i = 1
while i < n:
if c[i] < i:
if i is even:
swap(nums[0], nums[i])
else:
swap(nums[c[i]], nums[i])
result.append(copy of nums)
c[i] += 1
i = 1
else:
c[i] = 0
i += 1
return result
Complexity Proof:
- Time: . Generates permutations, each requiring swaps plus for copying.
- Space: for counter array. The algorithm itself uses minimal extra space.
Visualization for nums = [1,2,3]:
Initial: [1,2,3], c=[0,0,0], i=1
c[1]=0<1, i even? No (i=1), swap(nums[0], nums[1]): [2,1,3]
Add [2,1,3], c[1]=1, i=1
c[1]=1<1? No, c[1]=0, i=2
c[2]=0<2, i even? Yes, swap(nums[0], nums[2]): [3,1,2]
Add [3,1,2], c[2]=1, i=1
c[1]=0<1, swap(nums[0], nums[1]): [1,3,2]
Add [1,3,2], c[1]=1, i=1
c[1]=1<1? No, c[1]=0, i=2
c[2]=1<2, i even? Yes, swap(nums[0], nums[2]): [2,3,1]
Add [2,3,1], c[2]=2, i=1
c[1]=0<1, swap(nums[0], nums[1]): [3,2,1]
Add [3,2,1], c[1]=1, i=1
c[1]=1<1? No, c[1]=0, i=2
c[2]=2<2? No, c[2]=0, i=3
i=3 >= n, terminate
Approach 4: Lexicographic Ordering via Next Permutation
Concept: Start with the lexicographically smallest permutation (sorted array), then repeatedly generate the next lexicographic permutation until we cycle back to the beginning.
Next Permutation Algorithm:
- Find the largest index
isuch thatnums[i] < nums[i+1](first decreasing element from the right). - Find the largest index
jsuch thatnums[j] > nums[i]. - Swap
nums[i]andnums[j]. - Reverse the suffix starting at
i+1.
Pseudocode:
function generate_permutations(nums):
result = []
nums = sort(nums) // Get smallest permutation
result.append(copy of nums)
while true:
// Find first decreasing element from right
i = nums.length - 2
while i >= 0 and nums[i] >= nums[i+1]:
i -= 1
if i < 0:
break // All permutations generated
// Find element to swap with
j = nums.length - 1
while nums[j] <= nums[i]:
j -= 1
swap(nums[i], nums[j])
// Reverse suffix
left = i + 1
right = nums.length - 1
while left < right:
swap(nums[left], nums[right])
left += 1
right -= 1
result.append(copy of nums)
return result
Complexity Proof:
- Time: . Generating each next permutation takes time (finding indices + reversing), and we generate permutations.
- Space: extra space (excluding output).
Visualization for nums = [1,2,3]:
Sorted: [1,2,3]
1. i=1 (2<3), j=2 (3>2), swap: [1,3,2], reverse suffix (2): [1,3,2]
2. i=0 (1<3), j=2 (2>1), swap: [2,3,1], reverse suffix (3,1): [2,1,3]
3. i=1 (1<3), j=2 (3>1), swap: [2,3,1], reverse suffix (1): [2,3,1]
4. i=0 (2<3), j=2 (1>2? No), j=1 (3>2), swap: [3,2,1], reverse suffix (2,1): [3,1,2]
5. i=1 (1<2), j=2 (2>1), swap: [3,2,1], reverse suffix (1): [3,2,1]
6. i<0, break
Approach 5: Using Python’s itertools.permutations (Built-in)
Concept: Leverage Python’s standard library function which is implemented in C and highly optimized.
Pseudocode:
import itertools
def generate_permutations(nums):
return list(itertools.permutations(nums))
# Or convert to list of lists:
# return [list(p) for p in itertools.permutations(nums)]
Complexity Proof:
- Time: . The C implementation is efficient but still must generate all permutations.
- Space: for the iterator state.
Note: While correct, this approach is typically not accepted in coding interviews where the goal is to demonstrate algorithmic understanding.
4. Code Deep Dive
Optimal Solution: Backtracking by Swapping (Approach 2)
This approach balances simplicity, efficiency, and clarity. It uses extra space and has clean recursive structure.
class Solution:
def permute(self, nums):
"""
Generate all permutations of distinct integers.
Args:
nums: List[int] - Distinct integers (length 1-6)
Returns:
List[List[int]] - All permutations in any order
"""
# Result list to store all permutations
result = []
def backtrack(start):
"""
Recursive backtracking function that generates permutations
by swapping elements in-place.
Args:
start: int - Current position in array to fix
"""
# Base case: when start reaches end, we have a complete permutation
if start == len(nums):
# IMPORTANT: Append a copy, not the reference to nums
# Since nums is being modified in-place
result.append(nums[:])
return
# Iterate through all possible elements to place at position 'start'
for i in range(start, len(nums)):
# Place nums[i] at position start by swapping
nums[start], nums[i] = nums[i], nums[start]
# Recursively generate permutations for the remaining positions
backtrack(start + 1)
# Backtrack: restore original order to try other possibilities
nums[start], nums[i] = nums[i], nums[start]
# Start the recursive generation from position 0
backtrack(0)
return result
Line-by-Line Annotations
Line 1-3: Class definition - standard LeetCode solution format.
Line 4-12: Function signature and docstring explaining inputs, outputs, and constraints.
Line 14: result = [] - Initialize empty list to store all permutations. This list will eventually contain sublists.
Line 16-39: Inner function backtrack(start) definition:
- Line 20-23: Base case check. When
start == len(nums), all positions are fixed, so we have a complete permutation. - Line 24:
result.append(nums[:])- Critical: we append a copy ofnumsusing slice notation. If we appendednumsdirectly, all entries inresultwould reference the same mutable list, which would all end up identical (the last permutation generated). - Line 27:
for i in range(start, len(nums)):- Loop through all indices fromstartto end. At recursion levelstart, we’re deciding what element goes at positionstart. - Line 29:
nums[start], nums[i] = nums[i], nums[start]- Swap elements to placenums[i]at positionstart. This is the “choose” step. - Line 32:
backtrack(start + 1)- Recursively generate permutations for remaining positions (start+1to end). - Line 35:
nums[start], nums[i] = nums[i], nums[start]- Backtrack by swapping back. This restores the array to its state before the recursive call, allowing us to try other elements at positionstart.
Line 41: backtrack(0) - Initiate the recursive process starting at position 0.
Line 43: return result - Return the complete list of permutations.
Edge Case Handling
Example 1: nums = [1,2,3]
- The algorithm generates all 6 permutations through recursive swapping.
- The order in result may vary based on the swapping order but will contain all permutations.
- Base case triggers when
start == 3, capturing each complete permutation.
Example 2: nums = [0,1]
- Initial call:
backtrack(0)i=0: swap(0,0) → [0,1],backtrack(1)i=1: swap(1,1) → [0,1],backtrack(2)→ append [0,1]
i=1: swap(0,1) → [1,0],backtrack(1)i=1: swap(1,1) → [1,0],backtrack(2)→ append [1,0]
- Returns
[[0,1], [1,0]]
Example 3: nums = [1]
- Initial call:
backtrack(0)i=0: swap(0,0) → [1],backtrack(1)→ append [1]
- Returns
[[1]]
Single Element Array:
- The for loop runs once (
range(0,1)), swaps element with itself, recurses to base case, adds the single permutation.
Maximum Size Array (n=6):
- Recursion depth is 6, well within Python’s default recursion limit (1000).
- Generates 720 permutations, each of length 6.
- The swapping approach is efficient as it avoids creating intermediate lists for each partial permutation.
Negative Numbers: Works correctly as the algorithm only swaps values, never uses them as indices.
5. Complexity War Room
Hardware-Aware Analysis
Memory Footprint Analysis:
- For
n=6, output size: (Python ints are ~28 bytes). - Recursion stack: 6 frames × ~200 bytes/frame ≈ 1.2KB.
- Working array: 6 elements × 28 bytes = 168 bytes.
- Total: ~122KB, easily fits in L2/L3 cache (typically 256KB-8MB).
Performance Characteristics:
- Algorithm is CPU-bound, not memory-bound.
- With
n=6, generates 720 permutations. Even at 1000 operations per permutation, that’s only 720,000 operations - trivial for modern CPUs (billions of operations per second). - Recursion overhead is minimal due to small depth.
- The swapping approach minimizes memory allocations compared to approaches that create new lists at each step.
Cache Efficiency:
- The
numsarray fits in a single cache line (6×8=48 bytes on 64-bit, assuming 8-byte pointers). - Sequential access pattern in loops has good spatial locality.
- The result list grows sequentially, benefiting from cache prefetching.
Industry Comparison Table
| Approach | Time Complexity | Space Complexity (excl. output) | Readability (1-10) | Interview Viability | Key Trade-offs |
|---|---|---|---|---|---|
| Recursive Backtracking with Visited | 9 | ✅ Excellent | Clear logic, uses extra O(n) for visited array | ||
| Backtracking by Swapping | (recursion) | 8 | ✅ Recommended | Minimal extra memory, in-place manipulation | |
| Heap’s Algorithm (Iterative) | (counter array) | 6 | ✅ Good | Non-recursive, constant additional space | |
| Lexicographic Next Permutation | 7 | ✅ Good | Produces sorted output, no recursion | ||
| itertools.permutations | 10 | ❌ Not in interviews | Too trivial, doesn’t demonstrate understanding | ||
| Brute Force (generate all ) | 5 | ❌ Poor | Exponentially worse than factorial |
Interview Viability Rationale:
- Backtracking by Swapping is ideal: demonstrates recursion, backtracking, in-place manipulation, and understanding of permutations.
- Lexicographic approach shows knowledge of algorithmic permutation generation.
- Heap’s algorithm demonstrates knowledge of specialized algorithms but is harder to derive during interview.
- Built-in functions fail to demonstrate algorithmic thinking.
6. Pro Mode Extras
Variants Section
Variant 1: Permutations with Duplicates
def permuteUnique(nums):
result = []
nums.sort() # Sort to handle duplicates
def backtrack(start):
if start == len(nums):
result.append(nums[:])
return
used = set()
for i in range(start, len(nums)):
if nums[i] in used:
continue # Skip duplicates
used.add(nums[i])
nums[start], nums[i] = nums[i], nums[start]
backtrack(start + 1)
nums[start], nums[i] = nums[i], nums[start]
backtrack(0)
return result
Difference: Need to track used values at each level to avoid duplicate permutations from duplicate elements.
Variant 2: k-Permutations (Partial Permutations) Generate all permutations of length k from n elements:
def k_permutations(nums, k):
result = []
def backtrack(path, used):
if len(path) == k:
result.append(path[:])
return
for i in range(len(nums)):
if not used[i]:
used[i] = True
path.append(nums[i])
backtrack(path, used)
path.pop()
used[i] = False
backtrack([], [False]*len(nums))
return result
Difference: Stop recursion when path reaches length k instead of n.
Variant 3: Permutations in Lexicographic Order (Next Permutation as Generator)
def next_permutation(nums):
# Implementation as described in Approach 4
# Returns True if next permutation exists, False otherwise
def sorted_permutations(nums):
nums = sorted(nums)
result = [nums[:]]
while next_permutation(nums):
result.append(nums[:])
return result
Difference: Guarantees lexicographic order, useful when order matters.
Variant 4: Permutations with Custom Constraint Example: Only include permutations where adjacent elements differ by more than 1:
def constrained_permutations(nums):
result = []
def backtrack(start):
if start == len(nums):
# Check constraint before adding
valid = True
for i in range(len(nums)-1):
if abs(nums[i] - nums[i+1]) <= 1:
valid = False
break
if valid:
result.append(nums[:])
return
for i in range(start, len(nums)):
nums[start], nums[i] = nums[i], nums[start]
# Optional: prune early if constraint already violated
if start == 0 or abs(nums[start] - nums[start-1]) > 1:
backtrack(start + 1)
nums[start], nums[i] = nums[i], nums[start]
backtrack(0)
return result
Interview Cheat Sheet
Key Points to Mention:
- Complexity First: Always state time/space complexity upfront: “This generates all n! permutations, each requiring O(n) work, so O(n·n!) time and O(n) extra space.”
- Constraint Awareness: “Given n ≤ 6, n! ≤ 720, so factorial complexity is acceptable.”
- Distinct Elements: “Since elements are distinct, we don’t need deduplication logic.”
- Choice of Algorithm: “I’ll use backtracking with swapping as it’s space-efficient and demonstrates the recursive permutation generation clearly.”
Common Interview Questions:
- Q: “How would you handle duplicates?” A: “Sort the array and skip duplicate elements at each recursion level using a set or by comparing with previous element when sorted.”
- Q: “What if n could be much larger, say 100?” A: “We couldn’t generate all permutations as 100! is astronomically large. We’d need to sample or use a different approach depending on the problem.”
- Q: “Can you do it iteratively?” A: “Yes, using Heap’s algorithm or the next_permutation approach which uses O(1) extra space and no recursion.”
- Q: “How does this relate to combinatorial problems?” A: “This is the classic permutation problem which forms the basis for many combinatorial generation algorithms. The backtracking pattern is reusable for subset, combination, and other constraint satisfaction problems.”
Optimization Insights:
- Early pruning: If the problem has constraints, check them during recursion to avoid exploring dead ends.
- Memory: For very large n (not this problem), consider iterative approaches to avoid recursion depth limits.
- Parallelization: Permutation generation can be parallelized by dividing the initial choices among threads.
Testing Strategy:
- Test with n=1 (single element)
- Test with n=2 (minimum non-trivial case)
- Test with n=6 (maximum per constraints)
- Test with negative numbers
- Verify count: output should have exactly n! permutations
- Verify completeness: all elements appear in each permutation
- Verify uniqueness: no duplicate permutations (automatically satisfied with distinct inputs)