← ~/lc-grind/posts

#137 - Single Number II

January 2, 2026

Single Number II

Problem Description

Given an integer array nums where every element appears three times except for one, which appears exactly once. Find the single element and return it.

You must implement a solution with a linear runtime complexity and use only constant extra space.

Example 1:

Input: nums = [2,2,3,2]
Output: 3

Example 2:

Input: nums = [0,1,0,1,0,1,99]
Output: 99

Constraints:

  • 1 <= nums.length <= 3 * 104
  • -231 <= nums[i] <= 231 - 1
  • Each element in nums appears exactly three times except for one element which appears once.

Solution

1. Problem Deconstruction (500+ words)

Rewrite the Problem

  1. Technical Version
    Given an array nums of 32-bit signed integers, each element appears exactly three times except for one unique element that appears exactly once. The task is to identify that unique element under the constraints of linear time complexity ( O(n) ) and constant extra space ( O(1) ). Formally, let ( f: \mathbb{Z} \to \mathbb{N} ) be the multiplicity function for the multiset ( S ) represented by nums. Then ( f(a) = 1 ) for some ( a ), and ( \forall x \neq a, f(x) = 3 ). The algorithm must compute ( a ) without sorting or using hash maps.

  2. Beginner Version
    You have a list of numbers. In this list, every number repeats three times, except for one number that shows up only once. Your goal is to find that single number. You must do this by looking at each number just once and without using extra memory proportional to the list size.

  3. Mathematical Version
    Let ( n = 3k + 1 ) and ( nums = [a_1, a_2, \dots, a_n] ). Define the bitwise representation of each number as a vector over ( \mathbb{F}3 ) (the field of integers modulo 3). For each bit position ( i ) (0-indexed from LSB), let ( c_i = \sum{j=1}^n \text{bit}i(a_j) ). Then ( c_i \equiv b_i \pmod{3} ), where ( b_i ) is the ( i )-th bit of the unique element. Thus, the unique element is ( \sum{i=0}^{31} b_i \cdot 2^i ), with appropriate interpretation as a signed 32-bit integer.

Constraint Analysis

  • 1 <= nums.length <= 3 * 10^4
    This bounds the input size to at most 30,000 elements. An ( O(n^2) ) algorithm would approach ( 9 \times 10^8 ) operations, which is infeasible within typical time limits. The linear runtime requirement forces us to process each element a constant number of times. The lower bound of 1 ensures we always have at least one element, simplifying edge cases.

  • -2^31 <= nums[i] <= 2^31 - 1
    The numbers are 32-bit signed integers, requiring handling of negative values and the sign bit. This range precludes using the numbers directly as indices in a frequency array (which would require ( 2^{32} ) entries). Bitwise operations must consider two’s complement representation, but we can treat the numbers as 32-bit unsigned for bit counting and then convert to signed.

  • Each element appears exactly three times except for one
    This structural guarantee ensures the total count is ( 3k + 1 ). It allows mathematical approaches like summing bits modulo 3. Edge cases:

    • The unique element could be zero (all bits zero).
    • The unique element could be negative (sign bit set).
    • The array may contain duplicate sets of three identical numbers, but the unique element may also appear in the middle of such triplets.
    • Since all other elements appear exactly three times, any solution based on modulo 3 arithmetic will be exact.

Hidden Implications

  • Because of the sign bit, a naive bit-counting approach might produce a result interpreted as unsigned; we must convert back to signed 32-bit.
  • The linear time constraint prohibits sorting or nested loops.
  • Constant extra space forbids hash tables, sets, or arrays proportional to ( n ) or the value range.

2. Intuition Scaffolding

Analogies

  1. Real-World Metaphor (Library Books)
    Imagine a library with many copies of books. Each book has a barcode. Most barcodes appear on three identical copies, but one rare book has a unique barcode with only one copy. You have a scanner that reads barcodes and a special counter that only remembers counts modulo 3. As you scan, you update the modulo-3 counts for each bit of the barcode. At the end, the barcode with modulo-3 count 1 is the rare book.

  2. Gaming Analogy (Inventory Management)
    In an RPG, monsters drop items. Common items drop in sets of three, but a legendary item drops alone. Your inventory has limited slots (constant space). You use a magic spell that combines every three identical items into a token, leaving the legendary item uncombined. This is like counting modulo 3: when three identical items are seen, they cancel out.

  3. Math Analogy (Vector Spaces over GF(3))
    View each number as a 32-dimensional vector over the finite field ( \mathbb{F}_3 ) (integers modulo 3). Adding all vectors (using component-wise addition modulo 3) yields the vector of the unique element because triples sum to zero modulo 3. This is equivalent to counting set bits modulo 3 per position.

Common Pitfalls

  1. Using XOR naively – XOR cancels pairs (( x \oplus x = 0 )) but not triples (( x \oplus x \oplus x = x )), so XOR of all numbers gives the XOR of all distinct numbers, not the singleton.
  2. Hash map for frequencies – Requires ( O(n) ) extra space, violating constant space.
  3. Sorting then scanning – ( O(n \log n) ) time, violates linear time.
  4. Summing all numbers and dividing – Sum = ( 3 \times \sum_{x \neq a} x + a ). Unknown sum of distinct numbers makes this intractable.
  5. Using a set to compute ( (3 \times \text{sum(set)} - \text{sum(nums)}) / 2 ) – Fails due to integer division and sign issues; also uses ( O(n) ) space for the set.
  6. Ignoring negative numbers in bit manipulation – Not converting back to signed 32-bit may return a large positive number for negative singletons.
  7. Misunderstanding the state machine – Incorrect transitions in the two-variable bitwise approach can lead to wrong results.

3. Approach Encyclopedia

Approach 1: Brute Force (Nested Loops)

  • Idea: For each element, count its occurrences by scanning the entire array.
  • Pseudocode:
    def singleNumber(nums):
        for i in range(len(nums)):
            count = 0
            for j in range(len(nums)):
                if nums[j] == nums[i]:
                    count += 1
            if count == 1:
                return nums[i]
    
  • Complexity Proof:
    Time: ( \sum_{i=1}^n n = n^2 = O(n^2) ).
    Space: ( O(1) ) for counters.
  • Visualization:
    nums = [2,2,3,2]
    i=0: count 2's → 3 ✗
    i=1: count 2's → 3 ✗
    i=2: count 3's → 1 ✓ → return 3
    

Approach 2: Sorting

  • Idea: Sort the array, then scan in steps of 3; the singleton will break the triple pattern.
  • Pseudocode:
    def singleNumber(nums):
        nums.sort()
        n = len(nums)
        for i in range(0, n, 3):
            if i + 1 >= n or nums[i] != nums[i + 1]:
                return nums[i]
        # Should not reach here
    
  • Complexity Proof:
    Time: ( O(n \log n) ) for sorting + ( O(n) ) for scan = ( O(n \log n) ).
    Space: ( O(1) ) if in-place sort, but typical sort uses ( O(\log n) ) stack space.
  • Visualization:
    sorted nums = [0,0,0,1,1,1,99]
    i=0: nums[0]=0 == nums[1]=0 → triplet, skip to i=3
    i=3: nums[3]=1 == nums[4]=1 → triplet, skip to i=6
    i=6: no nums[7] → return 99
    

Approach 3: Hash Map (Frequency Count)

  • Idea: Count occurrences of each number using a hash map.
  • Pseudocode:
    def singleNumber(nums):
        freq = {}
        for num in nums:
            freq[num] = freq.get(num, 0) + 1
        for num, cnt in freq.items():
            if cnt == 1:
                return num
    
  • Complexity Proof:
    Time: ( O(n) ) for building map + ( O(n) ) for scanning = ( O(n) ).
    Space: ( O(n) ) in worst case for storing distinct numbers.
  • Visualization:
    nums = [2,2,3,2]
    freq = {2:3, 3:1} → return 3
    

Approach 4: Bit Count Array (32 Counters)

  • Idea: Maintain 32 counters for each bit position, sum bits modulo 3, reconstruct the number.
  • Pseudocode:
    def singleNumber(nums):
        count = [0] * 32
        for num in nums:
            for i in range(32):
                if (num >> i) & 1:
                    count[i] += 1
        result = 0
        for i in range(32):
            if count[i] % 3 == 1:
                result |= (1 << i)
        # Convert to signed 32-bit
        if result >= 2**31:
            result -= 2**32
        return result
    
  • Complexity Proof:
    Time: ( 32 \times n = O(n) ).
    Space: ( O(32) = O(1) ).
  • Visualization (4-bit example, nums = [2,2,3,2]):
    Bit positions:
    i=0: count 1's = 1 (from 3) → 1 mod 3 = 1 → set bit 0
    i=1: count 1's = 4 (three 2's have bit1 set, 3 has bit1 set) → 4 mod 3 = 1 → set bit1
    i=2+: all 0.
    Result: binary 11 = 3.
    

Approach 5: Bit Manipulation with Two Variables (Optimal)

  • Idea: Simulate a finite state machine that tracks counts modulo 3 for each bit using two integers ones and twos.
  • State Transition per bit:
    Let state (twos, ones) represent count modulo 3:
    • 00 → count 0
    • 01 → count 1
    • 10 → count 2
      On input bit b:
    • If b=0, state unchanged.
    • If b=1, state transitions: 00 → 01, 01 → 10, 10 → 00.
  • Update Rules:
    ones = (ones ^ num) & ~twos
    twos = (twos ^ num) & ~ones
    
  • Pseudocode:
    def singleNumber(nums):
        ones, twos = 0, 0
        for num in nums:
            ones = (ones ^ num) & ~twos
            twos = (twos ^ num) & ~ones
        return ones
    
  • Complexity Proof:
    Time: ( n \times (4 \text{ bitwise operations}) = O(n) ).
    Space: ( O(1) ) for two integers.
  • Visualization (nums = [2,2,3,2]):
    Start: ones=0, twos=0
    num=2 (0010): ones=(0^2)&~0=2, twos=(0^2)&~2=0
    num=2: ones=(2^2)&~0=0, twos=(0^2)&~0=2
    num=3 (0011): ones=(0^3)&~2=1, twos=(2^3)&~1=0
    num=2: ones=(1^2)&~0=3, twos=(0^2)&~3=0
    return ones=3
    

4. Code Deep Dive

Line-by-Line Annotations (Optimal Solution)

def singleNumber(nums):
    ones = 0  # Bits that have appeared 1 time modulo 3
    twos = 0  # Bits that have appeared 2 times modulo 3
    for num in nums:
        # Update ones: XOR adds current num's bits to ones, 
        # then mask out bits that are already in twos (since those bits are in state 2).
        ones = (ones ^ num) & ~twos
        
        # Update twos: XOR adds current num's bits to twos,
        # then mask out bits that are now in ones (since those bits just entered state 1).
        twos = (twos ^ num) & ~ones
    
    # After processing all numbers, ones holds bits of the singleton
    # (state 01), and twos is 0 (state 00).
    return ones

Edge Case Handling

  • Example 1 (nums = [2,2,3,2]):
    Iteration details as above. Final ones = 3.
  • Example 2 (nums = [0,1,0,1,0,1,99]):
    After processing first six numbers, ones=0, twos=0. Last number 99:
    ones = (0^99) & ~0 = 99, twos = (0^99) & ~99 = 0. Returns 99.
  • Negative Numbers:
    Suppose nums = [-1,-1,-1,-2]. In two’s complement, -1 has all bits set, -2 has all bits except LSB set. The algorithm computes per-bit modulo 3 counts. For -1, each bit count is 3 → 0 mod 3. For -2, each bit count is 1 mod 3 for LSB, 0 for others. Result: bits of -2. In Python, this yields the correct negative integer due to infinite-bit representation.
  • Single Element Array:
    If nums = [5], loop runs once: ones = (0^5)&~0=5, twos=0, returns 5.
  • Large Input:
    Handles up to 30,000 elements efficiently.

5. Complexity War Room

Hardware-Aware Analysis

  • Time: For ( n = 3 \times 10^4 ), the two-variable solution performs ~4 bitwise operations per element → ~120,000 operations. This is negligible on modern CPUs (can be done in <1 ms).
  • Space: Two 32-bit integers (8 bytes each in C, but Python objects have overhead). The input array itself is ~120 KB (30k * 4 bytes), fitting in L2/L3 cache, making memory accesses fast.
  • Cache Efficiency: Sequential array traversal maximizes cache locality; no random memory access.

Industry Comparison Table

Approach Time Space Readability Interview Viability
Brute Force ( O(n^2) ) ( O(1) ) High (simple loops) ❌ Fails large ( n )
Sorting ( O(n \log n) ) ( O(1) ) or ( O(\log n) ) Medium ⚠️ Not linear time
Hash Map ( O(n) ) ( O(n) ) High ⚠️ Violates constant space
Bit Count Array ( O(n) ) ( O(1) ) Medium (clear logic) ✅ Acceptable
Two-Variable Bit Manipulation ( O(n) ) ( O(1) ) Low (bitwise tricks) ✅ Optimal & expected

6. Pro Mode Extras

Variants Section

  1. Single Number I (LC 136) – Every element appears twice except one. Solution: XOR all numbers.
  2. Single Number III (LC 260) – Two elements appear once, others twice. Solution: XOR all to get xor = a ^ b, find a set bit in xor, partition numbers into two groups based on that bit, XOR each group.
  3. Single Number with k Occurrences – Generalization: every element appears k times except one appears once. Solution: Use ( m = \lceil \log_2 k \rceil ) counters, update with digital logic. For k=3, m=2 (ones, twos).
  4. Single Number II with Arbitrary k and p – Every element appears k times except one appears p times (p not multiple of k). Solution: Count bits modulo k, reconstruct number from bits where count % k = p % k.
  5. Single Number in Streaming Data – If data stream is infinite, the two-variable approach works with constant memory.

Interview Cheat Sheet

  • First Steps: Restate problem, confirm constraints (linear time, constant space).
  • Key Insight: Counting bits modulo 3 unlocks the solution.
  • Explain State Machine: Draw state diagram for one bit, show transitions with input 0/1.
  • Mention Alternatives: Discuss hash map and sorting, then explain why they are suboptimal.
  • Handle Negatives: For bit-count array, convert result to signed 32-bit by subtracting ( 2^{32} ) if sign bit set.
  • Test Thoroughly: Use examples with positives, negatives, zero, and single-element array.
  • Complexity: Always articulate time and space complexity clearly.
  • Practice: Be ready to derive update rules for the two-variable method or extend to general k.