Patterns

Two Pointer Patterns — Complete Guide

Master the three two-pointer patterns: opposite-direction converging pointers for sorted arrays, same-direction fast/slow pointers for linked lists and duplicate removal, and fixed-gap pointers that bridge two pointers and sliding window. Learn when each applies and how to avoid the most common pitfalls.

11 min read|

Two Pointer Patterns

Complete Guide

When to Use Two Pointers

Two pointers is not a single technique — it is a family of patterns that reduce O(n²) brute-force approaches to O(n) by maintaining two indices into a sequence and advancing them according to a problem-specific rule. The key recognition signal is a problem on sorted arrays, palindromes, linked lists, or in-place operations that asks for O(1) extra space.

The three variants each target a different class of problem. Opposite-direction (converging) pointers work on sorted arrays where you can use the sorted order to prune search space. Same-direction (fast/slow) pointers work when you need to detect cycles, find midpoints, or remove elements in-place without a second pass. Fixed-gap pointers maintain a constant distance between the two indices and are the bridge to sliding-window problems.

If a problem says "in-place," "O(1) space," or involves a sorted linear structure with a two-value constraint (sum equals target, palindrome check, container area), two pointers is almost certainly the right tool. Learn to recognize which of the three variants fits before writing any code.

  • Sorted array operations — two sum II, container with most water, 3sum, trapping rain water
  • Palindrome checks — valid palindrome, longest palindromic substring (expand-around-center)
  • Linked list cycle detection and midpoint finding — Floyd's tortoise and hare
  • In-place partitioning — remove duplicates, move zeroes, Dutch national flag
  • Signal: problem says "in-place" or "O(1) space" on sorted or linear data
  • Signal: brute force is O(n²) with two nested loops iterating the same array

Opposite Direction (Converging)

Converging pointers start at opposite ends of a sorted array — left at index 0, right at index n - 1 — and move toward each other until they meet. At each step, you evaluate the pair (nums[left], nums[right]). If the current value is too large, decrement right to reduce it. If the current value is too small, increment left to increase it. This eliminates one candidate per step, giving O(n) total iterations.

The canonical example is two sum II: given a sorted array, find two numbers that sum to a target. If nums[left] + nums[right] == target, you are done. If the sum is less than target, left++ (need a larger value). If the sum is greater than target, right-- (need a smaller value). Each iteration either finds the answer or eliminates one index, so the loop runs at most n times.

Container with most water and 3sum follow the same template. For 3sum, fix the first element with an outer loop, then run converging pointers on the remaining subarray for each fixed element. For valid palindrome, converging pointers check if s[left] == s[right] while skipping non-alphanumeric characters — left and right converge symmetrically from the ends.

💡

Why converging pointers work on sorted data

Converging pointers work on sorted data because moving left increases the value and moving right decreases it. This monotonic behavior enables pruning: if the current pair's sum exceeds the target, you know every pair with the current left pointer also exceeds the target — so you can safely decrement right and discard an entire column of candidates in one step.

Same Direction (Fast/Slow)

Fast/slow pointers both start at the beginning and move in the same direction, but at different speeds or with different advancement conditions. The classic use is cycle detection in a linked list (Floyd's tortoise and hare): slow moves one step per iteration, fast moves two. If there is a cycle, fast laps slow and they meet inside the cycle. If there is no cycle, fast reaches null first.

For finding the middle of a linked list, slow moves one step and fast moves two. When fast reaches the end, slow is at the middle. This is useful in merge sort on linked lists — split at the middle, sort each half, merge. For "remove duplicates from sorted array," both pointers start at 0 but fast always advances while slow advances only when a new unique value is found. The slow pointer marks the boundary of the deduplicated prefix.

The happy number problem maps naturally to fast/slow: treat the number sequence as an implicit linked list. If the sequence eventually loops to a number seen before (other than 1), the fast pointer will catch up to the slow pointer inside the loop. If it reaches 1, there is no cycle and the number is happy.

  1. 1Step 1 — Initialize slow = head (or 0), fast = head (or 0)
  2. 2Step 2 — Advance: slow = slow.next (or slow += 1), fast = fast.next.next (or fast += 2)
  3. 3Step 3 — Cycle detection: if slow == fast inside the loop, a cycle exists
  4. 4Step 4 — Middle finding: when fast == null or fast.next == null, slow is at the middle
  5. 5Step 5 — In-place dedup: if nums[fast] != nums[slow], increment slow and copy nums[fast] to nums[slow]
  6. 6Step 6 — Always check the fast pointer and fast.next for null before double-advancing on linked lists

Fixed-Gap Pattern

The fixed-gap pattern maintains a constant distance between the two pointers. The canonical example is "remove nth node from end of list": advance the fast pointer n steps ahead of slow. Then advance both one step at a time until fast reaches the end. At that point, slow points to the node just before the target, enabling O(1) deletion in a single pass.

Fixed-gap also appears in problems where you need to look ahead a fixed number of positions without revisiting. In a sorted array, if you want to allow each element to appear at most twice (remove duplicates II), you can compare nums[slow] with nums[slow - 2] — a gap of 2 — to decide whether to include the current element. The gap acts as a built-in memory without an explicit extra pass.

Fixed-gap differs from sliding window in that the gap is constant and set at initialization rather than varying based on a condition. When the gap becomes variable — expand when a condition is met, shrink when it is violated — the pattern graduates into a true sliding window.

ℹ️

Fixed-gap is the bridge to sliding window

The fixed-gap pattern is a bridge between two pointers and sliding window. When the gap is constant (always n steps apart), it is fixed-gap. When the gap varies based on conditions — expand right when a constraint is satisfied, shrink left when it is violated — it becomes a true sliding window. Recognizing this continuum helps you choose the right template before coding.

Two Pointers on Linked Lists

Linked lists do not support random access, so two-pointer techniques must traverse from a known starting point (head or null). The fast/slow pattern is the primary tool: fast moves twice as fast as slow, enabling cycle detection, middle finding, and k-th-from-end location in a single traversal. For cycle detection, if fast and slow meet, follow up with one pointer reset to head — they will meet again at the cycle entrance.

Merging two sorted linked lists uses dual pointers — one into each list — and a sentinel dummy node to simplify edge cases. Compare the current values of both pointers, attach the smaller one to the result list, advance that pointer, and repeat until one list is exhausted. Append the non-exhausted list to the result. This is O(n + m) time and O(1) extra space.

Intersection detection for two linked lists (LeetCode 160) uses a length-difference pre-computation trick: find the lengths of both lists, advance the longer one by the difference, then traverse both simultaneously. When the pointers are equal (same node reference, not just equal value), you have found the intersection. If they reach null simultaneously without meeting, there is no intersection.

  1. 1Step 1 — Cycle detection: slow = fast = head; advance until fast or fast.next is null (no cycle) or slow == fast (cycle found)
  2. 2Step 2 — Cycle entrance: reset one pointer to head; advance both one step at a time until they meet — meeting point is the cycle entrance
  3. 3Step 3 — Middle: slow = fast = head; advance slow by 1, fast by 2 until fast reaches end; slow is at middle
  4. 4Step 4 — Merge two sorted lists: dummy sentinel, cur pointer; compare list1.val and list2.val; attach smaller, advance that pointer
  5. 5Step 5 — Intersection: compute len(A) and len(B); advance the longer pointer by |len(A) - len(B)| steps; then advance both until equal
  6. 6Step 6 — Two-list merge result: return dummy.next to get the merged list head without dummy node

Common Mistakes

The most frequent bug in converging pointer problems is forgetting to sort the input first. Two sum II works because the array is already sorted; for 3sum and 4sum, you must sort before running the pointer loop. Without sorting, the monotonic property that makes converging pointers correct breaks down — moving left or right no longer guarantees a smaller or larger value.

Skipping duplicates in 3sum and 4sum requires advancing past equal values after processing a valid combination, not before. After fixing the outer element and finding an inner pair, increment left while nums[left] == nums[left - 1] and decrement right while nums[right] == nums[right + 1]. If you skip duplicates at the start of the outer loop incorrectly, you may miss valid triples that share a duplicate value.

Off-by-one errors in gap calculations are common in fixed-gap problems. For "remove nth from end," advance fast exactly n steps (not n - 1 and not n + 1) before starting the synchronized advance. A dummy head node simplifies edge cases where the node to remove is the first node — slow starts at dummy, and slow.next is always the node to remove after the synchronized traversal.

⚠️

Always ensure at least one pointer advances each iteration

Always ensure at least one pointer advances each iteration. If both pointers can stay still under certain conditions — for example, if your convergence condition is a no-op when left == right — you will get an infinite loop. In 3sum duplicate skipping, after a match, always advance both left and right at minimum once before checking for duplicates, otherwise the same pair can be processed repeatedly.

Problem Roadmap

Start with easy problems to build confidence with each variant. Two sum II (167) is the canonical converging pointer problem — sort is already done, one pass. Valid palindrome (125) combines converging pointers with character filtering. Merge sorted array (88) uses two pointers starting from the end to avoid overwriting elements, a clever trick worth internalizing.

Medium problems introduce complexity: 3sum (15) wraps converging pointers in an outer loop with duplicate skipping; container with most water (11) requires understanding why greedy pointer movement is always optimal; remove duplicates from sorted array II (80) uses the same-direction pattern with a gap-2 comparison trick. These three problems cover all three variants and should be the core of your practice.

For hard problems, trapping rain water (42) can be solved with converging pointers tracking the running max from each side — the classic two-pointer solution avoids the O(n) extra space of the stack or precomputed arrays approach. 4sum (18) extends 3sum with an additional outer loop and requires careful duplicate handling at two levels.

  • Easy: Two Sum II (167) — converging pointers on sorted array; no sorting needed
  • Easy: Valid Palindrome (125) — converging pointers with alphanumeric filtering
  • Easy: Merge Sorted Array (88) — two pointers from the end to avoid overwrite
  • Medium: 3Sum (15) — outer loop + converging inner pointers + duplicate skipping
  • Medium: Container With Most Water (11) — converging pointers; greedy move is always safe
  • Medium: Remove Duplicates II (80) — same-direction with gap-2 comparison trick
  • Hard: Trapping Rain Water (42) — converging pointers tracking left-max and right-max
  • Hard: 4Sum (18) — double outer loop + converging inner pointers; O(n³) total

Ready to master algorithm patterns?

YeetCode flashcards help you build pattern recognition through active recall and spaced repetition.

Start practicing now