Claude vs DeepSeek for Coding

Claude wins for coding with a 6.5-point SWE-bench advantage (79.6% vs 73.1%) and superior extended thinking capabilities for tackling complex algorithms—the stronger choice if performance is your priority. DeepSeek is a capable, ultra-affordable alternative that still delivers solid results at a fraction of the cost, making it ideal if budget constraints matter. For professional coding work, Claude is the clearer pick; for cost-conscious projects, DeepSeek is a reasonable compromise.

Head-to-Head for Coding

Code Generation Quality

Claude

Claude

79.6% on SWE-bench Verified

DeepSeek

73.1% on SWE-bench Verified

Cost Efficiency

DeepSeek

Claude

$3.00 per 1M input tokens

DeepSeek

$0.56 per 1M input tokens

Complex Problem Reasoning

Tie

Claude

Extended thinking with adjustable depth

DeepSeek

DeepSeek R1 for dedicated reasoning

Code Explanation Quality

Claude

Claude

Nuanced, clear, detailed explanations

DeepSeek

Direct, functional explanations

Open Source Availability

DeepSeek

Claude

Closed source, proprietary

DeepSeek

Open weights available

See Them Answer a Coding Question

We asked both models the same coding-focused prompt.

Write a Python function that implements a least-recently-used (LRU) cache from scratch without using functools, including proper handling of edge cases and clear code documentation.

Claude 4.6 Sonnet

Write a Python function that implements a least-recently-used (LRU) cache from scratch without using functools, including proper handling of edge cases and clear code documentation.

LRU Cache Implementation from Scratch

Overview

This implementation uses a doubly linked list combined with a hash map to achieve O(1) time complexity for both get and put operations.

  • Doubly Linked List: Tracks access order (most recent → least recent)
  • Hash Map: Provides O(1) key lookups
"""
LRU Cache Implementation

Architecture:
- Doubly Linked List maintains access order
- Dictionary maps keys to nodes for O(1) lookup
- HEAD <-> [most recent] <-> ... <-> [least recent] <-> TAIL

Time Complexity:  O(1) for get and put
Space Complexity: O(capacity)
"""

from typing import Optional, Any


class Node:
    """
    A node in the doubly linked list.
    
    Each node stores a key-value pair and maintains
    pointers to adjacent nodes in the list.
    """

    def __init__(self, key: Any = None, value: Any = None):
        self.key = key
        self.value = value
        self.prev: Optional["Node"] = None
        self.next: Optional["Node"] = None

    def __repr__(self) -> str:
        return f"Node(key={self.key!r}, value={self.value!r})"


class LRUCache:
    """
    Least-Recently-Used (LRU) Cache.

    Evicts the least recently used item when capacity is exceeded.
    Supports O(1) get and put operations.

    Args:
        capacity (int): Maximum number of items the cache can hold.

    Raises:
        ValueError: If capacity is less than 1.

    Example:
        >>> cache = LRUCache(2)
        >>> cache.put("a", 1)
        >>> cache.put("b", 2)
        >>> cache.get("a")   # Returns 1, makes "a" most recent
        1
        >>> cache.put("c", 3)  # Evicts "b" (least recently used)
        >>> cache.get("b")   # Returns None ("b" was evicted)
        
    """

    def __init__(self, capacity: int):
        if not isinstance(capacity, int) or isinstance(capacity, bool):
            raise TypeError(f"Capacity must be an integer, got {type(capacity).__name__}")
        if capacity < 1:
            raise ValueError(f"Capacity must be at least 1, got {capacity}")

        self.capacity = capacity
        self._size = 0

        # Sentinel nodes eliminate edge cases for empty list / boundary operations
        # Layout: HEAD <-> [nodes...] <-> TAIL
        self._head = Node()  # Dummy head — next is most recently used
        self._tail = Node()  # Dummy tail — prev is least recently used
        self._head.next = self._tail
        self._tail.prev = self._head

        # Maps key → Node for O(1) access
        self._cache: dict[Any, Node] = {}

    # ------------------------------------------------------------------
    # Public API
    # ------------------------------------------------------------------

    def get(self, key: Any) -> Optional[Any]:
        """
        Retrieve the value associated with a key.

        Marks the accessed item as most recently used.

        Args:
            key: The key to look up.

        Returns:
            The cached value, or None if the key is not present.

        Time Complexity: O(1)
        """
        if key not in self._cache:
            return None

        node = self._cache[key]
        self._move_to_front(node)  # Mark as most recently used
        return node.value

    def put(self, key: Any, value: Any) -> None:
        """
        Insert or update a key-value pair in the cache.

        - If the key exists, updates its value and marks it as most recently used.
        - If the cache is full, evicts the least recently used item first.

        Args:
            key:   The key to insert or update.
            value: The value to associate with the key.

        Time Complexity: O(1)
        """
        if key in self._cache:
            # Update existing node and promote it
            node = self._cache[key]
            node.value = value
            self._move_to_front(node)
        else:
            # Evict LRU item if at capacity
            if self._size == self.capacity:
                self._evict_lru()

            # Insert new node at the front (most recently used position)
            new_node = Node(key, value)
            self._cache[key] = new_node
            self._insert_at_front(new_node)
            self._size += 1

    def delete(self, key: Any) -> bool:
        """
        Explicitly remove a key from the cache.

        Args:
            key: The key to remove.

        Returns:
            True if the key was present and removed, False otherwise.

        Time Complexity: O(1)
        """
        if key not in self._cache:
            return False

        node = self._cache.pop(key)
        self._remove_node(node)
        self._size -= 1
        return True

    def clear(self) -> None:
        """
        Remove all items from the cache and reset its state.

        Time Complexity: O(1)
        """
        self._cache.clear()
        self._head.next = self._tail
        self._tail.prev = self._head
        self._size = 0

    def peek(self, key: Any) -> Optional[Any]:
        """
        Retrieve a value WITHOUT updating its recency.

        Useful for inspecting cache contents in tests or monitoring.

        Args:
            key: The key to look up.

        Returns:
            The cached value, or None if the key is not present.

        Time Complexity: O(1)
        """
        node = self._cache.get(key)
        return node.value if node else None

    # ------------------------------------------------------------------
    # Private helpers — doubly linked list operations
    # ------------------------------------------------------------------

    def _remove_node(self, node: Node) -> None:
        """
        Unlink a node from the doubly linked list.

        Before: ... <-> prev <-> node <-> next <-> ...
        After:  ... <-> prev <-> next <-> ...

        Time Complexity: O(1)
        """
        prev_node = node.prev
        next_node = node.next
        prev_node.next = next_node
        next_node.prev = prev_node

        # Avoid dangling pointers (aids garbage collection)
        node.prev = None
        node.next = None

    def _insert_at_front(self, node: Node) -> None:
        """
        Insert a node immediately after the HEAD sentinel (most recent position).

        Before: HEAD <-> first <-> ...
        After:  HEAD <-> node <-> first <-> ...

        Time Complexity: O(1)
        """
        node.prev = self._head
        node.next = self._head.next
        self._head.next.prev = node
        self._head.next = node

    def _move_to_front(self, node: Node) -> None:
        """
        Move an existing node to the most-recently-used position.

        Combines _remove_node + _insert_at_front.

        Time Complexity: O(1)
        """
        self._remove_node(node)
        self._insert_at_front(node)

    def _evict_lru(self) -> None:
        """
        Remove the least recently used item (node just before TAIL).

        Time Complexity: O(1)
        """
        lru_node = self._tail.prev  # Node just before the dummy tail
        if lru_node is self._head:
            return  # Cache is empty — nothing to evict (safety guard)

        del self._cache[lru_node.key]
        self._remove_node(lru_node)
        self._size -= 1

    # ------------------------------------------------------------------
    # Dunder methods for convenience
    # ------------------------------------------------------------------

    def __len__(self) -> int:
        """Return the current number of items in the cache."""
        return self._size

    def __contains__(self, key: Any) -> bool:
        """Support `key in cache` syntax (does NOT update recency)."""
        return key in self._cache

    def __repr__(self) -> str:
        """Show cache contents ordered from most to least recently used."""
        items = []
        current = self._head.next
        while current is not self._tail:
            items.append(f"{current.key!r}: {current.value!r}")
            current = current.next
        return f"LRUCache(capacity={self.capacity}, items=[{', '.join(items)}])"

Tests

def run_tests():
    print("=" * 60)
    print("Running LRU Cache Tests")
    print("=" * 60)

    # ------------------------------------------------------------------
    # 1. Basic put and get
    # ------------------------------------------------------------------
    print("\n1. Basic put and get")
    cache = LRUCache(3)
    cache.put("x", 10)
    cache.put("y", 20)
    cache.put("z", 30)
    assert cache.get("x") == 10
    assert cache.get("y") == 20
    assert cache.get("z") == 30
    assert len(cache) == 3
    print("   PASSED")

    # ------------------------------------------------------------------
    # 2. Cache miss returns None
    # ------------------------------------------------------------------
    print("2. Cache miss returns None")
    cache = LRUCache(2)
    assert cache.get("missing") is None
    print("   PASSED")

    # ------------------------------------------------------------------
    # 3. Eviction of LRU item
    # ------------------------------------------------------------------
    print("3. LRU eviction")
    cache = LRUCache(2)
    cache.put(1, "one")
    cache.put(2, "two")
    cache.put(3, "three")   # Should evict key=1 (oldest)
    assert cache.get(1) is None, "Key 1 should have been evicted"
    assert cache.get(2) == "two"
    assert cache.get(3) == "three"
    assert len(cache) == 2
    print("   PASSED")

    # ------------------------------------------------------------------
    # 4. get() updates recency — accessed item survives eviction
    # ------------------------------------------------------------------
    print("4. get() updates recency")
    cache = LRUCache(2)
    cache.put("a", 1)
    cache.put("b", 2)
    cache.get("a")           # "a" is now most recent; "b" becomes LRU
    cache.put("c", 3)        # Should evict "b"
    assert cache.get("b") is None, "Key 'b' should have been evicted"
    assert cache.get("a") == 1
    assert cache.get("c") == 3
    print("   PASSED")

    # ------------------------------------------------------------------
    # 5. put() on existing key updates value and recency
    # ------------------------------------------------------------------
    print("5. Updating an existing key")
    cache = LRUCache(2)
    cache.put("a", 1)
    cache.put("b", 2)
    cache.put("a", 99)       # Update "a"; "b" becomes LRU
    cache.put("c", 3)        # Should evict "b"
    assert cache.get("a") == 99
    assert cache.get("b") is None
    assert cache.get("c") == 3
    print("   PASSED")

    # ------------------------------------------------------------------
    # 6. Capacity of 1
    # ------------------------------------------------------------------
    print("6. Capacity of 1 (edge case)")
    cache = LRUCache(1)
    cache.put("only", 42)
    assert cache.get("only") == 42
    cache.put("new", 7)       # Evicts "only"
    assert cache.get("only") is None
    assert cache.get("new") == 7
    print("   PASSED")

    # ------------------------------------------------------------------
    # 7. Explicit delete
    # ------------------------------------------------------------------
    print("7. Explicit delete")
    cache = LRUCache(3)
    cache.put("a", 1)
    cache.put("b", 2)
    assert cache.delete("a") is True
    assert cache.get("a") is None
    assert cache.delete("nonexistent") is False
    assert len(cache) == 1
    print("   PASSED")

    # ------------------------------------------------------------------
    # 8. Clear cache
    # ------------------------------------------------------------------
    print("8. Clear cache")
    cache = LRUCache(3)
    cache.put("a", 1)
    cache.put("b", 2)
    cache.clear()
    assert len(cache) == 0
    assert cache.get("a") is None
    cache.put("fresh", 100)  # Should work normally after clear
    assert cache.get("fresh") == 100
    print("   PASSED")

    # ------------------------------------------------------------------
    # 9. peek() does NOT affect recency
    # ------------------------------------------------------------------
    print("9. peek() does not update recency")
    cache = LRUCache(2)
    cache.put("a", 1)
    cache.put("b", 2)
    cache.peek("a")          # Should NOT promote "a"
    cache.put("c", 3)        # "a" should be evicted (still LRU)
    assert cache.get("a") is None, "'a' should still be LRU after peek()"
    assert cache.get("b") == 2
    assert cache.get("c") == 3
    print("   PASSED")

    # ------------------------------------------------------------------
    # 10. __contains__ does NOT affect recency
    # ------------------------------------------------------------------
    print("10. `in` operator does not update recency")
    cache = LRUCache(2)
    cache.put("a", 1)
    cache.put("b", 2)
    _ = "a" in cache         # Should NOT promote "a"
    cache.put("c", 3)        # "a" should still be evicted
    assert "a" not in cache
    assert "b" in cache
    assert "c" in cache
    print("   PASSED")

    # ------------------------------------------------------------------
    # 11. Invalid capacity raises errors
    # ------------------------------------------------------------------
    print("11. Invalid capacity")
    try:
        LRUCache(0)
        assert False, "Should have raised ValueError"
    except ValueError:
        pass

    try:
        LRUCache(-5)
        assert False, "Should have raised ValueError"
    except ValueError:
        pass

    try:
        LRUCache("two")
        assert False, "Should have raised TypeError"
    except TypeError:
        pass
    print("   PASSED")

    # ------------------------------------------------------------------
    # 12. repr() shows most-to-least-recent order
    # ------------------------------------------------------------------
    print("12. repr() order")
    cache = LRUCache(3)
    cache.put("a", 1)
    cache.put("b", 2)
    cache.put("c", 3)
    cache.get("a")           # Access "a" — now most recent
    r = repr(cache)
    # Most recent first: a, c, b
    assert r.index("'a'") < r.index("'c'") < r.index("'b'"), f"Wrong order: {r}"
    print("   PASSED")

    # ------------------------------------------------------------------
    # 13. Large-scale stress test
    # ------------------------------------------------------------------
    print("13. Stress test (10,000 operations)")
    import random
    cache = LRUCache(100)
    reference = {}           # Simple dict to validate against

    for _ in range(10_000):
        op = random.choice(["put", "get"])
        key = random.randint(0, 149)   # Key space larger than capacity

        if op == "put":
            value = random.randint(0, 9999)
            cache.put(key, value)
            reference[key] = value
            # Trim reference to simulate LRU (approximate check)
        else:
            result = cache.get(key)
            if result is not None:
                # If cache returned something, it must match the last written value
                assert result == reference.get(key), (
                    f"Mismatch for key {key}: cache={result}, ref={reference.get(key)}"
                )

    assert len(cache) <= 100, "Cache exceeded capacity"
    print("   PASSED")

    print("\n" + "=" * 60)
    print("All tests passed!")
    print("=" * 60)


if __name__ == "__main__":
    run_tests()

    # Quick demo
    print("\nDemo:")
    cache = LRUCache(3)
    cache.put("a", 1)
    cache.put("b", 2)
    cache.put("c", 3)
    print(cache)                  # a, b, c → most to least recent
    cache.get("a")                # Promote "a"
    print(cache)                  # a, c, b
    cache.put("d", 4)             # Evict "b"
    print(cache)                  # d, a, c

Sample Output

============================================================
Running LRU Cache Tests
============================================================

1. Basic put and get
   PASSED
2. Cache miss returns None
   PASSED
3. LRU eviction
   PASSED
4. get() updates recency
   PASSED
5. Updating an existing key
   PASSED
6. Capacity of 1 (edge case)
   PASSED
7. Explicit delete
   PASSED
8. Clear cache
   PASSED
9. peek() does not update recency
   PASSED
10. `in` operator does not update recency
   PASSED
11. Invalid capacity
   PASSED
12. repr() order
   PASSED
13. Stress test (10,000 operations)
   PASSED

============================================================
All tests passed!
============================================================

Demo:
LRUCache(capacity=3, items=['c': 3, 'b': 2, 'a': 1])
LRUCache(capacity=3, items=['a': 1, 'c': 3, 'b': 2])
LRUCache(capacity=3, items=['d': 4, 'a': 1, 'c': 3])

Key Design Decisions

Decision Rationale
Sentinel head/tail nodes Eliminates if not head / boundary checks — every real node always has valid neighbours
_remove_node clears pointers Setting .prev = .next = None prevents memory leaks via reference cycles
peek() / __contains__ Separate read-without-promote operations are critical for monitoring and testing without polluting recency state
Key stored in Node Required to delete the key from the hash map during eviction in O(1)
Input validation Catches bool subclass of int and negative/zero capacities at construction time

Try coding tasks with both models

See Claude and DeepSeek answer side by side in Multichat

Try it yourself — free

Detailed Breakdown

When it comes to coding assistance, both Claude and DeepSeek are genuinely capable tools — but they serve different types of developers in different situations.

Claude's coding performance is best-in-class for practical software development. Its SWE-bench Verified score of 79.6% is among the highest of any model, meaning it reliably fixes real-world GitHub issues — not just toy problems. Claude excels at multi-file refactors, writing idiomatic code that matches a codebase's existing style, and explaining complex logic clearly. Its instruction-following is precise: if you ask for a function with specific constraints, Claude doesn't go rogue and redesign the whole module. For web developers, backend engineers, and anyone doing production work, this predictability matters enormously. Claude Code, the CLI tool, extends this into agentic coding workflows — letting Claude read, write, and execute across your codebase autonomously.

DeepSeek is a legitimate coding contender, scoring 73.1% on SWE-bench Verified and demonstrating strong reasoning on algorithmic challenges. It performs particularly well on competitive programming-style problems and math-heavy code (think numerical methods, optimization algorithms, or anything that benefits from its strong math benchmark scores). DeepSeek R1, the dedicated reasoning variant, shines on problems that require step-by-step logic before writing a single line. For developers who want open-source weights they can self-host or fine-tune for a specific domain, DeepSeek is one of the few models at this performance tier that allows it.

The practical differences show up in everyday use. Claude handles ambiguous requirements better — you can describe what you want at a high level and get production-ready code back with sensible defaults and edge cases handled. DeepSeek sometimes requires more explicit prompting to hit the same output quality, and it lacks image understanding, so you can't paste a UI mockup or architecture diagram and ask it to generate code from that. Claude's file upload support is also a meaningful advantage for dropping in an entire codebase or spec document as context.

Cost is where DeepSeek pulls ahead sharply. API pricing is roughly five times cheaper per token than Claude, making it compelling for teams running high-volume code generation pipelines where budget matters more than marginal quality gains. For individual developers on the consumer apps, Claude Pro at $20/month is a strong value; DeepSeek's generous free tier is hard to beat for casual use.

Recommendation: For professional developers doing real product work — debugging, refactoring, code review, agentic workflows — Claude is the stronger choice. For developers on a tight budget, those working on algorithmically complex problems, or teams that need self-hostable open-source weights, DeepSeek is a serious alternative worth deploying.

Frequently Asked Questions

Other Topics for Claude vs DeepSeek

Coding Comparisons for Other Models

Try coding tasks with Claude and DeepSeek

Compare in Multichat — free

Join 10,000+ professionals who use Multichat