Table of Contents

0. Setup & Environment

Install Python via Homebrew

macOS ships with a system Python, but you should never use it for development — it is managed by the OS and may be outdated or break after system updates. The most popular approach on macOS is Homebrew.

# Install Homebrew if you don't have it
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"

# Install Python 3 (currently 3.12+)
brew install python

# Verify
python3 --version   # Python 3.12.x
pip3 --version      # pip 24.x
python vs python3

On macOS, python may not exist or may point to the system Python 2. Always use python3 and pip3 from the terminal unless you are inside a virtual environment, where python will correctly refer to your venv's interpreter.

Virtual Environments

A virtual environment is an isolated Python installation scoped to a single project. It keeps your project's dependencies separate from the system and from other projects — this is essential for reproducible builds.

# Create a venv named .venv inside your project directory
python3 -m venv .venv

# Activate it (your prompt will show (.venv))
source .venv/bin/activate

# Now 'python' and 'pip' point to the venv
python --version
pip install requests

# Deactivate when done
deactivate
One venv per project

Always create a .venv at the root of each project. Add it to .gitignore — commit requirements.txt (or pyproject.toml) instead. Recreate the venv from that file on any machine.

# Freeze current dependencies
pip freeze > requirements.txt

# Recreate a venv from requirements on another machine
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt

Alternative: pyenv (Multiple Python Versions)

If you work across projects that require different Python versions (e.g., 3.10 for one and 3.12 for another), pyenv is the standard tool for managing multiple interpreters side by side.

# Install pyenv
brew install pyenv

# Add to your shell config (~/.zshrc or ~/.bashrc)
echo 'export PYENV_ROOT="$HOME/.pyenv"' >> ~/.zshrc
echo 'export PATH="$PYENV_ROOT/bin:$PATH"' >> ~/.zshrc
echo 'eval "$(pyenv init -)"' >> ~/.zshrc
source ~/.zshrc

# List available versions
pyenv install --list | grep "3\."

# Install a specific version
pyenv install 3.12.3

# Set a global default
pyenv global 3.12.3

# Set a per-directory version (writes .python-version file)
pyenv local 3.11.8

# Check active version
pyenv version
pyenv-virtualenv

The pyenv-virtualenv plugin (brew install pyenv-virtualenv) lets you create named virtualenvs tied to a specific Python version and auto-activate them per directory. It is useful for power users but optional — plain python3 -m venv .venv is simpler for most projects.

Editor Setup

VS Code is the most widely used editor for Python. Install the Python extension by Microsoft, which provides IntelliSense, linting, debugging, and automatic venv detection.

# Install VS Code from https://code.visualstudio.com
# Then install the Python extension from the Extensions panel (Ctrl+Shift+X)
# or via CLI if you have 'code' in PATH:
code --install-extension ms-python.python

VS Code will automatically detect a .venv at the project root and use it as the interpreter. You can also select the interpreter manually with Cmd+Shift+P → Python: Select Interpreter.

PyCharm

PyCharm (JetBrains) is a full-featured Python IDE that excels at large-scale projects and has first-class Django and FastAPI support. The Community edition is free. It is heavier than VS Code but provides deeper refactoring and inspection tools.

Quick Verify

Run this sequence from scratch to confirm your entire setup — Homebrew Python, venv creation, activation, and a simple script — all works end to end.

mkdir ~/python-refresher && cd ~/python-refresher

# Create and activate a venv
python3 -m venv .venv
source .venv/bin/activate

# Confirm the interpreter is the venv's, not the system one
which python        # should show .../python-refresher/.venv/bin/python

# Write and run a hello world
echo 'print("Hello, Python!")' > hello.py
python hello.py     # Hello, Python!

# Clean up
deactivate
cd ~ && rm -rf ~/python-refresher
uv: a faster alternative to pip + venv

uv (by Astral) is a modern, Rust-based drop-in replacement for pip and venv that is 10–100x faster. Install with brew install uv. Use uv venv to create environments and uv pip install to install packages. It is fully compatible with existing requirements.txt and pyproject.toml workflows.

1. Language Fundamentals

Everything Is an Object

In Python, every value — integers, functions, classes, modules — is an object with an identity (id()), a type (type()), and a value. Types are themselves objects of type type.

x = 42
print(type(x))        # <class 'int'>
print(type(int))      # <class 'type'>
print(type(type))     # <class 'type'>  (type is its own type)
print(isinstance(x, object))  # True — everything inherits from object

# Functions are first-class objects
def greet(name): return f"Hello, {name}"
print(type(greet))    # <class 'function'>
print(greet.__name__) # greet

Data Model: Dunder Methods

Python's data model lets you hook into built-in operations by implementing special ("dunder") methods on your classes.

class Vector:
    def __init__(self, x: float, y: float):
        self.x = x
        self.y = y

    def __repr__(self) -> str:
        # Used by repr(), shown in REPL, should be unambiguous
        return f"Vector({self.x!r}, {self.y!r})"

    def __str__(self) -> str:
        # Used by str() and print(), human-readable
        return f"({self.x}, {self.y})"

    def __add__(self, other: "Vector") -> "Vector":
        return Vector(self.x + other.x, self.y + other.y)

    def __mul__(self, scalar: float) -> "Vector":
        return Vector(self.x * scalar, self.y * scalar)

    def __rmul__(self, scalar: float) -> "Vector":
        # Called when left operand doesn't support *: 3 * v
        return self.__mul__(scalar)

    def __abs__(self) -> float:
        return (self.x**2 + self.y**2) ** 0.5

    def __bool__(self) -> bool:
        return abs(self) != 0

    def __eq__(self, other: object) -> bool:
        if not isinstance(other, Vector):
            return NotImplemented
        return self.x == other.x and self.y == other.y

    def __hash__(self) -> int:
        # Required if __eq__ is defined and you want hashability
        return hash((self.x, self.y))

    def __len__(self) -> int:
        return 2  # 2D vector has 2 components

    def __getitem__(self, index: int) -> float:
        return (self.x, self.y)[index]

v1 = Vector(1, 2)
v2 = Vector(3, 4)
print(v1 + v2)     # (4, 6)
print(3 * v1)      # (3, 6)
print(abs(v2))     # 5.0
print(bool(Vector(0, 0)))  # False
Dunder MethodTriggered ByNotes
__init__Object constructionAfter __new__
__repr__repr(obj), REPLShould be unambiguous
__str__str(obj), print()Fallback to __repr__
__len__len(obj)Must return non-negative int
__getitem__obj[key]Enables iteration if no __iter__
__contains__x in objFalls back to linear scan via __iter__
__call__obj()Makes instance callable
__enter__/__exit__with statementContext manager protocol
__iter__/__next__for loop, iter()Iterator protocol
__eq__/__hash__==, dicts/setsDefine together; if __eq__, must define __hash__
Predict

The type() Chain

Everything in Python is an object — even types themselves. What does this print?

print(type(42))
print(type(type(42)))
print(type(type(type(42))))

<class 'int'>, <class 'type'>, <class 'type'>type is its own metaclass, so the chain bottoms out at type.

Challenge

Make Vector Subtractable

The Vector class above supports + and *. Add the __sub__ method so subtraction works.

class Vector:
    def __init__(self, x, y):
        self.x, self.y = x, y
    def __repr__(self):
        return f"Vector({self.x}, {self.y})"

    # YOUR CODE HERE


# Tests — all should print True
print(Vector(5, 7) - Vector(2, 3) == Vector(3, 4))
print(Vector(0, 0) - Vector(1, 1) == Vector(-1, -1))

Follow the same pattern as __add__ — create a new Vector with the difference of each component.

def __sub__(self, other):
    return Vector(self.x - other.x, self.y - other.y)

def __eq__(self, other):
    return self.x == other.x and self.y == other.y

Mutable vs Immutable

# Immutable: int, float, str, bytes, tuple, frozenset, bool
x = "hello"
# x[0] = "H"  # TypeError: strings are immutable

# Mutable: list, dict, set, bytearray, custom objects
lst = [1, 2, 3]
lst[0] = 99    # OK

# Implication: immutables are safe as dict keys / set members
d = {(1, 2): "point"}   # tuple key — fine
# d = {[1, 2]: "point"} # TypeError: list is unhashable

# Assignment rebinds names, not copies
a = [1, 2, 3]
b = a           # b and a point to the SAME list
b.append(4)
print(a)        # [1, 2, 3, 4]

b = a.copy()    # shallow copy — now independent at top level
b.append(5)
print(a)        # [1, 2, 3, 4]  — unaffected

Identity vs Equality

a = [1, 2, 3]
b = [1, 2, 3]
c = a

print(a == b)   # True  — same value
print(a is b)   # False — different objects
print(a is c)   # True  — same object

# None, True, False are singletons
print(None is None)   # True — always use `is None`
x = None
print(x is None)      # correct idiom
print(x == None)      # works but discouraged (can be overridden)

# Small int / string interning (CPython implementation detail)
x = 256; y = 256
print(x is y)   # True  (CPython interns -5..256)
x = 257; y = 257
print(x is y)   # False (not guaranteed)
Never use is for value comparison
is checks object identity (same memory address). Use == for value equality. The only correct use of is is with None, True, and False.
Predict

Shallow Copy Surprise

Slicing creates a shallow copy. What does this print?

a = [1, 2, [3, 4]]
b = a[:]
b[2].append(5)
b[0] = 99
print(a)

[1, 2, [3, 4, 5]] — The slice [:] is a shallow copy: top-level items are copied, but nested objects (like the inner list) are shared references. So b[2].append(5) mutates both, but b[0] = 99 only affects b.

Truthiness

# Falsy values
falsy = [None, False, 0, 0.0, 0j, "", b"", [], {}, set(), ()]

# Any object is truthy unless:
# - __bool__ returns False, OR
# - __len__ returns 0 (and __bool__ is absent)

class Empty:
    def __len__(self): return 0

print(bool(Empty()))  # False

# Common idiom
items = []
if not items:         # preferred over: if len(items) == 0
    print("empty")

2. Type System & Typing

Static vs Runtime
Python type hints are not enforced at runtime by default. They are annotations for static checkers (mypy, pyright) and documentation. Use beartype or Pydantic if you need runtime enforcement.

Basic Annotations

from __future__ import annotations  # defers annotation evaluation; useful for forward refs in 3.7+

def greet(name: str, times: int = 1) -> str:
    return (name + " ") * times

# Variables
x: int = 5
items: list[str] = []
mapping: dict[str, int] = {}

Union Types (3.10+)

from typing import Optional, Union

# Old style (still valid)
def old(x: Optional[str]) -> None: ...      # Optional[X] == Union[X, None]
def old2(x: Union[int, str]) -> None: ...

# Python 3.10+ preferred syntax
def parse(value: str | int | None) -> str:
    match value:
        case None:
            return "nothing"
        case int(n):
            return f"int: {n}"
        case str(s):
            return f"str: {s}"

Generics

from typing import TypeVar, Generic, Sequence, Callable, Any

T = TypeVar("T")
S = TypeVar("S", bound="Comparable")  # bounded TypeVar

# Generic function
def first(items: Sequence[T]) -> T:
    return items[0]

# Generic class
class Stack(Generic[T]):
    def __init__(self) -> None:
        self._items: list[T] = []

    def push(self, item: T) -> None:
        self._items.append(item)

    def pop(self) -> T:
        return self._items.pop()

s: Stack[int] = Stack()
s.push(1)

# Python 3.12 — new syntax for generic functions/classes
# def first[T](items: Sequence[T]) -> T: ...
# class Stack[T]: ...

Protocol (Structural Typing)

from typing import Protocol, runtime_checkable

@runtime_checkable
class Drawable(Protocol):
    def draw(self) -> None: ...
    def area(self) -> float: ...

class Circle:
    def __init__(self, r: float):
        self.r = r
    def draw(self) -> None:
        print(f"Drawing circle r={self.r}")
    def area(self) -> float:
        return 3.14159 * self.r ** 2

# No explicit inheritance needed — structural compatibility
def render(shape: Drawable) -> None:
    shape.draw()

c = Circle(5)
render(c)                       # works
print(isinstance(c, Drawable))  # True (runtime_checkable)

Special Forms

from typing import Literal, TypeAlias, TypeGuard, Final, Never, Annotated

# Literal — restrict to specific values
Mode: TypeAlias = Literal["r", "w", "rb", "wb"]

def open_file(path: str, mode: Mode) -> None: ...

# TypeGuard — narrowing predicate
def is_string_list(val: list[object]) -> TypeGuard[list[str]]:
    return all(isinstance(x, str) for x in val)

# Final — constant, cannot be reassigned
MAX_SIZE: Final = 1024

# Annotated — attach metadata (used by Pydantic, FastAPI)
from typing import Annotated
PositiveInt = Annotated[int, "must be positive"]

# Never — function that never returns (raises or loops forever)  # 3.11+ (use NoReturn for 3.10)
def fail(msg: str) -> Never:
    raise RuntimeError(msg)

Dataclasses

from dataclasses import dataclass, field, KW_ONLY
from typing import ClassVar

@dataclass
class Point:
    x: float
    y: float
    label: str = ""                          # default value
    tags: list[str] = field(default_factory=list)  # mutable default
    _count: ClassVar[int] = 0               # class variable, not a field

    def __post_init__(self):
        # Called after __init__ — use for validation/transformation
        if self.x < 0 or self.y < 0:
            raise ValueError("Coordinates must be non-negative")
        Point._count += 1

@dataclass(frozen=True)  # immutable, hashable
class FrozenPoint:
    x: float
    y: float

@dataclass(order=True)   # generates __lt__, __le__, __gt__, __ge__
class Ranked:
    rank: int
    name: str

# KW_ONLY separator (3.10+)
@dataclass
class Config:
    host: str
    _: KW_ONLY
    port: int = 8080      # keyword-only after _: KW_ONLY
    debug: bool = False

p = Point(1.0, 2.0, tags=["geo"])
fp = FrozenPoint(1.0, 2.0)
print(hash(fp))           # works because frozen=True

Pydantic Overview

from pydantic import BaseModel, Field, field_validator, model_validator
from typing import Annotated

PositiveFloat = Annotated[float, Field(gt=0)]

class User(BaseModel):
    name: str
    age: int = Field(ge=0, le=150)
    email: str
    score: PositiveFloat = 1.0

    @field_validator("email")
    @classmethod
    def email_must_have_at(cls, v: str) -> str:
        if "@" not in v:
            raise ValueError("invalid email")
        return v.lower()

    @model_validator(mode="after")
    def check_adult_score(self) -> "User":
        if self.age < 18 and self.score > 100:
            raise ValueError("minors can't exceed score 100")
        return self

# Parsing — raises ValidationError on bad input
u = User(name="Alice", age=30, email="[email protected]", score=95.5)
print(u.model_dump())  # {'name': 'Alice', 'age': 30, ...}
print(u.model_dump_json())

# From dict / JSON
u2 = User.model_validate({"name": "Bob", "age": 25, "email": "[email protected]"})

3. Data Structures

Lists

# Construction and slicing
lst = [1, 2, 3, 4, 5]
print(lst[1:3])      # [2, 3]
print(lst[::2])      # [1, 3, 5]  (every 2nd)
print(lst[::-1])     # [5, 4, 3, 2, 1]  (reversed)

# Key methods — all O(1) unless noted
lst.append(6)        # add to end — O(1) amortized
lst.insert(0, 0)     # insert at index — O(n)
lst.pop()            # remove from end — O(1)
lst.pop(0)           # remove from front — O(n) — use deque instead
lst.extend([7, 8])   # extend with iterable
lst.remove(3)        # remove first occurrence — O(n)
lst.index(4)         # find index — O(n)
lst.sort()           # in-place, stable — O(n log n)
sorted(lst)          # returns new sorted list

# List comprehensions
squares = [x**2 for x in range(10)]
evens   = [x for x in range(20) if x % 2 == 0]
matrix  = [[i * j for j in range(4)] for i in range(4)]

# Flatten
nested = [[1, 2], [3, 4], [5]]
flat = [x for sublist in nested for x in sublist]

Dicts

from collections import defaultdict, Counter, OrderedDict

# Dict comprehension
word_len = {w: len(w) for w in ["apple", "banana", "cherry"]}

# Merging (3.9+)
a = {"x": 1}; b = {"y": 2}
merged = a | b          # new dict
a |= b                  # in-place merge

# Safe access
d = {"key": "value"}
d.get("missing", "default")  # "default"
d.setdefault("new", []).append(1)  # insert if missing, return value

# defaultdict — auto-creates missing keys
graph = defaultdict(list)
graph["A"].append("B")   # no KeyError
graph["A"].append("C")

# Counter — frequency counting
words = "the cat sat on the mat".split()
c = Counter(words)
print(c.most_common(3))   # [('the', 2), ('cat', 1), ...]
c2 = Counter("hello")
print(c + c2)             # merge counters

# Dict iteration
for key in d: ...
for key, val in d.items(): ...
for val in d.values(): ...

# Ordered insertion guaranteed since Python 3.7

Sets

a = {1, 2, 3, 4}
b = {3, 4, 5, 6}

print(a | b)    # union:        {1, 2, 3, 4, 5, 6}
print(a & b)    # intersection: {3, 4}
print(a - b)    # difference:   {1, 2}
print(a ^ b)    # symmetric diff: {1, 2, 5, 6}
print(a <= b)   # subset check

# frozenset — immutable, hashable set
fs = frozenset([1, 2, 3])
d = {fs: "value"}   # usable as dict key

# Set comprehension
squares_set = {x**2 for x in range(10)}

deque and heapq

from collections import deque
import heapq

# deque — O(1) at both ends
dq = deque([1, 2, 3], maxlen=5)  # maxlen auto-evicts from other end
dq.appendleft(0)   # O(1)
dq.popleft()       # O(1) — use instead of list.pop(0)
dq.rotate(2)       # rotate right by n

# heapq — min-heap on a list
heap = [5, 3, 8, 1]
heapq.heapify(heap)          # O(n) — transforms in place
heapq.heappush(heap, 2)      # O(log n)
smallest = heapq.heappop(heap)  # O(log n)
top3 = heapq.nsmallest(3, heap)

# Max-heap trick: negate values
max_heap = []
heapq.heappush(max_heap, -10)
heapq.heappush(max_heap, -5)
largest = -heapq.heappop(max_heap)  # 10
StructureAccessInsertDeleteSearchUse When
listO(1)O(1) end / O(n) midO(1) end / O(n) midO(n)Ordered sequence, random access
dequeO(n)O(1) both endsO(1) both endsO(n)Queue/stack, BFS
dictO(1)O(1)O(1)O(1) by keyKey-value, membership
setN/AO(1)O(1)O(1)Unique membership, set ops
heapqO(1) minO(log n)O(log n)O(n)Priority queue, top-k
Challenge

Loop to Dict Comprehension

Rewrite this loop as a single dict comprehension that maps each word to its length, but only for words longer than 3 characters.

words = ["the", "quick", "brown", "fox", "jumps", "over", "lazy", "dog"]

# Rewrite as a dict comprehension:
# result = {word: len(word) for ...}

# YOUR CODE HERE
result = {}

print(result)

Dict comprehension syntax: {key: value for item in iterable if condition}

result = {w: len(w) for w in words if len(w) > 3}

4. Functions

Arguments & Signatures

def func(
    pos_only: int,          # positional-only (before /)
    /,
    normal: str,            # positional or keyword
    *args: float,           # variadic positional
    kw_only: bool = False,  # keyword-only (after *)
    **kwargs: str,          # variadic keyword
) -> None:
    print(pos_only, normal, args, kw_only, kwargs)

func(1, "hello", 1.0, 2.0, kw_only=True, extra="x")

# Unpacking at call site
args_list = [1, 2, 3]
kwargs_dict = {"key": "val"}
some_func(*args_list, **kwargs_dict)

Closures

def make_counter(start: int = 0):
    count = start          # captured in closure

    def increment(step: int = 1) -> int:
        nonlocal count     # declare intent to rebind (not just read)
        count += step
        return count

    return increment

counter = make_counter(10)
print(counter())    # 11
print(counter(5))   # 16

Decorators

import functools
import time
from typing import Callable, TypeVar, ParamSpec

P = ParamSpec("P")
R = TypeVar("R")

# Basic decorator
def timer(func: Callable[P, R]) -> Callable[P, R]:
    @functools.wraps(func)  # preserves __name__, __doc__, etc.
    def wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
        start = time.perf_counter()
        result = func(*args, **kwargs)
        elapsed = time.perf_counter() - start
        print(f"{func.__name__} took {elapsed:.4f}s")
        return result
    return wrapper

# Decorator factory (takes arguments)
def retry(times: int = 3, exceptions: tuple = (Exception,)):
    def decorator(func: Callable[P, R]) -> Callable[P, R]:
        @functools.wraps(func)
        def wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
            for attempt in range(times):
                try:
                    return func(*args, **kwargs)
                except exceptions as e:
                    if attempt == times - 1:
                        raise
                    print(f"Attempt {attempt+1} failed: {e}")
        return wrapper
    return decorator

@retry(times=3, exceptions=(IOError,))
@timer
def fetch(url: str) -> str:
    # ...
    return ""

# Class-based decorator
class Memoize:
    def __init__(self, func: Callable):
        functools.update_wrapper(self, func)
        self.func = func
        self.cache: dict = {}

    def __call__(self, *args):
        if args not in self.cache:
            self.cache[args] = self.func(*args)
        return self.cache[args]

@Memoize
def fib(n: int) -> int:
    return n if n <= 1 else fib(n-1) + fib(n-2)

functools

import functools

# lru_cache — memoization with LRU eviction
@functools.lru_cache(maxsize=128)
def expensive(n: int) -> int:
    return n ** 2

# cache (3.9+) — unbounded, simpler
@functools.cache
def fib(n: int) -> int:
    return n if n <= 1 else fib(n-1) + fib(n-2)

# partial — freeze some arguments
from functools import partial
def power(base, exp): return base ** exp
square = partial(power, exp=2)
cube   = partial(power, exp=3)
print(square(5))   # 25

# reduce
from functools import reduce
product = reduce(lambda acc, x: acc * x, [1, 2, 3, 4, 5])  # 120
Challenge

Write a Timing Decorator

Write a decorator @timer that prints how long a function takes to execute. Use time.perf_counter().

import time
import functools

def timer(func):
    # YOUR CODE HERE
    pass


@timer
def slow_add(a, b):
    time.sleep(0.1)
    return a + b

result = slow_add(2, 3)
print(f"Result: {result}")

Use @functools.wraps(func) on the wrapper. Capture start before calling func(*args, **kwargs), then compute elapsed time after.

def timer(func):
    @functools.wraps(func)
    def wrapper(*args, **kwargs):
        start = time.perf_counter()
        result = func(*args, **kwargs)
        elapsed = time.perf_counter() - start
        print(f"{func.__name__} took {elapsed:.4f}s")
        return result
    return wrapper

5. Classes & OOP

Class Definition

class Animal:
    # Class variable — shared across all instances
    kingdom: str = "Animalia"

    def __init__(self, name: str, sound: str) -> None:
        self.name = name       # instance variable
        self._sound = sound    # convention: "private"
        self.__mangled = True  # name-mangled to _Animal__mangled

    @property
    def sound(self) -> str:
        return self._sound

    @sound.setter
    def sound(self, value: str) -> None:
        if not value:
            raise ValueError("sound cannot be empty")
        self._sound = value

    @classmethod
    def from_dict(cls, data: dict) -> "Animal":
        # Alternative constructor
        return cls(data["name"], data["sound"])

    @staticmethod
    def validate_name(name: str) -> bool:
        return bool(name and name.isalpha())

    def __repr__(self) -> str:
        return f"Animal({self.name!r}, {self._sound!r})"

Inheritance & MRO

class Dog(Animal):
    def __init__(self, name: str, breed: str) -> None:
        super().__init__(name, "woof")  # call parent __init__
        self.breed = breed

    def speak(self) -> str:
        return f"{self.name} says {self.sound}!"

# Multiple inheritance — MRO resolved by C3 linearization
class A:
    def hello(self): return "A"

class B(A):
    def hello(self): return "B -> " + super().hello()

class C(A):
    def hello(self): return "C -> " + super().hello()

class D(B, C):
    def hello(self): return "D -> " + super().hello()

print(D().hello())        # D -> B -> C -> A
print(D.__mro__)          # (D, B, C, A, object)

Abstract Base Classes

from abc import ABC, abstractmethod

class Shape(ABC):
    @abstractmethod
    def area(self) -> float: ...

    @abstractmethod
    def perimeter(self) -> float: ...

    def describe(self) -> str:   # concrete method
        return f"Area={self.area():.2f}, Perimeter={self.perimeter():.2f}"

class Rectangle(Shape):
    def __init__(self, w: float, h: float):
        self.w, self.h = w, h

    def area(self) -> float:
        return self.w * self.h

    def perimeter(self) -> float:
        return 2 * (self.w + self.h)

# Shape()  # TypeError: Can't instantiate abstract class
r = Rectangle(3, 4)
print(r.describe())

__slots__

class Point:
    __slots__ = ("x", "y")  # disables __dict__, saves memory

    def __init__(self, x: float, y: float):
        self.x = x
        self.y = y

# ~40% less memory per instance, slightly faster attribute access
# Cannot add arbitrary attributes: p.z = 1 raises AttributeError

# slots + dataclass (3.10+)
from dataclasses import dataclass

@dataclass(slots=True)
class FastPoint:
    x: float
    y: float

6. Pattern Matching 3.10+

Not just a switch statement
Python's match/case is structural pattern matching — it deconstructs data, not just compares values. It's closer to Rust or Haskell than to C's switch.
from dataclasses import dataclass

@dataclass
class Point:
    x: float
    y: float

def classify(value):
    match value:
        # Literal patterns
        case 0 | 0.0:
            return "zero"

        # Capture pattern
        case int(n) if n > 0:      # with guard clause
            return f"positive int: {n}"

        # Sequence patterns
        case [x, y]:
            return f"two-element list: {x}, {y}"

        case [first, *rest]:
            return f"head={first}, tail={rest}"

        # Mapping patterns
        case {"action": "move", "x": x, "y": y}:
            return f"move to ({x}, {y})"

        case {"action": action, **rest}:
            return f"action={action}, extra={rest}"

        # Class patterns
        case Point(x=0, y=0):
            return "origin"

        case Point(x=x, y=0):
            return f"on x-axis at {x}"

        case Point(x=x, y=y):
            return f"point at ({x}, {y})"

        # Wildcard
        case _:
            return "no match"

# Real-world: command parsing
Command = tuple[str, ...]

def handle(cmd: Command) -> str:
    match cmd:
        case ("quit",):
            return "goodbye"
        case ("go", direction) if direction in ("north","south","east","west"):
            return f"going {direction}"
        case ("get", item):
            return f"picking up {item}"
        case _:
            return f"unknown command: {cmd}"

print(handle(("go", "north")))   # going north
print(handle(("get", "sword")))  # picking up sword

7. Iterators & Generators

Iterator Protocol

class Range:
    """Custom range-like iterator."""
    def __init__(self, start: int, stop: int, step: int = 1):
        self.current = start
        self.stop = stop
        self.step = step

    def __iter__(self):
        return self     # iterator returns itself

    def __next__(self):
        if self.current >= self.stop:
            raise StopIteration
        value = self.current
        self.current += self.step
        return value

for n in Range(0, 5):
    print(n)    # 0 1 2 3 4

# Separate iterable and iterator
class NumberRange:
    """Iterable — can create multiple independent iterators."""
    def __init__(self, n: int):
        self.n = n

    def __iter__(self):
        return iter(range(self.n))  # fresh iterator each time

Generators

from typing import Generator, Iterator

# Generator function — lazy, produces values on demand
def fibonacci() -> Generator[int, None, None]:
    a, b = 0, 1
    while True:
        yield a
        a, b = b, a + b

gen = fibonacci()
print([next(gen) for _ in range(8)])  # [0, 1, 1, 2, 3, 5, 8, 13]

# Generator expression — lazy list comprehension
squares = (x**2 for x in range(1_000_000))  # no memory allocated yet
total = sum(squares)                          # consumed lazily

# yield from — delegate to sub-generator
def chain(*iterables):
    for it in iterables:
        yield from it

list(chain([1, 2], [3, 4], [5]))  # [1, 2, 3, 4, 5]

# Two-way communication with generators
def accumulator() -> Generator[float, float, str]:
    total = 0.0
    while True:
        value = yield total     # yield sends current total, receives next value
        if value is None:
            return f"final: {total}"
        total += value

acc = accumulator()
next(acc)          # prime the generator (advance to first yield)
acc.send(10)       # 10.0
acc.send(20)       # 30.0

itertools

import itertools as it

# Chaining
list(it.chain([1,2], [3,4], [5]))         # [1, 2, 3, 4, 5]
list(it.chain.from_iterable([[1,2],[3]])) # [1, 2, 3]

# Slicing lazy iterables
list(it.islice(fibonacci(), 10))          # first 10 Fibonacci numbers

# Grouping (input must be sorted by key)
data = [("a", 1), ("a", 2), ("b", 3)]
for key, group in it.groupby(data, key=lambda x: x[0]):
    print(key, list(group))

# Combinatorics
list(it.product("AB", repeat=2))         # AA AB BA BB
list(it.combinations("ABCD", 2))         # AB AC AD BC BD CD
list(it.permutations("ABC", 2))          # 6 permutations

# Padding shorter iterables
list(it.zip_longest([1,2,3], [4,5], fillvalue=0))  # [(1,4),(2,5),(3,0)]

# starmap — unpack tuples as args
list(it.starmap(pow, [(2,3),(3,2),(4,2)]))   # [8, 9, 16]

# takewhile / dropwhile
list(it.takewhile(lambda x: x < 5, [1,2,3,5,4]))  # [1, 2, 3]
list(it.dropwhile(lambda x: x < 5, [1,2,3,5,4]))  # [5, 4]

# cycle and repeat
it.cycle("AB")        # A B A B A B ... (infinite)
it.repeat(42, times=3)  # 42 42 42
Challenge

Fibonacci Generator

Write a generator function fib() that yields the infinite Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, ...

def fib():
    # YOUR CODE HERE
    pass


# Test — should print: 0 1 1 2 3 5 8 13 21 34
from itertools import islice
print(*islice(fib(), 10))

Use two variables a, b = 0, 1 and a while True loop. yield a, then update with a, b = b, a + b.

def fib():
    a, b = 0, 1
    while True:
        yield a
        a, b = b, a + b

8. Async / Await

Single-threaded concurrency
asyncio uses a single thread with an event loop. It achieves concurrency for I/O-bound work by suspending coroutines while waiting. It does NOT parallelize CPU-bound work.

Coroutines & Tasks

import asyncio

async def fetch_url(url: str) -> str:
    await asyncio.sleep(0.1)  # simulate I/O
    return f"content of {url}"

# Run a single coroutine
result = asyncio.run(fetch_url("https://example.com"))

# Run concurrently — gather
async def main():
    urls = ["https://a.com", "https://b.com", "https://c.com"]

    # gather — all run concurrently, returns list of results
    results = await asyncio.gather(*[fetch_url(u) for u in urls])

    # create_task — fire-and-forget, runs in background
    task = asyncio.create_task(fetch_url("https://d.com"))
    # ... do other work ...
    result = await task

    return results

asyncio.run(main())

Async Context Managers & Generators

import asyncio
from contextlib import asynccontextmanager

class AsyncDB:
    async def __aenter__(self):
        print("connect")
        return self

    async def __aexit__(self, exc_type, exc, tb):
        print("disconnect")
        return False  # don't suppress exceptions

    async def query(self, sql: str) -> list:
        await asyncio.sleep(0.01)
        return []

async def use_db():
    async with AsyncDB() as db:
        rows = await db.query("SELECT 1")

# asynccontextmanager helper
@asynccontextmanager
async def managed_resource(name: str):
    print(f"acquiring {name}")
    try:
        yield name
    finally:
        print(f"releasing {name}")

# Async generator
async def ticker(delay: float, count: int):
    for i in range(count):
        await asyncio.sleep(delay)
        yield i

async def consume():
    async for tick in ticker(0.1, 5):
        print(tick)

Concurrency Limiting with Semaphore

import asyncio
import aiohttp  # pip install aiohttp

async def fetch(session: aiohttp.ClientSession, url: str, sem: asyncio.Semaphore) -> str:
    async with sem:    # at most N concurrent requests
        async with session.get(url) as resp:
            return await resp.text()

async def crawl(urls: list[str], max_concurrent: int = 10) -> list[str]:
    sem = asyncio.Semaphore(max_concurrent)
    async with aiohttp.ClientSession() as session:
        tasks = [fetch(session, url, sem) for url in urls]
        return await asyncio.gather(*tasks, return_exceptions=True)

TaskGroup (3.11+) & ExceptionGroup

import asyncio

async def main():
    # TaskGroup — structured concurrency, cancels all on first failure
    async with asyncio.TaskGroup() as tg:
        task1 = tg.create_task(fetch_url("https://a.com"))
        task2 = tg.create_task(fetch_url("https://b.com"))
    # Both tasks are done here; any exception is re-raised

    # ExceptionGroup handling (3.11+)
    try:
        async with asyncio.TaskGroup() as tg:
            tg.create_task(bad_task())
            tg.create_task(good_task())
    except* ValueError as eg:         # except* handles ExceptionGroup
        for exc in eg.exceptions:
            print(f"ValueError: {exc}")
    except* IOError as eg:
        for exc in eg.exceptions:
            print(f"IOError: {exc}")

9. Concurrency

The GIL (Global Interpreter Lock)
CPython's GIL ensures only one thread executes Python bytecode at a time. Threading is effective for I/O-bound tasks (GIL released during I/O). For CPU-bound tasks, use multiprocessing or external libraries (NumPy releases GIL internally). Python 3.13 introduces an experimental "free-threaded" mode that removes the GIL.

threading

import threading
import time

# Thread — basic usage
def worker(name: str, results: list, lock: threading.Lock) -> None:
    time.sleep(0.1)
    with lock:
        results.append(f"done: {name}")

lock = threading.Lock()
results: list[str] = []
threads = [threading.Thread(target=worker, args=(f"t{i}", results, lock))
           for i in range(5)]

for t in threads:
    t.start()
for t in threads:
    t.join()    # wait for all to finish

print(results)

# Event — thread signaling
ready = threading.Event()

def producer():
    time.sleep(0.5)
    ready.set()    # signal

def consumer():
    ready.wait()   # block until set
    print("producer is ready")

# Condition — more complex signaling
cond = threading.Condition()
buffer: list = []

def produce():
    with cond:
        buffer.append(42)
        cond.notify_all()

def consume_item():
    with cond:
        cond.wait_for(lambda: len(buffer) > 0)
        return buffer.pop()

multiprocessing

from multiprocessing import Process, Pool, Manager, Queue
import os

def cpu_task(n: int) -> int:
    return sum(i * i for i in range(n))

# Pool — parallel map across worker processes
if __name__ == "__main__":  # Required guard on Windows/macOS
    with Pool(processes=4) as pool:
        results = pool.map(cpu_task, [1_000_000] * 8)
        # async variant
        async_result = pool.map_async(cpu_task, [500_000] * 4)
        results2 = async_result.get(timeout=30)

    # Shared state via Manager (slow — uses IPC)
    with Manager() as mgr:
        shared_list = mgr.list()
        shared_dict = mgr.dict()

    # Shared memory (3.8+) — fast, zero-copy
    from multiprocessing import shared_memory
    shm = shared_memory.SharedMemory(create=True, size=1024)
    shm.close()
    shm.unlink()

concurrent.futures

from concurrent.futures import ThreadPoolExecutor, ProcessPoolExecutor, as_completed

# ThreadPoolExecutor — I/O-bound
def download(url: str) -> str:
    import urllib.request
    with urllib.request.urlopen(url) as r:
        return r.read().decode()

urls = ["https://httpbin.org/delay/1"] * 5

with ThreadPoolExecutor(max_workers=5) as executor:
    # map — blocking, ordered results
    results = list(executor.map(download, urls, timeout=10))

    # submit — returns Future objects
    futures = {executor.submit(download, url): url for url in urls}
    for future in as_completed(futures):
        url = futures[future]
        try:
            data = future.result()
        except Exception as e:
            print(f"{url} failed: {e}")

# ProcessPoolExecutor — CPU-bound
def heavy(n: int) -> int:
    return sum(range(n))

with ProcessPoolExecutor() as executor:
    results = list(executor.map(heavy, [10**6] * 4))
ApproachBest ForGIL?MemoryComplexity
threadingI/O-bound (network, disk)SharedSharedLow
multiprocessingCPU-bound computationSeparate per processIsolatedMedium
asyncioMany concurrent I/O opsSingle threadSharedMedium
concurrent.futuresBoth (wrapper API)Depends on executorDependsLow

10. Error Handling

try / except / else / finally

def parse_config(path: str) -> dict:
    try:
        with open(path) as f:
            import json
            data = json.load(f)
    except FileNotFoundError:
        raise FileNotFoundError(f"Config not found: {path}") from None
    except json.JSONDecodeError as e:
        raise ValueError(f"Invalid JSON in {path}: {e}") from e
    except (PermissionError, OSError) as e:
        # catch multiple exceptions
        raise RuntimeError(f"Cannot read {path}") from e
    else:
        # runs only if no exception was raised
        return data
    finally:
        # ALWAYS runs — use for cleanup
        print("parse_config completed")

Custom Exceptions

class AppError(Exception):
    """Base exception for this application."""
    pass

class ValidationError(AppError):
    def __init__(self, field: str, message: str) -> None:
        self.field = field
        self.message = message
        super().__init__(f"{field}: {message}")

class NotFoundError(AppError):
    def __init__(self, resource: str, id: int) -> None:
        self.resource = resource
        self.id = id
        super().__init__(f"{resource} with id={id} not found")

# Exception chaining
try:
    result = int("bad")
except ValueError as e:
    raise ValidationError("age", "must be a number") from e
    # `from e` preserves original traceback chain

# Suppress: raise from None to hide original
try:
    risky()
except SomeError:
    raise CleanError("user-friendly message") from None
Challenge

Custom Exception Hierarchy

Create a ValidationError that inherits from a base AppError. It should accept a field name and message.

# YOUR CODE HERE — define AppError and ValidationError


# Tests
try:
    raise ValidationError("email", "must contain @")
except AppError as e:
    print(f"Caught: {e}")
    print(f"Field: {e.field}")
    print(f"Is ValidationError: {isinstance(e, ValidationError)}")

Define class AppError(Exception): pass, then class ValidationError(AppError) with an __init__ that stores field and calls super().__init__(message).

class AppError(Exception):
    pass

class ValidationError(AppError):
    def __init__(self, field, message):
        self.field = field
        super().__init__(f"{field}: {message}")

Context Managers

from contextlib import contextmanager, suppress, closing, ExitStack
import contextlib

# contextmanager decorator — simplest way
@contextmanager
def temp_directory():
    import tempfile, shutil
    path = tempfile.mkdtemp()
    try:
        yield path
    finally:
        shutil.rmtree(path, ignore_errors=True)

with temp_directory() as tmpdir:
    print(f"working in {tmpdir}")

# suppress — swallow specific exceptions
with suppress(FileNotFoundError):
    import os
    os.remove("maybe_missing.txt")

# ExitStack — dynamic number of context managers
with ExitStack() as stack:
    files = [stack.enter_context(open(f)) for f in ["a.txt", "b.txt"]]
    # all files closed on exit, even if one fails

# Exception groups (3.11+)
try:
    raise ExceptionGroup("multiple failures", [
        ValueError("bad value"),
        TypeError("bad type"),
        KeyError("missing key"),
    ])
except* ValueError as eg:
    for exc in eg.exceptions:
        print(f"Value error: {exc}")
except* (TypeError, KeyError) as eg:
    for exc in eg.exceptions:
        print(f"Type/Key error: {exc}")
Predict

Context Manager Call Order

What order do the print statements execute?

class Ctx:
    def __init__(self, name):
        self.name = name
        print(f"{name}: init")
    def __enter__(self):
        print(f"{self.name}: enter")
        return self
    def __exit__(self, *args):
        print(f"{self.name}: exit")

with Ctx("A") as a, Ctx("B") as b:
    print("body")

Output order: A: init, A: enter, B: init, B: enter, body, B: exit, A: exit. Context managers are entered left-to-right but exited in reverse (LIFO), like nested with blocks.

11. Modules & Packages

Import System

# Absolute imports (preferred)
import os
import os.path
from pathlib import Path
from collections import defaultdict

# Relative imports (within a package)
from . import sibling_module
from .sibling_module import SomeClass
from ..parent_module import util

# Aliasing
import numpy as np
from typing import Optional as Opt

# Conditional import
try:
    import ujson as json
except ImportError:
    import json

# Lazy import (for startup speed)
def get_numpy():
    import numpy as np  # only imported when called
    return np

Package Structure

mypackage/
    __init__.py          # makes directory a package; can be empty
    __main__.py          # run with: python -m mypackage
    core.py
    utils/
        __init__.py
        helpers.py
    _internal/           # convention: private subpackage
        __init__.py
# mypackage/__init__.py
# Control what `from mypackage import *` exposes
__all__ = ["MyClass", "main_function"]

# Re-export for cleaner public API
from .core import MyClass, main_function
from .utils.helpers import helper_fn

# __version__ convention
__version__ = "1.2.3"

importlib & Dynamic Imports

import importlib
import sys

# Dynamic import by name
module = importlib.import_module("os.path")
func = getattr(module, "join")

# Reload a module (useful in REPL development)
importlib.reload(module)

# Check if module is available without importing
from importlib.util import find_spec
if find_spec("ujson") is not None:
    print("ujson is available")

# sys.path manipulation (usually avoid — prefer venv)
sys.path.insert(0, "/path/to/extra/modules")

12. File I/O & Data

pathlib.Path

from pathlib import Path

# Construction
p = Path("/home/user/docs")
p = Path.home() / "docs" / "report.txt"   # / operator joins paths
p = Path(".")  # current directory

# Properties
p.name        # "report.txt"
p.stem        # "report"
p.suffix      # ".txt"
p.parent      # Path("/home/user/docs")
p.parts       # ("/", "home", "user", "docs", "report.txt")

# Existence checks
p.exists()
p.is_file()
p.is_dir()

# Reading / writing
text = p.read_text(encoding="utf-8")
p.write_text("hello", encoding="utf-8")
data = p.read_bytes()
p.write_bytes(b"binary data")

# Directory operations
p.parent.mkdir(parents=True, exist_ok=True)  # mkdir -p
for child in p.parent.iterdir():
    print(child)

# Glob
py_files = list(Path(".").glob("**/*.py"))   # recursive
txt_files = list(Path(".").glob("*.txt"))

# Rename / delete
p.rename(p.with_suffix(".bak"))
p.unlink(missing_ok=True)
p.parent.rmdir()                             # must be empty
import shutil
shutil.rmtree(p.parent)                      # recursive delete

JSON, CSV, Pickle

import json
import csv
import pickle
from pathlib import Path

# JSON
data = {"name": "Alice", "scores": [95, 87, 92]}
json_str = json.dumps(data, indent=2, ensure_ascii=False)
parsed  = json.loads(json_str)

Path("data.json").write_text(json.dumps(data), encoding="utf-8")
with open("data.json") as f:
    loaded = json.load(f)

# CSV
rows = [{"name": "Alice", "age": 30}, {"name": "Bob", "age": 25}]
with open("data.csv", "w", newline="") as f:
    writer = csv.DictWriter(f, fieldnames=["name", "age"])
    writer.writeheader()
    writer.writerows(rows)

with open("data.csv") as f:
    reader = csv.DictReader(f)
    for row in reader:
        print(row["name"])

# Pickle — Python-specific binary serialization
obj = {"complex": [1, 2], "data": set([3, 4])}
with open("data.pkl", "wb") as f:
    pickle.dump(obj, f, protocol=pickle.HIGHEST_PROTOCOL)

with open("data.pkl", "rb") as f:
    loaded = pickle.load(f)
Never unpickle untrusted data
pickle.load() can execute arbitrary code. Only load pickled data from trusted sources you control.

tempfile & shutil

import tempfile
import shutil

# Temporary file — auto-deleted on close
with tempfile.NamedTemporaryFile(suffix=".json", mode="w", delete=False) as f:
    json.dump(data, f)
    temp_path = f.name

# Temporary directory
with tempfile.TemporaryDirectory() as tmpdir:
    work_path = Path(tmpdir) / "output.txt"
    work_path.write_text("processed")
    # tmpdir deleted on exit

# shutil operations
shutil.copy("src.txt", "dst.txt")
shutil.copy2("src.txt", "dst.txt")    # preserves metadata
shutil.copytree("src_dir", "dst_dir", dirs_exist_ok=True)
shutil.move("old_path", "new_path")
shutil.disk_usage("/")               # total, used, free in bytes
shutil.make_archive("archive", "zip", root_dir="my_folder")

13. Testing

pytest

# test_math.py
import pytest

def add(a: int, b: int) -> int:
    return a + b

# Basic test
def test_add():
    assert add(2, 3) == 5

# Parametrize — run one test with multiple inputs
@pytest.mark.parametrize("a,b,expected", [
    (1, 2, 3),
    (0, 0, 0),
    (-1, 1, 0),
    (100, 200, 300),
])
def test_add_parametrized(a: int, b: int, expected: int):
    assert add(a, b) == expected

# Expected exceptions
def test_division_by_zero():
    with pytest.raises(ZeroDivisionError, match="division by zero"):
        1 / 0

# Skip / xfail
@pytest.mark.skip(reason="not implemented yet")
def test_future(): ...

@pytest.mark.xfail(reason="known bug #123")
def test_known_broken():
    assert False

Fixtures & conftest.py

# conftest.py — shared fixtures across test files
import pytest
from pathlib import Path

@pytest.fixture
def sample_data() -> dict:
    return {"name": "Alice", "age": 30}

@pytest.fixture(scope="session")   # created once per test session
def db_connection():
    conn = create_test_db()
    yield conn           # setup done; test runs; then teardown
    conn.close()

@pytest.fixture
def temp_config(tmp_path: Path) -> Path:
    # tmp_path is a built-in pytest fixture (unique per test)
    config = tmp_path / "config.json"
    config.write_text('{"debug": true}')
    return config

# test_features.py
def test_uses_fixture(sample_data: dict):
    assert sample_data["name"] == "Alice"

def test_with_file(temp_config: Path):
    import json
    data = json.loads(temp_config.read_text())
    assert data["debug"] is True

Mocking

from unittest.mock import patch, MagicMock, AsyncMock, call
import pytest

# Patch a dependency in the module under test
def get_user_name(user_id: int) -> str:
    import requests
    r = requests.get(f"https://api.example.com/users/{user_id}")
    return r.json()["name"]

def test_get_user_name():
    mock_response = MagicMock()
    mock_response.json.return_value = {"name": "Alice"}

    with patch("requests.get", return_value=mock_response) as mock_get:
        result = get_user_name(42)
        assert result == "Alice"
        mock_get.assert_called_once_with("https://api.example.com/users/42")

# Patch as decorator
@patch("os.path.exists", return_value=True)
def test_file_exists(mock_exists):
    import os
    assert os.path.exists("/fake/path")

# AsyncMock for async functions
async def fetch_data(session, url: str) -> dict:
    response = await session.get(url)
    return await response.json()

@pytest.mark.asyncio
async def test_fetch_data():
    mock_session = AsyncMock()
    mock_response = AsyncMock()
    mock_response.json.return_value = {"key": "value"}
    mock_session.get.return_value = mock_response

    result = await fetch_data(mock_session, "https://api.example.com")
    assert result == {"key": "value"}

# monkeypatch (pytest built-in)
def test_env_var(monkeypatch):
    monkeypatch.setenv("API_KEY", "test-key-123")
    monkeypatch.delenv("UNNEEDED_VAR", raising=False)
    monkeypatch.setattr("mymodule.CONSTANT", 42)

Hypothesis (Property-Based Testing)

from hypothesis import given, strategies as st, assume

def sort_and_deduplicate(items: list[int]) -> list[int]:
    return sorted(set(items))

@given(st.lists(st.integers()))
def test_output_is_sorted(items: list[int]):
    result = sort_and_deduplicate(items)
    assert result == sorted(result)

@given(st.lists(st.integers()))
def test_output_has_no_duplicates(items: list[int]):
    result = sort_and_deduplicate(items)
    assert len(result) == len(set(result))

@given(st.text(), st.text())
def test_concatenation_length(a: str, b: str):
    assert len(a + b) == len(a) + len(b)

# assume() — skip inputs that don't meet precondition
@given(st.integers(), st.integers())
def test_division(a: int, b: int):
    assume(b != 0)
    assert (a / b) * b == pytest.approx(a, rel=1e-9)

14. Package Management

pyproject.toml (PEP 621)

[build-system]
requires = ["setuptools>=68", "wheel"]
build-backend = "setuptools.build_meta"

[project]
name = "mypackage"
version = "0.1.0"
description = "A short description"
readme = "README.md"
requires-python = ">=3.10"
license = { text = "MIT" }
authors = [{ name = "Alice", email = "[email protected]" }]

dependencies = [
    "httpx>=0.27",
    "pydantic>=2.0",
]

[project.optional-dependencies]
dev = ["pytest>=8", "mypy", "ruff"]
docs = ["mkdocs", "mkdocs-material"]

[project.scripts]
mycli = "mypackage.cli:main"   # entry point

[tool.pytest.ini_options]
testpaths = ["tests"]
asyncio_mode = "auto"

[tool.mypy]
strict = true
python_version = "3.11"

[tool.ruff]
line-length = 100

[tool.ruff.lint]
select = ["E", "F", "I", "N", "UP"]

uv — Fast Package Manager

# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh

# Create project + venv
uv init myproject
uv venv                        # creates .venv/
source .venv/bin/activate

# Add dependencies (writes to pyproject.toml + uv.lock)
uv add httpx pydantic
uv add --dev pytest ruff mypy

# Install all deps from lockfile (fast — parallel, cached)
uv sync

# Run without activating venv
uv run python script.py
uv run pytest

# Upgrade all deps
uv lock --upgrade
uv sync

# pip-compatible
uv pip install -r requirements.txt
uv pip freeze > requirements.txt
ToolSpeedLock FileVirtual EnvsBest For
pip + venvSlowrequirements.txtManualSimple projects
uvVery fast (Rust)uv.lockAutomaticModern projects
poetryMediumpoetry.lockAutomaticLibrary publishing
condaSlowenvironment.ymlAutomaticData science, non-Python deps

15. Modern Features

f-strings (3.6+, enhanced in 3.12)

name = "Alice"
pi = 3.14159

# Basic
f"Hello, {name}!"

# Expressions
f"2 + 2 = {2 + 2}"
f"Upper: {name.upper()}"

# Format spec
f"Pi = {pi:.2f}"                 # Pi = 3.14
f"Hex: {255:#010x}"             # Hex: 0x000000ff
f"Thousands: {1_000_000:,}"     # Thousands: 1,000,000
f"Padded: {name:>10}"           # right-align in 10 chars

# Debug (3.8+) — prints name=value
x = 42
f"{x=}"                          # 'x=42'
f"{x=:.2f}"                      # 'x=42.00'

# !r, !s, !a conversions
f"{name!r}"                      # "'Alice'" (repr)
f"{name!s}"                      # "Alice"  (str)

# Multiline (3.12 — quotes inside f-string)
items = ["a", "b"]
f"Items: {', '.join(items)}"

Walrus Operator :=

import re

# Assign and test in one expression
data = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]

# Without walrus
chunk = data[:3]
if chunk:
    process(chunk)

# With walrus
if chunk := data[:3]:
    process(chunk)

# In while loop — common idiom
import sys
while line := sys.stdin.readline():
    process(line.strip())

# In comprehensions
values = [y for x in data if (y := x * 2) > 5]

# Regex match
text = "Hello, World! 2024"
if m := re.search(r"\d+", text):
    print(m.group())   # 2024

Python 3.11 Features

import tomllib  # built-in TOML parser (3.11+)
from enum import StrEnum  # (3.11+)

# tomllib
with open("pyproject.toml", "rb") as f:
    config = tomllib.load(f)  # read-only, binary mode required

# StrEnum — members are also strings
class Color(StrEnum):
    RED = "red"
    GREEN = "green"
    BLUE = "blue"

print(Color.RED == "red")       # True
print(f"Color: {Color.GREEN}")  # Color: green
json.dumps({"color": Color.RED})  # works without custom encoder

# ExceptionGroup / except* (3.11+)
def multi_error():
    raise ExceptionGroup("errors", [
        ValueError("v1"),
        ValueError("v2"),
        TypeError("t1"),
    ])

try:
    multi_error()
except* ValueError as eg:
    print(f"Got {len(eg.exceptions)} ValueErrors")
except* TypeError as eg:
    print(f"Got {len(eg.exceptions)} TypeErrors")

Python 3.12 Features

# type statement — type aliases with proper generics
type Vector = list[float]
type Matrix[T] = list[list[T]]

# Generic functions and classes with new syntax
def first[T](items: list[T]) -> T:
    return items[0]

class Stack[T]:
    def __init__(self) -> None:
        self._items: list[T] = []
    def push(self, item: T) -> None:
        self._items.append(item)
    def pop(self) -> T:
        return self._items.pop()

# f-string improvements — quotes inside
name = "World"
f"{'Hello'!r}, {name}"     # no escaping needed
f"{', '.join(['a','b'])}"  # nested quotes work

# Better error messages
# Python 3.12 pinpoints the exact token in SyntaxError
# AttributeError suggests similar names
# NameError suggests similar names in scope

16. Common Pitfalls

Mutable Default Arguments

# WRONG — the list is created ONCE when def is evaluated
def append_to(item, lst=[]):
    lst.append(item)
    return lst

append_to(1)  # [1]
append_to(2)  # [1, 2]  — not [2]! Same list reused!

# RIGHT — use None as sentinel
def append_to(item, lst=None):
    if lst is None:
        lst = []
    lst.append(item)
    return lst

# Same issue with dicts, sets, custom objects
Predict

The Mutable Default Trap

This is Python's most infamous gotcha. What does the third call print?

def append_to(item, lst=[]):
    lst.append(item)
    return lst

print(append_to(1))
print(append_to(2))
print(append_to(3))

All three calls share the same list: [1], [1, 2], [1, 2, 3]. The default [] is created once at function definition time, not per call. Fix: use None as default and create a new list inside the function.

Late Binding Closures

# WRONG — i is looked up at call time, not creation time
fns = [lambda: i for i in range(5)]
print([f() for f in fns])  # [4, 4, 4, 4, 4] — all use final i

# RIGHT — capture current value with default argument
fns = [lambda i=i: i for i in range(5)]
print([f() for f in fns])  # [0, 1, 2, 3, 4]

# Or use functools.partial
from functools import partial
def make_fn(i): return lambda: i
fns = [make_fn(i) for i in range(5)]
Predict

Late Binding Closures

Closures capture variables by reference, not value. What does this print?

funcs = [lambda: i for i in range(4)]
print([f() for f in funcs])

[3, 3, 3, 3] — All lambdas close over the same variable i, which is 3 after the loop ends. Fix: lambda i=i: i captures the current value.

Modifying While Iterating

# WRONG — modifying list while iterating over it
items = [1, 2, 3, 4, 5]
for item in items:
    if item % 2 == 0:
        items.remove(item)   # skips elements!
print(items)  # [1, 3, 5] — but 2 is removed and 3 may be skipped

# RIGHT — iterate over a copy
for item in items[:]:
    if item % 2 == 0:
        items.remove(item)

# BETTER — filter comprehension
items = [x for x in items if x % 2 != 0]

# Dict: cannot modify size during iteration
d = {"a": 1, "b": 2}
# for k in d: del d[k]   # RuntimeError!
for k in list(d.keys()):  # iterate over copy of keys
    del d[k]

Shallow vs Deep Copy

import copy

original = [[1, 2], [3, 4]]

# Assignment — same object
ref = original
ref[0].append(99)
print(original)   # [[1, 2, 99], [3, 4]] — mutated!

# Shallow copy — new outer list, same inner lists
shallow = copy.copy(original)
# or: shallow = original[:]
# or: shallow = list(original)
shallow[0].append(88)
print(original)   # [[1, 2, 99, 88], [3, 4]] — inner list still shared!

# Deep copy — fully independent
deep = copy.deepcopy(original)
deep[0].append(77)
print(original)   # unchanged — no sharing at any level

is vs ==

# WRONG — using `is` for value comparison
a = "hello world"  # not interned (contains space)
b = "hello world"
print(a is b)      # may be True or False — UNDEFINED behavior
print(a == b)      # True — correct

# CORRECT uses of `is`
x = None
if x is None: ...      # singleton comparison
if x is not None: ...  # preferred over != None
if flag is True: ...   # rarely needed; usually just `if flag:`

# The problem: CPython interns small ints (-5 to 256) and some strings
# This makes `is` appear to work but it's an implementation detail
x = 256; y = 256
print(x is y)   # True in CPython
x = 257; y = 257
print(x is y)   # False in CPython (but not guaranteed)

Circular Imports

# Problem: module_a.py imports from module_b.py
#          module_b.py imports from module_a.py
# Python partially executes each, leading to AttributeError

# Solution 1: Move shared code to a third module
# Solution 2: Import inside the function (local import)

# module_a.py
def func_a():
    from .module_b import func_b  # import at call time, not module load
    return func_b()

# Solution 3: Use TYPE_CHECKING guard (for type hints only)
from __future__ import annotations
from typing import TYPE_CHECKING

if TYPE_CHECKING:
    from .module_b import ClassB  # only imported by type checkers, not runtime

def process(obj: "ClassB") -> None:  # use string annotation or __future__
    ...

Other Pitfalls

################################
# GIL misconception
# Threading does NOT speed up CPU-bound Python code
# Use multiprocessing or ctypes/C extensions for CPU parallelism
################################

# String concatenation in loop — O(n^2) due to immutability
# WRONG
result = ""
for s in many_strings:
    result += s  # creates new string each time

# RIGHT
result = "".join(many_strings)

################################
# except Exception is too broad — catches ValueError, TypeError, etc. Use specific types
# WRONG
try:
    risky()
except Exception:   # still misses SystemExit/KeyboardInterrupt, but hides real bugs
    pass

# RIGHT — catch specific exceptions
try:
    risky()
except (ValueError, IOError):
    pass

# Or if you must catch everything, re-raise on keyboard/system
try:
    risky()
except BaseException:
    raise  # always re-raise BaseException

################################
# Missing __all__ — unintentionally exports imports
# In mymodule.py:
import os   # now `from mymodule import *` exports os!

# Fix: define __all__
__all__ = ["MyClass", "my_function"]
Quick Pitfall Checklist
  • Default mutable arguments? Use None sentinel.
  • Lambda in loop? Capture with default arg lambda i=i: i.
  • Mutating while iterating? Iterate over list(collection) copy.
  • Shallow copy when deep needed? Use copy.deepcopy().
  • is for value comparison? Use == (except is None).
  • Circular imports? Refactor or use local imports.
  • String concat in loop? Use "".join(...).
  • Missing __all__? Export only what's intentional.