Programming Paradigms¶
Programming paradigms are fundamental approaches to organizing and structuring code. While the Design Patterns chapter (7.2) covers OOP patterns, and the language chapters cover syntax, the paradigms themselves deserve attention as conceptual frameworks that shape how you think about software. Understanding paradigms deeply helps you choose the right approach for each part of your system and write more expressive, maintainable code.
Overview of Paradigms¶
| Paradigm | Core Idea | State Management | Key Languages |
|---|---|---|---|
| Imperative | Step-by-step instructions (how to do it) | Mutable state, explicit state changes | C, Go, Assembly |
| Object-Oriented | Objects encapsulating data + behavior | Mutable state in objects, controlled via methods | Java, C#, Python, Ruby |
| Functional | Pure functions, immutable data, composition | Immutable (or minimized mutation) | Haskell, Erlang, Elixir, Clojure |
| Declarative | Describe what you want, not how | Managed by runtime/engine | SQL, HTML, CSS, Terraform, Prolog |
| Reactive | Asynchronous data streams and propagation of change | Observable streams, derived state | RxJS, Akka Streams, ReactiveX |
| Logic | Define facts and rules, let the engine derive answers | Unification and backtracking | Prolog, Datalog, miniKanren |
| Concurrent/Actor | Independent actors communicating via messages | Isolated state per actor, message-passing | Erlang/OTP, Akka, Elixir |
Most modern languages are multi-paradigm (Python, Rust, JavaScript, Scala, Kotlin, Swift), supporting elements of several paradigms. The skill is knowing when to apply which paradigm.
Reactive programming — streams and propagation of change¶
Reactive programming models data and events as streams that propagate changes to dependents. Subscribers react to values over time instead of polling. A minimal example in Python using an async generator as a simple stream:
import asyncio
async def event_stream(events: list[str]):
"""Simple reactive stream: emit events one by one."""
for e in events:
await asyncio.sleep(0.1) # Simulate async source
yield e
async def main():
async for event in event_stream(["open", "click", "close"]):
print(f"Reactive: {event}") # Subscriber reacts to each value
asyncio.run(main())
In Rust, the same idea appears with Stream and async iteration; libraries like ReactiveX (RxPy, rx-rs) provide full observable/observer APIs with operators (map, filter, merge).
Logic programming — facts, rules, and queries¶
Logic programming describes what is true (facts and rules) and lets the engine derive answers by unification and backtracking. No explicit control flow for search—the engine explores possibilities.
% Facts: who is a parent of whom
parent(tom, bob).
parent(tom, liz).
parent(bob, ann).
% Rule: X is ancestor of Z if X is parent of Z, or X is parent of Y and Y is ancestor of Z
ancestor(X, Z) :- parent(X, Z).
ancestor(X, Z) :- parent(X, Y), ancestor(Y, Z).
% Query: ?- ancestor(tom, ann). → true (tom → bob → ann)
Datalog restricts Prolog for use in databases and program analysis; same idea—declare relations and let the engine compute fixed points.
Imperative Programming¶
Imperative programming tells the computer how to do something through a sequence of statements that change program state. It's the most natural paradigm for thinking about algorithms—it mirrors how you'd explain a procedure to another person.
Core Concepts¶
- Statements: Instructions executed sequentially (assignment, control flow, I/O)
- Variables: Named mutable state that changes over time
- Control flow: if/else, for, while, switch—explicit branching and looping
- Side effects: Functions that modify external state (write to disk, mutate globals, print output)
// Imperative: step-by-step instructions
int sum_of_squares(int* arr, int n) {
int sum = 0; // Mutable accumulator
for (int i = 0; i < n; i++) {
if (arr[i] % 2 == 0) { // Explicit control flow
sum += arr[i] * arr[i]; // Mutation
}
}
return sum;
}
Strengths: Direct hardware mapping (performance), easy to follow step-by-step, intuitive for algorithms with sequential dependencies, full control over memory and execution.
Weaknesses: Harder to reason about in concurrent contexts (mutable shared state → race conditions), code can become tangled (spaghetti code), scaling complexity is harder to manage than with OOP or FP.
Structured Programming¶
Structured programming is a refinement of imperative programming that eliminates goto in favor of structured control flow (if/else, while, for, functions). Introduced by Dijkstra ("Go To Statement Considered Harmful", 1968), it's the baseline for all modern imperative programming.
The key insight: any program can be written using only three control structures: sequence (one thing after another), selection (if/else), and iteration (while/for). No goto needed.
Object-Oriented Programming (OOP)¶
OOP organizes code around objects—units that bundle data (state) and behavior (methods). Objects interact by sending messages (calling methods), and complexity is managed through encapsulation, inheritance, and polymorphism.
The Four Pillars¶
1. Encapsulation
Bundle data with the methods that operate on it, and hide internal state behind a public interface. The object controls access to its own data—external code can't put the object into an invalid state.
class BankAccount:
def __init__(self, owner: str, initial_balance: float = 0):
self._owner = owner
self._balance = initial_balance # Private: external code can't set directly
self._transactions: list[dict] = []
@property
def balance(self) -> float:
return self._balance # Read-only access
def deposit(self, amount: float) -> None:
if amount <= 0:
raise ValueError("Deposit amount must be positive")
self._balance += amount
self._transactions.append({"type": "deposit", "amount": amount})
def withdraw(self, amount: float) -> None:
if amount <= 0:
raise ValueError("Withdrawal amount must be positive")
if amount > self._balance:
raise ValueError("Insufficient funds")
self._balance -= amount
self._transactions.append({"type": "withdrawal", "amount": amount})
The invariant (balance >= 0) is enforced by the object—external code cannot set _balance directly and bypass the validation. This is the essence of encapsulation: protecting invariants.
2. Abstraction
Hide complex implementation details behind a simple interface. Users of a class don't need to know how it works internally—only what it does.
from abc import ABC, abstractmethod
class PaymentProcessor(ABC):
"""Abstract interface — callers don't know the implementation."""
@abstractmethod
def charge(self, amount: float, currency: str) -> str:
"""Charge the payment method. Returns transaction ID."""
pass
class StripeProcessor(PaymentProcessor):
def charge(self, amount: float, currency: str) -> str:
# Complex Stripe API interaction hidden behind simple interface
response = stripe.PaymentIntent.create(amount=int(amount * 100), currency=currency)
return response.id
class PayPalProcessor(PaymentProcessor):
def charge(self, amount: float, currency: str) -> str:
# Complex PayPal API interaction hidden behind same interface
order = paypal.Order.create(amount=amount, currency=currency)
return order.id
3. Inheritance
Create new classes that reuse, extend, or modify behavior of existing classes. The child class (subclass) inherits from the parent class (superclass).
class Animal:
def __init__(self, name: str):
self.name = name
def speak(self) -> str:
raise NotImplementedError
class Dog(Animal):
def speak(self) -> str:
return f"{self.name} says Woof!"
class Cat(Animal):
def speak(self) -> str:
return f"{self.name} says Meow!"
Inheritance pitfalls: Deep inheritance hierarchies become brittle—a change to a base class can break all subclasses. The "fragile base class problem" and "diamond problem" (multiple inheritance ambiguity) have led to a modern preference for composition over inheritance.
4. Polymorphism
The ability to treat objects of different types through a common interface. The same method call behaves differently depending on the object's actual type.
def make_all_speak(animals: list[Animal]) -> None:
for animal in animals:
print(animal.speak()) # Polymorphic: calls Dog.speak() or Cat.speak()
animals = [Dog("Rex"), Cat("Whiskers"), Dog("Buddy")]
make_all_speak(animals)
# Rex says Woof!
# Whiskers says Meow!
# Buddy says Woof!
Composition Over Inheritance¶
Modern OOP strongly prefers composition (objects containing other objects) over inheritance (IS-A relationships):
# Inheritance approach (rigid, fragile):
class FlyingAnimal(Animal):
def fly(self): ...
class SwimmingAnimal(Animal):
def swim(self): ...
# Problem: What about a duck that can both fly AND swim?
# class Duck(FlyingAnimal, SwimmingAnimal): ← Diamond problem!
# Composition approach (flexible):
class FlyAbility:
def fly(self):
print("Flying!")
class SwimAbility:
def swim(self):
print("Swimming!")
class Duck:
def __init__(self):
self.flying = FlyAbility() # HAS-A flying ability
self.swimming = SwimAbility() # HAS-A swimming ability
def fly(self):
self.flying.fly()
def swim(self):
self.swimming.swim()
SOLID Principles¶
SOLID is a set of five design principles for writing maintainable OOP code:
| Principle | Description | Violation Example |
|---|---|---|
| Single Responsibility | A class should have one reason to change | A User class that handles both user data AND email sending |
| Open/Closed | Open for extension, closed for modification | Adding new payment methods by modifying a giant if/else chain |
| Liskov Substitution | Subtypes must be substitutable for their base types | A Square subclass of Rectangle that breaks when width is set independently |
| Interface Segregation | Clients shouldn't depend on interfaces they don't use | A fat Worker interface with both work() and eat() methods forced on robots |
| Dependency Inversion | Depend on abstractions, not concrete implementations | Business logic directly importing and using MySQLDatabase instead of a Database interface |
# Dependency Inversion example
# BAD: Business logic depends on concrete implementation
class OrderService:
def __init__(self):
self.db = MySQLDatabase() # Tightly coupled to MySQL
# GOOD: Business logic depends on abstraction
class OrderService:
def __init__(self, repository: OrderRepository): # Depends on interface
self._repository = repository
# Now we can inject any implementation:
service = OrderService(PostgresOrderRepository())
service = OrderService(InMemoryOrderRepository()) # For testing
Functional Programming (FP)¶
Functional programming treats computation as the evaluation of mathematical functions, emphasizing immutability, pure functions, and composition over stateful objects and side effects. FP has gained significant adoption in mainstream programming because its properties (purity, immutability) make code easier to test, reason about, parallelize, and maintain.
Core Principles¶
1. Pure Functions
A pure function: (a) always returns the same output for the same input (referential transparency), and (b) has no side effects (no I/O, no mutation of external state).
# Pure function — deterministic, no side effects
def add(a: int, b: int) -> int:
return a + b
# Pure: same input always gives same output, no side effects
add(2, 3) # Always 5, no matter when or how many times you call it
# Impure function — depends on/modifies external state
total = 0
def add_to_total(value: int) -> int:
global total
total += value # Side effect: mutates external state
return total
# add_to_total(5) returns different values depending on when it's called!
Benefits of purity: - Testable: No mocking, no setup, no teardown—just assert input → output - Cacheable: Since output depends only on input, results can be memoized - Parallelizable: No shared mutable state → no race conditions → safe to run on multiple threads - Composable: Pure functions combine freely without unexpected interactions
2. Immutability
Data, once created, is never modified. Instead, new data structures are created with the desired changes. This eliminates an entire class of bugs (unexpected mutation, race conditions, aliasing issues).
# Mutable approach (imperative) — risky in concurrent code
numbers = [1, 2, 3]
numbers.append(4) # Modifies the original list — any other reference sees the change
# Immutable approach (functional)
numbers = (1, 2, 3) # Tuple — immutable
new_numbers = numbers + (4,) # Creates a new tuple; original unchanged
# Immutable data structures with dataclasses
from dataclasses import dataclass
@dataclass(frozen=True) # frozen=True makes it immutable
class User:
name: str
email: str
age: int
def with_email(self, new_email: str) -> 'User':
"""Return a new User with updated email (immutable update)."""
return User(name=self.name, email=new_email, age=self.age)
alice = User("Alice", "alice@old.com", 30)
updated_alice = alice.with_email("alice@new.com")
# alice is unchanged, updated_alice is a new object
// Rust encourages immutability by default
let numbers = vec![1, 2, 3]; // immutable — cannot push, pop, or modify
let mut numbers = vec![1, 2, 3]; // must explicitly opt into mutability
numbers.push(4); // Only works with mut
Persistent data structures (used in Clojure, Scala, Haskell): Data structures that efficiently share structure between versions. When you "update" an immutable tree, the new tree shares most of its nodes with the old tree—only the changed path is copied. This gives O(log n) immutable updates instead of O(n) full copies.
3. First-Class and Higher-Order Functions
Functions are values—they can be passed as arguments, returned from other functions, and stored in variables.
# Higher-order function: takes a function as argument
def apply_operation(numbers: list[int], operation) -> list[int]:
return [operation(n) for n in numbers]
double = lambda x: x * 2
square = lambda x: x ** 2
apply_operation([1, 2, 3], double) # [2, 4, 6]
apply_operation([1, 2, 3], square) # [1, 4, 9]
# Higher-order function: returns a function
def make_adder(n: int):
def adder(x: int) -> int:
return x + n
return adder
add_five = make_adder(5)
add_five(3) # 8
add_five(10) # 15
4. Function Composition
Build complex operations by combining simple functions. This is the functional equivalent of piping data through a series of transformations.
from functools import reduce
# Compose: apply functions right to left
def compose(*functions):
return reduce(lambda f, g: lambda x: f(g(x)), functions)
add_one = lambda x: x + 1
double = lambda x: x * 2
square = lambda x: x ** 2
transform = compose(square, double, add_one) # square(double(add_one(x)))
transform(3) # square(double(4)) = square(8) = 64
// Rust iterator chains are function composition
let result: i32 = vec![1, 2, 3, 4, 5]
.iter()
.map(|x| x + 1) // [2, 3, 4, 5, 6]
.filter(|x| x % 2 == 0) // [2, 4, 6]
.map(|x| x * x) // [4, 16, 36]
.sum(); // 56
// This is a pipeline: data flows through transformations.
// Lazy evaluation: only one pass through the data!
5. Map, Filter, Reduce
The three foundational FP operations for transforming collections:
numbers = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
# Map: apply a function to each element (transform)
squares = list(map(lambda x: x**2, numbers))
# [1, 4, 9, 16, 25, 36, 49, 64, 81, 100]
# Filter: keep elements matching a predicate (select)
evens = list(filter(lambda x: x % 2 == 0, numbers))
# [2, 4, 6, 8, 10]
# Reduce: combine all elements into a single value (aggregate)
from functools import reduce
total = reduce(lambda acc, x: acc + x, numbers, 0)
# 55
# Pythonic equivalents (list comprehensions — often preferred in Python)
squares = [x**2 for x in numbers]
evens = [x for x in numbers if x % 2 == 0]
total = sum(numbers)
# Chaining: functional pipeline
result = sum(x**2 for x in numbers if x % 2 == 0)
# Sum of squares of even numbers: 4 + 16 + 36 + 64 + 100 = 220
// Rust iterators are functional and zero-cost (compiled to loops)
let numbers = vec![1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
let squares: Vec<i32> = numbers.iter().map(|x| x * x).collect();
let evens: Vec<&i32> = numbers.iter().filter(|x| *x % 2 == 0).collect();
let total: i32 = numbers.iter().sum();
// Chaining operations (lazy evaluation — only one pass!)
let result: i32 = numbers.iter()
.filter(|x| *x % 2 == 0) // Keep even numbers
.map(|x| x * x) // Square them
.sum(); // Sum the result
// 4 + 16 + 36 + 64 + 100 = 220
Advanced FP Concepts¶
Closures: Functions that capture variables from their enclosing scope. The captured variables persist as long as the closure exists.
def make_multiplier(factor):
def multiplier(x):
return x * factor # 'factor' is captured from outer scope
return multiplier
double = make_multiplier(2)
triple = make_multiplier(3)
double(5) # 10
triple(5) # 15
# The 'factor' variable persists inside the closure even after make_multiplier returns
Currying and Partial Application:
- Currying: Transform a function that takes multiple arguments into a chain of functions each taking a single argument:
f(a, b, c)→f(a)(b)(c) - Partial application: Fix some arguments of a function to create a new function:
f(a, b, c)witha=5→g(b, c) = f(5, b, c)
from functools import partial
def add(a, b):
return a + b
add_five = partial(add, 5) # Partially apply first argument
add_five(3) # 8
# Manual currying
def curry_add(a):
def inner(b):
return a + b
return inner
curry_add(5)(3) # 8
# Practical use: configuring reusable functions
def log(level, message):
print(f"[{level}] {message}")
info = partial(log, "INFO")
error = partial(log, "ERROR")
info("Server started") # [INFO] Server started
error("Connection lost") # [ERROR] Connection lost
Monads: A design pattern for chaining operations that include context (error handling, optional values, async). Think of them as "programmable semicolons"—they control what happens between computation steps. The key operation is bind (also called flatMap or >>=): given a value in a context and a function that produces a new context, combine them.
# The "Maybe" / "Optional" pattern — a monad for handling None
from typing import Optional, TypeVar, Callable
T = TypeVar('T')
U = TypeVar('U')
def bind(value: Optional[T], func: Callable[[T], Optional[U]]) -> Optional[U]:
"""Chain operations that might return None."""
if value is None:
return None
return func(value)
# Without monadic chaining (nested null checks — "pyramid of doom")
def get_user_city_ugly(user_id):
user = get_user(user_id)
if user is None:
return None
address = get_address(user)
if address is None:
return None
return get_city(address)
# With monadic chaining (flat, composable)
def get_user_city(user_id):
return bind(bind(get_user(user_id), get_address), get_city)
// Rust's Option and Result types are monads
// The ? operator is monadic bind for Result/Option
fn get_user_city(user_id: u64) -> Option<String> {
let user = get_user(user_id)?; // Returns None if None
let address = get_address(&user)?; // Returns None if None
let city = get_city(&address)?; // Returns None if None
Some(city)
}
// Result monad for error handling
fn process_order(order_id: &str) -> Result<Receipt, OrderError> {
let order = fetch_order(order_id)?; // Returns Err if error
let payment = process_payment(&order)?; // Returns Err if error
let receipt = generate_receipt(&order, &payment)?;
Ok(receipt)
}
// The ? operator automatically propagates errors — no manual error checking
Pattern Matching: Functional languages (and increasingly multi-paradigm languages) use pattern matching to destructure data and branch based on structure:
// Rust pattern matching — exhaustive and expressive
enum Shape {
Circle { radius: f64 },
Rectangle { width: f64, height: f64 },
Triangle { base: f64, height: f64 },
}
fn area(shape: &Shape) -> f64 {
match shape {
Shape::Circle { radius } => std::f64::consts::PI * radius * radius,
Shape::Rectangle { width, height } => width * height,
Shape::Triangle { base, height } => 0.5 * base * height,
}
// Compiler ensures all variants are handled — can't forget one!
}
# Python 3.10+ structural pattern matching
def process_command(command):
match command:
case {"action": "create", "name": str(name), "type": str(type_)}:
create_resource(name, type_)
case {"action": "delete", "id": int(id_)}:
delete_resource(id_)
case {"action": "list", "filter": dict(filter_)}:
list_resources(filter_)
case _:
raise ValueError(f"Unknown command: {command}")
Algebraic Data Types (ADTs): The foundation of FP type systems. ADTs combine sum types (a value is one of several variants, like an enum) and product types (a value has multiple fields, like a struct/record):
// Sum type (OR): A value is ONE of these variants
enum Result<T, E> {
Ok(T), // Success with value
Err(E), // Error with error info
}
enum Option<T> {
Some(T), // Has a value
None, // No value
}
// Product type (AND): A value has ALL of these fields
struct User {
name: String,
email: String,
age: u32,
}
// Combined: a rich domain model
enum PaymentResult {
Success { transaction_id: String, amount: f64 },
Declined { reason: String },
Error { code: u32, message: String },
}
Declarative Programming¶
Declarative programming describes what you want, not how to compute it. The runtime/engine figures out the execution strategy.
-- Declarative: describe the desired result
SELECT users.name, COUNT(orders.id) as order_count
FROM users
JOIN orders ON users.id = orders.user_id
WHERE users.active = true
GROUP BY users.name
HAVING COUNT(orders.id) > 5
ORDER BY order_count DESC;
-- The database decides: which indexes to use, join strategy, execution order
# Terraform: declare desired infrastructure state
resource "aws_instance" "web" {
ami = "ami-0abcdef1234567890"
instance_type = "t3.medium"
}
# Terraform figures out: what to create, update, or destroy to reach this state
Declarative UI (React, SwiftUI): Declare what the UI should look like given the current state; the framework handles updates:
// React: UI is a pure function of state (declarative)
function UserProfile({ user }) {
return (
<div>
<h1>{user.name}</h1>
<p>{user.email}</p>
{user.isAdmin && <Badge text="Admin" />}
</div>
);
}
// You never imperatively manipulate the DOM.
// React computes the diff and applies minimal updates.
Reactive Programming¶
Reactive programming models computation as data streams and the propagation of change. When a source value changes, all derived values update automatically.
// RxJS: reactive streams
import { fromEvent, interval } from 'rxjs';
import { map, filter, debounceTime, switchMap } from 'rxjs/operators';
// Search-as-you-type with debouncing
const searchInput = document.getElementById('search');
fromEvent(searchInput, 'input')
.pipe(
map(event => event.target.value), // Extract input value
filter(query => query.length >= 3), // Only search 3+ chars
debounceTime(300), // Wait 300ms after typing stops
switchMap(query => fetch(`/api/search?q=${query}`)), // API call (cancel previous)
)
.subscribe(results => renderResults(results));
// This declaratively describes: "When the user types 3+ characters and pauses
// for 300ms, search the API and render results. Cancel any in-flight request
// if the user types again."
Key reactive concepts: - Observable: A stream of values over time (clicks, HTTP responses, timer ticks) - Observer: Subscribes to an observable and reacts to emitted values - Operators: Transform streams (map, filter, merge, combine, debounce, retry) - Backpressure: Handling the case where a producer emits faster than a consumer can process
Reactive is not just frontend: Akka Streams (Scala/Java), Reactor (Spring), and RxPy are used in backend systems for processing event streams, handling backpressure in data pipelines, and building resilient microservices.
Concurrent Programming Paradigms¶
Shared Memory (Threads + Locks)¶
The traditional approach: multiple threads share memory, protected by locks (mutexes, semaphores, read-write locks).
import threading
counter = 0
lock = threading.Lock()
def increment():
global counter
for _ in range(100_000):
with lock: # Acquire lock, preventing race conditions
counter += 1 # Critical section: only one thread at a time
threads = [threading.Thread(target=increment) for _ in range(4)]
for t in threads: t.start()
for t in threads: t.join()
print(counter) # 400,000 (correct with lock; without lock: race condition, undefined result)
Problems: Deadlocks (A waits for B, B waits for A), race conditions (forgot a lock), priority inversion, hard to reason about.
Message Passing (Actor Model)¶
Instead of sharing memory, actors communicate by sending messages. Each actor has: - Private state (no other actor can access it) - A mailbox (receives messages) - Behavior (processes messages one at a time)
# Elixir actor (process) example
defmodule Counter do
def start(initial_value \\ 0) do
spawn(fn -> loop(initial_value) end)
end
defp loop(count) do
receive do
{:increment, caller} ->
send(caller, {:count, count + 1})
loop(count + 1)
{:get, caller} ->
send(caller, {:count, count})
loop(count)
end
end
end
# Usage:
pid = Counter.start(0)
send(pid, {:increment, self()})
send(pid, {:get, self()})
# No shared state, no locks, no race conditions!
Erlang/OTP uses the actor model to build systems with 99.9999999% uptime (nine 9s). The "let it crash" philosophy: if an actor encounters an error, it crashes. A supervisor actor detects the crash and restarts it. This is simpler and more reliable than trying to handle every possible error.
Communicating Sequential Processes (CSP)¶
Go's concurrency model: goroutines communicate through channels (typed, synchronous or buffered pipes):
// Go: CSP with goroutines and channels
func producer(ch chan<- int) {
for i := 0; i < 100; i++ {
ch <- i // Send value to channel (blocks if channel is full)
}
close(ch)
}
func consumer(ch <-chan int, results chan<- int) {
sum := 0
for value := range ch { // Receive values until channel is closed
sum += value * value
}
results <- sum
}
func main() {
ch := make(chan int, 10) // Buffered channel (capacity 10)
results := make(chan int)
go producer(ch) // Start producer goroutine
go consumer(ch, results) // Start consumer goroutine
total := <-results // Wait for result
fmt.Println(total) // 328350
}
CSP vs. Actor Model: - CSP (Go): Communication is through channels (anonymous). Processes are anonymous. Focus on the communication pattern. - Actor (Erlang): Communication is through named mailboxes. Actors have identities. Focus on the entities.
Software Transactional Memory (STM)¶
STM treats memory operations like database transactions: read and write shared memory inside a transaction, and the runtime ensures atomicity (all-or-nothing) and isolation (no interference between transactions).
-- Haskell STM: transfer money between accounts atomically
transfer :: TVar Int -> TVar Int -> Int -> STM ()
transfer from to amount = do
fromBalance <- readTVar from
if fromBalance < amount
then retry -- Block until fromBalance changes
else do
writeTVar from (fromBalance - amount)
toBalance <- readTVar to
writeTVar to (toBalance + amount)
-- No locks! The runtime handles conflict detection and retries.
;; Clojure STM: multiple refs updated atomically
(def account-a (ref 1000))
(def account-b (ref 2000))
(dosync ; Transaction — atomic and isolated
(alter account-a - 100)
(alter account-b + 100))
;; If another transaction conflicts, one is automatically retried
π-Calculus (Pi-Calculus)¶
Robin Milner's π-calculus (1992) extends CSP by allowing channels themselves to be passed as messages. In CSP, channels are fixed conduits between known processes; in π-calculus, channels are first-class values that can be sent and received, enabling dynamic topology—processes can create new communication links at runtime. This models real-world scenarios like service discovery (a process receives a channel to a newly discovered service) and mobile code (a channel to a remote resource is passed along).
The connection to Erlang is conceptual: Erlang's dynamic process linking and the ability to send process IDs (PIDs) as messages mirrors π-calculus—you can send a reference to another process over a channel, effectively passing a "channel" to that process. Session types formalize protocols over channels (who sends what, when) and have been explored in Rust and Scala for type-safe concurrent communication.
// Conceptual pseudocode: channel sent over a channel
// Process A has channel c1 to Process B; B sends a new channel c2 back over c1.
// Now A can communicate with a third process via c2, without knowing its identity upfront.
A --[c1]--> B : A sends channel c1 to B
B --[c2]--> A : B sends channel c2 back over c1 (channel as message!)
A --[data]--> C : A uses c2 to send data to whoever B connected it to
Practical relevance: Service discovery (receiving a handle to a service), dynamic load balancing, and type-safe session protocols in Rust (e.g., session-types crate) and Scala (effekt, etc.).
Petri Nets¶
Petri nets (Carl Adam Petri, 1962) model concurrent systems as a bipartite graph of places (passive holders of tokens) and transitions (active elements that consume and produce tokens). Tokens flow through the net when transitions fire, subject to enabling conditions. This formalism excels at modeling resource allocation, synchronization, and concurrent workflows.
| Primitive | Role | Analogy |
|---|---|---|
| Place | Holds tokens (state); passive | A buffer, a condition, or a resource pool |
| Transition | Consumes tokens from input places, produces tokens in output places; active | An action, event, or operation |
| Token | Movable unit of marking; indicates "something is present" | A resource, a message, or a unit of work |
// Producer-consumer with mutual exclusion (conceptual pseudocode)
// Places: buffer_slot, producer_ready, consumer_ready, mutex
// Transitions: produce, consume
// produce: requires (producer_ready AND mutex) -> outputs (buffer_slot, mutex)
// consume: requires (buffer_slot AND consumer_ready AND mutex) -> outputs (consumer_ready, mutex)
// The mutex token ensures only one of produce/consume fires at a time.
Formal properties you can prove include: reachability (can we reach a given marking?), boundedness (does any place ever hold unbounded tokens?), liveness (can every transition eventually fire?), and deadlock detection (is there a marking where no transition is enabled?). Tools like CPN Tools (Coloured Petri Nets) support simulation and analysis. Petri nets underpin BPMN (Business Process Model and Notation) for workflow modeling and are used in hardware verification (asynchronous circuits, protocol validation).
Dataflow Model¶
The dataflow model treats computation as a directed graph: nodes are operations, edges are data streams, and a node fires when all its inputs are available. There is no explicit control flow—execution is driven by data availability. This differs from imperative control flow (where the programmer specifies sequence) and is naturally parallelizable: independent nodes can execute concurrently as soon as their inputs arrive.
# Python: generator pipeline (dataflow-style — data streams through stages)
def read_lines(path):
with open(path) as f:
for line in f:
yield line.strip()
def filter_empty(stream):
for item in stream:
if item:
yield item
def to_upper(stream):
for item in stream:
yield item.upper()
# Pipeline: each stage consumes from the previous, produces for the next
# Stages run lazily; data flows through as needed
pipeline = to_upper(filter_empty(read_lines("data.txt")))
for line in pipeline:
print(line)
// Rust: iterator chains — dataflow as a DAG of operations
let result: Vec<i32> = vec![1, 2, 3, 4, 5]
.into_iter()
.filter(|x| x % 2 == 0) // Node 1: filter
.map(|x| x * 2) // Node 2: map (depends on filter)
.collect(); // Sink: consumes the stream
// Each operation is a node; data flows along the chain.
// The runtime can fuse or parallelize stages (e.g., rayon::par_iter).
Real-world systems: Apache Spark (DAG of RDD/DataFrame operations), TensorFlow computation graphs (nodes = ops, edges = tensors), RxJS (reactive streams as dataflow), and hardware description languages (VHDL, Verilog—combinational logic is dataflow: outputs update when inputs change). The key insight: dataflow inverts control—you declare dependencies, and the runtime schedules execution.
Metaprogramming¶
Metaprogramming is code that generates or manipulates other code. It enables powerful abstractions but can make code harder to understand.
| Technique | Description | Languages |
|---|---|---|
| Macros | Code that generates code at compile time | Rust (macro_rules!), Lisp, Elixir |
| Reflection | Inspect and modify program structure at runtime | Java, Python, C# |
| Decorators | Wrap functions to add behavior | Python, TypeScript |
| Code generation | Generate source code from schemas/specs | Protocol Buffers, OpenAPI, GraphQL codegen |
# Python decorator: metaprogramming for cross-cutting concerns
import time
import functools
def retry(max_attempts=3, delay=1):
"""Decorator that retries a function on failure."""
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
for attempt in range(1, max_attempts + 1):
try:
return func(*args, **kwargs)
except Exception as e:
if attempt == max_attempts:
raise
print(f"Attempt {attempt} failed: {e}. Retrying in {delay}s...")
time.sleep(delay)
return wrapper
return decorator
@retry(max_attempts=3, delay=2)
def fetch_data(url):
response = requests.get(url)
response.raise_for_status()
return response.json()
// Rust derive macros: generate code at compile time
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)]
struct User {
name: String,
email: String,
age: u32,
}
// The derive macro generates implementations of Debug, Clone, etc.
// automatically based on the struct's fields. Zero runtime cost.
Type Systems and Paradigms¶
Type systems deeply influence programming paradigms:
| Type System | Description | Languages | Paradigm Influence |
|---|---|---|---|
| Static, strong | Types checked at compile time, no implicit conversions | Rust, Haskell, Java, Go | Catches errors early, enables powerful abstractions |
| Static, with inference | Compiler infers types (no explicit annotations needed) | Rust, Haskell, Kotlin, TypeScript | FP-friendly (less boilerplate for generic functions) |
| Dynamic, strong | Types checked at runtime, no implicit conversions | Python, Ruby, Elixir | Rapid prototyping, duck typing, flexible |
| Gradual | Optional type annotations (dynamic by default, static when annotated) | Python (mypy), TypeScript, PHP | Best of both worlds (flexibility + safety where needed) |
Type-driven development (common in Haskell, Rust): Use the type system to make illegal states unrepresentable:
// BAD: illegal states are representable
struct Order {
status: String, // Could be anything: "shipped", "banana", ""
shipped_at: Option<DateTime>, // Could be Some even when status != "shipped"
}
// GOOD: illegal states are unrepresentable
enum Order {
Pending { items: Vec<Item>, created_at: DateTime },
Paid { items: Vec<Item>, paid_at: DateTime, payment_id: String },
Shipped { items: Vec<Item>, shipped_at: DateTime, tracking: String },
Delivered { items: Vec<Item>, delivered_at: DateTime },
Cancelled { reason: String, cancelled_at: DateTime },
}
// A Shipped order ALWAYS has a tracking number. A Pending order NEVER has a payment_id.
// The compiler enforces this — no runtime checks needed.
When to Use Which Paradigm¶
| Scenario | Best Paradigm | Reasoning |
|---|---|---|
| Data transformations, ETL pipelines | Functional | Immutable data flows naturally through map/filter/reduce |
| Domain modeling with complex state | Object-Oriented | Encapsulation, inheritance, polymorphism model real-world entities |
| Systems programming, performance-critical | Imperative | Direct control over memory and execution flow |
| Configuration, infrastructure | Declarative | Describe desired state, let the system figure out how |
| Concurrent/parallel processing | Functional or Actor | Immutable data avoids race conditions; actors isolate state |
| UI with state management | Functional + Declarative | React model: declarative UI = f(state), functional state updates |
| Business logic with many rules | OOP or FP | OOP: strategy/state patterns. FP: pattern matching, composition |
| Real-time event processing | Reactive | Observable streams handle async data naturally |
| Highly reliable distributed systems | Actor (Erlang/OTP) | Let-it-crash + supervision trees = self-healing systems |
| Data validation, constraint solving | Logic/Declarative | Express rules, let the engine find solutions |
The Multi-Paradigm Reality¶
Most production codebases use multiple paradigms. The best approach is to understand each paradigm and use the right tool for each part of your system:
# A Pythonic example mixing paradigms effectively
# Functional: data transformation pipeline (pure, composable, testable)
def process_orders(orders: list[Order]) -> list[Summary]:
return [
summarize(order)
for order in orders
if order.status == "completed"
]
# OOP: domain modeling (encapsulation, polymorphism, state management)
class OrderService:
def __init__(self, repository: OrderRepository, notifier: Notifier):
self._repository = repository # Dependency injection (SOLID)
self._notifier = notifier
def complete_order(self, order_id: str) -> None:
order = self._repository.get(order_id)
order.complete() # State transition via OOP method
self._repository.save(order)
self._notifier.send(OrderCompleted(order_id)) # Side effect
# Declarative: configuration (describe what, not how)
RETRY_CONFIG = {
"max_retries": 3,
"backoff_factor": 2,
"retry_on": [ConnectionError, TimeoutError],
}
# Metaprogramming: cross-cutting concerns via decorators
@retry(**RETRY_CONFIG)
@cache(ttl=300)
@trace("order-service")
def get_order(order_id: str) -> Order:
return db.query(Order).get(order_id)
The key insight: paradigms are tools, not religions. Use FP for data transformations. Use OOP for domain modeling. Use declarative approaches for configuration and infrastructure. Use imperative code when you need fine-grained control. The best engineers are fluent in multiple paradigms and know when to reach for each one.