REST APIs & HTTP Clients

0/4 in this phase0/54 across the roadmap

📖 Concept

Understanding REST conventions and knowing how to consume APIs with Python's HTTP client libraries is essential for both building and integrating with web services. In interviews, you will be asked about status codes, idempotency, and how to handle unreliable network calls.

REST Conventions:

  • GET — Read resource(s). Must be idempotent and safe (no side effects). Cacheable.
  • POST — Create a new resource. Not idempotent (calling twice creates two resources).
  • PUT — Full replacement of a resource. Idempotent (calling twice has the same effect).
  • PATCH — Partial update. May or may not be idempotent depending on implementation.
  • DELETE — Remove a resource. Idempotent (deleting twice gives 404 on second call but the state is the same).

Status Code Families:

  • 2xx — Success: 200 OK, 201 Created, 204 No Content
  • 3xx — Redirection: 301 Moved Permanently, 304 Not Modified
  • 4xx — Client Error: 400 Bad Request, 401 Unauthorized, 403 Forbidden, 404 Not Found, 422 Unprocessable Entity, 429 Too Many Requests
  • 5xx — Server Error: 500 Internal Server Error, 502 Bad Gateway, 503 Service Unavailable

Python HTTP Libraries:

  • `requests` — The classic synchronous library. Simple API, widely used, but blocks the calling thread during I/O.
  • `httpx` — Modern library supporting both sync and async. Drop-in replacement for `requests` with an async client (`httpx.AsyncClient`). Supports HTTP/2.
  • `aiohttp` — Async-only library, more low-level than httpx. Used heavily in older async codebases.

Production patterns include retry logic with exponential backoff, connection pooling via sessions/clients, timeout configuration, and proper error handling for transient failures. These patterns prevent cascading failures in microservice architectures.

💻 Code Example

codeTap to expand ⛶
1# ============================================================
2# HTTP Client Patterns — requests vs httpx
3# ============================================================
4# import requests
5# import httpx
6# import asyncio
7# import time
8# from typing import Any
9#
10#
11# # ============================================================
12# # BAD: Naive HTTP calls without error handling or sessions
13# # ============================================================
14# # def get_user_bad(user_id):
15# # # Creates a new TCP connection every call (slow)
16# # response = requests.get(f"https://api.example.com/users/{user_id}")
17# # return response.json() # Crashes on non-200 or invalid JSON
18#
19#
20# # ============================================================
21# # GOOD: Production-grade synchronous client with requests
22# # ============================================================
23# class APIClient:
24# """Reusable HTTP client with session pooling and retries."""
25#
26# def __init__(self, base_url: str, api_key: str, timeout: int = 30):
27# self.base_url = base_url.rstrip("/")
28# self.session = requests.Session()
29#
30# # Connection pooling — reuses TCP connections
31# adapter = requests.adapters.HTTPAdapter(
32# pool_connections=10,
33# pool_maxsize=20,
34# max_retries=requests.adapters.Retry(
35# total=3,
36# backoff_factor=1, # 1s, 2s, 4s between retries
37# status_forcelist=[502, 503, 504], # retry on these status codes
38# allowed_methods=["GET", "PUT", "DELETE"], # only idempotent methods
39# ),
40# )
41# self.session.mount("https://", adapter)
42# self.session.mount("http://", adapter)
43#
44# # Default headers for all requests
45# self.session.headers.update({
46# "Authorization": f"Bearer {api_key}",
47# "Content-Type": "application/json",
48# "Accept": "application/json",
49# })
50# self.timeout = timeout
51#
52# def get(self, path: str, params: dict = None) -> dict:
53# """GET with proper error handling."""
54# url = f"{self.base_url}{path}"
55# try:
56# response = self.session.get(url, params=params, timeout=self.timeout)
57# response.raise_for_status() # raises HTTPError for 4xx/5xx
58# return response.json()
59# except requests.exceptions.Timeout:
60# raise TimeoutError(f"Request to {url} timed out after {self.timeout}s")
61# except requests.exceptions.HTTPError as e:
62# if e.response.status_code == 404:
63# return None
64# raise
65# except requests.exceptions.ConnectionError:
66# raise ConnectionError(f"Cannot reach {url}")
67#
68# def post(self, path: str, data: dict) -> dict:
69# """POST with response validation."""
70# url = f"{self.base_url}{path}"
71# response = self.session.post(url, json=data, timeout=self.timeout)
72# response.raise_for_status()
73# return response.json()
74#
75# def close(self):
76# """Always close sessions to release connections."""
77# self.session.close()
78#
79# def __enter__(self):
80# return self
81#
82# def __exit__(self, *args):
83# self.close()
84#
85#
86# # Usage:
87# # with APIClient("https://api.example.com", api_key="secret") as client:
88# # user = client.get("/users/42")
89# # new_user = client.post("/users", {"name": "Alice", "role": "admin"})
90#
91#
92# # ============================================================
93# # GOOD: Async client with httpx (for FastAPI/async apps)
94# # ============================================================
95# async def fetch_multiple_users(user_ids: list[int]) -> list[dict]:
96# """Fetch multiple users concurrently with httpx.AsyncClient."""
97# async with httpx.AsyncClient(
98# base_url="https://api.example.com",
99# headers={"Authorization": "Bearer secret"},
100# timeout=httpx.Timeout(30.0, connect=5.0),
101# ) as client:
102# # Fire all requests concurrently
103# tasks = [client.get(f"/users/{uid}") for uid in user_ids]
104# responses = await asyncio.gather(*tasks, return_exceptions=True)
105#
106# results = []
107# for resp in responses:
108# if isinstance(resp, Exception):
109# results.append({"error": str(resp)})
110# elif resp.status_code == 200:
111# results.append(resp.json())
112# else:
113# results.append({"error": f"HTTP {resp.status_code}"})
114# return results
115#
116#
117# # ============================================================
118# # Retry with exponential backoff (manual implementation)
119# # ============================================================
120# async def fetch_with_retry(
121# client: httpx.AsyncClient,
122# url: str,
123# max_retries: int = 3,
124# base_delay: float = 1.0,
125# ) -> httpx.Response:
126# """Retry with exponential backoff and jitter."""
127# import random
128# for attempt in range(max_retries + 1):
129# try:
130# response = await client.get(url)
131# if response.status_code in (502, 503, 504):
132# raise httpx.HTTPStatusError(
133# "Server error", request=response.request, response=response
134# )
135# response.raise_for_status()
136# return response
137# except (httpx.HTTPStatusError, httpx.ConnectTimeout) as e:
138# if attempt == max_retries:
139# raise
140# delay = base_delay * (2 ** attempt) + random.uniform(0, 1)
141# await asyncio.sleep(delay)

🏋️ Practice Exercise

Exercises:

  1. Build a `GitHubClient` class using `requests.Session` that wraps the GitHub API (`https://api.github.com\`). Implement methods for `get_user(username)`, `list_repos(username)`, and `get_repo_languages(owner, repo)`. Handle rate limiting (403 with `X-RateLimit-Remaining: 0` header) by sleeping until the reset time.

  2. Rewrite the `GitHubClient` using `httpx.AsyncClient`. Fetch 10 users' repositories concurrently using `asyncio.gather()` and compare the total time against sequential synchronous requests.

  3. Implement a retry decorator: `@retry(max_attempts=3, backoff_factor=2, retryable_exceptions=(ConnectionError, TimeoutError))`. Apply it to an HTTP client method and write tests that mock network failures to verify the retry behavior.

  4. Build a REST API testing suite: write a script that creates a resource (POST), reads it (GET), updates it (PUT and PATCH), and deletes it (DELETE). Verify correct status codes at each step (201, 200, 200, 204). Use `httpx` and run against a local FastAPI server.

  5. Implement a simple HTTP caching layer: create a wrapper around `httpx.AsyncClient` that respects `Cache-Control` headers and stores responses in a dictionary with TTL-based expiration.

⚠️ Common Mistakes

  • Not using a session or client instance for multiple requests. Creating a new requests.get() call each time opens a fresh TCP connection, skipping connection pooling and adding latency. Always use requests.Session() or httpx.Client() for repeated calls to the same host.

  • Retrying POST requests on failure without understanding idempotency. POST is not idempotent — retrying may create duplicate resources. Only retry idempotent methods (GET, PUT, DELETE) automatically. For POST, use idempotency keys or check for existing resources first.

  • Setting no timeout on HTTP requests. The default in requests is no timeout — a hanging server will block your process forever. Always set timeout=(connect_timeout, read_timeout), e.g., timeout=(5, 30).

  • Calling response.json() without checking response.status_code first. A 500 error may return HTML, and .json() will raise a JSONDecodeError. Always check response.raise_for_status() or the status code before parsing the body.

  • Using synchronous requests inside async def handlers in FastAPI or asyncio applications. This blocks the event loop. Use httpx.AsyncClient for async contexts or run the sync call in a thread pool with asyncio.to_thread().

💼 Interview Questions

🎤 Mock Interview

Practice a live interview for REST APIs & HTTP Clients